Abstract: Most CT reconstruction system x-ray computed
tomography (CT) is a well established visualization technique in
medicine and nondestructive testing. However, since CT scanning
requires sampling of radiographic projections from different viewing
angles, common CT systems with mechanically moving parts are too
slow for dynamic imaging, for instance of multiphase flows or live
animals. A large number of X-ray projections are needed to
reconstruct CT images, so the collection and calculation of the
projection data consume too much time and harmful for patient. For
the purpose of solving the problem, in this study, we proposed a
method for tomographic reconstruction of a sample from a limited
number of x-ray projections by using linear interpolation method. In
simulation, we presented reconstruction from an experimental x-ray
CT scan of a Aluminum phantom that follows to two steps: X-ray
projections will be interpolated using linear interpolation method and
using it for CT reconstruction based upon Ordered Subsets
Expectation Maximization (OSEM) method.
Abstract: The major purpose of this study is to use network and multimedia technologies to build a game-based learning system for junior high school students to apply in learning “World Geography" through the “role-playing" game approaches. This study first investigated the motivation and habits of junior high school students to use the Internet and online games, and then designed a game-based learning system according to situated and game-based learning theories. A teaching experiment was conducted to analyze the learning effectiveness of students on the game-based learning system and the major factors affecting their learning. A questionnaire survey was used to understand the students- attitudes towards game-based learning. The results showed that the game-based learning system can enhance students- learning, but the gender of students and their habits in using the Internet have no significant impact on learning. Game experience has a significant impact on students- learning, and the higher the experience value the better the effectiveness of their learning. The results of questionnaire survey also revealed that the system can increase students- motivation and interest in learning "World Geography".
Abstract: A 3D industrial computed tomography (CT)
manufactured based on a first generation CT systems, single-source
– single-detector, was evaluated. Operation accuracy assessment of
the manufactured system was achieved using simulation in
comparison with experimental tests. 137Cs and 60Co were used as a gamma source. Simulations were achieved using MCNP4C code.
Experimental tests of 137Cs were in good agreement with the simulations
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: In our modern world, more physical transactions are being substituted by electronic transactions (i.e. banking, shopping, and payments), many businesses and companies are performing most of their operations through the internet. Instead of having a physical commerce, internet visitors are now adapting to electronic commerce (e-Commerce). The ability of web users to reach products worldwide can be greatly benefited by creating friendly and personalized online business portals. Internet visitors will return to a particular website when they can find the information they need or want easily. Dealing with this human conceptualization brings the incorporation of Artificial/Computational Intelligence techniques in the creation of customized portals. From these techniques, Fuzzy-Set technologies can make many useful contributions to the development of such a human-centered endeavor as e-Commerce. The main objective of this paper is the implementation of a Paradigm for the Intelligent Design and Operation of Human-Computer interfaces. In particular, the paradigm is quite appropriate for the intelligent design and operation of software modules that display information (such Web Pages, graphic user interfaces GUIs, Multimedia modules) on a computer screen. The human conceptualization of the user personal information is analyzed throughout a Cascaded Fuzzy Inference (decision-making) System to generate the User Ascribe Qualities, which identify the user and that can be used to customize portals with proper Web links.
Abstract: Diagnosis can be achieved by building a model of a
certain organ under surveillance and comparing it with the real time
physiological measurements taken from the patient. This paper deals
with the presentation of the benefits of using Data Mining techniques
in the computer-aided diagnosis (CAD), focusing on the cancer
detection, in order to help doctors to make optimal decisions quickly
and accurately. In the field of the noninvasive diagnosis techniques,
the endoscopic ultrasound elastography (EUSE) is a recent elasticity
imaging technique, allowing characterizing the difference between
malignant and benign tumors. Digitalizing and summarizing the main
EUSE sample movies features in a vector form concern with the use
of the exploratory data analysis (EDA). Neural networks are then
trained on the corresponding EUSE sample movies vector input in
such a way that these intelligent systems are able to offer a very
precise and objective diagnosis, discriminating between benign and
malignant tumors. A concrete application of these Data Mining
techniques illustrates the suitability and the reliability of this
methodology in CAD.
Abstract: This study uses GIS (Geographic Information
Systems) to conduct an evaluation of the degree of the sufficiency of
public green spaces such as parks and urban green areas as an
indicator of the density of metropolitan areas, in particular the Chubu
metropolitan area, in Japan. To that end, it first grasps the distribution
situation of green spaces in the three metropolitan areas in Japan,
especially in the Chubu metropolitan area, using GIS digital maps.
And based on this result, it conducts a GIS evaluation of the degree of
sufficiency of public green spaces and arranges the result for every
distance belt from the central part to compare and exam for every
distance belt away from the center in the Chubu metropolitan area.
Furthermore, after pointing out the insufficient areas of public green
spaces based on the result, it also proposes the improvement policy
which can be introduced in the Chubu metropolitan area.
Abstract: This paper describes a code clone visualization method, called FC graph, and the implementation issues. Code clone detection tools usually show the results in a textual representation. If the results are large, it makes a problem to software maintainers with understanding them. One of the approaches to overcome the situation is visualization of code clone detection results. A scatter plot is a popular approach to the visualization. However, it represents only one-to-one correspondence and it is difficult to find correspondence of code clones over multiple files. FC graph represents correspondence among files, code clones and packages in Java. All nodes in FC graph are positioned using force-directed graph layout, which is dynami- cally calculated to adjust the distances of nodes until stabilizing them. We applied FC graph to some open source programs and visualized the results. In the author’s experience, FC graph is helpful to grasp correspondence of code clones over multiple files and also code clones with in a file.
Abstract: This paper proposes a simple model of economic geography within the Dixit-Stiglitz-Iceberg framework that may be used to analyze migration patterns among three cities. The cost–benefit tradeoffs affecting incentives for three types of migration, including echelon migration, are discussed. This paper develops a tractable, heterogeneous-agent, general equilibrium model, where agents share constant human capital, and explores the relationship between the benefits of echelon migration and gross human capital. Using Chinese numerical solutions, we study the manifestation of echelon migration and how it responds to changes in transportation cost and elasticity of substitution. Numerical results demonstrate that (i) there are positive relationships between a migration-s benefit-and-wage ratio, (ii) there are positive relationships between gross human capital ratios and wage ratios as to origin and destination, and (iii) we identify 13 varieties of human capital convergence among cities. In particular, this model predicts population shock resulting from the processes of migration choice and echelon migration.
Abstract: Random Oracle Model (ROM) is an effective method
for measuring the practical security of cryptograph. In this paper, we
try to use it into information hiding system (IHS). Because IHS has its
own properties, the ROM must be modified if it is used into IHS.
Firstly, we fully discuss why and how to modify each part of ROM
respectively. The main changes include: 1) Divide the attacks that IHS
may be suffered into two phases and divide the attacks of each phase
into several kinds. 2) Distinguish Oracles and Black-boxes clearly. 3)
Define Oracle and four Black-boxes that IHS used. 4) Propose the
formalized adversary model. And 5) Give the definition of judge.
Secondly, based on ROM of IHS, the security against known original
cover attack (KOCA-KOCA-security) is defined. Then, we give an
actual information hiding scheme and prove that it is
KOCA-KOCA-secure. Finally, we conclude the paper and propose the
open problems of further research.
Abstract: In wireless sensor network (WSN) the use of mobile
sink has been attracting more attention in recent times. Mobile sinks
are more effective means of balancing load, reducing hotspot
problem and elongating network lifetime. The sensor nodes in WSN
have limited power supply, computational capability and storage and
therefore for continuous data delivery reliability becomes high
priority in these networks. In this paper, we propose a Reliable
Energy-efficient Data Dissemination (REDD) scheme for WSNs with
multiple mobile sinks. In this strategy, sink first determines the
location of source and then directly communicates with the source
using geographical forwarding. Every forwarding node (FN) creates a
local zone comprising some sensor nodes that can act as
representative of FN when it fails. Analytical and simulation study
reveals significant improvement in energy conservation and reliable
data delivery in comparison to existing schemes.
Abstract: Real-time 3D applications have to guarantee
interactive rendering speed. There is a restriction for the number of
polygons which is rendered due to performance of a graphics hardware
or graphics algorithms. Generally, the rendering performance will be
drastically increased when handling only the dynamic 3d models,
which is much fewer than the static ones. Since shapes and colors of
the static objects don-t change when the viewing direction is fixed, the
information can be reused. We render huge amounts of polygon those
cannot handled by conventional rendering techniques in real-time by
using a static object image and merging it with rendering result of the
dynamic objects. The performance must be decreased as a
consequence of updating the static object image including removing
an static object that starts to move, re-rending the other static objects
being overlapped by the moving ones. Based on visibility of the object
beginning to move, we can skip the updating process. As a result, we
enhance rendering performance and reduce differences of rendering
speed between each frame. Proposed method renders total
200,000,000 polygons that consist of 500,000 dynamic polygons and
the rest are static polygons in about 100 frames per second.
Abstract: A geothermal power plant multiple simulator for
operators training is presented. The simulator is designed to be
installed in a wireless local area network and has a capacity to train
one to six operators simultaneously, each one with an independent
simulation session. The sessions must be supervised only by one
instructor. The main parts of this multiple simulator are: instructor
and operator-s stations. On the instructor station, the instructor
controls the simulation sessions, establishes training exercises and
supervises each power plant operator in individual way. This station
is hosted in a Main Personal Computer (NS) and its main functions
are: to set initial conditions, snapshots, malfunctions or faults,
monitoring trends, and process and soft-panel diagrams. On the other
hand the operators carry out their actions over the power plant
simulated on the operator-s stations; each one is also hosted in a PC.
The main software of instructor and operator-s stations are executed
on the same NS and displayed in PCs through graphical Interactive
Process Diagrams (IDP). The geothermal multiple simulator has been
installed in the Geothermal Simulation Training Center (GSTC) of
the Comisi├│n Federal de Electricidad, (Federal Commission of
Electricity, CFE), Mexico, and is being utilized as a part of the
training courses for geothermal power plant operators.
Abstract: Team distillation assisted by microwave extraction
(SDAM) considered as accelerated technique extraction is a
combination of microwave heating and steam distillation, performed
at atmospheric pressure. SDAM has been compared with the same
technique coupled with the cryogrinding of seeds (SDAM -CG).
Isolation and concentration of volatile compounds are performed by a
single stage for the extraction of essential oil from Cuminum
cyminum seeds. The essential oils extracted by these two methods for
5 min were quantitatively (yield) and qualitatively (aromatic profile)
no similar. These methods yield an essential oil with higher amounts
of more valuable oxygenated compounds, and allow substantial
savings of costs, in terms of time, energy and plant material. SDAM
and SDAM-CG is a green technology and appears as a good
alternative for the extraction of essential oils from aromatic plants.
Abstract: Classification is one of the primary themes in
computational biology. The accuracy of classification strongly
depends on quality of a dataset, and we need some method to
evaluate this quality. In this paper, we propose a new graphical
analysis method using 'Membership-Deviation Graph (MDG)' for
analyzing quality of a dataset. MDG represents degree of
membership and deviations for instances of a class in the dataset. The
result of MDG analysis is used for understanding specific feature and
for selecting best feature for classification.
Abstract: The goal of this work is to improve the efficiency and the reliability of the automatic artifact rejection, in particular from the Electroencephalographic (EEG) recordings. Artifact rejection is a key topic in signal processing. The artifacts are unwelcome signals that may occur during the signal acquisition and that may alter the analysis of the signals themselves. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we enhance this technique introducing the Renyi-s entropy. The performance of our method was tested exploiting the Independent Component scalp maps and it was compared to the performance of the method in literature and it showed to outperform it.
Abstract: The hydromagnetic flow of a Maxwell fluid past a vertical stretching sheet with thermophoresis is considered. The impact of chemical reaction species to the flow is analyzed for the first time by using the homotopy analysis method (HAM). The h-curves for the flow boundary layer equations are presented graphically. Several values of wall skin friction, heat and mass transfer are obtained and discussed.
Abstract: Significant changes in oil and gas drilling have
emphasized the need to verify the integrity and reliability of drill
stem components. Defects are inevitable in cast components,
regardless of application; but if these defects go undetected, any
severe defect could cause down-hole failure.
One such defect is shrinkage porosity. Castings with lower level
shrinkage porosity (CB levels 1 and 2) have scattered pores and do
not occupy large volumes; so pressure testing and helium leak testing
(HLT) are sufficient for qualifying the castings. However, castings
with shrinkage porosity of CB level 3 and higher, behave erratically
under pressure testing and HLT making these techniques insufficient
for evaluating the castings- integrity.
This paper presents a case study to highlight how the radiography
technique is much more effective than pressure testing and HLT.