Abstract: Electric impedance imaging is a method of
reconstructing spatial distribution of electrical conductivity inside a
subject. In this paper, a new method of electrical impedance imaging
using eddy current is proposed. The eddy current distribution in the
body depends on the conductivity distribution and the magnetic field
pattern. By changing the position of magnetic core, a set of voltage
differences is measured with a pair of electrodes. This set of voltage
differences is used in image reconstruction of conductivity
distribution. The least square error minimization method is used as a
reconstruction algorithm. The back projection algorithm is used to
get two dimensional images. Based on this principle, a measurement
system is developed and some model experiments were performed
with a saline filled phantom. The shape of each model in the
reconstructed image is similar to the corresponding model,
respectively. From the results of these experiments, it is confirmed
that the proposed method is applicable in the realization of electrical
imaging.
Abstract: As a result of the daily workflow in the design
development departments of companies, databases containing huge
numbers of 3D geometric models are generated. According to the
given problem engineers create CAD drawings based on their design
ideas and evaluate the performance of the resulting design, e.g. by
computational simulations. Usually, new geometries are built either
by utilizing and modifying sets of existing components or by adding
single newly designed parts to a more complex design.
The present paper addresses the two facets of acquiring
components from large design databases automatically and providing
a reasonable overview of the parts to the engineer. A unified
framework based on the topographic non-negative matrix
factorization (TNMF) is proposed which solves both aspects
simultaneously. First, on a given database meaningful components
are extracted into a parts-based representation in an unsupervised
manner. Second, the extracted components are organized and
visualized on square-lattice 2D maps. It is shown on the example of
turbine-like geometries that these maps efficiently provide a wellstructured
overview on the database content and, at the same time,
define a measure for spatial similarity allowing an easy access and
reuse of components in the process of design development.
Abstract: The aim of this paper is to explain what a multienterprise tie is, what evidence its analysis provides and how does the cooperation mechanism influence the establishment of a multienterprise tie. The study focuses on businesses of smaller dimension, geographically dispersed and whose businessmen are learning to cooperate in an international environment. The empirical evidence obtained at this moment permits to conclude the following: The tie is not long-lasting, it has an end; opportunism is an opportunity to learn; the multi-enterprise tie is a space to learn about the cooperation mechanism; the local tie permits a businessman to alternate between competition and cooperation strategies; the disappearance of a tie is an experience of learning for a businessman, diminishing the possibility of failure in the next tie; the cooperation mechanism tends to eliminate hierarchical relations; the multienterprise tie diminishes the asymmetries and permits SME-s to have a better position when they negotiate with large companies; the multi-enterprise tie impacts positively on the local system. The collection of empirical evidence was done trough the following instruments: direct observation in a business encounter to which the businesses attended in 2003 (202 Mexican agro industry SME-s), a survey applied in 2004 (129), a questionnaire applied in 2005 (86 businesses), field visits to the businesses during the period 2006-2008 and; a survey applied by telephone in 2008 (55 Mexican agro industry SME-s).
Abstract: The problems associated with wind predictions of
WAsP model in complex terrain are already the target of several
studies in the last decade. In this paper, the influence of surrounding
orography on accuracy of wind data analysis of a train is
investigated. For the case study, a site with complex surrounding
orography is considered. This site is located in Manjil, one of the
windiest cities of Iran. For having precise evaluation of wind regime
in the site, one-year wind data measurements from two metrological
masts are used. To validate the obtained results from WAsP, the
cross prediction between each mast is performed. The analysis
reveals that WAsP model can estimate the wind speed behavior
accurately. In addition, results show that this software can be used
for predicting the wind regime in flat sites with complex surrounding
orography.
Abstract: The Minimum Vertex Cover (MVC) problem is a classic
graph optimization NP - complete problem. In this paper a competent
algorithm, called Vertex Support Algorithm (VSA), is designed to
find the smallest vertex cover of a graph. The VSA is tested on a
large number of random graphs and DIMACS benchmark graphs.
Comparative study of this algorithm with the other existing methods
has been carried out. Extensive simulation results show that the VSA
can yield better solutions than other existing algorithms found in the
literature for solving the minimum vertex cover problem.
Abstract: In this study, we examined gender differences in: (1) a
flexible remembering task, that asked for episodic memory decisions
at an item-specific versus category-based level, and (2) the retrieval
specificity of autobiographical memory during free recall.
Differences favouring women were found on both measures.
Furthermore, a significant association was observed, across gender
groups, between level of specificity in the autobiographical memory
interview and sensitivity to gist on the flexible remembering task.
These results suggest that similar cognitive processes may partially
contribute to both the ability for specific autobiographical recall and
the capacity for inhibition of gist-information on the flexible
remembering task.
Abstract: The purpose of this paper primarily intends to develop GIS interface for estimating sequences of stream-flows at ungauged stations based on known flows at gauged stations. The integrated GIS interface is composed of three major steps. The first, precipitation characteristics using statistical analysis is the procedure for making multiple linear regression equation to get the long term mean daily flow at ungauged stations. The independent variables in regression equation are mean daily flow and drainage area. Traditionally, mean flow data are generated by using Thissen polygon method. However, method for obtaining mean flow data can be selected by user such as Kriging, IDW (Inverse Distance Weighted), Spline methods as well as other traditional methods. At the second, flow duration curve (FDC) is computing at unguaged station by FDCs in gauged stations. Finally, the mean annual daily flow is computed by spatial interpolation algorithm. The third step is to obtain watershed/topographic characteristics. They are the most important factors which govern stream-flows. In summary, the simulated daily flow time series are compared with observed times series. The results using integrated GIS interface are closely similar and are well fitted each other. Also, the relationship between the topographic/watershed characteristics and stream flow time series is highly correlated.
Abstract: This paper presents a novel method for prediction of
the mechanical behavior of proximal femur using the general
framework of the quantitative computed tomography (QCT)-based
finite element Analysis (FEA). A systematic imaging and modeling
procedure was developed for reliable correspondence between the
QCT-based FEA and the in-vitro mechanical testing. A speciallydesigned
holding frame was used to define and maintain a unique
geometrical reference system during the analysis and testing. The
QCT images were directly converted into voxel-based 3D finite
element models for linear and nonlinear analyses. The equivalent
plastic strain and the strain energy density measures were used to
identify the critical elements and predict the failure patterns. The
samples were destructively tested using a specially-designed gripping
fixture (with five degrees of freedom) mounted within a universal
mechanical testing machine. Very good agreements were found
between the experimental and the predicted failure patterns and the
associated load levels.
Abstract: Formal Specification languages are being widely used
for system specification and testing. Highly critical systems such as
real time systems, avionics, and medical systems are represented
using Formal specification languages. Formal specifications based
testing is mostly performed using black box testing approaches thus
testing only the set of inputs and outputs of the system. The formal
specification language such as VDMµ can be used for white box
testing as they provide enough constructs as any other high level
programming language. In this work, we perform data and control
flow analysis of VDMµ class specifications. The proposed work is
discussed with an example of SavingAccount.
Abstract: Animated graph gives some good impressions in
presenting information. However, not many people are able to produce it because the process of generating an animated graph requires some technical skills. This work presents Content
Management System with Animated Graph (CMS-AG). It is a webbased system enabling users to produce an effective and interactive
graphical report in a short time period. It allows for three levels of user authentication, provides update profile, account management, template management, graph management, and track changes. The system development applies incremental development approach, object-oriented concepts and Web programming technologies. The design architecture promotes new technology of reporting. It also helps user cut off unnecessary expenses, save time and learn new things on different levels of users. In this paper, the developed system is described.
Abstract: Image registration plays an important role in the
diagnosis of dental pathologies such as dental caries, alveolar bone
loss and periapical lesions etc. This paper presents a new wavelet
based algorithm for registering noisy and poor contrast dental x-rays.
Proposed algorithm has two stages. First stage is a preprocessing
stage, removes the noise from the x-ray images. Gaussian filter has
been used. Second stage is a geometric transformation stage.
Proposed work uses two levels of affine transformation. Wavelet
coefficients are correlated instead of gray values. Algorithm has been
applied on number of pre and post RCT (Root canal treatment)
periapical radiographs. Root Mean Square Error (RMSE) and
Correlation coefficients (CC) are used for quantitative evaluation.
Proposed technique outperforms conventional Multiresolution
strategy based image registration technique and manual registration
technique.
Abstract: Alpinia galanga is rhizome, generally known as
Greater galangal and is selected for isolation of newer constituents
accountable for various therapeutic activities. Present study is
intended to isolate glycoside from Alpinia galanga rhizomes. Alpinia
galanga methanolic extract was column chromatograph and eluted
with ethyl acetate-methanol (99:1) to isolate compound β-Sitosterol
Diarabinoside. Herein, the isolation and structural elucidation of new
compound is described. Chemical investigation of methanolic extract
of rhizomes of Alpinia galanga furnished a new compound β-
Sitosterol Diarabinoside. The IR, NMR and MASS investigations of
isolated compound confirmed its structure as β-Sitosterol
Diarabinoside, which is isolated for the first time from a medicinal
plant or any synthetic source.
Abstract: Fuzzy Cognitive Maps (FCMs) is a causal graph, which shows the relations between essential components in complex systems. Experts who are familiar with the system components and their relations can generate a related FCM. There is a big gap when human experts cannot produce FCM or even there is no expert to produce the related FCM. Therefore, a new mechanism must be used to bridge this gap. In this paper, a novel learning method is proposed to construct causal graph based on historical data and by using metaheuristic such Tabu Search (TS). The efficiency of the proposed method is shown via comparison of its results of some numerical examples with those of some other methods.
Abstract: This paper presents a rule-based text- to- speech
(TTS) Synthesis System for Standard Malay, namely SMaTTS. The
proposed system using sinusoidal method and some pre- recorded
wave files in generating speech for the system. The use of phone
database significantly decreases the amount of computer memory
space used, thus making the system very light and embeddable. The
overall system was comprised of two phases the Natural Language
Processing (NLP) that consisted of the high-level processing of text
analysis, phonetic analysis, text normalization and morphophonemic
module. The module was designed specially for SM to overcome
few problems in defining the rules for SM orthography system before
it can be passed to the DSP module. The second phase is the Digital
Signal Processing (DSP) which operated on the low-level process of
the speech waveform generation. A developed an intelligible and
adequately natural sounding formant-based speech synthesis system
with a light and user-friendly Graphical User Interface (GUI) is
introduced. A Standard Malay Language (SM) phoneme set and an
inclusive set of phone database have been constructed carefully for
this phone-based speech synthesizer. By applying the generative
phonology, a comprehensive letter-to-sound (LTS) rules and a
pronunciation lexicon have been invented for SMaTTS. As for the
evaluation tests, a set of Diagnostic Rhyme Test (DRT) word list was
compiled and several experiments have been performed to evaluate
the quality of the synthesized speech by analyzing the Mean Opinion
Score (MOS) obtained. The overall performance of the system as
well as the room for improvements was thoroughly discussed.
Abstract: Present study summarizes the control of Vibrio
alginolyticus infection in hatchery reared Clownfish, Amphiprion
sebae with the extract of the mangrove plant, Avicennia marina.
Fishes with visible symptoms of hemorrhagic spots were chosen and
the genomic DNA of the causative bacterium was isolated and
sequenced based on 16S rDNA gene. The in vitro assay revealed that
a fraction of A. marina leaf extract elucidated with ethyl acetate:
methanol (6:4) showed a high activity (28 mm) at 125 μg/ml
concentrations. About 4 % of the fraction fed along with live V.
alginolyticus was significantly decreased the cumulative mortality
(P
Abstract: The intention of this lessons is to assess the probability
of optical coherence tomography (OCT) for biometric recognition.
The OCT is the foundation on an optical signal acquisition and
processing method and has the micrometer-resolution. In this study,
we used the porcine skin for verifying the abovementioned means. The
porcine tissue was sound acknowledged for structural and
immunohistochemical similarity with human skin, so it could be
suitable for pre-clinical trial as investigational specimen. For this
reason, it was tattooed by the tattoo machine with the tattoo-pigment.
We detected the pattern of the tattooed skin by the OCT according to
needle speed. The result was consistent with the histology images.
This result showed that the OCT was effective to examine the tattooed
skin section noninvasively. It might be available to identify
morphological changes inside the skin.
Abstract: This paper describes a combined mathematicalgraphical
approach for optimum tool path planning in order to
improve machining efficiency. A methodology has been used that
stabilizes machining operations by adjusting material removal rate in
pocket milling operations while keeping cutting forces within limits.
This increases the life of cutting tool and reduces the risk of tool
breakage, machining vibration, and chatter. Case studies reveal the
fact that application of this approach could result in a slight increase
of machining time, however, a considerable reduction of tooling cost,
machining vibration, noise and chatter can be achieved in addition to
producing a better surface finish.
Abstract: The present report describes the characteristics of
damages and behavior of reinforced concrete buildings during the
tsunami action. The discussion is based on the field damage survey in
selected cities located on the coast of the zone affected by the Great
East Japan Earthquake on March 11, 2011. This earthquake is the most
powerful know earthquake that has hit Japan with a magnitude 9.0 and
with epicenter located at 129 km of Sendai city (off the coast). The
earthquake triggered a destructive tsunami with run up height of up to
40 meters that mainly affect cities located on the Pacific Ocean coast of
the Tohoku region (north-east region of Japan). Reinforced concrete
buildings in general resist the tsunami without collapse however the
non-structural elements like panels and ceilings were severely
damaged. The analysis of damages has permitted to understand the
behavior of RC buildings under tsunami attack, and has also permitted
to establish recommendations for their use to take refuge from tsunami
in places where natural topography makes impossible to reach hilltops
or other safer places.
Abstract: The VoIP networks as alternative method to traditional PSTN system has been implemented in a wide variety of structures
with multiple protocols, codecs, software and hardware–based
distributions. The use of cryptographic techniques let the users to have a secure communication, but the calculate throughput as well as the QoS parameters are affected according to the used algorithm. This
paper analyzes the VoIP throughput and the QoS parameters with
different commercial encryption methods. The measurement–based
approach uses lab scenarios to simulate LAN and WAN
environments. Security mechanisms such as TLS, SIAX2, SRTP,
IPSEC and ZRTP are analyzed with μ-LAW and GSM codecs.
Abstract: Both image steganography and image encryption have
advantages and disadvantages. Steganograhy allows us to hide a
desired image containing confidential information in a covered or
host image while image encryption is decomposing the desired image
to a non-readable, non-comprehended manner. The encryption
methods are usually much more robust than the steganographic ones.
However, they have a high visibility and would provoke the attackers
easily since it usually is obvious from an encrypted image that
something is hidden! The combination of steganography and
encryption will cover both of their weaknesses and therefore, it
increases the security. In this paper an image encryption method
based on sinc-convolution along with using an encryption key of 128
bit length is introduced. Then, the encrypted image is covered by a
host image using a modified version of JSteg steganography
algorithm. This method could be applied to almost all image formats
including TIF, BMP, GIF and JPEG. The experiment results show
that our method is able to hide a desired image with high security and
low visibility.