Abstract: We have developed a microfluidic device system for the continuous producting of nanoparticles, and we have clarified the relationship between the mixing performance of reactors and the particle size. First, we evaluated the mixing performance of reactors by carring out the Villermaux–Dushman reaction and determined the experimental conditions for producing AgCl nanoparticles. Next, we produced AgCl nanoparticles and evaluated the mixing performance and the particle size. We found that as the mixing performance improves the size of produced particles decreases and the particle size distribution becomes sharper. We produced AgCl nanoparticles with a size of 86 nm using the microfluidic device that had the best mixing performance among the three reactors we tested in this study; the coefficient of variation (Cv) of the size distribution of the produced nanoparticles was 26.1%.
Abstract: A new concept for long-term reagent storage for Labon- a-Chip (LoC) devices is described. Here we present a polymer multilayer stack with integrated stick packs for long-term storage of several liquid reagents, which are necessary for many diagnostic applications. Stick packs are widely used in packaging industry for storing solids and liquids for long time. The storage concept fulfills two main requirements: First, a long-term storage of reagents in stick packs without significant losses and interaction with surroundings, second, on demand releasing of liquids, which is realized by pushing a membrane against the stick pack through pneumatic pressure. This concept enables long-term on-chip storage of liquid reagents at room temperature and allows an easy implementation in different LoC devices.
Abstract: The purpose of research was to know the role of
immunogenic protein of 49 kDa from V.alginolyticus which capable
to initiate molecule expression of MHC Class II in receptor of
Cromileptes altivelis. The method used was in vivo experimental
research through testing of immunogenic protein 49 kDa from
V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams)
using 3 times booster by injecting an immunogenic protein in a
intramuscular manner. Response of expressed MHC molecule was
shown using immunocytochemistry method and SEM. Results
indicated that adhesin V.alginolyticus 49 kDa which have
immunogenic character could trigger expression of MHC class II on
receptor of grouper and has been proven by staining using
immunocytochemistry and SEM with labeling using antibody anti
MHC (anti mouse). This visible expression based on binding between
epitopes antigen and antibody anti MHC in the receptor. Using
immunocytochemistry, intracellular response of MHC to in vivo
induction of immunogenic adhesin from V.alginolyticus was shown.
Abstract: If price and quantity are the fundamental building
blocks of any theory of market interactions, the importance of trading
volume in understanding the behavior of financial markets is clear.
However, while many economic models of financial markets have
been developed to explain the behavior of prices -predictability,
variability, and information content- far less attention has been
devoted to explaining the behavior of trading volume. In this article,
we hope to expand our understanding of trading volume by
developing a new measure of herding behavior based on a cross
sectional dispersion of volumes betas. We apply our measure to the
Toronto stock exchange using monthly data from January 2000 to
December 2002. Our findings show that the herd phenomenon
consists of three essential components: stationary herding, intentional
herding and the feedback herding.
Abstract: In this paper, we propose an algorithm to compute
initial cluster centers for K-means clustering. Data in a cell is
partitioned using a cutting plane that divides cell in two smaller cells.
The plane is perpendicular to the data axis with the highest variance
and is designed to reduce the sum squared errors of the two cells as
much as possible, while at the same time keep the two cells far apart
as possible. Cells are partitioned one at a time until the number of
cells equals to the predefined number of clusters, K. The centers of
the K cells become the initial cluster centers for K-means. The
experimental results suggest that the proposed algorithm is effective,
converge to better clustering results than those of the random
initialization method. The research also indicated the proposed
algorithm would greatly improve the likelihood of every cluster
containing some data in it.
Abstract: In most of the cases, natural disasters lead to the
necessity of evacuating people. The quality of evacuation
management is dramatically improved by the use of information
provided by decision support systems, which become indispensable
in case of large scale evacuation operations. This paper presents a
best practice case study. In November 2007, officers from the
Emergency Situations Inspectorate “Crisana" of Bihor County from
Romania participated to a cross-border evacuation exercise, when
700 people have been evacuated from Netherlands to Belgium. One
of the main objectives of the exercise was the test of four different
decision support systems. Afterwards, based on that experience,
software system called TEVAC (Trans Border Evacuation) has been
developed “in house" by the experts of this institution. This original
software system was successfully tested in September 2008, during
the deployment of the international exercise EU-HUROMEX 2008,
the scenario involving real evacuation of 200 persons from Hungary
to Romania. Based on the lessons learned and results, starting from
April 2009, the TEVAC software is used by all Emergency
Situations Inspectorates all over Romania.
Abstract: In the present study, a heterogeneous and
homogeneous gas flow dispersion model for simulation and
optimisation of a large-scale catalytic slurry reactor for the direct
synthesis of dimethyl ether (DME) from syngas and CO2, using a
churn-turbulent regime was developed. In the heterogeneous gas flow
model the gas phase was distributed into two bubble phases: small
and large, however in the homogeneous one, the gas phase was
distributed into only one large bubble phase. The results indicated
that the heterogeneous gas flow model was in more agreement with
experimental pilot plant data than the homogeneous one.
Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.
Abstract: Although in sustainable development field, innovative
solutions have been sought worldwide by environmental groups,
academia, governments and companies for many years, recently,
citizens and communities have emerged as a new group and taken
more and more active role in this field. Many scholars call for more
research on the role of community and community innovation in
sustainable development. This paper is to respond to the calls. In
this paper, we first summarize a comprehensive set of innovation
principles. Then, we do a qualitative cross case study by comparing
three community innovation cases in three different areas of sustainable
development according to the innovation principles. Finally,
we summarize the case comparison and discuss the implications
to sustainable development. A unified role model and innovation
distribution map of community innovation are developed to better
understand community innovation in sustainable development..
Abstract: Measurement of competitiveness between countries or regions is an important topic of many economic analysis and scientific papers. In European Union (EU), there is no mainstream approach of competitiveness evaluation and measuring. There are many opinions and methods of measurement and evaluation of competitiveness between states or regions at national and European level. The methods differ in structure of using the indicators of competitiveness and ways of their processing. The aim of the paper is to analyze main sources of competitive potential of the EU Member States with the help of Factor analysis (FA) and to classify the EU Member States to homogeneous units (clusters) according to the similarity of selected indicators of competitiveness factors by Cluster analysis (CA) in reference years 2000 and 2011. The theoretical part of the paper is devoted to the fundamental bases of competitiveness and the methodology of FA and CA methods. The empirical part of the paper deals with the evaluation of competitiveness factors in the EU Member States and cluster comparison of evaluated countries by cluster analysis.
Abstract: Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Abstract: The notion of Next Generation Network (NGN) is
based on the Network Convergence concept which refers to
integration of services (such as IT and communication services) over
IP layer. As the most popular implementation of Service Oriented
Architecture (SOA), Web Services technology is known to be the
base for service integration. In this paper, we present a platform to
deliver communication services as web services. We also implement
a sample service to show the simplicity of making composite web
and communication services using this platform. A Service Logic
Execution Environment (SLEE) is used to implement the
communication services. The proposed architecture is in agreement
with Service Oriented Architecture (SOA) and also can be integrated
to an Enterprise Service Bus to make a base for NGN Service
Delivery Platform (SDP).
Abstract: With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.
Abstract: The distribution, enrichment, and accumulation of zinc
(Zn) in the sediments of Kaohsiung Ocean Disposal Site (KODS),
Taiwan were investigated. Sediment samples from two outer disposal
site stations and nine disposed stations in the KODS were collected per
quarterly in 2009 and characterized for Zn, aluminum, organic matter,
and grain size. Results showed that the mean Zn concentrations varied
from 48 mg/kg to 456 mg/kg. Results from the enrichment factor (EF)
and geo-accumulation index (Igeo) analyses imply that the sediments
collected from the KODS can be characterized between moderate and
moderately severe degree enrichment and between none and none to
medium accumulation of Zn, respectively. However, results of
potential ecological risk index indicate that the sediment has low
ecological potential risk. The EF, Igeo, and Zn concentrations at the
disposed stations were slightly higher than those at outer disposal site.
This indicated that the disposed area centers may be subjected to the
disposal impaction of harbor dredged sediments.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The adsorption properties of CO and H2 on iron-based
catalyst with addition of Zr and Ni were investigated using
temperature programmed desorption process. It was found that on the
carburized iron-based catalysts, molecular state and dissociative state
CO existed together. The addition of Zr was preferential for the
molecular state adsorption of CO on iron-based catalyst and the
presence of Ni was beneficial to the dissociative adsorption of CO. On
H2 reduced catalysts, hydrogen mainly adsorbs on the surface iron
sites and surface oxide sites. On CO reduced catalysts, hydrogen
probably existed as the most stable CH and OH species. The addition
of Zr was not benefit to the dissociative adsorption of hydrogen on
iron-based catalyst and the presence of Ni was preferential for the
dissociative adsorption of hydrogen.
Abstract: The inphase/quadrature (I/Q) amplitude and phase
imbalance effects are studied in coherent optical orthogonal
frequency division multiplexing (CO-OFDM) systems. An analytical
model for the I/Q imbalance is developed and supported by
simulation results. The results indicate that the I/Q imbalance degrades the BER performance considerably.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.