Abstract: We have developed a microfluidic device system for the continuous producting of nanoparticles, and we have clarified the relationship between the mixing performance of reactors and the particle size. First, we evaluated the mixing performance of reactors by carring out the Villermaux–Dushman reaction and determined the experimental conditions for producing AgCl nanoparticles. Next, we produced AgCl nanoparticles and evaluated the mixing performance and the particle size. We found that as the mixing performance improves the size of produced particles decreases and the particle size distribution becomes sharper. We produced AgCl nanoparticles with a size of 86 nm using the microfluidic device that had the best mixing performance among the three reactors we tested in this study; the coefficient of variation (Cv) of the size distribution of the produced nanoparticles was 26.1%.
Abstract: The purpose of research was to know the role of
immunogenic protein of 49 kDa from V.alginolyticus which capable
to initiate molecule expression of MHC Class II in receptor of
Cromileptes altivelis. The method used was in vivo experimental
research through testing of immunogenic protein 49 kDa from
V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams)
using 3 times booster by injecting an immunogenic protein in a
intramuscular manner. Response of expressed MHC molecule was
shown using immunocytochemistry method and SEM. Results
indicated that adhesin V.alginolyticus 49 kDa which have
immunogenic character could trigger expression of MHC class II on
receptor of grouper and has been proven by staining using
immunocytochemistry and SEM with labeling using antibody anti
MHC (anti mouse). This visible expression based on binding between
epitopes antigen and antibody anti MHC in the receptor. Using
immunocytochemistry, intracellular response of MHC to in vivo
induction of immunogenic adhesin from V.alginolyticus was shown.
Abstract: In this paper, we propose an algorithm to compute
initial cluster centers for K-means clustering. Data in a cell is
partitioned using a cutting plane that divides cell in two smaller cells.
The plane is perpendicular to the data axis with the highest variance
and is designed to reduce the sum squared errors of the two cells as
much as possible, while at the same time keep the two cells far apart
as possible. Cells are partitioned one at a time until the number of
cells equals to the predefined number of clusters, K. The centers of
the K cells become the initial cluster centers for K-means. The
experimental results suggest that the proposed algorithm is effective,
converge to better clustering results than those of the random
initialization method. The research also indicated the proposed
algorithm would greatly improve the likelihood of every cluster
containing some data in it.
Abstract: Given the motivation of maps impact in enhancing the
perception of the quality of life in a region, this work examines the
use of spatial analytical techniques in exploring the role of space in
shaping human development patterns in Assiut governorate.
Variations of human development index (HDI) of the governorate-s
villages, districts and cities are mapped using geographic information
systems (GIS). Global and local spatial autocorrelation measures are
employed to assess the levels of spatial dependency in the data and to
map clusters of human development. Results show prominent
disparities in HDI between regions of Assiut. Strong patterns of
spatial association were found proving the presence of clusters on the
distribution of HDI. Finally, the study indicates several "hot-spots" in
the governorate to be area of more investigations to explore the
attributes of such levels of human development. This is very
important for accomplishing the development plan of poorest regions
currently adopted in Egypt.
Abstract: In most of the cases, natural disasters lead to the
necessity of evacuating people. The quality of evacuation
management is dramatically improved by the use of information
provided by decision support systems, which become indispensable
in case of large scale evacuation operations. This paper presents a
best practice case study. In November 2007, officers from the
Emergency Situations Inspectorate “Crisana" of Bihor County from
Romania participated to a cross-border evacuation exercise, when
700 people have been evacuated from Netherlands to Belgium. One
of the main objectives of the exercise was the test of four different
decision support systems. Afterwards, based on that experience,
software system called TEVAC (Trans Border Evacuation) has been
developed “in house" by the experts of this institution. This original
software system was successfully tested in September 2008, during
the deployment of the international exercise EU-HUROMEX 2008,
the scenario involving real evacuation of 200 persons from Hungary
to Romania. Based on the lessons learned and results, starting from
April 2009, the TEVAC software is used by all Emergency
Situations Inspectorates all over Romania.
Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.
Abstract: Measurement of competitiveness between countries or regions is an important topic of many economic analysis and scientific papers. In European Union (EU), there is no mainstream approach of competitiveness evaluation and measuring. There are many opinions and methods of measurement and evaluation of competitiveness between states or regions at national and European level. The methods differ in structure of using the indicators of competitiveness and ways of their processing. The aim of the paper is to analyze main sources of competitive potential of the EU Member States with the help of Factor analysis (FA) and to classify the EU Member States to homogeneous units (clusters) according to the similarity of selected indicators of competitiveness factors by Cluster analysis (CA) in reference years 2000 and 2011. The theoretical part of the paper is devoted to the fundamental bases of competitiveness and the methodology of FA and CA methods. The empirical part of the paper deals with the evaluation of competitiveness factors in the EU Member States and cluster comparison of evaluated countries by cluster analysis.
Abstract: Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Abstract: Day by day technology increases and problems
associated with this technology also increase. Several researches
were carried out to investigate the deployment of such material safely
in geotechnical engineering in particular and civil engineering in
general. However, different types of waste material have such as
cement duct, fly ash and slag been proven to be suitable in several
applications. In this research cement dust mixed with different
percentages of sand will be used in some civil engineering
application as will be explained later in this paper throughout filed
and laboratory test. The used mixer (waste material with sand) prove
high performance, durability to environmental condition, low cost
and high benefits. At higher cement dust ratio, small cement ratio is
valuable for compressive strength and permeability. Also at small
cement dust ratio higher cement ratio is valuable for compressive
strength.
Abstract: The notion of Next Generation Network (NGN) is
based on the Network Convergence concept which refers to
integration of services (such as IT and communication services) over
IP layer. As the most popular implementation of Service Oriented
Architecture (SOA), Web Services technology is known to be the
base for service integration. In this paper, we present a platform to
deliver communication services as web services. We also implement
a sample service to show the simplicity of making composite web
and communication services using this platform. A Service Logic
Execution Environment (SLEE) is used to implement the
communication services. The proposed architecture is in agreement
with Service Oriented Architecture (SOA) and also can be integrated
to an Enterprise Service Bus to make a base for NGN Service
Delivery Platform (SDP).
Abstract: With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The adsorption properties of CO and H2 on iron-based
catalyst with addition of Zr and Ni were investigated using
temperature programmed desorption process. It was found that on the
carburized iron-based catalysts, molecular state and dissociative state
CO existed together. The addition of Zr was preferential for the
molecular state adsorption of CO on iron-based catalyst and the
presence of Ni was beneficial to the dissociative adsorption of CO. On
H2 reduced catalysts, hydrogen mainly adsorbs on the surface iron
sites and surface oxide sites. On CO reduced catalysts, hydrogen
probably existed as the most stable CH and OH species. The addition
of Zr was not benefit to the dissociative adsorption of hydrogen on
iron-based catalyst and the presence of Ni was preferential for the
dissociative adsorption of hydrogen.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: The main objective of this paper is applying a
comparison between the Wolf Pack Search (WPS) as a newly
introduced intelligent algorithm with several other known algorithms
including Particle Swarm Optimization (PSO), Shuffled Frog
Leaping (SFL), Binary and Continues Genetic algorithms. All
algorithms are applied on two benchmark cost functions. The aim is
to identify the best algorithm in terms of more speed and accuracy in
finding the solution, where speed is measured in terms of function
evaluations. The simulation results show that the SFL algorithm with
less function evaluations becomes first if the simulation time is
important, while if accuracy is the significant issue, WPS and PSO
would have a better performance.
Abstract: Using data of listed Croatian firms from the Zagreb
Stock Exchange we analyze the relationship between firm ownership
(ownership concentration and type) and performance (ROA).
Empirical research was conducted for the period 2003-2010, yielding
with the total of 1,430 observations. Empirical findings based on
dynamic panel analysis indicate that ownership concentration
variable - CR4 is negatively related with performance, i.e. listed firms
with dispersed ownership perform better than firms with concentrated
ownership. Also, the research indicated that foreign controlled listed
firms perform better than domestically controlled firms. Majority
state owned firms perform worse than privately held firms but
dummy variable for privately controlled firms was not statistically
significant in the estimated panel model.
Abstract: Results are presented from a combined experimental
and modeling study undertaken to understand the effect of fuel spray
angle on soot production in turbulent liquid spray flames. The
experimental work was conducted in a cylindrical laboratory furnace
at fuel spray cone angle of 30º, 45º and 60º. Soot concentrations
inside the combustor are measured by filter paper technique. The soot
concentration is modeled by using the soot particle number density
and the mass density based acetylene concentrations. Soot oxidation
occurred by both hydroxide radicals and oxygen molecules. The
comparison of calculated results against experimental measurements
shows good agreement. Both the numerical and experimental results
show that the peak value of soot and its location in the furnace
depend on fuel spray cone angle. An increase in spray angle enhances
the evaporating rate and peak temperature near the nozzle. Although
peak soot concentration increase with enhance of fuel spray angle but
soot emission from the furnace decreases.
Abstract: Generally, administrative systems in an academic
environment are disjoint and support independent queries. The
objective in this work is to semantically connect these independent
systems to provide support to queries run on the integrated platform.
The proposed framework, by enriching educational material in the
legacy systems, provides a value-added semantics layer where
activities such as annotation, query and reasoning can be carried out
to support management requirements. We discuss the development of
this ontology framework with a case study of UAE University
program administration to show how semantic web technologies can
be used by administration to develop student profiles for better
academic program management.
Abstract: With the development of the Internet, E-commerce is
growing at an exponential rate, and lots of online stores are built up to
sell their goods online. A major factor influencing the successful
adoption of E-commerce is consumer-s trust. For new or unknown
Internet business, consumers- lack of trust has been cited as a major
barrier to its proliferation. As web sites provide key interface for
consumer use of E-Commerce, we investigate the design of web site to
build trust in E-Commerce from a design science approach. A
conceptual model is proposed in this paper to describe the ontology of
online transaction and human-computer interaction. Based on this
conceptual model, we provide a personalized webpage design
approach using Bayesian networks learning method. Experimental
evaluation are designed to show the effectiveness of web
personalization in improving consumer-s trust in new or unknown
online store.