Size Control of Nanoparticles Using a Microfluidic Device

We have developed a microfluidic device system for the continuous producting of nanoparticles, and we have clarified the relationship between the mixing performance of reactors and the particle size. First, we evaluated the mixing performance of reactors by carring out the Villermaux–Dushman reaction and determined the experimental conditions for producing AgCl nanoparticles. Next, we produced AgCl nanoparticles and evaluated the mixing performance and the particle size. We found that as the mixing performance improves the size of produced particles decreases and the particle size distribution becomes sharper. We produced AgCl nanoparticles with a size of 86 nm using the microfluidic device that had the best mixing performance among the three reactors we tested in this study; the coefficient of variation (Cv) of the size distribution of the produced nanoparticles was 26.1%.

Long-Term On-Chip Storage and Release of Liquid Reagents for Diagnostic Lab-on-a-Chip Applications

A new concept for long-term reagent storage for Labon- a-Chip (LoC) devices is described. Here we present a polymer multilayer stack with integrated stick packs for long-term storage of several liquid reagents, which are necessary for many diagnostic applications. Stick packs are widely used in packaging industry for storing solids and liquids for long time. The storage concept fulfills two main requirements: First, a long-term storage of reagents in stick packs without significant losses and interaction with surroundings, second, on demand releasing of liquids, which is realized by pushing a membrane against the stick pack through pneumatic pressure. This concept enables long-term on-chip storage of liquid reagents at room temperature and allows an easy implementation in different LoC devices.

The Role of Immunogenic Adhesin Vibrio alginolyticus 49 k Da to Molecule Expression of Major Histocompatibility Complex on Receptors of Humpback Grouper Cromileptes altivelis

The purpose of research was to know the role of immunogenic protein of 49 kDa from V.alginolyticus which capable to initiate molecule expression of MHC Class II in receptor of Cromileptes altivelis. The method used was in vivo experimental research through testing of immunogenic protein 49 kDa from V.alginolyticus at Cromileptes altivelis (size of 250 - 300 grams) using 3 times booster by injecting an immunogenic protein in a intramuscular manner. Response of expressed MHC molecule was shown using immunocytochemistry method and SEM. Results indicated that adhesin V.alginolyticus 49 kDa which have immunogenic character could trigger expression of MHC class II on receptor of grouper and has been proven by staining using immunocytochemistry and SEM with labeling using antibody anti MHC (anti mouse). This visible expression based on binding between epitopes antigen and antibody anti MHC in the receptor. Using immunocytochemistry, intracellular response of MHC to in vivo induction of immunogenic adhesin from V.alginolyticus was shown.

A New Measure of Herding Behavior: Derivation and Implications

If price and quantity are the fundamental building blocks of any theory of market interactions, the importance of trading volume in understanding the behavior of financial markets is clear. However, while many economic models of financial markets have been developed to explain the behavior of prices -predictability, variability, and information content- far less attention has been devoted to explaining the behavior of trading volume. In this article, we hope to expand our understanding of trading volume by developing a new measure of herding behavior based on a cross sectional dispersion of volumes betas. We apply our measure to the Toronto stock exchange using monthly data from January 2000 to December 2002. Our findings show that the herd phenomenon consists of three essential components: stationary herding, intentional herding and the feedback herding.

A Simple Method for Tracing PV Curve of a Radial Transmission Line

Analytical expression for maximum power transfer through a transmission line limited by voltage stability has been formulated using exact representation of transmission line with ABCD parameters. The expression has been used for plotting PV curve at different power factors of a radial transmission line. Limiting values of reactive power have been obtained.

Spatial Variability in Human Development Patterns in Assiut, Egypt

Given the motivation of maps impact in enhancing the perception of the quality of life in a region, this work examines the use of spatial analytical techniques in exploring the role of space in shaping human development patterns in Assiut governorate. Variations of human development index (HDI) of the governorate-s villages, districts and cities are mapped using geographic information systems (GIS). Global and local spatial autocorrelation measures are employed to assess the levels of spatial dependency in the data and to map clusters of human development. Results show prominent disparities in HDI between regions of Assiut. Strong patterns of spatial association were found proving the presence of clusters on the distribution of HDI. Finally, the study indicates several "hot-spots" in the governorate to be area of more investigations to explore the attributes of such levels of human development. This is very important for accomplishing the development plan of poorest regions currently adopted in Egypt.

The Management in Large Emergency Situations – A Best Practise Case Study based on GIS for Management of Evacuation

In most of the cases, natural disasters lead to the necessity of evacuating people. The quality of evacuation management is dramatically improved by the use of information provided by decision support systems, which become indispensable in case of large scale evacuation operations. This paper presents a best practice case study. In November 2007, officers from the Emergency Situations Inspectorate “Crisana" of Bihor County from Romania participated to a cross-border evacuation exercise, when 700 people have been evacuated from Netherlands to Belgium. One of the main objectives of the exercise was the test of four different decision support systems. Afterwards, based on that experience, software system called TEVAC (Trans Border Evacuation) has been developed “in house" by the experts of this institution. This original software system was successfully tested in September 2008, during the deployment of the international exercise EU-HUROMEX 2008, the scenario involving real evacuation of 200 persons from Hungary to Romania. Based on the lessons learned and results, starting from April 2009, the TEVAC software is used by all Emergency Situations Inspectorates all over Romania.

A Comparison between Heterogeneous and Homogeneous Gas Flow Model in Slurry Bubble Column Reactor for Direct Synthesis of DME

In the present study, a heterogeneous and homogeneous gas flow dispersion model for simulation and optimisation of a large-scale catalytic slurry reactor for the direct synthesis of dimethyl ether (DME) from syngas and CO2, using a churn-turbulent regime was developed. In the heterogeneous gas flow model the gas phase was distributed into two bubble phases: small and large, however in the homogeneous one, the gas phase was distributed into only one large bubble phase. The results indicated that the heterogeneous gas flow model was in more agreement with experimental pilot plant data than the homogeneous one.

Performance Prediction of Multi-Agent Based Simulation Applications on the Grid

A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.

Community Innovation in Sustainable Development: A Cross Case Study

Although in sustainable development field, innovative solutions have been sought worldwide by environmental groups, academia, governments and companies for many years, recently, citizens and communities have emerged as a new group and taken more and more active role in this field. Many scholars call for more research on the role of community and community innovation in sustainable development. This paper is to respond to the calls. In this paper, we first summarize a comprehensive set of innovation principles. Then, we do a qualitative cross case study by comparing three community innovation cases in three different areas of sustainable development according to the innovation principles. Finally, we summarize the case comparison and discuss the implications to sustainable development. A unified role model and innovation distribution map of community innovation are developed to better understand community innovation in sustainable development..

Assessment of EU Competitiveness Factors by Multivariate Methods

Measurement of competitiveness between countries or regions is an important topic of many economic analysis and scientific papers. In European Union (EU), there is no mainstream approach of competitiveness evaluation and measuring. There are many opinions and methods of measurement and evaluation of competitiveness between states or regions at national and European level. The methods differ in structure of using the indicators of competitiveness and ways of their processing. The aim of the paper is to analyze main sources of competitive potential of the EU Member States with the help of Factor analysis (FA) and to classify the EU Member States to homogeneous units (clusters) according to the similarity of selected indicators of competitiveness factors by Cluster analysis (CA) in reference years 2000 and 2011. The theoretical part of the paper is devoted to the fundamental bases of competitiveness and the methodology of FA and CA methods. The empirical part of the paper deals with the evaluation of competitiveness factors in the EU Member States and cluster comparison of evaluated countries by cluster analysis. 

Approximate Bounded Knowledge Extraction Using Type-I Fuzzy Logic

Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.

The Safety of WiMAX Insolid Propellant Rocket Production

With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.

Evaluation of Zinc Status in the Sediments of the Kaohsiung Ocean Disposal Site, Taiwan

The distribution, enrichment, and accumulation of zinc (Zn) in the sediments of Kaohsiung Ocean Disposal Site (KODS), Taiwan were investigated. Sediment samples from two outer disposal site stations and nine disposed stations in the KODS were collected per quarterly in 2009 and characterized for Zn, aluminum, organic matter, and grain size. Results showed that the mean Zn concentrations varied from 48 mg/kg to 456 mg/kg. Results from the enrichment factor (EF) and geo-accumulation index (Igeo) analyses imply that the sediments collected from the KODS can be characterized between moderate and moderately severe degree enrichment and between none and none to medium accumulation of Zn, respectively. However, results of potential ecological risk index indicate that the sediment has low ecological potential risk. The EF, Igeo, and Zn concentrations at the disposed stations were slightly higher than those at outer disposal site. This indicated that the disposed area centers may be subjected to the disposal impaction of harbor dredged sediments.

Semi-Automatic Artifact Rejection Procedure Based on Kurtosis, Renyi's Entropy and Independent Component Scalp Maps

Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.

Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

Classification of the Latin Alphabet as Pattern on ARToolkit Markers for Augmented Reality Applications

augmented reality is a technique used to insert virtual objects in real scenes. One of the most used libraries in the area is the ARToolkit library. It is based on the recognition of the markers that are in the form of squares with a pattern inside. This pattern which is mostly textual is source of confusing. In this paper, we present the results of a classification of Latin characters as a pattern on the ARToolkit markers to know the most distinguishable among them.

Academic Program Administration via Semantic Web – A Case Study

Generally, administrative systems in an academic environment are disjoint and support independent queries. The objective in this work is to semantically connect these independent systems to provide support to queries run on the integrated platform. The proposed framework, by enriching educational material in the legacy systems, provides a value-added semantics layer where activities such as annotation, query and reasoning can be carried out to support management requirements. We discuss the development of this ontology framework with a case study of UAE University program administration to show how semantic web technologies can be used by administration to develop student profiles for better academic program management.

Application of a Systemic Soft Domain-Driven Design Framework

This paper proposes a “soft systems" approach to domain-driven design of computer-based information systems. We propose a systemic framework combining techniques from Soft Systems Methodology (SSM), the Unified Modelling Language (UML), and an implementation pattern known as “Naked Objects". We have used this framework in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within the proposed framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to generate a ubiquitous language (soft language) which can be used as the basis for developing an object-oriented domain model. The domain model is further developed using techniques based on the UML and is implemented in software following the “Naked Objects" implementation pattern. We argue that there are advantages from combining and using techniques from different methodologies in this way. The proposed systemic framework is overviewed and justified as multimethodologyusing Mingers multimethodology ideas. This multimethodology approach is being evaluated through a series of action research projects based on real-world case studies. A Peer-Tutoring case study is presented here as a sample of the framework evaluation process

Web Personalization to Build Trust in E-Commerce: A Design Science Approach

With the development of the Internet, E-commerce is growing at an exponential rate, and lots of online stores are built up to sell their goods online. A major factor influencing the successful adoption of E-commerce is consumer-s trust. For new or unknown Internet business, consumers- lack of trust has been cited as a major barrier to its proliferation. As web sites provide key interface for consumer use of E-Commerce, we investigate the design of web site to build trust in E-Commerce from a design science approach. A conceptual model is proposed in this paper to describe the ontology of online transaction and human-computer interaction. Based on this conceptual model, we provide a personalized webpage design approach using Bayesian networks learning method. Experimental evaluation are designed to show the effectiveness of web personalization in improving consumer-s trust in new or unknown online store.