Proposal of Additional Fuzzy Membership Functions in Smoothing Transition Autoregressive Models

In this paper we present, propose and examine additional membership functions for the Smoothing Transition Autoregressive (STAR) models. More specifically, we present the tangent hyperbolic, Gaussian and Generalized bell functions. Because Smoothing Transition Autoregressive (STAR) models follow fuzzy logic approach, more fuzzy membership functions should be tested. Furthermore, fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation or genetic algorithm instead to nonlinear squares. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.

Analysis of Different Combining Schemes of Two Amplify-Forward Relay Branches with Individual Links Experiencing Nakagami Fading

Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.

Visualization of Searching and Sorting Algorithms

Sequences of execution of algorithms in an interactive manner using multimedia tools are employed in this paper. It helps to realize the concept of fundamentals of algorithms such as searching and sorting method in a simple manner. Visualization gains more attention than theoretical study and it is an easy way of learning process. We propose methods for finding runtime sequence of each algorithm in an interactive way and aims to overcome the drawbacks of the existing character systems. System illustrates each and every step clearly using text and animation. Comparisons of its time complexity have been carried out and results show that our approach provides better perceptive of algorithms.

Exploration of Autistic Children using Case Based Reasoning System with Cognitive Map

Exploring an autistic child in Elementary school is a difficult task that must be fully thought out and the teachers should be aware of the many challenges they face raising their child especially the behavioral problems of autistic children. Hence there arises a need for developing Artificial intelligence (AI) Contemporary Techniques to help diagnosis to discover autistic people. In this research, we suggest designing architecture of expert system that combine Cognitive Maps (CM) with Case Based Reasoning technique (CBR) in order to reduce time and costs of traditional diagnosis process for the early detection to discover autistic children. The teacher is supposed to enter child's information for analyzing by CM module. Then, the reasoning processor would translate the output into a case to be solved a current problem by CBR module. We will implement a prototype for the model as a proof of concept using java and MYSQL. This will be provided a new hybrid approach that will achieve new synergies and improve problem solving capabilities in AI. And we will predict that will reduce time, costs, the number of human errors and make expertise available to more people who want who want to serve autistic children and their families.

Is Management Science doing Enough to Improve Healthcare?

Healthcare issues continue to pose huge problems and incur massive costs. As a result there are many challenging problems still unresolved. In this paper, we will carry out an extensive scientific survey of different areas of management and planning in an attempt to identify where there has already been a substantial contribution from management science methods to healthcare problems and where there is a clear potential for more work to be done. The focus will be on the read-across to the healthcare domain from such approaches applied generally to management and planning and how the methods can be used to improvement patient care. We conclude that, since the healthcare domain significantly differs from traditional areas of management and planning, in some cases there is a need to modify the approaches so as to incorporate the complexities of healthcare, and fully exploit the potential for improvement.

Flow Regime Characterization in a Diseased Artery Model

Cardiovascular disease mostly in the form of atherosclerosis is responsible for 30% of all world deaths amounting to 17 million people per year. Atherosclerosis is due to the formation of plaque. The fatty plaque may be at risk of rupture, leading typically to stroke and heart attack. The plaque is usually associated with a high degree of lumen reduction, called a stenosis. The initiation and progression of the disease is strongly linked to the hemodynamic environment near the vessel wall. The aim of this study is to validate the flow of blood mimic through an arterial stenosis model with computational fluid dynamics (CFD) package. In experiment, an axisymmetric model constructed consists of contraction and expansion region that follow a mathematical form of cosine function. A 30% diameter reduction was used in this study. Particle image velocimetry (PIV) was used to characterize the flow. The fluid consists of rigid spherical particles suspended in waterglycerol- NaCl mixture. The particles with 20 μm diameter were selected to follow the flow of fluid. The flow at Re=155, 270 and 390 were investigated. The experimental result is compared with FLUENT simulated flow that account for viscous laminar flow model. The results suggest that laminar flow model was sufficient to predict flow velocity at the inlet but the velocity at stenosis throat at Re =390 was overestimated. Hence, a transition to turbulent regime might have been developed at throat region as the flow rate increases.

Robust Human Rights Governance: Developing International Criteria

Many states are now committed to implementing international human rights standards domestically. In terms of practical governance, how might effectiveness be measured? A facevalue answer can be found in domestic laws and institutions relating to human rights. However, this article provides two further tools to help states assess their status on the spectrum of robust to fragile human rights governance. The first recognises that each state has its own 'human rights history' and the ideal end stage is robust human rights governance, and the second is developing criteria to assess robustness. Although a New Zealand case study is used to illustrate these tools, the widespread adoption of human rights standards by many states inevitably means that the issues are relevant to other countries. This is even though there will always be varying degrees of similarity-difference in constitutional background and developed or emerging human rights systems.

Business Intelligence and Strategic Decision Simulation

The purpose of this study is two-fold. First, it attempts to explore potential opportunities for utilizing visual interactive simulations along with Business Intelligence (BI) as a decision support tool for strategic decision making. Second, it tries to figure out the essential top-level managerial requirements that would transform strategic decision simulation into an integral component of BI systems. The domain of particular interest was the application of visual interactive simulation capabilities in the field of supply chains. A qualitative exploratory method was applied, through the use of interviews with two leading companies. The collected data was then analysed to demonstrate the difference between the literature perspective and the practical managerial perspective on the issue. The results of the study suggest that although the use of simulation particularly in managing supply chains is very evident in literature, yet, in practice such utilization is still in its infancy, particularly regarding strategic decisions. Based on the insights a prototype of a simulation based BI-solution-extension was developed and evaluated.

Optimal Embedded Generation Allocation in Distribution System Employing Real Coded Genetic Algorithm Method

This paper proposes a new methodology for the optimal allocation and sizing of Embedded Generation (EG) employing Real Coded Genetic Algorithm (RCGA) to minimize the total power losses and to improve voltage profiles in the radial distribution networks. RCGA is a method that uses continuous floating numbers as representation which is different from conventional binary numbers. The RCGA is used as solution tool, which can determine the optimal location and size of EG in radial system simultaneously. This method is developed in MATLAB. The effect of EG units- installation and their sizing to the distribution networks are demonstrated using 24 bus system.

Using a Trust-Based Environment Key for Mobile Agent Code Protection

Human activities are increasingly based on the use of remote resources and services, and on the interaction between remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with various environmental security conditions. The aim of this paper is to propose a trust based mechanism to improve the security of mobile agents and allow their execution in various environments. Thus, an adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information collected during the interaction enables generation of an environment key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can be selected.

Changes to Oxidative Stress Levels Following Exposure to Formaldehyde in Lymphocytes

Formaldehyde is the illegal chemical substance used for food preservation in fish and vegetable. It can promote carcinogenesis. Superoxide dismutases are the important antioxidative enzymes that catalyze the dismutation of superoxide anion into oxygen and hydrogen peroxide. The resultant level of oxidative stress in formaldehyde-treated lymphocytes was investigated. The formaldehyde concentrations of 0, 20, 40, 60, 80 and 120μmol/L were treated in human lymphocytes for 12 hours. After 12 treated hours, the superoxide dismutase activity change was measured in formaldehyde-treated lymphocytes. The results showed that the formaldehyde concentrations of 60, 80 and 120μmol/L significantly decreased superoxide dismutase activities in lymphocytes (P < 0.05). The change of superoxide dismutase activity in formaldehyde-treated lymphocytes may be the biomarker for detect cellular injury, such as damage to DNA, due to formaldehyde exposure.

Climate Change and Environmental Education: The Application of Concept Map for Representing the Knowledge Complexity of Climate Change

It has formed an essential issue that Climate Change, composed of highly knowledge complexity, reveals its significant impact on human existence. Therefore, specific national policies, some of which present the educational aspects, have been published for overcoming the imperative problem. Accordingly, the study aims to analyze as well as integrate the relationship between Climate Change and environmental education and apply the perspective of concept map to represent the knowledge contents and structures of Climate Change; by doing so, knowledge contents of Climate Change could be represented in an even more comprehensive way and manipulated as the tool for environmental education. The method adapted for this study is knowledge conversion model compounded of the platform for experts and teachers, who were the participants for this study, to cooperate and combine each participant-s standpoints into a complete knowledge framework that is the foundation for structuring the concept map. The result of this research contains the important concepts, the precise propositions and the entire concept map for representing the robust concepts of Climate Change.

Pipelined Control-Path Effects on Area and Performance of a Wormhole-Switched Network-on-Chip

This paper presents design trade-off and performance impacts of the amount of pipeline phase of control path signals in a wormhole-switched network-on-chip (NoC). The numbers of the pipeline phase of the control path vary between two- and one-cycle pipeline phase. The control paths consist of the routing request paths for output selection and the arbitration paths for input selection. Data communications between on-chip routers are implemented synchronously and for quality of service, the inter-router data transports are controlled by using a link-level congestion control to avoid lose of data because of an overflow. The trade-off between the area (logic cell area) and the performance (bandwidth gain) of two proposed NoC router microarchitectures are presented in this paper. The performance evaluation is made by using a traffic scenario with different number of workloads under 2D mesh NoC topology using a static routing algorithm. By using a 130-nm CMOS standard-cell technology, our NoC routers can be clocked at 1 GHz, resulting in a high speed network link and high router bandwidth capacity of about 320 Gbit/s. Based on our experiments, the amount of control path pipeline stages gives more significant impact on the NoC performance than the impact on the logic area of the NoC router.

The Nonlinear Dynamic Elasto-Plastic Analysis for Evaluating the Controlling Effectiveness and Failure Mechanism of the MSCSS

This paper focuses on the Mega-Sub Controlled Structure Systems (MSCSS) performances and characteristics regarding the new control principle contained in MSCSS subjected to strong earthquake excitations. The adopted control scheme consists of modulated sub-structures where the control action is achieved by viscous dampers and sub-structure own configuration. The elastic-plastic time history analysis under severe earthquake excitation is analyzed base on the Finite Element Analysis Method (FEAM), and some comparison results are also given in this paper. The result shows that the MSCSS systems can remarkably reduce vibrations effects more than the mega-sub structure (MSS). The study illustrates that the improved MSCSS presents good seismic resistance ability even at 1.2g and can absorb seismic energy in the structure, thus imply that structural members cross section can be reduce and achieve to good economic characteristics. Furthermore, the elasto-plastic analysis demonstrates that the MSCSS is accurate enough regarding international building evaluation and design codes. This paper also shows that the elasto-plastic dynamic analysis method is a reasonable and reliable analysis method for structures subjected to strong earthquake excitations and that the computed results are more precise.

Increasing the Efficiency of Rake Receivers for Ultra-Wideband Applications

In diversity rich environments, such as in Ultra- Wideband (UWB) applications, the a priori determination of the number of strong diversity branches is difficult, because of the considerably large number of diversity paths, which are characterized by a variety of power delay profiles (PDPs). Several Rake implementations have been proposed in the past, in order to reduce the number of the estimated and combined paths. To this aim, we introduce two adaptive Rake receivers, which combine a subset of the resolvable paths considering simultaneously the quality of both the total combining output signal-to-noise ratio (SNR) and the individual SNR of each path. These schemes achieve better adaptation to channel conditions compared to other known receivers, without further increasing the complexity. Their performance is evaluated in different practical UWB channels, whose models are based on extensive propagation measurements. The proposed receivers compromise between the power consumption, complexity and performance gain for the additional paths, resulting in important savings in power and computational resources.

Dynamic Traffic Simulation for Traffic Congestion Problem Using an Enhanced Algorithm

Traffic congestion has become a major problem in many countries. One of the main causes of traffic congestion is due to road merges. Vehicles tend to move slower when they reach the merging point. In this paper, an enhanced algorithm for traffic simulation based on the fluid-dynamic algorithm and kinematic wave theory is proposed. The enhanced algorithm is used to study traffic congestion at a road merge. This paper also describes the development of a dynamic traffic simulation tool which is used as a scenario planning and to forecast traffic congestion level in a certain time based on defined parameter values. The tool incorporates the enhanced algorithm as well as the two original algorithms. Output from the three above mentioned algorithms are measured in terms of traffic queue length, travel time and the total number of vehicles passing through the merging point. This paper also suggests an efficient way of reducing traffic congestion at a road merge by analyzing the traffic queue length and travel time.

Performance Analysis of Heat Pipe Using Copper Nanofluid with Aqueous Solution of n-Butanol

This study presents the improvement of thermal performance of heat pipe using copper nanofluid with aqueous solution of n-Butanol. The nanofluids kept in the suspension of conventional fluids have the potential of superior heat transfer capability than the conventional fluids due to their improved thermal conductivity. In this work, the copper nanofluid which has a 40 nm size with a concentration of 100 mg/lit is kept in the suspension of the de-ionized (DI) water and an aqueous solution of n-Butanol and these fluids are used as a working medium in the heat pipe. The study discusses about the effect of heat pipe inclination, type of working fluid and heat input on the thermal efficiency and thermal resistance. The experimental results are evaluated in terms of its performance metrics and are compared with that of DI water.

Formulation Development and Moiturising Effects of a Topical Cream of Aloe vera Extract

This study was designed to formulate, pharmaceutically evaluate a topical skin-care cream (w/o emulsion) of Aloe Vera versus its vehicle (Base) as control and determine their effects on Stratum Corneum (SC) water content and Transepidermal water loss (TEWL). Base containing no extract and a Formulation containing 3% concentrated extract of Aloe Vera was developed by entrapping in the inner aqueous phase of w/o emulsion (cream). Lemon oil was incorporated to improve the odor. Both the Base and Formulation were stored at 8°C ±0.1°C (in refrigerator), 25°C±0.1°C, 40°C±0.1°C and 40°C± 0.1°C with 75% RH (in incubator) for a period of 4 weeks to predict their stability. The evaluation parameters consisted of color, smell, type of emulsion, phase separation, electrical conductivity, centrifugation, liquefaction and pH. Both the Base and Formulation were applied to the cheeks of 21 healthy human volunteers for a period of 8 weeks Stratum corneum (SC) water content and Transepidermal water loss (TEWL) were monitored every week to measure any effect produced by these topical creams. The expected organoleptic stability of creams was achieved from 4 weeks in-vitro study period. Odor was disappeared with the passage of time due to volatilization of lemon oil. Both the Base and Formulation produced significant (p≤0.05) changes in TEWL with respect to time. SC water content was significantly (p≤0.05) increased by the Formulation while the Base has insignificant (p 0.05) effects on SC water content. The newly formulated cream of Aloe Vera, applied is suitable for improvement and quantitative monitoring of skin hydration level (SC water content/ moisturizing effects) and reducing TEWL in people with dry skin.

Visualized Characterization of Molecular Mobility for Water Species in Foods

Six parameters, the effective diffusivity (De), activation energy of De, pre-exponential factor of De, amount (ASOW) of self-organized water species, and amplitude (α) of the forced oscillation of the molecular mobility (1/tC) derived from the forced cyclic temperature change operation, were characterized by using six typical foods, squid, sardines, scallops, salmon, beef, and pork, as a function of the correlation time (tC) of the water molecule-s proton retained in the foods. Each of the six parameters was clearly divided into the water species A1 and A2 at a specified value of tC =10-8s (=CtC), indicating an anomalous change in the physicochemical nature of the water species at the CtC. The forced oscillation of 1/tC clearly demonstrated a characteristic mode depending on the food shown as a three dimensional map associated with 1/tC, the amount of self-organized water, and tC.

Optimization of Breast Tumor Cells Isolation Efficiency and Purity by Membrane Filtration

Size based filtration is one of the common methods employed to isolate circulating tumor cells (CTCs) from whole blood. It is well known that this method suffers from isolation efficiency to purity tradeoff. However, this tradeoff is poorly understood. In this paper, we present the design and manufacturing of a special rectangular slit filter. The filter was designed to retain maximal amounts of nucleated cells, while minimizing the pressure on cells, thereby preserving their morphology. The key parameter, namely, input pressure, was optimized to retain the maximal number of tumor cells, whilst maximizing the depletion of normal blood cells (red and white blood cells and platelets). Our results indicate that for a slit geometry of 5 × 40 μm on a 13 mm circular membrane with a fill factor of 21%, a pressure of 6.9 mBar yields the optimum for maximizing isolation of MCF-7 and depletion of normal blood cells.