A Method to Predict Hemorrhage Disease of Grass Carp Tends

Hemorrhage Disease of Grass Carp (HDGC) is a kind of commonly occurring illnesses in summer, and the extremely high death rate result in colossal losses to aquaculture. As the complex connections among each factor which influences aquiculture diseases, there-s no quit reasonable mathematical model to solve the problem at present.A BP neural network which with excellent nonlinear mapping coherence was adopted to establish mathematical model; Environmental factor, which can easily detected, such as breeding density, water temperature, pH and light intensity was set as the main analyzing object. 25 groups of experimental data were used for training and test, and the accuracy of using the model to predict the trend of HDGC was above 80%. It is demonstrated that BP neural network for predicating diseases in HDGC has a particularly objectivity and practicality, thus it can be spread to other aquiculture disease.

The Adoption and Diffusion of Electronic Wallets

Despite the strong and consistent increase in the use of electronic payment methods worldwide, the diffusion of electronic wallets is still far from widespread. Analysis of the failure of electronic wallet uptake has either focused on technical issues or chosen to analyse a specific scheme. This article proposes a joint approach to analysing key factors affecting the adoption of e-wallets by using the ‘Technology Acceptance Model” [1] which we have expanded to take into account the cost of using e-wallets. We use this model to analyse Monéo, the only French electronic wallet still in operation.

Comparative Analysis of Different Control Strategies for Electro-hydraulic Servo Systems

The main goal of the study is to analyze all relevant properties of the electro hydraulic systems and based on that to make a proper choice of the control strategy that may be used for the control of the servomechanism system. A combination of electronic and hydraulic systems is widely used since it combines the advantages of both. Hydraulic systems are widely spread because of their properties as accuracy, flexibility, high horsepower-to-weight ratio, fast starting, stopping and reversal with smoothness and precision, and simplicity of operations. On the other hand, the modern control of hydraulic systems is based on control of the circuit fed to the inductive solenoid that controls the position of the hydraulic valve. Since this circuit may be easily handled by PWM (Pulse Width Modulation) signal with a proper frequency, the combination of electrical and hydraulic systems became very fruitful and usable in specific areas as airplane and military industry. The study shows and discusses the experimental results obtained by the control strategy (classical feedback (PID) & neural network) using MATLAB and SIMULINK [1]. Finally, the special attention was paid to the possibility of neuro-controller design and its application to control of electro-hydraulic systems and to make comparative with classical control.

Robust Human Rights Governance: Developing International Criteria

Many states are now committed to implementing international human rights standards domestically. In terms of practical governance, how might effectiveness be measured? A facevalue answer can be found in domestic laws and institutions relating to human rights. However, this article provides two further tools to help states assess their status on the spectrum of robust to fragile human rights governance. The first recognises that each state has its own 'human rights history' and the ideal end stage is robust human rights governance, and the second is developing criteria to assess robustness. Although a New Zealand case study is used to illustrate these tools, the widespread adoption of human rights standards by many states inevitably means that the issues are relevant to other countries. This is even though there will always be varying degrees of similarity-difference in constitutional background and developed or emerging human rights systems.

Scale Time Offset Robust Modulation (STORM) in a Code Division Multiaccess Environment

Scale Time Offset Robust Modulation (STORM) [1]– [3] is a high bandwidth waveform design that adds time-scale to embedded reference modulations using only time-delay [4]. In an environment where each user has a specific delay and scale, identification of the user with the highest signal power and that user-s phase is facilitated by the STORM processor. Both of these parameters are required in an efficient multiuser detection algorithm. In this paper, the STORM modulation approach is evaluated with a direct sequence spread quadrature phase shift keying (DS-QPSK) system. A misconception of the STORM time scale modulation is that a fine temporal resolution is required at the receiver. STORM will be applied to a QPSK code division multiaccess (CDMA) system by modifying the spreading codes. Specifically, the in-phase code will use a typical spreading code, and the quadrature code will use a time-delayed and time-scaled version of the in-phase code. Subsequently, the same temporal resolution in the receiver is required before and after the application of STORM. In this paper, the bit error performance of STORM in a synchronous CDMA system is evaluated and compared to theory, and the bit error performance of STORM incorporated in a single user WCDMA downlink is presented to demonstrate the applicability of STORM in a modern communication system.

On the Early Development of Dispersion in Flow through a Tube with Wall Reactions

This is a study on numerical simulation of the convection-diffusion transport of a chemical species in steady flow through a small-diameter tube, which is lined with a very thin layer made up of retentive and absorptive materials. The species may be subject to a first-order kinetic reversible phase exchange with the wall material and irreversible absorption into the tube wall. Owing to the velocity shear across the tube section, the chemical species may spread out axially along the tube at a rate much larger than that given by the molecular diffusion; this process is known as dispersion. While the long-time dispersion behavior, well described by the Taylor model, has been extensively studied in the literature, the early development of the dispersion process is by contrast much less investigated. By early development, that means a span of time, after the release of the chemical into the flow, that is shorter than or comparable to the diffusion time scale across the tube section. To understand the early development of the dispersion, the governing equations along with the reactive boundary conditions are solved numerically using the Flux Corrected Transport Algorithm (FCTA). The computation has enabled us to investigate the combined effects on the early development of the dispersion coefficient due to the reversible and irreversible wall reactions. One of the results is shown that the dispersion coefficient may approach its steady-state limit in a short time under the following conditions: (i) a high value of Damkohler number (say Da ≥ 10); (ii) a small but non-zero value of absorption rate (say Γ* ≤ 0.5).

Estimation of Buffer Size of Internet Gateway Server via G/M/1 Queuing Model

How to efficiently assign system resource to route the Client demand by Gateway servers is a tricky predicament. In this paper, we tender an enhanced proposal for autonomous recital of Gateway servers under highly vibrant traffic loads. We devise a methodology to calculate Queue Length and Waiting Time utilizing Gateway Server information to reduce response time variance in presence of bursty traffic. The most widespread contemplation is performance, because Gateway Servers must offer cost-effective and high-availability services in the elongated period, thus they have to be scaled to meet the expected load. Performance measurements can be the base for performance modeling and prediction. With the help of performance models, the performance metrics (like buffer estimation, waiting time) can be determined at the development process. This paper describes the possible queue models those can be applied in the estimation of queue length to estimate the final value of the memory size. Both simulation and experimental studies using synthesized workloads and analysis of real-world Gateway Servers demonstrate the effectiveness of the proposed system.

Analysis of Codebook Based Channel Feedback Techniques for MIMO-OFDM Systems

This paper investigates the performance of Multiple- Input Multiple-Output (MIMO) feedback system combined with Orthogonal Frequency Division Multiplexing (OFDM). Two types of codebook based channel feedback techniques are used in this work. The first feedback technique uses a combination of both the long-term and short-term channel state information (CSI) at the transmitter, whereas the second technique uses only the short term CSI. The long-term and short-term CSI at the transmitter is used for efficient channel utilization. OFDM is a powerful technique employed in communication systems suffering from frequency selectivity. Combined with multiple antennas at the transmitter and receiver, OFDM proves to be robust against delay spread. Moreover, it leads to significant data rates with improved bit error performance over links having only a single antenna at both the transmitter and receiver. The effectiveness of these techniques has been demonstrated through the simulation of a MIMO-OFDM feedback system. The results have been evaluated for 4x4 MIMO channels. Simulation results indicate the benefits of the MIMO-OFDM channel feedback system over the one without incorporating OFDM. Performance gain of about 3 dB is observed for MIMO-OFDM feedback system as compared to the one without employing OFDM. Hence MIMO-OFDM becomes an attractive approach for future high speed wireless communication systems.

Direct Sequence Spread Spectrum Technique with Residue Number System

In this paper, a residue number arithmetic is used in direct sequence spread spectrum system, this system is evaluated and the bit error probability of this system is compared to that of non residue number system. The effect of channel bandwidth, PN sequences, multipath effect and modulation scheme are studied. A Matlab program is developed to measure the signal-to-noise ratio (SNR), and the bit error probability for the various schemes.

Futures Trading: Design of a Strategy

The paper describes the futures trading and aims to design the speculators trading strategy. The problem is formulated as the decision making task and such as is solved. The solution of the task leads to complex mathematical problems and the approximations of the decision making is demanded. Two kind of approximation are used in the paper: Monte Carlo for the multi-step prediction and iteration spread in time for the optimization. The solution is applied to the real-market data and the results of the off-line experiments are presented.

In Cognitive Radio the Analysis of Bit-Error- Rate (BER) by using PSO Algorithm

The electromagnetic spectrum is a natural resource and hence well-organized usage of the limited natural resources is the necessities for better communication. The present static frequency allocation schemes cannot accommodate demands of the rapidly increasing number of higher data rate services. Therefore, dynamic usage of the spectrum must be distinguished from the static usage to increase the availability of frequency spectrum. Cognitive radio is not a single piece of apparatus but it is a technology that can incorporate components spread across a network. It offers great promise for improving system efficiency, spectrum utilization, more effective applications, reduction in interference and reduced complexity of usage for users. Cognitive radio is aware of its environmental, internal state, and location, and autonomously adjusts its operations to achieve designed objectives. It first senses its spectral environment over a wide frequency band, and then adapts the parameters to maximize spectrum efficiency with high performance. This paper only focuses on the analysis of Bit-Error-Rate in cognitive radio by using Particle Swarm Optimization Algorithm. It is theoretically as well as practically analyzed and interpreted in the sense of advantages and drawbacks and how BER affects the efficiency and performance of the communication system.

Thermal Management of Space Power Electronics using TLM-3D

When designing satellites, one of the major issues aside for designing its primary subsystems is to devise its thermal. The thermal management of satellites requires solving different sets of issues with regards to modelling. If the satellite is well conditioned all other parts of the satellite will have higher temperature no matter what. The main issue of thermal modelling for satellite design is really making sure that all the other points of the satellite will be within the temperature limits they are designed. The insertion of power electronics in aerospace technologies is becoming widespread and the modern electronic systems used in space must be reliable and efficient with thermal management unaffected by outer space constraints. Many advanced thermal management techniques have been developed in recent years that have application in high power electronic systems. This paper presents a Three-Dimensional Modal Transmission Line Matrix (3D-TLM) implementation of transient heat flow in space power electronics. In such kind of components heat dissipation and good thermal management are essential. Simulation provides the cheapest tool to investigate all aspects of power handling. The 3DTLM has been successful in modeling heat diffusion problems and has proven to be efficient in terms of stability and complex geometry. The results show a three-dimensional visualisation of self-heating phenomena in the device affected by outer space constraints, and will presents possible approaches for increasing the heat dissipation capability of the power modules.

Implementing an Adaptive Behavior for Spread Spectrum Watermarking Procedures

The advances in multimedia and networking technologies have created opportunities for Internet pirates, who can easily copy multimedia contents and illegally distribute them on the Internet, thus violating the legal rights of content owners. This paper describes how a simple and well-known watermarking procedure based on a spread spectrum method and a watermark recovery by correlation can be improved to effectively and adaptively protect MPEG-2 videos distributed on the Internet. In fact, the procedure, in its simplest form, is vulnerable to a variety of attacks. However, its security and robustness have been increased, and its behavior has been made adaptive with respect to the video terminals used to open the videos and the network transactions carried out to deliver them to buyers. In fact, such an adaptive behavior enables the proposed procedure to efficiently embed watermarks, and this characteristic makes the procedure well suited to be exploited in web contexts, where watermarks usually generated from fingerprinting codes have to be inserted into the distributed videos “on the fly", i.e. during the purchase web transactions.

The Development of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

The paper investigates the feasibility of constructing a software multi-agent based monitoring and classification system and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. The agents function autonomously to provide continuous and periodic monitoring of excels spreadsheet workbooks. Resulting in, the development of the MultiAgent classification System (MACS) that is in compliance with the specifications of the Foundation for Intelligent Physical Agents (FIPA). However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies that are Windows Communication Foundation (WCF) services, Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW that is in order to satisfy the monitoring and classification of the multiple developer aspect. ODM was used to automate the classification phase of MACS.

A Web Oriented Spread Spectrum Watermarking Procedure for MPEG-2 Videos

In the last decade digital watermarking procedures have become increasingly applied to implement the copyright protection of multimedia digital contents distributed on the Internet. To this end, it is worth noting that a lot of watermarking procedures for images and videos proposed in literature are based on spread spectrum techniques. However, some scepticism about the robustness and security of such watermarking procedures has arisen because of some documented attacks which claim to render the inserted watermarks undetectable. On the other hand, web content providers wish to exploit watermarking procedures characterized by flexible and efficient implementations and which can be easily integrated in their existing web services frameworks or platforms. This paper presents how a simple spread spectrum watermarking procedure for MPEG-2 videos can be modified to be exploited in web contexts. To this end, the proposed procedure has been made secure and robust against some well-known and dangerous attacks. Furthermore, its basic scheme has been optimized by making the insertion procedure adaptive with respect to the terminals used to open the videos and the network transactions carried out to deliver them to buyers. Finally, two different implementations of the procedure have been developed: the former is a high performance parallel implementation, whereas the latter is a portable Java and XML based implementation. Thus, the paper demonstrates that a simple spread spectrum watermarking procedure, with limited and appropriate modifications to the embedding scheme, can still represent a valid alternative to many other well-known and more recent watermarking procedures proposed in literature.

Effects of pH, Temperature, Enzyme and Substrate Concentration on Xylooligosaccharides Production

Agricultural residue such as oil palm fronds (OPF) is cheap, widespread and available throughout the year. Hemicelluloses extracted from OPF can be hydrolyzed to their monomers and used in production of xylooligosaccharides (XOs). The objective of the present study was to optimize the enzymatic hydrolysis process of OPF hemicellulose by varying pH, temperature, enzyme and substrate concentration for production of XOs. Hemicelluloses was extracted from OPF by using 3 M potassium hydroxide (KOH) at temperature of 40°C for 4 hrs and stirred at 400 rpm. The hemicellulose was then hydrolyzed using Trichoderma longibrachiatum xylanase at different pH, temperature, enzyme and substrate concentration. XOs were characterized based on reducing sugar determination. The optimum conditions to produced XOs from OPF hemicellulose was obtained at pH 4.6, temperature of 40°C , enzyme concentration of 2 U/mL and 2% substrate concentration. The results established the suitability of oil palm fronds as raw material for production of XOs.

User Experience Evolution Lifecycle Framework

Perceptions of quality from both designers and users perspective have now stretched beyond the traditional usability, incorporating abstract and subjective concepts. This has led to a shift in human computer interaction research communities- focus; a shift that focuses on achieving user experience (UX) by not only fulfilling conventional usability needs but also those that go beyond them. The term UX, although widely spread and given significant importance, lacks consensus in its unified definition. In this paper, we survey various UX definitions and modeling frameworks and examine them as the foundation for proposing a UX evolution lifecycle framework for understanding UX in detail. In the proposed framework we identify the building blocks of UX and discuss how UX evolves in various phases. The framework can be used as a tool to understand experience requirements and evaluate them, resulting in better UX design and hence improved user satisfaction.

ARMrayan Multimedia Mobile CMS: a Simplified Approach towards Content-Oriented Mobile Application Designing

The ARMrayan Multimedia Mobile CMS (Content Management System) is the first mobile CMS that gives the opportunity to users for creating multimedia J2ME mobile applications with their desired content, design and logo; simply, without any need for writing even a line of code. The low-level programming and compatibility problems of the J2ME, along with UI designing difficulties, makes it hard for most people –even programmers- to broadcast their content to the widespread mobile phones used by nearly all people. This system provides user-friendly, PC-based tools for creating a tree index of pages and inserting multiple multimedia contents (e.g. sound, video and picture) in each page for creating a J2ME mobile application. The output is a standalone Java mobile application that has a user interface, shows texts and pictures and plays music and videos regardless of the type of devices used as long as the devices support the J2ME platform. Bitmap fonts have also been used thus Middle Eastern languages can be easily supported on all mobile phone devices. We omitted programming concepts for users in order to simplify multimedia content-oriented mobile applictaion designing for use in educational, cultural or marketing centers. Ordinary operators can now create a variety of multimedia mobile applications such as tutorials, catalogues, books, and guides in minutes rather than months. Simplicity and power has been the goal of this CMS. In this paper, we present the software engineered-designed concepts of the ARMrayan MCMS along with the implementation challenges faces and solutions adapted.

Energy Based Temperature Profile for Heat Transfer Analysis of Concrete Section Exposed to Fire on One Side

For fire safety purposes, the fire resistance and the structural behavior of reinforced concrete members are assessed to satisfy specific fire performance criteria. The available prescribed provisions are based on standard fire load. Under various fire scenarios, engineers are in need of both heat transfer analysis and structural analysis. For heat transfer analysis, the study proposed a modified finite difference method to evaluate the temperature profile within a cross section. The research conducted is limited to concrete sections exposed to a fire on their one side. The method is based on the energy conservation principle and a pre-determined power function of the temperature profile. The power value of 2.7 is found to be a suitable value for concrete sections. The temperature profiles of the proposed method are only slightly deviate from those of the experiment, the FEM and the FDM for various fire loads such as ASTM E 119, ASTM 1529, BS EN 1991-1-2 and 550 oC. The proposed method is useful to avoid incontinence of the large matrix system of the typical finite difference method to solve the temperature profile. Furthermore, design engineers can simply apply the proposed method in regular spreadsheet software.

Optimization of Process Parameters of Pressure Die Casting using Taguchi Methodology

The present work analyses different parameters of pressure die casting to minimize the casting defects. Pressure diecasting is usually applied for casting of aluminium alloys. Good surface finish with required tolerances and dimensional accuracy can be achieved by optimization of controllable process parameters such as solidification time, molten temperature, filling time, injection pressure and plunger velocity. Moreover, by selection of optimum process parameters the pressure die casting defects such as porosity, insufficient spread of molten material, flash etc. are also minimized. Therefore, a pressure die casting component, carburetor housing of aluminium alloy (Al2Si2O5) has been considered. The effects of selected process parameters on casting defects and subsequent setting of parameters with the levels have been accomplished by Taguchi-s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L18 orthogonal array. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the percent contribution of different process parameters. Confidence interval has also been estimated for 95% consistency level and three conformational experiments have been performed to validate the optimum level of different parameters. Overall 2.352% reduction in defects has been observed with the help of suggested optimum process parameters.