Stepsize Control of the Finite Difference Method for Solving Ordinary Differential Equations

An important task in solving second order linear ordinary differential equations by the finite difference is to choose a suitable stepsize h. In this paper, by using the stochastic arithmetic, the CESTAC method and the CADNA library we present a procedure to estimate the optimal stepsize hopt, the stepsize which minimizes the global error consisting of truncation and round-off error.

Block Cipher Based on Randomly Generated Quasigroups

Quasigroups are algebraic structures closely related to Latin squares which have many different applications. The construction of block cipher is based on quasigroup string transformation. This article describes a block cipher based Quasigroup of order 256, suitable for fast software encryption of messages written down in universal ASCII code. The novelty of this cipher lies on the fact that every time the cipher is invoked a new set of two randomly generated quasigroups are used which in turn is used to create a pair of quasigroup of dual operations. The cryptographic strength of the block cipher is examined by calculation of the xor-distribution tables. In this approach some algebraic operations allows quasigroups of huge order to be used without any requisite to be stored.

Properties of Al2O3 – hBN Composites

Alumina matrix composites with addition of hexagonal boron nitride (hBN), acting as solid lubricant, were produced. Main purpose of solid lubricants is to dispose the necessity of using cooling lubricants in machining process. Hot pressing was used as a consolidating process for Al2O3-x%wt.hBN (x=1/ 2,5/ 5 /7,5 /10) composites. Properties of sinters such as relative density, hardness, Young-s modulus and fracture toughness were examined. Obtained samples characterize by high relative density. Hardness and fracture toughness values allow the use of alumina – hBN composites for machining steels even in hardened condition. However it was observed that high weight content of hBN can negatively influence the mechanical properties of composites.

Designing of the Heating Process for Fiber- Reinforced Thermoplastics with Middle-Wave Infrared Radiators

Manufacturing components of fiber-reinforced thermoplastics requires three steps: heating the matrix, forming and consolidation of the composite and terminal cooling the matrix. For the heating process a pre-determined temperature distribution through the layers and the thickness of the pre-consolidated sheets is recommended to enable forming mechanism. Thus, a design for the heating process for forming composites with thermoplastic matrices is necessary. To obtain a constant temperature through thickness and width of the sheet, the heating process was analyzed by the help of the finite element method. The simulation models were validated by experiments with resistance thermometers as well as with an infrared camera. Based on the finite element simulation, heating methods for infrared radiators have been developed. Using the numeric simulation many iteration loops are required to determine the process parameters. Hence, the initiation of a model for calculating relevant process parameters started applying regression functions.

A Markov Chain Model for Load-Balancing Based and Service Based RAT Selection Algorithms in Heterogeneous Networks

Next Generation Wireless Network (NGWN) is expected to be a heterogeneous network which integrates all different Radio Access Technologies (RATs) through a common platform. A major challenge is how to allocate users to the most suitable RAT for them. An optimized solution can lead to maximize the efficient use of radio resources, achieve better performance for service providers and provide Quality of Service (QoS) with low costs to users. Currently, Radio Resource Management (RRM) is implemented efficiently for the RAT that it was developed. However, it is not suitable for a heterogeneous network. Common RRM (CRRM) was proposed to manage radio resource utilization in the heterogeneous network. This paper presents a user level Markov model for a three co-located RAT networks. The load-balancing based and service based CRRM algorithms have been studied using the presented Markov model. A comparison for the performance of load-balancing based and service based CRRM algorithms is studied in terms of traffic distribution, new call blocking probability, vertical handover (VHO) call dropping probability and throughput.

RBF modeling of Incipient Motion of Plane Sand Bed Channels

To define or predict incipient motion in an alluvial channel, most of the investigators use a standard or modified form of Shields- diagram. Shields- diagram does give a process to determine the incipient motion parameters but an iterative one. To design properly (without iteration), one should have another equation for resistance. Absence of a universal resistance equation also magnifies the difficulties in defining the model. Neural network technique, which is particularly useful in modeling a complex processes, is presented as a tool complimentary to modeling incipient motion. Present work develops a neural network model employing the RBF network to predict the average velocity u and water depth y based on the experimental data on incipient condition. Based on the model, design curves have been presented for the field application.

Performance of a Transcritical CO2 Heat Pump for Simultaneous Water Cooling and Heating

This paper presents the experimental as well as the simulated performance studies on the transcritical CO2 heat pumps for simultaneous water cooling and heating; effects of water mass flow rates and water inlet temperatures of both evaporator and gas cooler on the cooling and heating capacities, system COP and water outlets temperatures are investigated. Study shows that both the water mass flow rate and inlet temperature have significant effect on system performances. Test results show that the effect of evaporator water mass flow rate on the system performances and water outlet temperatures is more pronounced (COP increases 0.6 for 1 kg/min) compared to the gas cooler water mass flow rate (COP increases 0.4 for 1 kg/min) and the effect of gas cooler water inlet temperature is more significant (COP decreases 0.48 for given ranges) compared to the evaporator water inlet temperature (COP increases 0.43 for given ranges). Comparisons of experimental values with simulated results show the maximum deviation of 5% for cooling capacity, 10% for heating capacity, 16% for system COP. This study offers useful guidelines for selecting appropriate water mass flow rate to obtain required system performance.

Mechanisms Involved In Organic Solvent Resistance in Gram-Negative Bacteria

The high world interest given to the researches concerning the study of moderately halophilic solvent-tolerant bacteria isolated from marine polluted environments is due to their high biotechnological potential, and also to the perspective of their application in different remediation technologies. Using enrichment procedures, I isolated two moderately halophilic Gram-negative bacterial strains from seawater sample, which are tolerant to organic solvents. Cell tolerance, adhesion and cells viability of Aeromonas salmonicida IBBCt2 and Pseudomonas aeruginosa IBBCt3 in the presence of organic solvents depends not only on its physicochemical properties and its concentration, but also on the specific response of the cells, and the cellular response is not the same for these bacterial strains. n-hexane, n-heptane, propylbenzene, with log POW between 3.69 and 4.39, were less toxic for Aeromonas salmonicida IBBCt2 and Pseudomonas aeruginosa IBBCt3, compared with toluene, styrene, xylene isomers and ethylbenzene, with log POW between 2.64 and 3.17. The results indicated that Aeromonas salmonicida IBBCt2 is more susceptible to organic solvents than Pseudomonas aeruginosa IBBCt3. The mechanisms underlying solvent tolerance (e.g., the existance of the efflux pumps) in Aeromonas salmonicida IBBCt2 and Pseudomonas aeruginosa IBBCt3 it was also studied.

Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors

A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.

The Design of the HL7 RIM-based Sharing Components for Clinical Information Systems

The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.

The Benefit of Green Logistics to Organization

This research studied about green logistics and the expected benefit that organization gotten when adapted to green logistics also the organization concerned about the important activity in green logistics to apply in implementation from study was found that the benefit of green logistics that organization was gotten by logistics management which was the increased efficiency process of management the product from producer to customer all of reduce production cost, increased value added save energy and prevented environment together From study was found that the organization had green logistics to apply in logistics activities in supply chain since downstream till upstream to prevent environment as follow 1). Purchasing process, trade facilitation enhance such as linking of information technology during business to business (B2B business). 2). Productions process improved by business logistics improvement 3). Warehouse management process such as recycled packaging, moving goods in to warehouse, transportation goods and inside receiving and delivery products plan.

Technological Innovation Capabilities and Firm Performance

Technological innovation capability (TIC) is defined as a comprehensive set of characteristics of a firm that facilities and supports its technological innovation strategies. An audit to evaluate the TICs of a firm may trigger improvement in its future practices. Such an audit can be used by the firm for self assessment or third-party independent assessment to identify problems of its capability status. This paper attempts to develop such an auditing framework that can help to determine the subtle links between innovation capabilities and business performance; and to enable the auditor to determine whether good practice is in place. The seven TICs in this study include learning, R&D, resources allocation, manufacturing, marketing, organization and strategic planning capabilities. Empirical data was acquired through a survey study of 200 manufacturing firms in the Hong Kong/Pearl River Delta (HK/PRD) region. Structural equation modelling was employed to examine the relationships among TICs and various performance indicators: sales performance, innovation performance, product performance, and sales growth. The results revealed that different TICs have different impacts on different performance measures. Organization capability was found to have the most influential impact. Hong Kong manufacturers are now facing the challenge of high-mix-low-volume customer orders. In order to cope with this change, good capability in organizing different activities among various departments is critical to the success of a company.

Assessment of Reliability and Quality Measures in Power Systems

The paper presents new results of a recent industry supported research and development study in which an efficient framework for evaluating practical and meaningful power system reliability and quality indices was applied. The system-wide integrated performance indices are capable of addressing and revealing areas of deficiencies and bottlenecks as well as redundancies in the composite generation-transmission-demand structure of large-scale power grids. The technique utilizes a linear programming formulation, which simulates practical operating actions and offers a general and comprehensive framework to assess the harmony and compatibility of generation, transmission and demand in a power system. Practical applications to a reduced system model as well as a portion of the Saudi power grid are also presented in the paper for demonstration purposes.

Capacitor Placement in Radial Distribution System for Loss Reduction Using Artificial Bee Colony Algorithm

This paper presents a new method which applies an artificial bee colony algorithm (ABC) for capacitor placement in distribution systems with an objective of improving the voltage profile and reduction of power loss. The ABC algorithm is a new population based meta heuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 69-bus system and compared the results with the other approach available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.

Sous Vide Packaging Technology Application for Salad with Meat in Mayonnaise Shelf Life Extension

Experiments have been carried out at the Latvia University of Agriculture Department of Food Technology. The aim of this work was to assess the effect of sous vide packaging during the storage time of salad with meat in mayonnaise at different storage temperature. Samples were evaluated at 0, 1, 3, 7, 10, 15, 18, 25, 29, 42, and 52 storage days at the storage temperature of +4±0.5 ºC and +10±0.5 ºC. Experimentally the quality of the salad with meat in mayonnaise was characterized by measuring colour, pH and microbiological properties. The sous vide packaging was effective in protecting the product from physical, chemical, and microbial quality degradation. The sous vide packaging significantly reduces microbial growth at storage temperature of +4±0.5 ºC and +10±0.5 ºC. Moreover, it is possible to extend the product shelf life to 52 days even when stored at +10±0.5 ºC.

DNS of a Laminar Separation Bubble

Direct numerical simulation (DNS) is used to study the evolution of a boundary layer that was laminar initially followed by separation and then reattachment owing to generation of turbulence. This creates a closed region of recirculation, known as the laminar-separation bubble. The present simulation emulates the flow environment encountered in a modern LP turbine blade, where a laminar separation bubble may occur on the suction surface. The unsteady, incompressible three-dimensional (3-D) Navier-Stokes (NS) equations have been solved over a flat plate in the Cartesian coordinates. The adverse pressure gradient, which causes the flow to separate, is created by a boundary condition. The separated shear layer undergoes transition through appearance of ╬ø vortices, stretching of these create longitudinal streaks. Breakdown of the streaks into small and irregular structures makes the flow turbulent downstream.

A Novel Receiver Algorithm for Coherent Underwater Acoustic Communications

In this paper, we proposed a novel receiver algorithm for coherent underwater acoustic communications. The proposed receiver is composed of three parts: (1) Doppler tracking and correction, (2) Time reversal channel estimation and combining, and (3) Joint iterative equalization and decoding (JIED). To reduce computational complexity and optimize the equalization algorithm, Time reversal (TR) channel estimation and combining is adopted to simplify multi-channel adaptive decision feedback equalizer (ADFE) into single channel ADFE without reducing the system performance. Simultaneously, the turbo theory is adopted to form joint iterative ADFE and convolutional decoder (JIED). In JIED scheme, the ADFE and decoder exchange soft information in an iterative manner, which can enhance the equalizer performance using decoding gain. The simulation results show that the proposed algorithm can reduce computational complexity and improve the performance of equalizer. Therefore, the performance of coherent underwater acoustic communications can be improved greatly.

Equivalence Class Subset Algorithm

The equivalence class subset algorithm is a powerful tool for solving a wide variety of constraint satisfaction problems and is based on the use of a decision function which has a very high but not perfect accuracy. Perfect accuracy is not required in the decision function as even a suboptimal solution contains valuable information that can be used to help find an optimal solution. In the hardest problems, the decision function can break down leading to a suboptimal solution where there are more equivalence classes than are necessary and which can be viewed as a mixture of good decision and bad decisions. By choosing a subset of the decisions made in reaching a suboptimal solution an iterative technique can lead to an optimal solution, using series of steadily improved suboptimal solutions. The goal is to reach an optimal solution as quickly as possible. Various techniques for choosing the decision subset are evaluated.

Arterial CO2 Pressure Drives Ventilation with a Time Delay during Recovery from an Impulse-like Exercise without Metabolic Acidosis

We investigated this hypothesis that arterial CO2 pressure (PaCO2) drives ventilation (V.E) with a time delay duringrecovery from short impulse-like exercise (10 s) with work load of 200 watts. V.E and end tidal CO2 pressure (PETCO2) were measured continuously during rest, warming up, exercise and recovery periods. PaCO2 was predicted (PaCO2 pre) from PETCO2 and tidal volume (VT). PETCO2 and PaCO2 pre peaked at 20 s of recovery. V.E increased and peaked at the end of exercise and then decreased during recovery; however, it peaked again at 30 s of recovery, which was 10 s later than the peak of PaCO2 pre. The relationship between V. E and PaCO2pre was not significant by using data of them obtained at the same time but was significant by using data of V.E obtained 10 s later for data of PaCO2 pre. The results support our hypothesis that PaCO2 drives V.E with a time delay.

Multiple Job Shop-Scheduling using Hybrid Heuristic Algorithm

In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.