Effects of Polymers and Alkaline on Recovery Improvement from Fractured Models

In this work, several ASP solutions were flooded into fractured models initially saturated with heavy oil at a constant flow rate and different geometrical characteristics of fracture. The ASP solutions are constituted from 2 polymers i.e. a synthetic polymer, hydrolyzed polyacrylamide as well as a biopolymer, a surfactant and 2types of alkaline. The results showed that using synthetic hydrolyzed polyacrylamide polymer increases ultimate oil recovery; however, type of alkaline does not play a significant rule on oil recovery. In addition, position of the injection well respect to the fracture system has remarkable effects on ASP flooding. For instance increasing angle of fractures with mean flow direction causes more oil recovery and delays breakthrough time. This work can be accounted as a comprehensive survey on ASP flooding which considers most of effective factors in this chemical EOR method.

Force Analysis of an Automated Rapid Maxillary Expansion (ARME) Appliance

An Automated Rapid Maxillary Expander (ARME) is a specially designed microcontroller-based orthodontic appliance to overcome the shortcomings imposed by the traditional maxillary expansion appliances. This new device is operates by automatically widening the maxilla (upper jaw) by expanding the midpalatal suture [1]. The ARME appliance that has been developed is a combination of modified butterfly expander appliance, micro gear, micro motor, and microcontroller to automatically produce light and continuous pressure to expand the maxilla. For this study, the functionality of the system is verified through laboratory tests by measure the forced applied to the teeth each time the maxilla expands. The laboratory test results show that the developed appliance meets the desired performance specifications consistently.

Performance Evaluation of Improved Ball End Magnetorheological Finishing Process

A novel nanofinishing process using improved ball end magnetorheological (MR) finishing tool was developed for finishing of flat as well as 3D surfaces of ferromagnetic and non ferromagnetic workpieces. In this process a magnetically controlled ball end of smart MR polishing fluid is generated at the tip surface of the tool which is used as a finishing medium and it is guided to follow the surface to be finished through computer controlled 3-axes motion controller. The experiments were performed on ferromagnetic workpiece surface in the developed MR finishing setup to study the effect of finishing time on final surface roughness. The performance of present finishing process on final finished surface roughness was studied. The surface morphology was observed under scanning electron microscopy and atomic force microscope. The final surface finish was obtained as low as 19.7 nm from the initial surface roughness of 142.9 nm. The outcome of newly developed finishing process can be found useful in its applications in aerospace, automotive, dies and molds manufacturing industries, semiconductor and optics machining etc.

The Impact of Semantic Web on E-Commerce

Semantic Web Technologies enable machines to interpret data published in a machine-interpretable form on the web. At the present time, only human beings are able to understand the product information published online. The emerging semantic Web technologies have the potential to deeply influence the further development of the Internet Economy. In this paper we propose a scenario based research approach to predict the effects of these new technologies on electronic markets and business models of traders and intermediaries and customers. Over 300 million searches are conducted everyday on the Internet by people trying to find what they need. A majority of these searches are in the domain of consumer ecommerce, where a web user is looking for something to buy. This represents a huge cost in terms of people hours and an enormous drain of resources. Agent enabled semantic search will have a dramatic impact on the precision of these searches. It will reduce and possibly eliminate information asymmetry where a better informed buyer gets the best value. By impacting this key determinant of market prices semantic web will foster the evolution of different business and economic models. We submit that there is a need for developing these futuristic models based on our current understanding of e-commerce models and nascent semantic web technologies. We believe these business models will encourage mainstream web developers and businesses to join the “semantic web revolution."

Computer-aided Lenke Classification of Scoliotic Spines

The identification and classification of the spine deformity play an important role when considering surgical planning for adolescent patients with idiopathic scoliosis. The subject of this article is the Lenke classification of scoliotic spines using Cobb angle measurements. The purpose is two-fold: (1) design a rulebased diagram to assist clinicians in the classification process and (2) investigate a computer classifier which improves the classification time and accuracy. The rule-based diagram efficiency was evaluated in a series of scoliotic classifications by 10 clinicians. The computer classifier was tested on a radiographic measurement database of 603 patients. Classification accuracy was 93% using the rule-based diagram and 99% for the computer classifier. Both the computer classifier and the rule based diagram can efficiently assist clinicians in their Lenke classification of spine scoliosis.

A Subtractive Clustering Based Approach for Early Prediction of Fault Proneness in Software Modules

In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.

Real Time Monitoring of Long Slender Shaft by Distributed-Lumped Modeling Techniques

The aim of this paper is to determine the stress levels at the end of a long slender shaft such as a drilling assembly used in the oil or gas industry using a mathematical model in real-time. The torsional deflection experienced by this type of drilling shaft (about 4 KM length and 20 cm diameter hollow shaft with a thickness of 1 cm) can only be determined using a distributed modeling technique. The main objective of this project is to calculate angular velocity and torque at the end of the shaft by TLM method and also analyzing of the behavior of the system by transient response. The obtained result is compared with lumped modeling technique the importance of these results will be evident only after the mentioned comparison. Two systems have different transient responses and in this project because of the length of the shaft transient response is very important.

Isolation and Identification of an Acetobacter Strain from Iranian White-Red Cherry with High Acetic Acid Productivity as a Potential Strain for Cherry Vinegar Production in Foodand Agriculture Biotechnology

According to FDA (Food and Drug Administration of the United States), vinegar is definedas a sour liquid containing at least 4 grams acetic acid in 100 cubic centimeter (4% solution of acetic acid) of solution that is produced from sugary materials by alcoholic fermentation. In the base of microbial starters, vinegars could be contained of more than 50 types of volatile and aromatic substances that responsible for their sweet taste and smelling. Recently the vinegar industry has a great proportion in agriculture, food and microbial biotechnology. The acetic acid bacteria are from the family Acetobacteraceae. Regarding to the latest version of Bergy-s Mannual of Systematic Bacteriology that has categorized bacteria in the base of their 16s RNA differences, the most important acetic acid genera are included Acetobacter (genus I), Gluconacetobacter (genus VIII) and Gluconobacter (genus IX). The genus Acetobacter that is primarily used in vinegar manufacturing plants is a gram negative, obligate aerobe coccus or rod shaped bacterium with the size 0.6 - 0.8 X 1.0 - 4.0 μm, nonmotile or motile with peritrichous flagella and catalase positive – oxidase negative biochemically. Some strains are overoxidizer that could convert acetic acid to carbon dioxide and water.In this research one Acetobacter native strain with high acetic acid productivity was isolated from Iranian white – red cherry. We used two specific culture media include Carr medium [yeast extract, 3%; ethanol, 2% (v/v); bromocresol green, 0.002%; agar, 2% and distilled water, 1000 ml], Frateur medium [yeast extract, 10 g/l; CaCO3, 20 g/l; ethanol, 20 g/l; agar, 20 g/l and distilled water, 1000 ml] and an industrial culture medium. In addition to high acetic acid production and high growth rate, this strain had a good tolerance against ethanol concentration that was examined using modified Carr media with 5%, 7% and 9% ethanol concentrations. While the industrial strains of acetic acid bacteria grow in the thermal range of 28 – 30 °C, this strain was adapted for growth in 34 – 36 °C after 96 hours incubation period. These dramatic characteristics suggest a potential biotechnological strain in production of cherry vinegar with a sweet smell and different nutritional properties in comparison to recent vinegar types. The lack of growth after 24, 48 and 72 hours incubation at 34 – 36 °C and the growth after 96 hours indicates a good and fast thermal flexibility of this strain as a significant characteristic of biotechnological and industrial strains.

A New Approach to Polynomial Neural Networks based on Genetic Algorithm

Recently, a lot of attention has been devoted to advanced techniques of system modeling. PNN(polynomial neural network) is a GMDH-type algorithm (Group Method of Data Handling) which is one of the useful method for modeling nonlinear systems but PNN performance depends strongly on the number of input variables and the order of polynomial which are determined by trial and error. In this paper, we introduce GPNN (genetic polynomial neural network) to improve the performance of PNN. GPNN determines the number of input variables and the order of all neurons with GA (genetic algorithm). We use GA to search between all possible values for the number of input variables and the order of polynomial. GPNN performance is obtained by two nonlinear systems. the quadratic equation and the time series Dow Jones stock index are two case studies for obtaining the GPNN performance.

A Type of Urban Genesis in Romanian Outer-Carpathian Area: the Genoan Cities

The Mongol expansion in the West and the political and commercial interests arising from antagonisms between the Golden Horde and the Persian Ilkhanate determined the transformation of the Black Sea into an international trade turntable beginning with the last third of the XIIIth century. As the Volga Khanate attracted the maritime power of Genoa in the transcontinental project of deviating the Silk Road to its own benefit, the latter took full advantage of the new historical conjuncture, to the detriment of its rival, Venice. As a consequence, Genoa settled important urban centers on the Pontic shores, having mainly a commercial role. In the Romanian outer-Carpathian area, Vicina, Cetatea Albâ, and Chilia are notable, representing distinct, important types of cities within the broader context of the Romanian medieval urban genesis typology.

Application of the Data Distribution Service for Flexible Manufacturing Automation

This paper discusses the applicability of the Data Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an infrastructure for platform-independent many-to-many communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory footprints and high robustness requirements. After an overview of the standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.

Comanche – A Compiler-Driven I/O Management System

Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.

Entropy Based Spatial Design: A Genetic Algorithm Approach (Case Study)

We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.

Study of Tower Grounding Resistance Effected Back Flashover to 500 kV Transmission Line in Thailand by using ATP/EMTP

This study describes analysis of tower grounding resistance effected the back flashover voltage across insulator string in a transmission system. This paper studies the 500 kV transmission lines from Mae Moh, Lampang to Nong Chok, Bangkok, Thailand, which is double circuit in the same steel tower with two overhead ground wires. The factor of this study includes magnitude of lightning stroke, and front time of lightning stroke. Steel tower uses multistory tower model. The assumption of studies based on the return stroke current ranged 1-200 kA, front time of lightning stroke between 1 μs to 3 μs. The simulations study the effect of varying tower grounding resistance that affect the lightning current. Simulation results are analyzed lightning over voltage that causes back flashover at insulator strings. This study helps to know causes of problems of back flashover the transmission line system, and also be as a guideline solving the problem for 500 kV transmission line systems, as well.

The Removal of Cu (II) Ions from Aqueous Solutions on Synthetic Zeolite NaA

In this study the adsorption of Cu (II) ions from aqueous solutions on synthetic zeolite NaA was evaluated. The effect of solution temperature and the determination of the kinetic parameters of adsorption of Cu(II) from aqueous solution on zeolite NaA is important in understanding the adsorption mechanism. Variables of the system include adsorption time, temperature (293- 328K), initial solution concentration and pH for the system. The sorption kinetics of the copper ions were found to be strongly dependent on pH (the optimum pH 3-5), solute ion concentration and temperature (293 – 328 K). It was found, the pseudo-second-order model was the best choice among all the kinetic models to describe the adsorption behavior of Cu(II) onto ziolite NaA, suggesting that the adsorption mechanism might be a chemisorptions process The activation energy of adsorption (Ea) was determined as Cu(II) 13.5 kJ mol-1. The low value of Ea shows that Cu(II) adsorption process by zeolite NaA may be an activated chemical adsorption. The thermodynamic parameters (ΔG0, ΔH0, and ΔS0) were also determined from the temperature dependence. The results show that the process of adsorption Cu(II) is spontaneous and endothermic process and rise in temperature favors the adsorption.

An Enhanced Artificial Neural Network for Air Temperature Prediction

The mitigation of crop loss due to damaging freezes requires accurate air temperature prediction models. An improved model for temperature prediction in Georgia was developed by including information on seasonality and modifying parameters of an existing artificial neural network model. Alternative models were compared by instantiating and training multiple networks for each model. The inclusion of up to 24 hours of prior weather information and inputs reflecting the day of year were among improvements that reduced average four-hour prediction error by 0.18°C compared to the prior model. Results strongly suggest model developers should instantiate and train multiple networks with different initial weights to establish appropriate model parameters.

Belief Theory-Based Classifiers Comparison for Static Human Body Postures Recognition in Video

This paper presents various classifiers results from a system that can automatically recognize four different static human body postures in video sequences. The considered postures are standing, sitting, squatting, and lying. The three classifiers considered are a naïve one and two based on the belief theory. The belief theory-based classifiers use either a classic or restricted plausibility criterion to make a decision after data fusion. The data come from the people 2D segmentation and from their face localization. Measurements consist in distances relative to a reference posture. The efficiency and the limits of the different classifiers on the recognition system are highlighted thanks to the analysis of a great number of results. This system allows real-time processing.

On General Stability for Switched Positive Linear Systems with Bounded Time-varying Delays

This paper focuses on the problem of a common linear copositive Lyapunov function(CLCLF) existence for discrete-time switched positive linear systems(SPLSs) with bounded time-varying delays. In particular, applying system matrices, a special class of matrices are constructed in an appropriate manner. Our results reveal that the existence of a common copositive Lyapunov function can be related to the Schur stability of such matrices. A simple example is provided to illustrate the implication of our results.

Performance Modeling for Web based J2EE and .NET Applications

When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.

Optimization of Reaction Rate Parameters in Modeling of Heavy Paraffins Dehydrogenation

In the present study, a procedure was developed to determine the optimum reaction rate constants in generalized Arrhenius form and optimized through the Nelder-Mead method. For this purpose, a comprehensive mathematical model of a fixed bed reactor for dehydrogenation of heavy paraffins over Pt–Sn/Al2O3 catalyst was developed. Utilizing appropriate kinetic rate expressions for the main dehydrogenation reaction as well as side reactions and catalyst deactivation, a detailed model for the radial flow reactor was obtained. The reactor model composed of a set of partial differential equations (PDE), ordinary differential equations (ODE) as well as algebraic equations all of which were solved numerically to determine variations in components- concentrations in term of mole percents as a function of time and reactor radius. It was demonstrated that most significant variations observed at the entrance of the bed and the initial olefin production obtained was rather high. The aforementioned method utilized a direct-search optimization algorithm along with the numerical solution of the governing differential equations. The usefulness and validity of the method was demonstrated by comparing the predicted values of the kinetic constants using the proposed method with a series of experimental values reported in the literature for different systems.