Preliminary Study on Determining Stem Diameter Variations of Sympodial Orchid

Changes in stem diameter of orchid plants were investigated in a control growing climate. Previous studies have focused on stem diameter in relation to plant water on terrestrial plants in order to schedule the irrigation. The objective of this work was to evaluate the ability of the strain gauges to capture changes in the epiphytes plant stem. Experiments were carried out by using the sympodial orchid, Dendrobium Sonia in a stressed condition. From the findings, the sensor can detect changes in the plant stem and the result can easily be used as a reference for further studies for the development of a proper watering system.

The Suitability of GPS Receivers Update Rates for Navigation Applications

Navigation is the processes of monitoring and controlling the movement of an object from one place to another. Currently, Global Positioning System (GPS) is the main navigation system used all over the world for navigation applications. GPS receiver receives signals from at least three satellites to locate and display itself. Displayed positioning information is updated continuously. Update rate is the number of times per second that a display is illuminated. The speed of update is governed by receiver update rate. A higher update rate decreases display lag time and improves distance measurements and tracking especially when moving on a curvy route. The majority of GPS receivers used nowadays are updated every second continuously. This period is considered reasonable for some applications while it is long relatively for high speed applications. In this paper, the suitability and feasibility of GPS receiver with different update rates will be evaluated for various applications according to the level of speed and update rate needed for particular applications.

A Simulation Software for DNA Computing Algorithms Implementation

The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.

Distribution Feeder Reconfiguration Considering Distributed Generators

Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. Fueling the attention have been the possibilities of international agreements to reduce greenhouse gas emissions, electricity sector restructuring, high power reliability requirements for certain activities, and concern about easing transmission and distribution capacity bottlenecks and congestion. So it is necessary that impact of these kinds of generators on distribution feeder reconfiguration would be investigated. This paper presents an approach for distribution reconfiguration considering Distributed Generators (DGs). The objective function is summation of electrical power losses A Tabu search optimization is used to solve the optimal operation problem. The approach is tested on a real distribution feeder.

Technology Based Learning Environment and Student Achievement in English as a Foreign Language in Pakistan

The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. The global technological scenario has paved the way to new pedagogies in teaching-learning process focusing on technology based learning environment and its impact on student achievement. The present experimental study was conducted to determine the effectiveness of technology based learning environment on student achievement in English as a foreign language. The sample of the study was 90 students of 10th grade of a public school located in Islamabad. A pretest- posttest equivalent group design was used to compare the achievement of the two groups. A Pretest and A posttest containing 50 items each from English textbook were developed and administered. The collected data were statistically analyzed. The results showed that there was a significant difference between the mean scores of Experimental group and the Control group. The performance of Experimental group was better on posttest scores that indicted that teaching through technology based learning environment enhanced the achievement level of the students. On the basis of the results, it was recommended that teaching and learning through information and communication technologies may be adopted to enhance the language learning capability of the students.

Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.

Distributed Data-Mining by Probability-Based Patterns

In this paper a new method is suggested for distributed data-mining by the probability patterns. These patterns use decision trees and decision graphs. The patterns are cared to be valid, novel, useful, and understandable. Considering a set of functions, the system reaches to a good pattern or better objectives. By using the suggested method we will be able to extract the useful information from massive and multi-relational data bases.

A Study on the Mobile Web Generating using Element of User Experience

As mobile service's subscriber is increasing; mobile contents services are getting more and more variables. So, mobile contents development needs not only contents design but also guideline for just mobile. And when mobile contents are developed, it is important to pass the limit and restriction of the mobile. The restrictions of mobile are small browser and screen size, limited download size and uncomfortable navigation. So each contents of mobile guideline will be presented for user's usability, easy of development and consistency of rule. This paper will be proposed methodology which is each contents of mobile guideline. Mobile web will be developed by mobile guideline which I proposed.

Development of a Methodology for Processing of Drilling Operations

Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.

Motor Imagery Signal Classification for a Four State Brain Machine Interface

Motor imagery classification provides an important basis for designing Brain Machine Interfaces [BMI]. A BMI captures and decodes brain EEG signals and transforms human thought into actions. The ability of an individual to control his EEG through imaginary mental tasks enables him to control devices through the BMI. This paper presents a method to design a four state BMI using EEG signals recorded from the C3 and C4 locations. Principle features extracted through principle component analysis of the segmented EEG are analyzed using two novel classification algorithms using Elman recurrent neural network and functional link neural network. Performance of both classifiers is evaluated using a particle swarm optimization training algorithm; results are also compared with the conventional back propagation training algorithm. EEG motor imagery recorded from two subjects is used in the offline analysis. From overall classification performance it is observed that the BP algorithm has higher average classification of 93.5%, while the PSO algorithm has better training time and maximum classification. The proposed methods promises to provide a useful alternative general procedure for motor imagery classification

A Finite Point Method Based on Directional Derivatives for Diffusion Equation

This paper presents a finite point method based on directional derivatives for diffusion equation on 2D scattered points. To discretize the diffusion operator at a given point, a six-point stencil is derived by employing explicit numerical formulae of directional derivatives, namely, for the point under consideration, only five neighbor points are involved, the number of which is the smallest for discretizing diffusion operator with first-order accuracy. A method for selecting neighbor point set is proposed, which satisfies the solvability condition of numerical derivatives. Some numerical examples are performed to show the good performance of the proposed method.

Spectral Amplitude Coding Optical CDMA: Performance Analysis of PIIN Reduction Using VC Code Family

Multi-user interference (MUI) is the main reason of system deterioration in the Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) system. MUI increases with the number of simultaneous users, resulting into higher probability bit rate and limits the maximum number of simultaneous users. On the other hand, Phase induced intensity noise (PIIN) problem which is originated from spontaneous emission of broad band source from MUI severely limits the system performance should be addressed as well. Since the MUI is caused by the interference of simultaneous users, reducing the MUI value as small as possible is desirable. In this paper, an extensive study for the system performance specified by MUI and PIIN reducing is examined. Vectors Combinatorial (VC) codes families are adopted as a signature sequence for the performance analysis and a comparison with reported codes is performed. The results show that, when the received power increases, the PIIN noise for all the codes increases linearly. The results also show that the effect of PIIN can be minimized by increasing the code weight leads to preserve adequate signal to noise ratio over bit error probability. A comparison study between the proposed code and the existing codes such as Modified frequency hopping (MFH), Modified Quadratic- Congruence (MQC) has been carried out.

Forming of Nanodimentional Structure Parts in Carbon Steels

A way of achieving nanodimentional structural elements in high carbon steel by special kind of heat treatment and cold plastic deformation is being explored. This leads to increasing interlamellar spacing of ferrite-carbide mixture. Decreasing the interlamellar spacing with cooling temperature increasing is determined. Experiments confirm such interlamellar spacing with which high carbon steel demonstrates the highest treatment and hardening capability. Total deformation degree effect on interlamellar spacing value in a ferrite-carbide mixture is obtained. Mechanical experiments results show that high carbon steel after heat treatment and repetitive cold plastic deformation possesses high tensile strength and yield strength keeping good percentage elongation.

Topical Delivery of Thymidine Dinucleotide to Induce p53 Generation in the Skin by Elastic Liposome

Transcription factor p53 has a powerful tumor suppressing function that is associated with many cancers. However, p53 of the molecular weight was higher make the limitation across to skin or cell membrane. Thymidine dinucleotide (pTT), an oligonucleotide, can activate the p53 transcription factor. pTT is a hydrophilic and negative charge oligonucleotide, which delivery in to cell membrane need an appropriate carrier. The aim of this study was to improve the bioavailability of the nucleotide fragment, thymidine dinucleotide (pTT), using elasic liposome carriers to deliver the drug into the skin. The study demonstrate that dioleoylphosphocholine (DOPC) incorporated with sodium cholate at molar ratio 1:1 can archived the particle size about 220 nm. This elastic liposome could penetration through skin from stratum corneum to whole epidermis by confocal laser scanning microscopy (CLSM). Moreover, we observed the the slight increase in generation of p53 by western blot.

Mining and Visual Management of XML-Based Image Collections

This article describes Uruk, the virtual museum of Iraq that we developed for visual exploration and retrieval of image collections. The system largely exploits the loosely-structured hierarchy of XML documents that provides a useful representation method to store semi-structured or unstructured data, which does not easily fit into existing database. The system offers users the capability to mine and manage the XML-based image collections through a web-based Graphical User Interface (GUI). Typically, at an interactive session with the system, the user can browse a visual structural summary of the XML database in order to select interesting elements. Using this intermediate result, queries combining structure and textual references can be composed and presented to the system. After query evaluation, the full set of answers is presented in a visual and structured way.

Fuzzy C-Means Clustering Algorithm for Voltage Stability in Large Power Systems

The steady-state operation of maintaining voltage stability is done by switching various controllers scattered all over the power network. When a contingency occurs, whether forced or unforced, the dispatcher is to alleviate the problem in a minimum time, cost, and effort. Persistent problem may lead to blackout. The dispatcher is to have the appropriate switching of controllers in terms of type, location, and size to remove the contingency and maintain voltage stability. Wrong switching may worsen the problem and that may lead to blackout. This work proposed and used a Fuzzy CMeans Clustering (FCMC) to assist the dispatcher in the decision making. The FCMC is used in the static voltage stability to map instantaneously a contingency to a set of controllers where the types, locations, and amount of switching are induced.

Gas Detonation Forming by a Mixture of H2+O2 Detonation

Explosive forming is one of the unconventional techniques in which, most commonly, the water is used as the pressure transmission medium. One of the newest methods in explosive forming is gas detonation forming which uses a normal shock wave derived of gas detonation, to form sheet metals. For this purpose a detonation is developed from the reaction of H2+O2 mixture in a long cylindrical detonation tube. The detonation wave goes through the detonation tube and acts as a blast load on the steel blank and forms it. Experimental results are compared with a finite element model; and the comparison of the experimental and numerical results obtained from strain, thickness variation and deformed geometry is carried out. Numerical and experimental results showed approximately 75 – 90 % similarity in formability of desired shape. Also optimum percent of gas mixture obtained when we mix 68% H2 with 32% O2.

Vibration Induced Fatigue Assessment in Vehicle Development Process

Improvement in CAE methods has an important role for shortening of the vehicle product development time. It is provided that validation of the design and improvements in terms of durability can be done without hardware prototype production. In recent years, several different methods have been developed in order to investigate fatigue damage of the vehicle. The intended goal among these methods is prediction of fatigue damage in a short time with reduced costs. This study developed a new fatigue damage prediction method in the automotive sector using power spectrum densities of accelerations. This study also confirmed that the weak region in vehicle can be easily detected with the method developed in this study which results were compared with conventional method.

A Serializability Condition for Multi-step Transactions Accessing Ordered Data

In mobile environments, unspecified numbers of transactions arrive in continuous streams. To prove correctness of their concurrent execution a method of modelling an infinite number of transactions is needed. Standard database techniques model fixed finite schedules of transactions. Lately, techniques based on temporal logic have been proposed as suitable for modelling infinite schedules. The drawback of these techniques is that proving the basic serializability correctness condition is impractical, as encoding (the absence of) conflict cyclicity within large sets of transactions results in prohibitively large temporal logic formulae. In this paper, we show that, under certain common assumptions on the graph structure of data items accessed by the transactions, conflict cyclicity need only be checked within all possible pairs of transactions. This results in formulae of considerably reduced size in any temporal-logic-based approach to proving serializability, and scales to arbitrary numbers of transactions.