Numerical Analysis and Experimental Validation of a Downhole Stress/Strain Measurement Tool

Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.

Adaptive PID Control of Wind Energy Conversion Systems Using RASP1 Mother Wavelet Basis Function Networks

In this paper a PID control strategy using neural network adaptive RASP1 wavelet for WECS-s control is proposed. It is based on single layer feedforward neural networks with hidden nodes of adaptive RASP1 wavelet functions controller and an infinite impulse response (IIR) recurrent structure. The IIR is combined by cascading to the network to provide double local structure resulting in improving speed of learning. This particular neuro PID controller assumes a certain model structure to approximately identify the system dynamics of the unknown plant (WECS-s) and generate the control signal. The results are applied to a typical turbine/generator pair, showing the feasibility of the proposed solution.

The Effect of e-learning on the Promotion of Optoelectronics Technology and Daily Livings Literacy among Students in Universities of Technology

This study aims to analyze the effect of e-learning on photonics technology and daily livings among college students. The course contents of photonics technology and daily livings are first drafted based on research discussions and expert interviews. Having expert questionnaires with Delphi Technique for three times, the knowledge units and items for the course of photonics technology and daily livings are established. The e-learning materials and the drafts of instructional strategies, academic achievement, and learning attitude scales are then developed. With expert inspection, reliability and validity test, and experimental instructions, the scales and the material are further revised. Finally, the formal instructions are implemented to test the effect of different instructional methods on the academic achievement of photonics technology and daily livings among students in universities of technology. The research results show that e-learning could effectively promote academic achievement and learning attitude, and the students with e-learning obviously outperform the ones with trandition instructions.

Investigation on Novel Based Metaheuristic Algorithms for Combinatorial Optimization Problems in Ad Hoc Networks

Routing in MANET is extremely challenging because of MANETs dynamic features, its limited bandwidth, frequent topology changes caused by node mobility and power energy consumption. In order to efficiently transmit data to destinations, the applicable routing algorithms must be implemented in mobile ad-hoc networks. Thus we can increase the efficiency of the routing by satisfying the Quality of Service (QoS) parameters by developing routing algorithms for MANETs. The algorithms that are inspired by the principles of natural biological evolution and distributed collective behavior of social colonies have shown excellence in dealing with complex optimization problems and are becoming more popular. This paper presents a survey on few meta-heuristic algorithms and naturally-inspired algorithms.

e-Learning Program with Voice Assistance for a Tactile Braille

Along with the increased morbidity of glaucoma or diabetic retinitis pigmentosa, etc., number of people with vision loss is also increasing in Japan. It is difficult for the visually impaired to learn and acquire braille because most of them are middle-aged. In addition, number of braille teachers are not sufficient and reducing in Japan, and this situation makes more difficult for the visually impaired. Therefore, we research and develop a Web-based e-learning program for tactile braille, that cooperate with braille display and voice assistance.

Supplementation of Saccharomyces Cerevisiae or Lactobacillus Acidophilus in Goats Diets

This experiment was performed with the purpose of investigating effect of additional blend of probiotics Saccharomyces cerevisiae and Lactobacillus acidophilus on plasma fatty acid profiles particularly conjugated linoleic acid (CLA) in growing goats fed corn silage, and selected the optimal levels of the probiotics for further study. Twenty-four growing crossbred (Thai native x Anglo-Nubian) goats that weighed (14.2 ± 2.3) kg, aged about 6 months, were purchased and allocated to 4 treatments according to Randomized Complete Block Design (RCBD) with 6 goats in each treatment. The blocks were made by weight into heavy, medium, and light goats and each of the treatments contained two goats from each of the blocks. In the mean time, ruminal average pH unaffected, but the NH3-N and also plasma urea nitrogen (p0.05) were raised, but propionic proportion (p0.05) were reduced in concurrent with raise of acetic proportion and resultantly C2:C3 ratio (p>0.05). On plasma fatty acid profiles, total saturated fatty acids (p>0.05) was increased, and contrasted with decrease of C15:0 (p0.05), and C18-C22 polyunsaturated fatty acids (p

Assesing Extension of Meeting System Performance in Information Technology in Defense and Aerospace Project

The Ministry of Defense (MoD) spends hundreds of millions of dollars on software to support its infrastructure, operate its weapons and provide command, control, communications, computing, intelligence, surveillance, and reconnaissance (C4ISR) functions. These and other all new advanced systems have a common critical component is information technology. Defense and Aerospace environment is continuously striving to keep up with increasingly sophisticated Information Technology (IT) in order to remain effective in today-s dynamic and unpredictable threat environment. This makes it one of the largest and fastest growing expenses of Defense. Hundreds of millions of dollars spent a year on IT projects. But, too many of those millions are wasted on costly mistakes. Systems that do not work properly, new components that are not compatible with old once, trendily new applications that do not really satisfy defense needs or lost though poorly managed contracts. This paper investigates and compiles the effective strategies that aim to end exasperation with low returns and high cost of Information Technology Acquisition for defense; it tries to show how to maximize value while reducing time and expenditure.

Broadening of Raw Materials in the Steel Industry, by Recycling and Recovery Wastes

In technological processes, in addition to the main product, result a large amount of materials, called wastes, but due to the possibilities of recovery, by means of recycling and reusing it can fit in the category of by-products. These large amounts of dust from the steel industry are a major problem in terms of environmental and human health, landscape, etc. Solving these problems, the impressive amounts of waste can be done through their proper management and recovery for every type of waste. In this article it was watched the capitalizing through pelleting and briquetting of small and powdery waste aiming to obtain the sponge iron as raw material, used in blast furnaces and electric arc furnaces. The data have been processed in the Excel spreadsheet program, being presented in the form of diagrams.

School Design and Energy Efficiency

Auckland has a temperate climate with comfortable warm, dry summers and mild, wet winters. An Auckland school normally does not need air conditioning for cooling during the summer and only need heating during the winter. The space hating energy is the major portion of winter school energy consumption and the winter energy consumption is major portion of annual school energy consumption. School building thermal design should focus on the winter thermal performance for reducing the space heating energy. A number of Auckland schools- design data and energy consumption data are used for this study. This pilot study investigates the relationships between their energy consumption data and school building design data to improve future school design for energy efficiency.

A Markov Chain Model for Load-Balancing Based and Service Based RAT Selection Algorithms in Heterogeneous Networks

Next Generation Wireless Network (NGWN) is expected to be a heterogeneous network which integrates all different Radio Access Technologies (RATs) through a common platform. A major challenge is how to allocate users to the most suitable RAT for them. An optimized solution can lead to maximize the efficient use of radio resources, achieve better performance for service providers and provide Quality of Service (QoS) with low costs to users. Currently, Radio Resource Management (RRM) is implemented efficiently for the RAT that it was developed. However, it is not suitable for a heterogeneous network. Common RRM (CRRM) was proposed to manage radio resource utilization in the heterogeneous network. This paper presents a user level Markov model for a three co-located RAT networks. The load-balancing based and service based CRRM algorithms have been studied using the presented Markov model. A comparison for the performance of load-balancing based and service based CRRM algorithms is studied in terms of traffic distribution, new call blocking probability, vertical handover (VHO) call dropping probability and throughput.

Information Gain Ratio Based Clustering for Investigation of Environmental Parameters Effects on Human Mental Performance

Methods of clustering which were developed in the data mining theory can be successfully applied to the investigation of different kinds of dependencies between the conditions of environment and human activities. It is known, that environmental parameters such as temperature, relative humidity, atmospheric pressure and illumination have significant effects on the human mental performance. To investigate these parameters effect, data mining technique of clustering using entropy and Information Gain Ratio (IGR) K(Y/X) = (H(X)–H(Y/X))/H(Y) is used, where H(Y)=-ΣPi ln(Pi). This technique allows adjusting the boundaries of clusters. It is shown that the information gain ratio (IGR) grows monotonically and simultaneously with degree of connectivity between two variables. This approach has some preferences if compared, for example, with correlation analysis due to relatively smaller sensitivity to shape of functional dependencies. Variant of an algorithm to implement the proposed method with some analysis of above problem of environmental effects is also presented. It was shown that proposed method converges with finite number of steps.

RBF modeling of Incipient Motion of Plane Sand Bed Channels

To define or predict incipient motion in an alluvial channel, most of the investigators use a standard or modified form of Shields- diagram. Shields- diagram does give a process to determine the incipient motion parameters but an iterative one. To design properly (without iteration), one should have another equation for resistance. Absence of a universal resistance equation also magnifies the difficulties in defining the model. Neural network technique, which is particularly useful in modeling a complex processes, is presented as a tool complimentary to modeling incipient motion. Present work develops a neural network model employing the RBF network to predict the average velocity u and water depth y based on the experimental data on incipient condition. Based on the model, design curves have been presented for the field application.

Suitability of Requirements Abstraction Model (RAM) Requirements for High-Level System Testing

The Requirements Abstraction Model (RAM) helps in managing abstraction in requirements by organizing them at four levels (product, feature, function and component). The RAM is adaptable and can be tailored to meet the needs of the various organizations. Because software requirements are an important source of information for developing high-level tests, organizations willing to adopt the RAM model need to know the suitability of the RAM requirements for developing high-level tests. To investigate this suitability, test cases from twenty randomly selected requirements were developed, analyzed and graded. Requirements were selected from the requirements document of a Course Management System, a web based software system that supports teachers and students in performing course related tasks. This paper describes the results of the requirements document analysis. The results show that requirements at lower levels in the RAM are suitable for developing executable tests whereas it is hard to develop from requirements at higher levels.

An Erosion-based Modeling of Abrasive Waterjet Turning

In this paper, an erosion-based model for abrasive waterjet (AWJ) turning process is presented. By using modified Hashish erosion model, the volume of material removed by impacting of abrasive particles to surface of the rotating cylindrical specimen is estimated and radius reduction at each rotation is calculated. Different to previous works, the proposed model considers the continuous change in local impact angle due to change in workpiece diameter, axial traverse rate of the jet, the abrasive particle roundness and density. The accuracy of the proposed model is examined by experimental tests under various traverse rates. The final diameters estimated by the proposed model are in good accordance with experiments.

The Design of the HL7 RIM-based Sharing Components for Clinical Information Systems

The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.

Technological Innovation Capabilities and Firm Performance

Technological innovation capability (TIC) is defined as a comprehensive set of characteristics of a firm that facilities and supports its technological innovation strategies. An audit to evaluate the TICs of a firm may trigger improvement in its future practices. Such an audit can be used by the firm for self assessment or third-party independent assessment to identify problems of its capability status. This paper attempts to develop such an auditing framework that can help to determine the subtle links between innovation capabilities and business performance; and to enable the auditor to determine whether good practice is in place. The seven TICs in this study include learning, R&D, resources allocation, manufacturing, marketing, organization and strategic planning capabilities. Empirical data was acquired through a survey study of 200 manufacturing firms in the Hong Kong/Pearl River Delta (HK/PRD) region. Structural equation modelling was employed to examine the relationships among TICs and various performance indicators: sales performance, innovation performance, product performance, and sales growth. The results revealed that different TICs have different impacts on different performance measures. Organization capability was found to have the most influential impact. Hong Kong manufacturers are now facing the challenge of high-mix-low-volume customer orders. In order to cope with this change, good capability in organizing different activities among various departments is critical to the success of a company.

Capacitor Placement in Radial Distribution System for Loss Reduction Using Artificial Bee Colony Algorithm

This paper presents a new method which applies an artificial bee colony algorithm (ABC) for capacitor placement in distribution systems with an objective of improving the voltage profile and reduction of power loss. The ABC algorithm is a new population based meta heuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 69-bus system and compared the results with the other approach available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.

Dominant Flow Features of Two Inclined Impinging Jets Confined in Large Enclosure

The present study was provided to examine the vortical structures generated by two inclined impinging jets with experimental and numerical investigations. The jets are issuing with a pitch angle α=40° into a confined quiescent fluid. The experimental investigation on flow patterns was visualized by using olive particles injected into the jets illuminated by Nd:Yag laser light to reveal the finer details of the confined jets interaction. It was observed that two counter-rotating vortex pairs (CVPs) were generated in the near region. A numerical investigation was also performed. First, the numerical results were validates against the experimental results and then the numerical model was used to study the effect of section ratio on the evolution of the CVPs. Our results show promising agreement with experimental data, and indicate that our model has the potential to produce useful and accurate data regarding the evolution of CVPs.

Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines

A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.

An Intelligent Water Drop Algorithm for Solving Economic Load Dispatch Problem

Economic Load Dispatch (ELD) is a method of determining the most efficient, low-cost and reliable operation of a power system by dispatching available electricity generation resources to supply load on the system. The primary objective of economic dispatch is to minimize total cost of generation while honoring operational constraints of available generation resources. In this paper an intelligent water drop (IWD) algorithm has been proposed to solve ELD problem with an objective of minimizing the total cost of generation. Intelligent water drop algorithm is a swarm-based natureinspired optimization algorithm, which has been inspired from natural rivers. A natural river often finds good paths among lots of possible paths in its ways from source to destination and finally find almost optimal path to their destination. These ideas are embedded into the proposed algorithm for solving economic load dispatch problem. The main advantage of the proposed technique is easy is implement and capable of finding feasible near global optimal solution with less computational effort. In order to illustrate the effectiveness of the proposed method, it has been tested on 6-unit and 20-unit test systems with incremental fuel cost functions taking into account the valve point-point loading effects. Numerical results shows that the proposed method has good convergence property and better in quality of solution than other algorithms reported in recent literature.