Oscillation Effect of the Multi-stage Learning for the Layered Neural Networks and Its Analysis

This paper proposes an efficient learning method for the layered neural networks based on the selection of training data and input characteristics of an output layer unit. Comparing to recent neural networks; pulse neural networks, quantum neuro computation, etc, the multilayer network is widely used due to its simple structure. When learning objects are complicated, the problems, such as unsuccessful learning or a significant time required in learning, remain unsolved. Focusing on the input data during the learning stage, we undertook an experiment to identify the data that makes large errors and interferes with the learning process. Our method devides the learning process into several stages. In general, input characteristics to an output layer unit show oscillation during learning process for complicated problems. The multi-stage learning method proposes by the authors for the function approximation problems of classifying learning data in a phased manner, focusing on their learnabilities prior to learning in the multi layered neural network, and demonstrates validity of the multi-stage learning method. Specifically, this paper verifies by computer experiments that both of learning accuracy and learning time are improved of the BP method as a learning rule of the multi-stage learning method. In learning, oscillatory phenomena of a learning curve serve an important role in learning performance. The authors also discuss the occurrence mechanisms of oscillatory phenomena in learning. Furthermore, the authors discuss the reasons that errors of some data remain large value even after learning, observing behaviors during learning.

Numerical Analysis of Flow through Abrasive Water Suspension Jet: The Effect of Garnet, Aluminum Oxide and Silicon Carbide Abrasive on Skin Friction Coefficient Due To Wall Shear and Jet Exit Kinetic Energy

It is well known that the abrasive particles in the abrasive water suspension has significant effect on the erosion characteristics of the inside surface of the nozzle. Abrasive particles moving with the flow cause severe skin friction effect, there by altering the nozzle diameter due to wear which in turn reflects on the life of the nozzle for effective machining. Various commercial abrasives are available for abrasive water jet machining. The erosion characteristic of each abrasive is different. In consideration of this aspect, in the present work, the effect of abrasive materials namely garnet, aluminum oxide and silicon carbide on skin friction coefficient due to wall shear stress and jet kinetic energy has been analyzed. It is found that the abrasive material of lower density produces a relatively higher skin friction effect and higher jet exit kinetic energy.

Static Headspace GC Method for Aldehydes Determination in Different Food Matrices

Aldehydes as secondary lipid oxidation products are highly specific to the oxidative degradation of particular polyunsaturated fatty acids present in foods. Gas chromatographic analysis of those volatile compounds has been widely used for monitoring of the deterioration of food products. Developed static headspace gas chromatography method using flame ionization detector (SHS GC FID) was applied to monitor the aldehydes present in processed foods such as bakery, meat and confectionary products. Five selected aldehydes were determined in samples without any sample preparation, except grinding for bakery and meat products. SHS–GC analysis allows the separation of propanal, pentanal, hexanal, heptanal and octanal, within 15min. Aldehydes were quantified in fresh and stored samples, and the obtained range of aldehydes in crackers was 1.62±0.05 – 9.95±0.05mg/kg, in sausages 6.62±0.46 – 39.16±0.39mg/kg; and in cocoa spread cream 0.48±0.01 – 1.13±0.02mg/kg. Referring to the obtained results, the following can be concluded, proposed method is suitable for different types of samples, content of aldehydes varies depending on the type of a sample, and differs in fresh and stored samples of the same type.

Design of Smart Energy Monitoring System for Green IT Life

This paper describes the smart energy monitoring system with a wireless sensor network for monitoring of electrical usage in smart house. Proposed system is composed of wireless plugs and energy control wallpad server. The wireless plug integrates an AC power socket, a relay to switch the socket ON/OFF, a Hall effect sensor to sense current of load appliance and a Kmote. The Kmote is a wireless communication interface based on TinyOS. We evaluated wireless plug in a laboratory, analyzed and presented energy consumption data from electrical appliances for 3 months in home.

Product Configuration Strategy Based On Product Family Similarity

To offer a large variety of products while maintaining low costs, high speed, and high quality in a mass customization product development environment, platform based product development has much benefit and usefulness in many industry fields. This paper proposes a product configuration strategy by similarity measure, incorporating the knowledge engineering principles such as product information model, ontology engineering, and formal concept analysis.

Incidence, Occurrence, Classification and Outcome of Small Animal Fractures: A Retrospective Study (2005-2010)

A retrospective study was undertaken to record the occurrence and pattern of fractures in small animals (dogs and cats) from year 2005 to 2010. A total of 650 cases were presented in small animal surgery unit out of which of 116 (dogs and cats) were presented with history of fractures of different bones. A total of 17.8% (116/650) cases were of fractures which constituted dogs 67% while cats were 23%. The majority of animals were intact. Trauma in the form of road side accident was the principal cause of fractures in dogs whereas as in cats it was fall from height. The ages of the fractured dog ranged from 4 months to 12 years whereas in cat it was from 4 weeks to 10 years. The femoral fractures represented 37.5% and 25% respectively in dogs and cats. Diaphysis, distal metaphyseal and supracondylar fractures were the most affected sites in dog and cats. Tibial fracture in dogs and cats represented 21.5% and 10% while humoral fractures were 7.9% and 14% in dogs and cats respectively. Humoral condyler fractures were most commonly seen in puppies aged 4 to 6 months. Fractured radius-ulna incidence was 19% and 14% in dogs and cats respectively. Other fractures recorded were of lumbar vertebrae, mandible and metacarpals etc. The management comprised of external and internal fixation in both the species. The most common internal fixation technique employed was Intramedullary fixation in long followed by other methods like stack or cross pinning, wiring etc as per findings in the cases. The cast bandage was used majorly as mean for external coaptation. The paper discusses the outcome of the case as per the technique employed.

The Vulnerability Analysis of Java Bytecode Based on Points-to Dataflow

Today many developers use the Java components collected from the Internet as external LIBs to design and develop their own software. However, some unknown security bugs may exist in these components, such as SQL injection bug may comes from the components which have no specific check for the input string by users. To check these bugs out is very difficult without source code. So a novel method to check the bugs in Java bytecode based on points-to dataflow analysis is in need, which is different to the common analysis techniques base on the vulnerability pattern check. It can be used as an assistant tool for security analysis of Java bytecode from unknown softwares which will be used as extern LIBs.

Variance Based Component Analysis for Texture Segmentation

This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA.

Detecting and Measuring Fabric Pills Using Digital Image Analysis

In this paper a novel method was presented for evaluating the fabric pills using digital image processing techniques. This work provides a novel technique for detecting pills and also measuring their heights, surfaces and volumes. Surely, measuring the intensity of defects by human vision is an inaccurate method for quality control; as a result, this problem became a motivation for employing digital image processing techniques for detection of defects of fabric surface. In the former works, the systems were just limited to measuring of the surface of defects, but in the presented method the height and the volume of defects were also measured, which leads to a more accurate quality control. An algorithm was developed to first, find pills and then measure their average intensity by using three criteria of height, surface and volume. The results showed a meaningful relation between the number of rotations and the quality of pilled fabrics.

Oil Refineries Emissions: Source and Impact: A Study using AERMOD

The main objectives of this paper are to measure pollutants concentrations in the oil refinery area in Kuwait over three periods during one year, obtain recent emission inventory for the three refineries of Kuwait, use AERMOD and the emission inventory to predict pollutants concentrations and distribution, compare model predictions against measured data, and perform numerical experiments to determine conditions at which emission rates and the resulting pollutant dispersion is below maximum allowable limits.

Impacts of the Courtyard with Glazed Roof on House Winter Thermal Conditions

The 'wind-rain' house has a courtyard with glazed roof, which allows more direct sunlight to come into indoor spaces during the winter. The glazed roof can be partially opened or closed and automatically controlled to provide natural ventilation in order to adjust for indoor thermal conditions and the roof area can be shaded by reflective insulation materials during the summer. Two field studies for evaluating indoor thermal conditions of the two 'windrain' houses have been carried out by author in 2009 and 2010. Indoor and outdoor air temperature and relative humidity adjacent to floor and ceiling of the two sample houses were continuously tested at 15-minute intervals, 24 hours a day during the winter months. Based on field study data, this study investigates relationships between building design and indoor thermal condition of the 'windrain' house to improve the future house design for building thermal comfort and energy efficiency

Feasibility Study on Designing a Flat Loop Heat Pipe (LHP) to Recover the Heat from Exhaust of a Gas Turbine

A theoretical study is conducted to design and explore the effect of different parameters such as heat loads, the tube size of piping system, wick thickness, porosity and hole size on the performance and capability of a Loop Heat Pipe(LHP). This paper presents a steady state model that describes the different phenomena inside a LHP. Loop Heat Pipes(LHPs) are two-phase heat transfer devices with capillary pumping of a working fluid. By their original design comparing with heat pipes and special properties of the capillary structure, they-re capable of transferring heat efficiency for distances up to several meters at any orientation in the gravity field, or to several meters in a horizontal position. This theoretical model is described by different relations to satisfy important limits such as capillary and nucleate boiling. An algorithm is developed to predict the size of the LHP satisfying the limitations mentioned above for a wide range of applied loads. Finally, to assess and evaluate the algorithm and all the relations considered, we have used to design a new kind of LHP to recover the heat from the exhaust of an actual Gas Turbine. By finding the results, it showed that we can use the LHP as a very high efficient device to recover the heat even in high amount of loads(exhaust of a gas turbine). The sizes of all parts of the LHP were obtained using the developed algorithm.

A New Method of Adaptation in Integrated Learning Environment

A new method of adaptation in a partially integrated learning environment that includes electronic textbook (ET) and integrated tutoring system (ITS) is described. The algorithm of adaptation is described in detail. It includes: establishment of Interconnections of operations and concepts; estimate of the concept mastering level (for all concepts); estimate of student-s non-mastering level on the current learning step of information on each page of ET; creation of a rank-order list of links to the e-manual pages containing information that require repeated work.

An Improved Quality Adaptive Rate Filtering Technique Based on the Level Crossing Sampling

Mostly the systems are dealing with time varying signals. The Power efficiency can be achieved by adapting the system activity according to the input signal variations. In this context an adaptive rate filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by following the input signal local variations. Thus, it correlates the processing activity with the signal variations. Interpolation is required in the proposed technique. A drastic reduction in the interpolation error is achieved by employing the symmetry during the interpolation process. Processing error of the proposed technique is calculated. The computational complexity of the proposed filtering technique is deduced and compared to the classical one. Results promise a significant gain of the computational efficiency and hence of the power consumption.

Value Stream Oriented Inventory Management

Producing companies aspire to high delivery availability despite appearing disruptions. To ensure high delivery availability safety stocksare required. Howeversafety stock leads to additional capital commitment and compensates disruptions instead of solving the reasons.The intention is to increase the stability in production by configuring the production planning and control systematically. Thus the safety stock can be reduced. The largest proportion of inventory in producing companies is caused by batch inventory, schedule deviations and variability of demand rates.These reasons for high inventory levels can be reduced by configuring the production planning and control specifically. Hence the inventory level can be reduced. This is enabled by synchronizing the lot size straightening the demand as well as optimizing the releasing order, sequencing and capacity control.

Cognitive Landscape of Values – Understanding the Information Contents of Mental Representations

The values of managers and employees in organizations are phenomena that have captured the interest of researchers at large. Despite this attention, there continues to be a lack of agreement on what values are and how they influence individuals, or how they are constituted in individuals- mind. In this article content-based approach is presented as alternative reference frame for exploring values. In content-based approach human thinking in different contexts is set at the focal point. Differences in valuations can be explained through the information contents of mental representations. In addition to the information contents, attention is devoted to those cognitive processes through which mental representations of values are constructed. Such informational contents are in decisive role for understanding human behavior. By applying content-based analysis to an examination of values as mental representations, it is possible to reach a deeper to the motivational foundation of behaviors, such as decision making in organizational procedures, through understanding the structure and meanings of specific values at play.

Granulation using Clustering and Rough Set Theory and its Tree Representation

Granular computing deals with representation of information in the form of some aggregates and related methods for transformation and analysis for problem solving. A granulation scheme based on clustering and Rough Set Theory is presented with focus on structured conceptualization of information has been presented in this paper. Experiments for the proposed method on four labeled data exhibit good result with reference to classification problem. The proposed granulation technique is semi-supervised imbibing global as well as local information granulation. To represent the results of the attribute oriented granulation a tree structure is proposed in this paper.

Instability of Ties in Compression

Masonry cavity walls are loaded by wind pressure and vertical load from upper floors. These loads results in bending moments and compression forces in the ties connecting the outer and the inner wall in a cavity wall. Large cavity walls are furthermore loaded by differential movements from the temperature gradient between the outer and the inner wall, which results in critical increase of the bending moments in the ties. Since the ties are loaded by combined compression and moment forces, the loadbearing capacity is derived from instability equilibrium equations. Most of them are iterative, since exact instability solutions are complex to derive, not to mention the extra complexity introducing dimensional instability from the temperature gradients. Using an inverse variable substitution and comparing an exact theory with an analytical instability solution a method to design tie-connectors in cavity walls was developed. The method takes into account constraint conditions limiting the free length of the wall tie, and the instability in case of pure compression which gives an optimal load bearing capacity. The model is illustrated with examples from praxis.

Dynamic Action Induced By Walking Pedestrian

The main focus of this paper is on the human induced forces. Almost all existing force models for this type of load (defined either in the time or frequency domain) are developed from the assumption of perfect periodicity of the force and are based on force measurements conducted on rigid (i.e. high frequency) surfaces. To verify the different authors conclusions the vertical pressure measurements invoked during the walking was performed, using pressure gauges in various configurations. The obtained forces are analyzed using Fourier transformation. This load is often decisive in the design of footbridges. Design criteria and load models proposed by widely used standards and other researchers were introduced and a comparison was made.

Performance Evaluation of the Post-Installed Anchor for Sign Structure

Numerous experimental tests for post-installed anchor systems drilled in hardened concrete were conducted in order to estimate pull-out and shear strength accounting for uncertainties such as torque ratios, embedment depths and different diameters in demands. In this study, the strength of the systems was significantly changed by the effect of those three uncertainties during pull-out experimental tests, whereas the shear strength of the systems was not affected by torque ratios. It was also shown that concrete cone failure or damage mechanism was generally investigated during and after pull-out tests and in shear strength tests, mostly the anchor systems were failed prior to failure of primary structural system. Furthermore, 3D finite element model for the anchor systems was created by ABAQUS for the numerical analysis. The verification of finite element model was identical till the failure points to the load-displacement relationship specified by the experimental tests.