Abstraction Hierarchies for Engineering Design

Complex engineering design problems consist of numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction hierarchies in a recursive and bottom-up approach that guarantees no backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the developed methodology is demonstrated by a design problem.

Corporate Credit Rating using Multiclass Classification Models with order Information

Corporate credit rating prediction using statistical and artificial intelligence (AI) techniques has been one of the attractive research topics in the literature. In recent years, multiclass classification models such as artificial neural network (ANN) or multiclass support vector machine (MSVM) have become a very appealing machine learning approaches due to their good performance. However, most of them have only focused on classifying samples into nominal categories, thus the unique characteristic of the credit rating - ordinality - has been seldom considered in their approaches. This study proposes new types of ANN and MSVM classifiers, which are named OMANN and OMSVM respectively. OMANN and OMSVM are designed to extend binary ANN or SVM classifiers by applying ordinal pairwise partitioning (OPP) strategy. These models can handle ordinal multiple classes efficiently and effectively. To validate the usefulness of these two models, we applied them to the real-world bond rating case. We compared the results of our models to those of conventional approaches. The experimental results showed that our proposed models improve classification accuracy in comparison to typical multiclass classification techniques with the reduced computation resource.

An Effective Framework for Chinese Syntactic Parsing

This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.

Comparison of Three Turbulence Models in Wear Prediction of Multi-Size Particulate Flow through Rotating Channel

The present work compares the performance of three turbulence modeling approach (based on the two-equation k -ε model) in predicting erosive wear in multi-size dense slurry flow through rotating channel. All three turbulence models include rotation modification to the production term in the turbulent kineticenergy equation. The two-phase flow field obtained numerically using Galerkin finite element methodology relates the local flow velocity and concentration to the wear rate via a suitable wear model. The wear models for both sliding wear and impact wear mechanisms account for the particle size dependence. Results of predicted wear rates using the three turbulence models are compared for a large number of cases spanning such operating parameters as rotation rate, solids concentration, flow rate, particle size distribution and so forth. The root-mean-square error between FE-generated data and the correlation between maximum wear rate and the operating parameters is found less than 2.5% for all the three models.

Ensuring Data Security and Consistency in FTIMA - A Fault Tolerant Infrastructure for Mobile Agents

Transaction management is one of the most crucial requirements for enterprise application development which often require concurrent access to distributed data shared amongst multiple application / nodes. Transactions guarantee the consistency of data records when multiple users or processes perform concurrent operations. Existing Fault Tolerance Infrastructure for Mobile Agents (FTIMA) provides a fault tolerant behavior in distributed transactions and uses multi-agent system for distributed transaction and processing. In the existing FTIMA architecture, data flows through the network and contains personal, private or confidential information. In banking transactions a minor change in the transaction can cause a great loss to the user. In this paper we have modified FTIMA architecture to ensure that the user request reaches the destination server securely and without any change. We have used triple DES for encryption/ decryption and MD5 algorithm for validity of message.

An Approach for Data Analysis, Evaluation and Correction: A Case Study from Man-Made River Project in Libya

The world-s largest Pre-stressed Concrete Cylinder Pipe (PCCP) water supply project had a series of pipe failures which occurred between 1999 and 2001. This has led the Man-Made River Authority (MMRA), the authority in charge of the implementation and operation of the project, to setup a rehabilitation plan for the conveyance system while maintaining the uninterrupted flow of water to consumers. At the same time, MMRA recognized the need for a long term management tool that would facilitate repair and maintenance decisions and enable taking the appropriate preventive measures through continuous monitoring and estimation of the remaining life of each pipe. This management tool is known as the Pipe Risk Management System (PRMS) and now in operation at MMRA. Both the rehabilitation plan and the PRMS require the availability of complete and accurate pipe construction and manufacturing data This paper describes a systematic approach of data collection, analysis, evaluation and correction for the construction and manufacturing data files of phase I pipes which are the platform for the PRMS database and any other related decision support system.

Bandwidth Estimation Algorithms for the Dynamic Adaptation of Voice Codec

In the recent years multimedia traffic and in particular VoIP services are growing dramatically. We present a new algorithm to control the resource utilization and to optimize the voice codec selection during SIP call setup on behalf of the traffic condition estimated on the network path. The most suitable methodologies and the tools that perform realtime evaluation of the available bandwidth on a network path have been integrated with our proposed algorithm: this selects the best codec for a VoIP call in function of the instantaneous available bandwidth on the path. The algorithm does not require any explicit feedback from the network, and this makes it easily deployable over the Internet. We have also performed intensive tests on real network scenarios with a software prototype, verifying the algorithm efficiency with different network topologies and traffic patterns between two SIP PBXs. The promising results obtained during the experimental validation of the algorithm are now the basis for the extension towards a larger set of multimedia services and the integration of our methodology with existing PBX appliances.

Tri-Axis Receiver for Wireless Micro-Power Transmission

An innovative tri-axes micro-power receiver is proposed. The tri-axes micro-power receiver consists of two sets 3-D micro-solenoids and one set planar micro-coils in which iron core is embedded. The three sets of micro-coils are designed to be orthogonal to each other. Therefore, no matter which direction the flux is present along, the magnetic energy can be harvested and transformed into electric power. Not only dead space of receiving power is mostly reduced, but also transformation efficiency of electromagnetic energy to electric power can be efficiently raised. By employing commercial software, Ansoft Maxwell, the preliminary simulation results verify that the proposed micro-power receiver can efficiently pick up the energy transmitted by magnetic power source. As to the fabrication process, the isotropic etching technique is employed to micro-machine the inverse-trapezoid fillister so that the copper wire can be successfully electroplated. The adhesion between micro-coils and fillister is much enhanced.

Simulation of Sample Paths of Non Gaussian Stationary Random Fields

Mathematical justifications are given for a simulation technique of multivariate nonGaussian random processes and fields based on Rosenblatt-s transformation of Gaussian processes. Different types of convergences are given for the approaching sequence. Moreover an original numerical method is proposed in order to solve the functional equation yielding the underlying Gaussian process autocorrelation function.

Effect of Twelve Weeks Brisk Walking on Blood Pressure, Body Mass Index, and Anthropometric Circumference of Obese Males

Introduction: Obesity is a major health risk issue in the present day of life for one and all globally. Obesity is one of the major concerns for public health according to recent increasing trends in obesity-related diseases such as Type 2 diabetes. ( Kazuya, 1994).and hyperlipidemia, (Sakata,1990) .which are more prevalent in Japanese adults with body mass index (BMI) values Z25 kg/m2.( Japanese Ministry of Health and Welfare,1997). The purpose of the study was to assess the effect of twelve weeks of brisk walking on blood pressure and body mass index, anthropometric measurements of obese males. Method: Thirty obese (BMI= above 30) males, aged 18 to 22 years, were selected from King Fahd University of Petroleum & Minerals, Saudi Arabia. The subject-s height (cm) was measured using a stadiometer and body mass (kg) was measured with a electronic weighing machine. BMI was subsequently calculated (kg/m2). The blood pressure was measured with standardized sphygmomanometer in mm of Hg. All the measurements were taken twice before and twice after the experimental period. The pre and post anthropometric measurements of waist and hip circumference were measured with the steel tape in cm. The subjects underwent walking schedule two times in a week for 12 weeks. The 45 minute sessions of brisk walking were undertaken at an average intensity of 65% to 85% of maximum HR (HRmax; calculated as 220-age). Results & Discussion: Statistical findings revealed significant changes from pre test to post test in case of both systolic blood pressure and diastolic blood pressure in the walking group. Results also showed significant decrease in their body mass index and anthropometric measurements i.e. (waist & hip circumference). Conclusion: It was concluded that twelve weeks brisk walking is beneficial for lowering of blood pressure, body mass index, and anthropometric circumference of obese males.

Detecting and Locating Wormhole Attacks in Wireless Sensor Networks Using Beacon Nodes

This paper focuses on wormhole attacks detection in wireless sensor networks. The wormhole attack is particularly challenging to deal with since the adversary does not need to compromise any nodes and can use laptops or other wireless devices to send the packets on a low latency channel. This paper introduces an easy and effective method to detect and locate the wormholes: Since beacon nodes are assumed to know their coordinates, the straight line distance between each pair of them can be calculated and then compared with the corresponding hop distance, which in this paper equals hop counts × node-s transmission range R. Dramatic difference may emerge because of an existing wormhole. Our detection mechanism is based on this. The approximate location of the wormhole can also be derived in further steps based on this information. To the best of our knowledge, our method is much easier than other wormhole detecting schemes which also use beacon nodes, and to those have special requirements on each nodes (e.g., GPS receivers or tightly synchronized clocks or directional antennas), ours is more economical. Simulation results show that the algorithm is successful in detecting and locating wormholes when the density of beacon nodes reaches 0.008 per m2.

Ultrasonic Echo Image Adaptive Watermarking Using the Just-Noticeable Difference Estimation

Most of the image watermarking methods, using the properties of the human visual system (HVS), have been proposed in literature. The component of the visual threshold is usually related to either the spatial contrast sensitivity function (CSF) or the visual masking. Especially on the contrast masking, most methods have not mention to the effect near to the edge region. Since the HVS is sensitive what happens on the edge area. This paper proposes ultrasound image watermarking using the visual threshold corresponding to the HVS in which the coefficients in a DCT-block have been classified based on the texture, edge, and plain area. This classification method enables not only useful for imperceptibility when the watermark is insert into an image but also achievable a robustness of watermark detection. A comparison of the proposed method with other methods has been carried out which shown that the proposed method robusts to blockwise memoryless manipulations, and also robust against noise addition.

Characterization of Lactose Consumption during the Biogas Production from Acid Whey by FT-IR Spectroscopy

The consumption of lactose in acid cheese whey anaerobic fermentation process under fed-batch conditions was studied. During fermentation for 100 hours the biogas production (CO2 and CH4) was analyzed online. Among the standard analyses FT-IR spectroscopy was used to follow the consumption of lactose by bacteria. The absorption bands at 990, 894 and 787 cm-1 in the 2nd derivative spectra were shown to be characteristic for lactose and were used to follow the lactose conversion. It was shown that acid cheese whey lactose was converted by bacteria in first 7 hours. In the spectra of 17, 18 and 95 hour fermentation samples lactose was not identified and these results correlated with the HPLC data.

Directional Drilling Optimization by Non-Rotating Stabilizer

The Non-Rotating Adjustable Stabilizer / Directional Solution (NAS/DS) is the imitation of a mechanical process or an object by a directional drilling operation that causes a respond mathematically and graphically to data and decision to choose the best conditions compared to the previous mode. The NAS/DS Auto Guide rotary steerable tool is undergoing final field trials. The point-the-bit tool can use any bit, work at any rotating speed, work with any MWD/LWD system, and there is no pressure drop through the tool. It is a fully closed-loop system that automatically maintains a specified curvature rate. The Non–Rotating Adjustable stabilizer (NAS) can be controls curvature rate by exactly positioning and run with the optimum bit, use the most effective weight (WOB) and rotary speed (RPM) and apply all of the available hydraulic energy to the bit. The directional simulator allowed to specify the size of the curvature rate performance errors of the NAS tool and the magnitude of the random errors in the survey measurements called the Directional Solution (DS). The combination of these technologies (NAS/DS) will provide smoother bore holes, reduced drilling time, reduced drilling cost and incredible targeting precision. This simulator controls curvature rate by precisely adjusting the radial extension of stabilizer blades on a near bit Non-Rotating Stabilizer and control process corrects for the secondary effects caused by formation characteristics, bit and tool wear, and manufacturing tolerances.

Investigation of the Possibility to Prepare Supervised Classification Map of Gully Erosion by RS and GIS

This study investigates the possibility providing gully erosion map by the supervised classification of satellite images (ETM+) in two mountainous and plain land types. These land types were the part of Varamin plain, Tehran province, and Roodbar subbasin, Guilan province, as plain and mountain land types, respectively. The position of 652 and 124 ground control points were recorded by GPS respectively in mountain and plain land types. Soil gully erosion, land uses or plant covers were investigated in these points. Regarding ground control points and auxiliary points, training points of gully erosion and other surface features were introduced to software (Ilwis 3.3 Academic). The supervised classified map of gully erosion was prepared by maximum likelihood method and then, overall accuracy of this map was computed. Results showed that the possibility supervised classification of gully erosion isn-t possible, although it need more studies for results generalization to other mountainous regions. Also, with increasing land uses and other surface features in plain physiography, it decreases the classification of accuracy.

Application of Life Data Analysis for the Reliability Assessment of Numerical Overcurrent Relays

Protective relays are components of a protection system in a power system domain that provides decision making element for correct protection and fault clearing operations. Failure of the protection devices may reduce the integrity and reliability of the power system protection that will impact the overall performance of the power system. Hence it is imperative for power utilities to assess the reliability of protective relays to assure it will perform its intended function without failure. This paper will discuss the application of reliability analysis using statistical method called Life Data Analysis in Tenaga Nasional Berhad (TNB), a government linked power utility company in Malaysia, namely Transmission Division, to assess and evaluate the reliability of numerical overcurrent protective relays from two different manufacturers.

Low Complexity Regular LDPC codes for Magnetic Storage Devices

LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.

The Appropriate Time Required for Newborn Calf Camel to Get Optimal Amount of Colostrums Immunoglobulin (IgG) with Relation to Levels of Cortisol and Thyroxin

A major challenge in camel productivity is the high mortality rate of camel calves in the early stage due to the lack of colostrums. This study investigates the time required for the calves to obtain the optimum amount of the immunoglobulin (IgG). Eleven pregnant female camels (Camelus Dromedarus) were selected randomly and variant in age and gestation. After delivery, 7 calves were obtained and used for this investigation. Colostrum samples were collected from mothers immediately after parturition. Blood samples were obtained from the calves as follow: 0 day (before suckling), 24, 48, 72, 96, 120 and 144 hours, 2nd, 3rd, and 4th weeks post suckling. Blood serum and colostrums whey were separated and used to determine IgG concentration, total protein and concentration of Cortisol and Thyroxin. The results showed high levels of IgG in camel colostrums (328.8 ± 4.5 mg / ml). The IgG concentration in serum of calves was the highest within 1st 24 h after suckling (140.75 mg /ml), and then declined gradually reached lower level at 144 h (41.97 mg / ml). The average turnover rate (t 1/2) of serum IgG in the all cases was 3.22 days. The turnover of ranged from 2.56 days for calves have values of IgG more than average and 7.7 days for those with values below average. In spite of very high levels of thyroxin in sera of new born the results showed no correlation between cortisol and thyroxin with IgG levels.

Prediction of Basic Wind Speed for Ayeyarwady

Abstract— The paper presents a preliminary study on modeling and estimation of basic wind speed ( extreme wind gusts ) for the consideration of vulnerability and design of building in Ayeyarwady Region. The establishment of appropriate design wind speeds is a critical step towards the calculation of design wind loads for structures. In this paper the extreme value analysis of this prediction work is based on the anemometer data (1970-2009) maintained by the department of meteorology and hydrology of Pathein. Statistical and probabilistic approaches are used to derive formulas for estimating 3-second gusts from recorded data (10-minute sustained mean wind speeds).

The Index of Sustainable Functionality: An Application for Measuring Sustainability

The index of sustainable functionality (ISF) is an adaptive, multi-criteria technique that is used to measure sustainability; it is a concept that can be transposed to many regions throughout the world. An ISF application of the Southern Regional Organisation of Councils (SouthROC) in South East Queensland (SEQ) – the fastest growing region in Australia – indicated over a 25 year period an increase of over 10% level of functionality from 58.0% to 68.3%. The ISF of SouthROC utilised methodologies that derived from an expert panel based approach. The overall results attained an intermediate level of functionality which amounted to related concerns of economic progress and lack of social awareness. Within the region, a solid basis for future testing by way of measured changes and developed trends can be established. In this regard as management tool, the ISF record offers support for regional sustainability practice and decision making alike. This research adaptively analyses sustainability – a concept that is lacking throughout much of the academic literature and any reciprocal experimentation. This lack of knowledge base has been the emphasis of where future sustainability research can grow from and prove useful in rapidly growing regions. It is the intentions of this research to help further develop the notions of index-based quantitative sustainability.