Abstraction Hierarchies for Engineering Design

Complex engineering design problems consist of numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction hierarchies in a recursive and bottom-up approach that guarantees no backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the developed methodology is demonstrated by a design problem.

Optical Coherence Tomography Combined with the Confocal Microscopy Method and Fluorescence for Class V Cavities Investigations

The purpose of this study is to present a non invasive method for the marginal adaptation evaluation in class V composite restorations. Standardized class V cavities, prepared in human extracted teeth, were filled with Premise (Kerr) composite. The specimens were thermo cycled. The interfaces were examined by Optical Coherence Tomography method (OCT) combined with the confocal microscopy and fluorescence. The optical configuration uses two single mode directional couplers with a superluminiscent diode as the source at 1300 nm. The scanning procedure is similar to that used in any confocal microscope, where the fast scanning is enface (line rate) and the depth scanning is much slower (at the frame rate). Gaps at the interfaces as well as inside the composite resin materials were identified. OCT has numerous advantages which justify its use in vivo as well as in vitro in comparison with conventional techniques.

Corporate Credit Rating using Multiclass Classification Models with order Information

Corporate credit rating prediction using statistical and artificial intelligence (AI) techniques has been one of the attractive research topics in the literature. In recent years, multiclass classification models such as artificial neural network (ANN) or multiclass support vector machine (MSVM) have become a very appealing machine learning approaches due to their good performance. However, most of them have only focused on classifying samples into nominal categories, thus the unique characteristic of the credit rating - ordinality - has been seldom considered in their approaches. This study proposes new types of ANN and MSVM classifiers, which are named OMANN and OMSVM respectively. OMANN and OMSVM are designed to extend binary ANN or SVM classifiers by applying ordinal pairwise partitioning (OPP) strategy. These models can handle ordinal multiple classes efficiently and effectively. To validate the usefulness of these two models, we applied them to the real-world bond rating case. We compared the results of our models to those of conventional approaches. The experimental results showed that our proposed models improve classification accuracy in comparison to typical multiclass classification techniques with the reduced computation resource.

Extended Dynamic Source Routing Protocol for the Non Co-Operating Nodes in Mobile Adhoc Networks

In this paper, a new approach based on the extent of friendship between the nodes is proposed which makes the nodes to co-operate in an ad hoc environment. The extended DSR protocol is tested under different scenarios by varying the number of malicious nodes and node moving speed. It is also tested varying the number of nodes in simulation used. The result indicates the achieved throughput by extended DSR is greater than the standard DSR and indicates the percentage of malicious drops over total drops are less in the case of extended DSR than the standard DSR.

Effect of Size of the Step in the Response Surface Methodology using Nonlinear Test Functions

The response surface methodology (RSM) is a collection of mathematical and statistical techniques useful in the modeling and analysis of problems in which the dependent variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a production process. The RSM estimated a regression model of first order, and sets the search direction using the method of maximum / minimum slope up / down MMS U/D. However, this method selects the step size intuitively, which can affect the efficiency of the RSM. This paper assesses how the step size affects the efficiency of this methodology. The numerical examples are carried out through Monte Carlo experiments, evaluating three response variables: efficiency gain function, the optimum distance and the number of iterations. The results in the simulation experiments showed that in response variables efficiency and gain function at the optimum distance were not affected by the step size, while the number of iterations is found that the efficiency if it is affected by the size of the step and function type of test used.

Performance Improvements of DSP Applications on a Generic Reconfigurable Platform

Speedups from mapping four real-life DSP applications on an embedded system-on-chip that couples coarsegrained reconfigurable logic with an instruction-set processor are presented. The reconfigurable logic is realized by a 2-Dimensional Array of Processing Elements. A design flow for improving application-s performance is proposed. Critical software parts, called kernels, are accelerated on the Coarse-Grained Reconfigurable Array. The kernels are detected by profiling the source code. For mapping the detected kernels on the reconfigurable logic a prioritybased mapping algorithm has been developed. Two 4x4 array architectures, which differ in their interconnection structure among the Processing Elements, are considered. The experiments for eight different instances of a generic system show that important overall application speedups have been reported for the four applications. The performance improvements range from 1.86 to 3.67, with an average value of 2.53, compared with an all-software execution. These speedups are quite close to the maximum theoretical speedups imposed by Amdahl-s law.

Development of UiTM Robotic Prosthetic Hand

The study of human hand morphology reveals that developing an artificial hand with the capabilities of human hand is an extremely challenging task. This paper presents the development of a robotic prosthetic hand focusing on the improvement of a tendon driven mechanism towards a biomimetic prosthetic hand. The design of this prosthesis hand is geared towards achieving high level of dexterity and anthropomorphism by means of a new hybrid mechanism that integrates a miniature motor driven actuation mechanism, a Shape Memory Alloy actuated mechanism and a passive mechanical linkage. The synergy of these actuators enables the flexion-extension movement at each of the finger joints within a limited size, shape and weight constraints. Tactile sensors are integrated on the finger tips and the finger phalanges area. This prosthesis hand is developed with an exact size ratio that mimics a biological hand. Its behavior resembles the human counterpart in terms of working envelope, speed and torque, and thus resembles both the key physical features and the grasping functionality of an adult hand.

Optimal Facility Layout Problem Solution Using Genetic Algorithm

Facility Layout Problem (FLP) is one of the essential problems of several types of manufacturing and service sector. It is an optimization problem on which the main objective is to obtain the efficient locations, arrangement and order of the facilities. In the literature, there are numerous facility layout problem research presented and have used meta-heuristic approaches to achieve optimal facility layout design. This paper presented genetic algorithm to solve facility layout problem; to minimize total cost function. The performance of the proposed approach was verified and compared using problems in the literature.

Evaluation of Shear Strength Parameters of Amended Loess through Using Common Admixtures in Gorgan, Iran

Non-saturated soils that while saturation greatly decrease their volume, have sudden settlement due to increasing humidity, fracture and structural crack are called loess soils. Whereas importance of civil projects including: dams, canals and constructions bearing this type of soil and thereof problems, it is required for carrying out more research and study in relation to loess soils. This research studies shear strength parameters by using grading test, Atterberg limit, compression, direct shear and consolidation and then effect of using cement and lime additives on stability of loess soils is studied. In related tests, lime and cement are separately added to mixed ratios under different percentages of soil and for different times the stabilized samples are processed and effect of aforesaid additives on shear strength parameters of soil is studied. Results show that upon passing time the effect of additives and collapsible potential is greatly decreased and upon increasing percentage of cement and lime the maximum dry density is decreased; however, optimum humidity is increased. In addition, liquid limit and plastic index is decreased; however, plastic index limit is increased. It is to be noted that results of direct shear test reveal increasing shear strength of soil due to increasing cohesion parameter and soil friction angle.

Probabilistic Modeling of Network-induced Delays in Networked Control Systems

Time varying network induced delays in networked control systems (NCS) are known for degrading control system-s quality of performance (QoP) and causing stability problems. In literature, a control method employing modeling of communication delays as probability distribution, proves to be a better method. This paper focuses on modeling of network induced delays as probability distribution. CAN and MIL-STD-1553B are extensively used to carry periodic control and monitoring data in networked control systems. In literature, methods to estimate only the worst-case delays for these networks are available. In this paper probabilistic network delay model for CAN and MIL-STD-1553B networks are given. A systematic method to estimate values to model parameters from network parameters is given. A method to predict network delay in next cycle based on the present network delay is presented. Effect of active network redundancy and redundancy at node level on network delay and system response-time is also analyzed.

Independent Component Analysis to Mass Spectra of Aluminium Sulphate

Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.

Block Sorting: A New Characterization and a New Heuristic

The Block Sorting problem is to sort a given permutation moving blocks. A block is defined as a substring of the given permutation, which is also a substring of the identity permutation. Block Sorting has been proved to be NP-Hard. Until now two different 2-Approximation algorithms have been presented for block sorting. These are the best known algorithms for Block Sorting till date. In this work we present a different characterization of Block Sorting in terms of a transposition cycle graph. Then we suggest a heuristic, which we show to exhibit a 2-approximation performance guarantee for most permutations.

Periodic Storage Control Problem

Considering a reservoir with periodic states and different cost functions with penalty, its release rules can be modeled as a periodic Markov decision process (PMDP). First, we prove that policy- iteration algorithm also works for the PMDP. Then, with policy- iteration algorithm, we obtain the optimal policies for a special aperiodic reservoir model with two cost functions under large penalty and give a discussion when the penalty is small.

Comparison of Three Turbulence Models in Wear Prediction of Multi-Size Particulate Flow through Rotating Channel

The present work compares the performance of three turbulence modeling approach (based on the two-equation k -ε model) in predicting erosive wear in multi-size dense slurry flow through rotating channel. All three turbulence models include rotation modification to the production term in the turbulent kineticenergy equation. The two-phase flow field obtained numerically using Galerkin finite element methodology relates the local flow velocity and concentration to the wear rate via a suitable wear model. The wear models for both sliding wear and impact wear mechanisms account for the particle size dependence. Results of predicted wear rates using the three turbulence models are compared for a large number of cases spanning such operating parameters as rotation rate, solids concentration, flow rate, particle size distribution and so forth. The root-mean-square error between FE-generated data and the correlation between maximum wear rate and the operating parameters is found less than 2.5% for all the three models.

Property Aggregation and Uncertainty with Links to the Management and Determination of Critical Design Features

Within the domain of Systems Engineering the need to perform property aggregation to understand, analyze and manage complex systems is unequivocal. This can be seen in numerous domains such as capability analysis, Mission Essential Competencies (MEC) and Critical Design Features (CDF). Furthermore, the need to consider uncertainty propagation as well as the sensitivity of related properties within such analysis is equally as important when determining a set of critical properties within such a system. This paper describes this property breakdown in a number of domains within Systems Engineering and, within the area of CDFs, emphasizes the importance of uncertainty analysis. As part of this, a section of the paper describes possible techniques which may be used within uncertainty propagation and in conclusion an example is described utilizing one of the techniques for property and uncertainty aggregation within an aircraft system to aid the determination of Critical Design Features.

Ensuring Data Security and Consistency in FTIMA - A Fault Tolerant Infrastructure for Mobile Agents

Transaction management is one of the most crucial requirements for enterprise application development which often require concurrent access to distributed data shared amongst multiple application / nodes. Transactions guarantee the consistency of data records when multiple users or processes perform concurrent operations. Existing Fault Tolerance Infrastructure for Mobile Agents (FTIMA) provides a fault tolerant behavior in distributed transactions and uses multi-agent system for distributed transaction and processing. In the existing FTIMA architecture, data flows through the network and contains personal, private or confidential information. In banking transactions a minor change in the transaction can cause a great loss to the user. In this paper we have modified FTIMA architecture to ensure that the user request reaches the destination server securely and without any change. We have used triple DES for encryption/ decryption and MD5 algorithm for validity of message.

An Assessment of Software Process Optimization Compared to International Best Practice in Bangladesh

The challenge for software development house in Bangladesh is to find a path of using minimum process rather than CMMI or ISO type gigantic practice and process area. The small and medium size organization in Bangladesh wants to ensure minimum basic Software Process Improvement (SPI) in day to day operational activities. Perhaps, the basic practices will ensure to realize their company's improvement goals. This paper focuses on the key issues in basic software practices for small and medium size software organizations, who are unable to effort the CMMI, ISO, ITIL etc. compliance certifications. This research also suggests a basic software process practices model for Bangladesh and it will show the mapping of our suggestions with international best practice. In this IT competitive world for software process improvement, Small and medium size software companies that require collaboration and strengthening to transform their current perspective into inseparable global IT scenario. This research performed some investigations and analysis on some projects- life cycle, current good practice, effective approach, reality and pain area of practitioners, etc. We did some reasoning, root cause analysis, comparative analysis of various approach, method, practice and justifications of CMMI and real life. We did avoid reinventing the wheel, where our focus is for minimal practice, which will ensure a dignified satisfaction between organizations and software customer.

An Approach for Data Analysis, Evaluation and Correction: A Case Study from Man-Made River Project in Libya

The world-s largest Pre-stressed Concrete Cylinder Pipe (PCCP) water supply project had a series of pipe failures which occurred between 1999 and 2001. This has led the Man-Made River Authority (MMRA), the authority in charge of the implementation and operation of the project, to setup a rehabilitation plan for the conveyance system while maintaining the uninterrupted flow of water to consumers. At the same time, MMRA recognized the need for a long term management tool that would facilitate repair and maintenance decisions and enable taking the appropriate preventive measures through continuous monitoring and estimation of the remaining life of each pipe. This management tool is known as the Pipe Risk Management System (PRMS) and now in operation at MMRA. Both the rehabilitation plan and the PRMS require the availability of complete and accurate pipe construction and manufacturing data This paper describes a systematic approach of data collection, analysis, evaluation and correction for the construction and manufacturing data files of phase I pipes which are the platform for the PRMS database and any other related decision support system.

3D Oil Reservoir Visualisation Using Octree Compression Techniques Utilising Logical Grid Co-Ordinates

Octree compression techniques have been used for several years for compressing large three dimensional data sets into homogeneous regions. This compression technique is ideally suited to datasets which have similar values in clusters. Oil engineers represent reservoirs as a three dimensional grid where hydrocarbons occur naturally in clusters. This research looks at the efficiency of storing these grids using octree compression techniques where grid cells are broken into active and inactive regions. Initial experiments yielded high compression ratios as only active leaf nodes and their ancestor, header nodes are stored as a bitstream to file on disk. Savings in computational time and memory were possible at decompression, as only active leaf nodes are sent to the graphics card eliminating the need of reconstructing the original matrix. This results in a more compact vertex table, which can be loaded into the graphics card quicker and generating shorter refresh delay times.

Psychological Structure of “Aitys“ as a Process of Oral Creative Competition in Kazakh Traditional Folklore

the aim of this study was to analyze ethnopsychological content of “Aitys" as a process of creative competition in Kazakh traditional folklore by means of Transaction analysis (three types of Ego states are Parent, Adult and Child). “Aitys" is as sources of Kazakh national self-consciousness and form of oral Kazakh national creativity. Comparative psychological analysis of classical and modern “aityses" is carried out. Empirical proved that the victory in “Aitys" is provided with a position of egostate “Adult".