A Novel Design Approach for Mechatronic Systems Based On Multidisciplinary Design Optimization

In this paper, a novel approach for the multidisciplinary design optimization (MDO) of complex mechatronic systems. This approach, which is a part of a global project aiming to include the MDO aspect inside an innovative design process. As a first step, the paper considers the MDO as a redesign approach which is limited to the parametric optimization. After defining and introducing the different keywords, the proposed method which is based on the V-Model which is commonly used in mechatronics.

A Study on the Effect of Valve Timing on the Combustion and Emission Characteristics for a 4-cylinder PCCI Diesel Engine

PCCI engines can reduce NOx and PM emissions simultaneously without sacrificing thermal efficiency, but a low combustion temperature resulting from early fuel injection, and ignition occurring prior to TDC, can cause higher THC and CO emissions and fuel consumption. In conclusion, it was found that the PCCI combustion achieved by the 2-stage injection strategy with optimized calibration factors (e.g. EGR rate, injection pressure, swirl ratio, intake pressure, injection timing) can reduce NOx and PM emissions simultaneously. This research works are expected to provide valuable information conducive to a development of an innovative combustion engine that can fulfill upcoming stringent emission standards.

Discovery of Human HMG-Coa Reductase Inhibitors Using Structure-Based Pharmacophore Modeling Combined with Molecular Dynamics Simulation Methodologies

3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGR) catalyzes the conversion of HMG-CoA to mevalonate using NADPH and the enzyme is involved in rate-controlling step of mevalonate. Inhibition of HMGR is considered as effective way to lower cholesterol levels so it is drug target to treat hypercholesterolemia, major risk factor of cardiovascular disease. To discover novel HMGR inhibitor, we performed structure-based pharmacophore modeling combined with molecular dynamics (MD) simulation. Four HMGR inhibitors were used for MD simulation and representative structure of each simulation were selected by clustering analysis. Four structure-based pharmacophore models were generated using the representative structure. The generated models were validated used in virtual screening to find novel scaffolds for inhibiting HMGR. The screened compounds were filtered by applying drug-like properties and used in molecular docking. Finally, four hit compounds were obtained and these complexes were refined using energy minimization. These compounds might be potential leads to design novel HMGR inhibitor.

Improved Robust Stability and Stabilization Conditions of Discrete-time Delayed System

The problem of robust stability and robust stabilization for a class of discrete-time uncertain systems with time delay is investigated. Based on Tchebychev inequality, by constructing a new augmented Lyapunov function, some improved sufficient conditions ensuring exponential stability and stabilization are established. These conditions are expressed in the forms of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. Compared with some previous results derived in the literature, the new obtained criteria have less conservatism. Two numerical examples are provided to demonstrate the improvement and effectiveness of the proposed method.

Video Data Mining based on Information Fusion for Tamper Detection

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Forecasting Fraudulent Financial Statements using Data Mining

This paper explores the effectiveness of machine learning techniques in detecting firms that issue fraudulent financial statements (FFS) and deals with the identification of factors associated to FFS. To this end, a number of experiments have been conducted using representative learning algorithms, which were trained using a data set of 164 fraud and non-fraud Greek firms in the recent period 2001-2002. The decision of which particular method to choose is a complicated problem. A good alternative to choosing only one method is to create a hybrid forecasting system incorporating a number of possible solution methods as components (an ensemble of classifiers). For this purpose, we have implemented a hybrid decision support system that combines the representative algorithms using a stacking variant methodology and achieves better performance than any examined simple and ensemble method. To sum up, this study indicates that the investigation of financial information can be used in the identification of FFS and underline the importance of financial ratios.

Extracting Single Trial Visual Evoked Potentials using Selective Eigen-Rate Principal Components

In single trial analysis, when using Principal Component Analysis (PCA) to extract Visual Evoked Potential (VEP) signals, the selection of principal components (PCs) is an important issue. We propose a new method here that selects only the appropriate PCs. We denote the method as selective eigen-rate (SER). In the method, the VEP is reconstructed based on the rate of the eigen-values of the PCs. When this technique is applied on emulated VEP signals added with background electroencephalogram (EEG), with a focus on extracting the evoked P3 parameter, it is found to be feasible. The improvement in signal to noise ratio (SNR) is superior to two other existing methods of PC selection: Kaiser (KSR) and Residual Power (RP). Though another PC selection method, Spectral Power Ratio (SPR) gives a comparable SNR with high noise factors (i.e. EEGs), SER give more impressive results in such cases. Next, we applied SER method to real VEP signals to analyse the P3 responses for matched and non-matched stimuli. The P3 parameters extracted through our proposed SER method showed higher P3 response for matched stimulus, which confirms to the existing neuroscience knowledge. Single trial PCA using KSR and RP methods failed to indicate any difference for the stimuli.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Health Care Ethics in Vulnerable Populations: Clinical Research through the Patient's Eyes

Chronic conditions carry with them strong emotions and often lead to charged relationships between patients and their health providers and, by extension, patients and health researchers. Persons are both autonomous and relational and a purely cognitive model of autonomy neglects the social and relational basis of chronic illness. Ensuring genuine informed consent in research requires a thorough understanding of how participants perceive a study and their reasons for participation. Surveys may not capture the complexities of reasoning that underlies study participation. Contradictory reasons for participation, for instance an initial claim of altruism as rationale and a subsequent claim of personal benefit (therapeutic misconception), affect the quality of informed consent. Individuals apply principles through the filter of personal values and lived experience. Authentic autonomy, and hence authentic consent to research, occurs within the context of patients- unique life narratives and illness experiences.

Numerical and Experimental Stress Analysis of Stiffened Cylindrical Composite Shell under Transverse end Load

Grid composite structures have many applications in aerospace industry in which deal with transverse loadings abundantly. In present paper a stiffened composite cylindrical shell with clamped-free boundary condition under transverse end load experimentally and numerically was studied. Some electrical strain gauges were employed to measure the strains. Also a finite element analysis was done for validation of experimental result. The FEM software used was ANSYS11. In addition, the results between stiffened composite shell and unstiffened composite shell were compared. It was observed that intersection of two stiffeners has an important effect in decrease of stress in the shell. Fairly good agreements were observed between the numerical and the measured results. According to recent studies about grid composite structures, it should be noted that any investigation like this research has not been reported.

Time Comparative Simulator for Distributed Process Scheduling Algorithms

In any distributed systems, process scheduling plays a vital role in determining the efficiency of the system. Process scheduling algorithms are used to ensure that the components of the system would be able to maximize its utilization and able to complete all the processes assigned in a specified period of time. This paper focuses on the development of comparative simulator for distributed process scheduling algorithms. The objectives of the works that have been carried out include the development of the comparative simulator, as well as to implement a comparative study between three distributed process scheduling algorithms; senderinitiated, receiver-initiated and hybrid sender-receiver-initiated algorithms. The comparative study was done based on the Average Waiting Time (AWT) and Average Turnaround Time (ATT) of the processes involved. The simulation results show that the performance of the algorithms depends on the number of nodes in the system.

A Self Configuring System for Object Recognition in Color Images

System MEMORI automatically detects and recognizes rotated and/or rescaled versions of the objects of a database within digital color images with cluttered background. This task is accomplished by means of a region grouping algorithm guided by heuristic rules, whose parameters concern some geometrical properties and the recognition score of the database objects. This paper focuses on the strategies implemented in MEMORI for the estimation of the heuristic rule parameters. This estimation, being automatic, makes the system a highly user-friendly tool.

Mounting Time Reduction using Content-Based Block Management for NAND Flash File System

The flash memory has many advantages such as low power consumption, strong shock resistance, fast I/O and non-volatility. And it is increasingly used in the mobile storage device. The YAFFS, one of the NAND flash file system, is widely used in the embedded device. However, the existing YAFFS takes long time to mount the file system because it scans whole spare areas in all pages of NAND flash memory. In order to solve this problem, we propose a new content-based flash file system using a mounting time reduction technique. The proposed method only scans partial spare areas of some special pages by using content-based block management. The experimental results show that the proposed method reduces the average mounting time by 87.2% comparing with JFFS2 and 69.9% comparing with YAFFS.

Reduction of Chloride Dioxide in Paper Bleaching using Peroxide Activation

All around the world pulp and paper industries are the biggest plant production with the environmental pollution as the biggest challenge facing the pulp manufacturing operations. The concern among these industries is to produce a high volume of papers with the high quality standard and of low cost without affecting the environment. This result obtained from this bleaching study show that the activation of peroxide was an effective method of reducing the total applied charge of chlorine dioxide which is harmful to our environment and also show that softwood and hardwood Kraft pulps responded linearly to the peroxide treatments. During the bleaching process the production plant produce chlorines. Under the trial stages chloride dioxide has been reduced by 3 kg/ton to reduce the brightness from 65% ISO to 60% ISO of pulp and the dosing point returned to the E stage charges by pre-treating Kraft pulps with hydrogen peroxide. The pulp and paper industry has developed elemental chlorine free (ECF) and totally chlorine free (TCF) bleaching, in their quest for being environmental friendly, they have been looking at ways to turn their ECF process into a TCF process while still being competitive. This prompted the research to investigate the capability of the hydrogen peroxide as catalyst to reduce chloride dioxide.

Exploring the Application of Knowledge Management Factors in Esfahan University's Medical College

In this competitive age, one of the key tools of most successful organizations is knowledge management. Today some organizations measure their current knowledge and use it as an indicator for rating the organization on their reports. Noting that the universities and colleges of medical science have a great role in public health of societies, their access to newest scientific research and the establishment of organizational knowledge management systems is very important. In order to explore the Application of Knowledge Management Factors, a national study was undertaken. The main purpose of this study was to find the rate of the application of knowledge management factors and some ways to establish more application of knowledge management system in Esfahan University-s Medical College (EUMC). Esfahan is the second largest city after Tehran, the capital city of Iran, and the EUMC is the biggest medical college in Esfahan. To rate the application of knowledge management, this study uses a quantitative research methodology based on Probst, Raub and Romhardt model of knowledge management. A group of 267 faculty members and staff of the EUMC were asked via questionnaire. Finding showed that the rate of the application of knowledge management factors in EUMC have been lower than average. As a result, an interview with ten faculty members conducted to find the guidelines to establish more applications of knowledge management system in EUMC.

Estimating the Absorption of Volatile Organic Compounds in Four Biodiesels Using the UNIFAC Procedure

This work considered the thermodynamic feasibility of scrubbing volatile organic compounds into biodiesel in view of designing a gas treatment process with this absorbent. A detailed vapour – liquid equilibrium investigation was performed using the original UNIFAC group contribution method. The four biodiesels studied in this work are methyl oleate, methyl palmitate, methyl linolenate and ethyl stearate. The original UNIFAC procedure was used to estimate the infinite dilution activity coefficients of 13 selected volatile organic compounds in the biodiesels. The calculations were done at the VOC mole fraction of 9.213x10-8. Ethyl stearate gave the most favourable phase equilibrium. A close agreement was found between the infinite dilution activity coefficient of toluene found in this work and those reported in literature. Thermodynamic models can efficiently be used to calculate vast amount of phase equilibrium behaviour using limited number of experimental data.

Vulnerabilities of IEEE 802.11i Wireless LAN CCMP Protocol

IEEE has recently incorporated CCMP protocol to provide robust security to IEEE 802.11 wireless LANs. It is found that CCMP has been designed with a weak nonce construction and transmission mechanism, which leads to the exposure of initial counter value. This weak construction of nonce renders the protocol vulnerable to attacks by intruders. This paper presents how the initial counter can be pre-computed by the intruder. This vulnerability of counter block value leads to pre-computation attack on the counter mode encryption of CCMP. The failure of the counter mode will result in the collapse of the whole security mechanism of 802.11 WLAN.

An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure

Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.

Signature Identification Scheme Based on Iterated Function Systems

Since 1984 many schemes have been proposed for digital signature protocol, among them those that based on discrete log and factorizations. However a new identification scheme based on iterated function (IFS) systems are proposed and proved to be more efficient. In this study the proposed identification scheme is transformed into a digital signature scheme by using a one way hash function. It is a generalization of the GQ signature schemes. The attractor of the IFS is used to obtain public key from a private one, and in the encryption and decryption of a hash function. Our aim is to provide techniques and tools which may be useful towards developing cryptographic protocols. Comparisons between the proposed scheme and fractal digital signature scheme based on RSA setting, as well as, with the conventional Guillou-Quisquater signature, and RSA signature schemes is performed to prove that, the proposed scheme is efficient and with high performance.

Industrial Development, Environment And Occupational Problems: The Case Of Iran

There are three distinct stages in the evolution of economic thought, namely: 1. in the first stage, the major concern was to accelerate economic growth with increased availability of material goods, especially in developing economies with very low living standards, because poverty eradication meant faster economic growth. 2. in the second stage, economists made distinction between growth and development. Development was seen as going beyond economic growth, and bringing certain changes in the structure of the economy with more equitable distribution of the benefits of growth, with the growth coming automatic and sustained. 3. the third stage is now reached. Our concern is now with “sustainable development", that is, development not only for the present but also of the future. Thus the focus changed from “sustained growth" to “sustained development". Sustained development brings to the fore the long term relationship between the ecology and economic development. Since the creation of UNEP in 1972 it has worked for development without destruction for environmentally sound and sustained development. It was realised that the environment cannot be viewed in a vaccum, it is not separate from development, nor is it competing. It suggested for the integration of the environment with development whereby ecological factors enter development planning, socio-economic policies, cost-benefit analysis, trade, technology transfer, waste management, educational and other specific areas. Industrialisation has contributed to the growth of economy of several countries. It has improved the standards of living of its people and provided benefits to the society. It has also created in the process great environmental problems like climate change, forest destruction and denudation, soil erosion and desertification etc. On the other hand, industry has provided jobs and improved the prospects of wealth for the industrialists. The working class communities had to simply put up with the high levels of pollution in order to keep up their jobs and also to save their income. There are many roots of the environmental problem. They may be political, economic, cultural and technological conditions of the modern society. The experts concede that industrial growth lies somewhere close to the heart of the matter. Therefore, the objective of this paper is not to document all roots of an environmental crisis but rather to discuss the effects of industrial growth and development. We have come to the conclusion that although public intervention is often unnecessary to ensure that perfectly competitive markets will function in society-s best interests, such intervention is necessary when firms or consumers pollute.