STRPRO Tool for Manipulation of Stratified Programs Based on SEPN

Negation is useful in the majority of the real world applications. However, its introduction leads to semantic and canonical problems. SEPN nets are well adapted extension of predicate nets for the definition and manipulation of stratified programs. This formalism is characterized by two main contributions. The first concerns the management of the whole class of stratified programs. The second contribution is related to usual operations optimization (maximal stratification, incremental updates ...). We propose, in this paper, useful algorithms for manipulating stratified programs using SEPN. These algorithms were implemented and validated with STRPRO tool.

The Development of Positive Emotion Regulation Strategies Scale for Children and Adolescents

The study was designed to develop a measurement of the positive emotion regulation questionnaire (PERQ) that assesses positive emotion regulation strategies through self-report. The 14 items developed for the surveying instrument of the study were based upon literatures regarding elements of positive regulation strategies. 319 elementary students (age ranging from 12 to14) were recruited among three public elementary schools to survey on their use of positive emotion regulation strategies. Of 319 subjects, 20 invalid questionnaire s yielded a response rate of 92%. The data collected wasanalyzed through methods such as item analysis, factor analysis, and structural equation models. In reference to the results from item analysis, the formal survey instrument was reduced to 11 items. A principal axis factor analysis with varimax was performed on responses, resulting in a 2-factor equation (savoring strategy and neutralizing strategy), which accounted for 55.5% of the total variance. Then, the two-factor structure of scale was also identified by structural equation models. Finally, the reliability coefficients of the two factors were Cronbach-s α .92 and .74. Gender difference was only found in savoring strategy. In conclusion, the positive emotion regulation strategies questionnaire offers a brief, internally consistent, and valid self-report measure for understanding the emotional regulation strategies of children that may be useful to researchers and applied professionals.

Compensation–Based Current Decomposition

This paper deals with the current space-vector decomposition in three-phase, three-wire systems on the basis of some case studies. We propose four components of the current spacevector in terms of DC and AC components of the instantaneous active and reactive powers. The term of supplementary useless current vector is also pointed out. The analysis shows that the current decomposition which respects the definition of the instantaneous apparent power vector is useful for compensation reasons only if the supply voltages are sinusoidal. A modified definition of the components of the current is proposed for the operation under nonsinusoidal voltage conditions.

A Hybrid Multi Objective Algorithm for Flexible Job Shop Scheduling

Scheduling for the flexible job shop is very important in both fields of production management and combinatorial optimization. However, it quit difficult to achieve an optimal solution to this problem with traditional optimization approaches owing to the high computational complexity. The combining of several optimization criteria induces additional complexity and new problems. In this paper, a Pareto approach to solve the multi objective flexible job shop scheduling problems is proposed. The objectives considered are to minimize the overall completion time (makespan) and total weighted tardiness (TWT). An effective simulated annealing algorithm based on the proposed approach is presented to solve multi objective flexible job shop scheduling problem. An external memory of non-dominated solutions is considered to save and update the non-dominated solutions during the solution process. Numerical examples are used to evaluate and study the performance of the proposed algorithm. The proposed algorithm can be applied easily in real factory conditions and for large size problems. It should thus be useful to both practitioners and researchers.

Machining of FRP Composites by Abrasive Jet Machining Optimization Using Taguchi

Abrasive Jet Machining is an Unconventional machining process in which the metal is removed from brittle and hard material in the form of micro-chips. With increase in need of materials like ceramics, composites, in manufacturing of various Mechanical & Electronic components, AJM has become a useful technique for micro machining. The present study highlights the influence of different parameters like Pressure, SOD, Time, Abrasive grain size, nozzle diameter on the Metal removal of FRP (Fiber Reinforced Polymer) composite by Abrasive jet machining. The results of the Experiments conducted were analyzed and optimized with TAGUCHI method of Optimization and ANOVA for Optimal Value.

In silico Analysis of Human microRNAs Targeting Influenza a Viruses (subtype H1N1, H5N1 and H3N2)

In this study, three subtypes of influenza A viruses (pH1N1, H5N1 and H3N2) which naturally infected human were analyzed by bioinformatic approaches to find candidate human cellular miRNAs targeting viral genomes. There were 76 miRNAs targeting influenza A viruses. Among these candidates, 70 miRNAs were subtypes specifically targeting each subtype of influenza A virus including 21 miRNAs targeted subtype H1N1, 27 miRNAs targeted subtype H5N1 and 22 miRNAs targeted subtype H3N2. The remaining 6 miRNAs target on multiple subtypes of influenza A viruses. Uniquely, hsa-miR-3145 is the only one candidate miRNA targeting PB1 gene of all three subtypes. Obviously, most of the candidate miRNAs are targeting on polymerase complex genes (PB2, PB1 and PA) of influenza A viruses. This study predicted potential human miRNAs targeting on different subtypes of influenza A viruses which might be useful for inhibition of viral replication and for better understanding of the interaction between virus and host cell.

Preparation of Nanostructure ZnO-SnO2 Thin Films for Optoelectronic Properties and Post Annealing Influence

ZnO-SnO2 i.e. Zinc-Tin-Oxide (ZTO) thin films were deposited on glass substrate with varying concentrations (ZnO:SnO2 - 100:0, 90:10, 70:30 and 50:50 wt.%) at room temperature by flash evaporation technique. These deposited ZTO film were annealed at 450 0C in vacuum. These films were characterized to study the effect of annealing on the structural, electrical, and optical properties. Atomic force microscopy (AFM) and Scanning electron microscopy (SEM) images manifest the surface morphology of these ZTO thin films. The apparent growth of surface features revealed the formation of nanostructure ZTO thin films. The small value of surface roughness (root mean square RRMS) ensures the usefulness in optical coatings. The sheet resistance was also found to be decreased for both types of films with increasing concentration of SnO2. The optical transmittance found to be decreased however blue shift has been observed after annealing.

A Simplified Model for Mechanical Loads under Angular Misalignment and Unbalance

This paper presents a dynamic model for mechanical loads of an electric drive, including angular misalignment and including load unbalance. The misalignment model represents the effects of the universal joint between the motor and the mechanical load. Simulation results are presented for an induction motor driving a mechanical load with angular misalignment for both flexible and rigid coupling. The models presented are very useful in the study of mechanical fault detection in induction motors, using mechanical and electrical signals already available in a drive system, such as speed, torque and stator currents.

Computational Study on Cardiac-Coronary Interaction in Terms of Coronary Flow-Pressure Waveforms in Presence of Drugs: Comparison Between Simulated and In Vivo Data

Cardiovascular human simulator can be a useful tool in understanding complex physiopathological process in cardiocirculatory system. It can also be a useful tool in order to investigate the effects of different drugs on hemodynamic parameters. The aim of this work is to test the potentiality of our cardiovascular numerical simulator CARDIOSIM© in reproducing flow/pressure coronary waveforms in presence of two different drugs: Amlodipine (AMLO) and Adenosine (ADO). In particular a time-varying intramyocardial compression, assumed to be proportional to the left ventricular pressure, was related to the venous coronary compliances in order to study its effects on the coronary blood flow and the flow/pressure loop. Considering that coronary circulation dynamics is strongly interrelated with the mechanics of the left ventricular contraction, relaxation, and filling, the numerical model allowed to analyze the effects induced by the left ventricular pressure on the coronary flow.

Optimum Shape and Design of Cooling Towers

The aim of the current study is to develop a numerical tool that is capable of achieving an optimum shape and design of hyperbolic cooling towers based on coupling a non-linear finite element model developed in-house and a genetic algorithm optimization technique. The objective function is set to be the minimum weight of the tower. The geometric modeling of the tower is represented by means of B-spline curves. The finite element method is applied to model the elastic buckling behaviour of a tower subjected to wind pressure and dead load. The study is divided into two main parts. The first part investigates the optimum shape of the tower corresponding to minimum weight assuming constant thickness. The study is extended in the second part by introducing the shell thickness as one of the design variables in order to achieve an optimum shape and design. Design, functionality and practicality constraints are applied.

Biological Data Integration using SOA

Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. This research suggests the use of Service Oriented Architecture (SOA) to integrate biological data from different data sources. This work shows SOA will solve the problems that facing integration process and if the biologist scientists can access the biological data in easier way. There are several methods to implement SOA but web service is the most popular method. The Microsoft .Net Framework used to implement proposed architecture.

Offline Signature Recognition using Radon Transform

In this work a new offline signature recognition system based on Radon Transform, Fractal Dimension (FD) and Support Vector Machine (SVM) is presented. In the first step, projections of original signatures along four specified directions have been performed using radon transform. Then, FDs of four obtained vectors are calculated to construct a feature vector for each signature. These vectors are then fed into SVM classifier for recognition of signatures. In order to evaluate the effectiveness of the system several experiments are carried out. Offline signature database from signature verification competition (SVC) 2004 is used during all of the tests. Experimental result indicates that the proposed method achieved high accuracy rate in signature recognition.

A Distributed Cognition Framework to Compare E-Commerce Websites Using Data Envelopment Analysis

This paper presents an approach based on the adoption of a distributed cognition framework and a non parametric multicriteria evaluation methodology (DEA) designed specifically to compare e-commerce websites from the consumer/user viewpoint. In particular, the framework considers a website relative efficiency as a measure of its quality and usability. A website is modelled as a black box capable to provide the consumer/user with a set of functionalities. When the consumer/user interacts with the website to perform a task, he/she is involved in a cognitive activity, sustaining a cognitive cost to search, interpret and process information, and experiencing a sense of satisfaction. The degree of ambiguity and uncertainty he/she perceives and the needed search time determine the effort size – and, henceforth, the cognitive cost amount – he/she has to sustain to perform his/her task. On the contrary, task performing and result achievement induce a sense of gratification, satisfaction and usefulness. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 40 websites of businesses performing electronic commerce in the information technology market. A questionnaire to collect subjective judgements for the websites in the sample was purposely designed and administered to 85 university students enrolled in computer science and information systems engineering undergraduate courses.

Feature Reduction of Nearest Neighbor Classifiers using Genetic Algorithm

The design of a pattern classifier includes an attempt to select, among a set of possible features, a minimum subset of weakly correlated features that better discriminate the pattern classes. This is usually a difficult task in practice, normally requiring the application of heuristic knowledge about the specific problem domain. The selection and quality of the features representing each pattern have a considerable bearing on the success of subsequent pattern classification. Feature extraction is the process of deriving new features from the original features in order to reduce the cost of feature measurement, increase classifier efficiency, and allow higher classification accuracy. Many current feature extraction techniques involve linear transformations of the original pattern vectors to new vectors of lower dimensionality. While this is useful for data visualization and increasing classification efficiency, it does not necessarily reduce the number of features that must be measured since each new feature may be a linear combination of all of the features in the original pattern vector. In this paper a new approach is presented to feature extraction in which feature selection, feature extraction, and classifier training are performed simultaneously using a genetic algorithm. In this approach each feature value is first normalized by a linear equation, then scaled by the associated weight prior to training, testing, and classification. A knn classifier is used to evaluate each set of feature weights. The genetic algorithm optimizes a vector of feature weights, which are used to scale the individual features in the original pattern vectors in either a linear or a nonlinear fashion. By this approach, the number of features used in classifying can be finely reduced.

Using Memetic Algorithms for the Solution of Technical Problems

The intention of this paper is, to help the user of evolutionary algorithms to adapt them easier to their problem at hand. For a lot of problems in the technical field it is not necessary to reach an optimum solution, but to reach a good solution in time. In many cases the solution is undetermined or there doesn-t exist a method to determine the solution. For these cases an evolutionary algorithm can be useful. This paper intents to give the user rules of thumb with which it is easier to decide if the problem is suitable for an evolutionary algorithm and how to design them.

Environmental Management in Arid Regions:The Question of Water

Only recently have water ethics received focused interest in the international water community. Because water is metabolically basic to life, an ethical dimension persists in every decision related to water. Water ethics at once express human society-s approach to water and act as guidelines for behaviour. Ideas around water are often implicit and embedded as assumptions. They can be entrenched in behaviour and difficult to contest because they are difficult to “see". By explicitly revealing the ethical ideas underlying water-related decisions, human society-s relationship with water, and with natural systems of which water is part, can be contested and shifted or be accepted with conscious intention by human society. In recent decades, improved understanding of water-s importance for ecosystem functioning and ecological services for human survival is moving us beyond this growth-driven, supplyfocused management paradigm. Environmental ethics challenge this paradigm by extending the ethical sphere to the environment and thus water or water Resources management per se. An ethical approach is a legitimate, important, and often ignored approach to effect change in environmental decision making. This qualitative research explores principles of water ethics and examines the underlying ethical precepts of selected water policy examples. The constructed water ethic principles act as a set of criteria against which a policy comparison can be established. This study shows that water Resources management is a progressive issue by embracing full public participation and a new planning model, and knowledgegeneration initiatives.

Topology Preservation in SOM

The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.

Deterministic Random Number Generators for Online Applications

Cryptography, Image watermarking and E-banking are filled with apparent oxymora and paradoxes. Random sequences are used as keys to encrypt information to be used as watermark during embedding the watermark and also to extract the watermark during detection. Also, the keys are very much utilized for 24x7x365 banking operations. Therefore a deterministic random sequence is very much useful for online applications. In order to obtain the same random sequence, we need to supply the same seed to the generator. Many researchers have used Deterministic Random Number Generators (DRNGs) for cryptographic applications and Pseudo Noise Random sequences (PNs) for watermarking. Even though, there are some weaknesses in PN due to attacks, the research community used it mostly in digital watermarking. On the other hand, DRNGs have not been widely used in online watermarking due to its computational complexity and non-robustness. Therefore, we have invented a new design of generating DRNG using Pi-series to make it useful for online Cryptographic, Digital watermarking and Banking applications.

Effect of Crude oil Intoxication on Antioxidant and Marker Enzymes of Tissue Damage in Liver of Rat

The objective of the present study was to examine the dose-response relationships between antioxidant parameters and liver contaminant levels of Kazakhstan light crude oil (KLCO) in albino rats. The animals were repeatedly exposed, by intraperitoneal injection, to low dosages (0.5–1.5 ml/kg) of KLCO. Rats exposed to these doses levels did not show any apparent symptoms of intoxication. Serum aminotransferases increased significantly (p

Sustainability Policies and Corporate Social Responsibility (CSR): Ergonomics Contribution Regarding Work in Companies

The growing importance of sustainability in corporate policies represents a great opportunity for workers to gain more consideration, with great benefits to their well being. Sustainable work is believed to be one which improves the organization-s performance and fosters professional development as well as workers- health. In a multiple case study based on document research, information was sought about work activities and their sustainability or corporate social responsibility (CSR) policies, as disseminated by corporations. All the companies devoted attention to work activities and delivered a good amount of information about them. Nevertheless, the information presented was generic; all the actions developed were top-down and there was no information about the impact of changes aimed at sustainability on the workers- activities. It was found that the companies seemed to be at an early stage. In the future, they need to show more commitment through concrete goals: they must be aware that workers contribute directly to the corporations- sustainability. This would allow room for Ergonomics and Work Psychodynamics to be incorporated and to be useful for both companies and society, so as to promote and ensure work sustainability.