A Model for Study of the Defects in Rolling Element Bearings at Higher Speed by Vibration Signature Analysis

The vibrations produced by a single point defect on various parts of the bearing under constant radial load are predicted by using a theoretical model. The model includes variation in the response due to the effect of bearing dimensions, rotating frequency distribution of load. The excitation forces are generated when the defects on the races strike to rolling elements. In case of the outer ring defect, the pulses generated are with periodicity of outer ring defect frequency where as for inner ring defect, the pulses are with periodicity of inner ring defect frequency. The effort has been carried out in preparing the physical model of the system. Different defect frequencies are obtained and are used to find out the amplitudes of the vibration due to excitation of the bearing parts. Increase in the radial load or severity of the defect produces a significant change in bearing signature characteristics.

Study of Optical Properties of a Glutathione Capped Gold Nanoparticles Using Linker (MHDA) by Fourier Transform Infra Red Spectroscopy and Surface Enhanced Raman Scattering

16-Mercaptohexadecanoic acid (MHDA) and tripeptide glutathione conjugated with gold nanoparticles (Au-NPs) are characterized by Fourier Transform InfaRared (FTIR) spectroscopy combined with Surface-enhanced Raman scattering (SERS) spectroscopy. Surface Plasmon Resonance (SPR) technique based on FTIR spectroscopy has become an important tool in biophysics, which is perspective for the study of organic compounds. FTIR-spectra of MHDA shows the line at 2500 cm-1 attributed to thiol group which is modified by presence of Au-NPs, suggesting the formation of bond between thiol group and gold. We also can observe the peaks originate from characteristic chemical group. A Raman spectrum of the same sample is also promising. Our preliminary experiments confirm that SERS-effect takes place for MHDA connected with Au-NPs and enable us to detected small number (less than 106 cm-2) of MHDA molecules. Combination of spectroscopy methods: FTIR and SERS – enable to study optical properties of Au- NPs and immobilized bio-molecules in context of a bio-nano-sensors.

Enhance Image Transmission Based on DWT with Pixel Interleaver

The recent growth of using multimedia transmission over wireless communication systems, have challenges to protect the data from lost due to wireless channel effect. Images are corrupted due to the noise and fading when transmitted over wireless channel, in wireless channel the image is transmitted block by block, Due to severe fading, entire image blocks can be damaged. The aim of this paper comes out from need to enhance the digital images at the wireless receiver side. Proposed Boundary Interpolation (BI) Algorithm using wavelet, have been adapted here used to reconstruction the lost block in the image at the receiver depend on the correlation between the lost block and its neighbors. New Proposed technique by using Boundary Interpolation (BI) Algorithm using wavelet with Pixel interleaver has been implemented. Pixel interleaver work on distribute the pixel to new pixel position of original image before transmitting the image. The block lost through wireless channel is only effects individual pixel. The lost pixels at the receiver side can be recovered by using Boundary Interpolation (BI) Algorithm using wavelet. The results showed that the New proposed algorithm boundary interpolation (BI) using wavelet with pixel interleaver is better in term of MSE and PSNR.

Relationship between Communication Effectiveness and the Extent of Communication among Organizational Units

This contribution deals with the relationship between communication effectiveness and the extent of communication among organizational units. To facilitate communication between employees and to increase the level of understanding, the knowledge of communication tools is necessary. Recent experience has shown that personal communication is critical for smooth running of companies and cannot be fully replaced by any form of technical communication devices. Below are presented the outcomes of the research on the relationship between the extent of communication among organisational units and its efficiency.

Clustered Signatures for Modeling and Recognizing 3D Rigid Objects

This paper describes a probabilistic method for three-dimensional object recognition using a shared pool of surface signatures. This technique uses flatness, orientation, and convexity signatures that encode the surface of a free-form object into three discriminative vectors, and then creates a shared pool of data by clustering the signatures using a distance function. This method applies the Bayes-s rule for recognition process, and it is extensible to a large collection of three-dimensional objects.

A Voltage Based Maximum Power Point Tracker for Low Power and Low Cost Photovoltaic Applications

This paper describes the design of a voltage based maximum power point tracker (MPPT) for photovoltaic (PV) applications. Of the various MPPT methods, the voltage based method is considered to be the simplest and cost effective. The major disadvantage of this method is that the PV array is disconnected from the load for the sampling of its open circuit voltage, which inevitably results in power loss. Another disadvantage, in case of rapid irradiance variation, is that if the duration between two successive samplings, called the sampling period, is too long there is a considerable loss. This is because the output voltage of the PV array follows the unchanged reference during one sampling period. Once a maximum power point (MPP) is tracked and a change in irradiation occurs between two successive samplings, then the new MPP is not tracked until the next sampling of the PV array voltage. This paper proposes an MPPT circuit in which the sampling interval of the PV array voltage, and the sampling period have been shortened. The sample and hold circuit has also been simplified. The proposed circuit does not utilize a microcontroller or a digital signal processor and is thus suitable for low cost and low power applications.

Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Towards Modeling for Crashes A Low-Cost Adaptive Methodology for Karachi

The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.

Robotic Hands: Design Review and Proposal of New Design Process

In this paper we intend to ascertain the state of the art on multifingered end-effectors, also known as robotic hands or dexterous robot hands, and propose an experimental setup for an innovative task based design approach, involving cutting edge technologies in motion capture. After an initial description of the capabilities and complexity of a human hand when grasping objects, in order to point out the importance of replicating it, we analyze the mechanical and kinematical structure of some important works carried out all around the world in the last three decades and also review the actuators and sensing technologies used. Finally we describe a new design philosophy proposing an experimental setup for the first stage using recent developments in human body motion capture systems that might lead to lighter and always more dexterous robotic hands.

Contingent Pay and Experience with its use by Organizations of the Czech Republic Operating in the Field of Environmental Protection

One part of the total employee-s reward is apart from basic wages or salary, employee-s benefits and intangible elements also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, capcompetency or skills of individual employees, and to team-s or company-wide performance or to combination of few of the mentioned possibilities. Main aim of this article is to define, based on available information, contingent pay, describe reasons for its implementation and arguments for and against this type of remuneration, but also bring information not only about its extent and level of utilization by organizations of the Czech Republic operating in the field of environmental protection, but also mention their practical experience with this type of remuneration.

Using Structural Equation Modeling in Causal Relationship Design for Balanced-Scorecards' Strategic Map

Through 1980s, management accounting researchers described the increasing irrelevance of traditional control and performance measurement systems. The Balanced Scorecard (BSC) is a critical business tool for a lot of organizations. It is a performance measurement system which translates mission and strategy into objectives. Strategy map approach is a development variant of BSC in which some necessary causal relations must be established. To recognize these relations, experts usually use experience. It is also possible to utilize regression for the same purpose. Structural Equation Modeling (SEM), which is one of the most powerful methods of multivariate data analysis, obtains more appropriate results than traditional methods such as regression. In the present paper, we propose SEM for the first time to identify the relations between objectives in the strategy map, and a test to measure the importance of relations. In SEM, factor analysis and test of hypotheses are done in the same analysis. SEM is known to be better than other techniques at supporting analysis and reporting. Our approach provides a framework which permits the experts to design the strategy map by applying a comprehensive and scientific method together with their experience. Therefore this scheme is a more reliable method in comparison with the previously established methods.

Distributed Splay Suffix Arrays: A New Structure for Distributed String Search

As a structure for processing string problem, suffix array is certainly widely-known and extensively-studied. But if the string access pattern follows the “90/10" rule, suffix array can not take advantage of the fact that we often find something that we have just found. Although the splay tree is an efficient data structure for small documents when the access pattern follows the “90/10" rule, it requires many structures and an excessive amount of pointer manipulations for efficiently processing and searching large documents. In this paper, we propose a new and conceptually powerful data structure, called splay suffix arrays (SSA), for string search. This data structure combines the features of splay tree and suffix arrays into a new approach which is suitable to implementation on both conventional and clustered computers.

Manufacturing Dispersions Based Simulation and Synthesis of Design Tolerances

The objective of this work which is based on the approach of simultaneous engineering is to contribute to the development of a CIM tool for the synthesis of functional design dimensions expressed by average values and tolerance intervals. In this paper, the dispersions method known as the Δl method which proved reliable in the simulation of manufacturing dimensions is used to develop a methodology for the automation of the simulation. This methodology is constructed around three procedures. The first procedure executes the verification of the functional requirements by automatically extracting the functional dimension chains in the mechanical sub-assembly. Then a second procedure performs an optimization of the dispersions on the basis of unknown variables. The third procedure uses the optimized values of the dispersions to compute the optimized average values and tolerances of the functional dimensions in the chains. A statistical and cost based approach is integrated in the methodology in order to take account of the capabilities of the manufacturing processes and to distribute optimal values among the individual components of the chains.

A Blind Digital Watermark in Hadamard Domain

A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.

Design and Simulation of Air-Fuel Ratio Control System for Distributorless CNG Engine

This paper puts forward one kind of air-fuel ratio control method with PI controller. With the help of MATLAB/SIMULINK software, the mathematical model of air-fuel ratio control system for distributorless CNG engine is constructed. The objective is to maintain cylinder-to-cylinder air-fuel ratio at a prescribed set point, determined primarily by the state of the Three- Way-Catalyst (TWC), so that the pollutants in the exhaust are removed with the highest efficiency. The concurrent control of airfuel under transient conditions could be implemented by Proportional and Integral (PI) controller. The simulation result indicates that the control methods can easily eliminate the air/fuel maldistribution and maintain the air/fuel ratio at the stochiometry within minimum engine events.

An Investigation into Turbine Blade Tip Leakage Flows at High Speeds

The effect of the blade tip geometry of a high pressure gas turbine is studied experimentally and computationally for high speed leakage flows. For this purpose two simplified models are constructed, one models a flat tip of the blade and the second models a cavity tip of the blade. Experimental results are obtained from a transonic wind tunnel to show the static pressure distribution along the tip wall and provide flow visualization. RANS computations were carried to provide further insight into the mean flow behavior and to calculate the discharge coefficient which is a measure of the flow leaking over the tip. It is shown that in both geometries of tip the flow separates over the tip to form a separation bubble. The bubble is higher for the cavity tip while a complete shock wave system of oblique waves ending with a normal wave can be seen for the flat tip. The discharge coefficient for the flat tip shows less dependence on the pressure ratio over the blade tip than the cavity tip. However, the discharge coefficient for the cavity tip is lower than that of the flat tip, showing a better ability to reduce the leakage flow and thus increase the turbine efficiency.

MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes

A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.

Digital Image Encryption Scheme using Chaotic Sequences with a Nonlinear Function

In this study, a system of encryption based on chaotic sequences is described. The system is used for encrypting digital image data for the purpose of secure image transmission. An image secure communication scheme based on Logistic map chaotic sequences with a nonlinear function is proposed in this paper. Encryption and decryption keys are obtained by one-dimensional Logistic map that generates secret key for the input of the nonlinear function. Receiver can recover the information using the received signal and identical key sequences through the inverse system technique. The results of computer simulations indicate that the transmitted source image can be correctly and reliably recovered by using proposed scheme even under the noisy channel. The performance of the system will be discussed through evaluating the quality of recovered image with and without channel noise.

Hybridized Technique to Analyze Workstress Related Data via the StressCafé

This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach) has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a superior hybrid solution. Recent researches have shown that there is a need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.

Curriculum of Ethical Education in Slovakia

Ethical Education is a compulsorily optional subject in primary and secondary schools. The Ethical Education objective is the education of a personality with one´s own identity, with interiorized ethical standards, with mature moral judgement and therefore with the behaviour determined by one´s own beliefs; with a positive attitude to himself/herself and other people and that is why he/she is able to cooperate and to initiate cooperation. In the paper we describe the contents and the principles of Ethical education. We also shows that Ethical education is subject supported primary socialpathological prevention and education to citizenship. In this context we try to show that ethical education contributes to the education of good people who are aware of the necessity to respect social norms and are able to assume responsibility for their own behaviour in any situation at present and in the future.