Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems

Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.

Impact of the Decoder Connection Schemes on Iterative Decoding of GPCB Codes

In this paper we present a study of the impact of connection schemes on the performance of iterative decoding of Generalized Parallel Concatenated block (GPCB) constructed from one step majority logic decodable (OSMLD) codes and we propose a new connection scheme for decoding them. All iterative decoding connection schemes use a soft-input soft-output threshold decoding algorithm as a component decoder. Numerical result for GPCB codes transmitted over Additive White Gaussian Noise (AWGN) channel are provided. It will show that the proposed scheme is better than Hagenauer-s scheme and Lucas-s scheme [1] and slightly better than the Pyndiah-s scheme.

Removal of CO2 and H2S using Aqueous Alkanolamine Solusions

This work presents a theoretical investigation of the simultaneous absorption of CO2 and H2S into aqueous solutions of MDEA and DEA. In this process the acid components react with the basic alkanolamine solution via an exothermic, reversible reaction in a gas/liquid absorber. The use of amine solvents for gas sweetening has been investigated using process simulation programs called HYSYS and ASPEN. We use Electrolyte NRTL and Amine Package and Amines (experimental) equation of state. The effects of temperature and circulation rate and amine concentration and packed column and murphree efficiency on the rate of absorption were studied. When lean amine flow and concentration increase, CO2 and H2S absorption increase too. With the improvement of inlet amine temperature in absorber, CO2 and H2S penetrate to upper stages of absorber and absorption of acid gases in absorber decreases. The CO2 concentration in the clean gas can be greatly influenced by the packing height, whereas for the H2S concentration in the clean gas the packing height plays a minor role. HYSYS software can not estimate murphree efficiency correctly and it applies the same contributions in all diagrams for HYSYS software. By improvement in murphree efficiency, maximum temperature of absorber decrease and the location of reaction transfer to the stages of bottoms absorber and the absorption of acid gases increase.

Flagging Critical Components to Prevent Transient Faults in Real-Time Systems

This paper proposes the use of metrics in design space exploration that highlight where in the structure of the model and at what point in the behaviour, prevention is needed against transient faults. Previous approaches to tackle transient faults focused on recovery after detection. Almost no research has been directed towards preventive measures. But in real-time systems, hard deadlines are performance requirements that absolutely must be met and a missed deadline constitutes an erroneous action and a possible system failure. This paper proposes the use of metrics to assess the system design to flag where transient faults may have significant impact. These tools then allow the design to be changed to minimize that impact, and they also flag where particular design techniques – such as coding of communications or memories – need to be applied in later stages of design.

Integrating Low and High Level Object Recognition Steps by Probabilistic Networks

In pattern recognition applications the low level segmentation and the high level object recognition are generally considered as two separate steps. The paper presents a method that bridges the gap between the low and the high level object recognition. It is based on a Bayesian network representation and network propagation algorithm. At the low level it uses hierarchical structure of quadratic spline wavelet image bases. The method is demonstrated for a simple circuit diagram component identification problem.

Satellite Thermal Control: Cooling by a Diphasic Loop

In space during functioning, a satellite will be heated up due to the behavior of its components such as power electronics. In order to prevent problems in the satellite, this heat has to be released in space thanks to the cooling system. This system consists of a loop heat pipe (LHP), in which a fluid streams through an evaporator and a condenser. In the evaporator, the fluid captures the heat from the satellite and evaporates. Then it flows to the condenser where it releases the heat and it condenses. In this project, the two mains parts of a cooling system are studied: the evaporator and the condenser. The study of the diphasic loop was done starting from digital simulations carried out under Matlab and Femlab.

Stochastic Subspace Modelling of Turbulence

Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical cross spectral density function for the along-wind turbulence component over the wind field area is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since the succeeding state space and ARMA modelling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.

Implementation of the SIP Express Router with Mediaproxy Method on VoIP

Voice Over IP (VoIP) is a technology that could pass the voice traffic and data packet form over an IP network. Network can be used for intranet or Internet. Phone calls using VoIP has advantages in terms of cheaper cost of PSTN phone to more than half, because the cost is calculated by the cost of the global nature of the Internet. Session Initiation Protocol (SIP) is a signaling protocol at the application layer which serves to establish, modify, and terminate a multimedia session involving one or more users. This SIP signaling has SIP message in text form that is used for session management by the SIP components, such as User Agent, Registrar, Redirect Server, and Proxy Server. To build a SIP communication is required SIP Express Router (SER) to be able to receive SIP messages, for handling the basic functions of SIP messages. Problems occur when the NAT through which affects the voice communication will be blocked starting from the sound that is not sent or one side of the sound are sent (half duplex). How that could be used to penetrate NAT is to use a given mediaproxy random RTP port to penetrate NAT.

Automation of Fishhooks Objective Measures

Fishing has always been an essential component of the Polynesians- life. Fishhooks, mostly in pearl shell, found during archaeological excavations are the artifacts related to this activity the most numerous. Thanks to them, we try to reconstruct the ancient techniques of resources exploitation, inside the lagoons and offshore. They can also be used as chronological and cultural indicators. The shapes and dimensions of these artifacts allow comparisons and classifications used in both functional approach and chrono-cultural perspective. Hence it is very important for the ethno-archaeologists to dispose of reliable methods and standardized measurement of these artifacts. Such a reliable objective and standardized method have been previously proposed. But this method cannot be envisaged manually because of the very important time required to measure each fishhook manually and the quantity of fishhooks to measure (many hundreds). We propose in this paper a detailed acquisition protocol of fishhooks and an automation of every step of this method. We also provide some experimental results obtained on the fishhooks coming from three archaeological excavations sites.

Observations about the Principal Components Analysis and Data Clustering Techniques in the Study of Medical Data

The medical data statistical analysis often requires the using of some special techniques, because of the particularities of these data. The principal components analysis and the data clustering are two statistical methods for data mining very useful in the medical field, the first one as a method to decrease the number of studied parameters, and the second one as a method to analyze the connections between diagnosis and the data about the patient-s condition. In this paper we investigate the implications obtained from a specific data analysis technique: the data clustering preceded by a selection of the most relevant parameters, made using the principal components analysis. Our assumption was that, using the principal components analysis before data clustering - in order to select and to classify only the most relevant parameters – the accuracy of clustering is improved, but the practical results showed the opposite fact: the clustering accuracy decreases, with a percentage approximately equal with the percentage of information loss reported by the principal components analysis.

An Integrated Software Architecture for Bandwidth Adaptive Video Streaming

Video streaming over lossy IP networks is very important issues, due to the heterogeneous structure of networks. Infrastructure of the Internet exhibits variable bandwidths, delays, congestions and time-varying packet losses. Because of variable attributes of the Internet, video streaming applications should not only have a good end-to-end transport performance but also have a robust rate control, furthermore multipath rate allocation mechanism. So for providing the video streaming service quality, some other components such as Bandwidth Estimation and Adaptive Rate Controller should be taken into consideration. This paper gives an overview of video streaming concept and bandwidth estimation tools and then introduces special architectures for bandwidth adaptive video streaming. A bandwidth estimation algorithm – pathChirp, Optimized Rate Controllers and Multipath Rate Allocation Algorithm are considered as all-in-one solution for video streaming problem. This solution is directed and optimized by a decision center which is designed for obtaining the maximum quality at the receiving side.

A User Friendly Tool for Performance Evaluation of Different Reference Evapotranspiration Methods

Evapotranspiration (ET) is a major component of the hydrologic cycle and its accurate estimation is essential for hydrological studies. In past, various estimation methods have been developed for different climatological data, and the accuracy of these methods varies with climatic conditions. Reference crop evapotranspiration (ET0) is a key variable in procedures established for estimating evapotranspiration rates of agricultural crops. Values of ET0 are used with crop coefficients for many aspects of irrigation and water resources planning and management. Numerous methods are used for estimating ET0. As per internationally accepted procedures outlined in the United Nations Food and Agriculture Organization-s Irrigation and Drainage Paper No. 56(FAO-56), use of Penman-Monteith equation is recommended for computing ET0 from ground based climatological observations. In the present study, seven methods have been selected for performance evaluation. User friendly software has been developed using programming language visual basic. The visual basic has ability to create graphical environment using less coding. For given data availability the developed software estimates reference evapotranspiration for any given area and period for which data is available. The accuracy of the software has been checked by the examples given in FAO-56.The developed software is a user friendly tool for estimating ET0 under different data availability and climatic conditions.

Hardware Centric Machine Vision for High Precision Center of Gravity Calculation

We present a hardware oriented method for real-time measurements of object-s position in video. The targeted application area is light spots used as references for robotic navigation. Different algorithms for dynamic thresholding are explored in combination with component labeling and Center Of Gravity (COG) for highest possible precision versus Signal-to-Noise Ratio (SNR). This method was developed with a low hardware cost in focus having only one convolution operation required for preprocessing of data.

Achieving High Availability by Implementing Beowulf Cluster

A computer cluster is a group of tightly coupled computers that work together closely so that in many respects they can be viewed as though they are a single computer. The components of a cluster are commonly, but not always, connected to each other through fast local area networks. Clusters are usually deployed to improve performance and/or availability over that provided by a single computer, while typically being much more cost-effective than single computers of comparable speed or availability. This paper proposed the way to implement the Beowulf Cluster in order to achieve high performance as well as high availability.

Capacity of Anchors in Structural Connections

When dealing with safety in structures, the connections between structural components play an important role. Robustness of a structure as a whole depends both on the load- bearing capacity of the structural component and on the structures capacity to resist total failure, even though a local failure occurs in a component or a connection between components. To avoid progressive collapse it is necessary to be able to carry out a design for connections. A connection may be executed with anchors to withstand local failure of the connection in structures built with prefabricated components. For the design of these anchors, a model is developed for connections in structures performed in prefabricated autoclaved aerated concrete components. The design model takes into account the effect of anchors placed close to the edge, which may result in splitting failure. Further the model is developed to consider the effect of reinforcement diameter and anchor depth. The model is analytical and theoretically derived assuming a static equilibrium stress distribution along the anchor. The theory is compared to laboratory test, including the relevant parameters and the model is refined and theoretically argued analyzing the observed test results. The method presented can be used to improve safety in structures or even optimize the design of the connections

RTCoord: A Methodology to Design WSAN Applications

Wireless Sensor and Actor Networks (WSANs) constitute an emerging and pervasive technology that is attracting increasing interest in the research community for a wide range of applications. WSANs have two important requirements: coordination interactions and real-time communication to perform correct and timely actions. This paper introduces a methodology to facilitate the task of the application programmer focusing on the coordination and real-time requirements of WSANs. The methodology proposed in this model uses a real-time component model, UM-RTCOM, which will help us to achieve the design and implementation of applications in WSAN by using the component oriented paradigm. This will help us to develop software components which offer some very interesting features, such as reusability and adaptability which are very suitable for WSANs as they are very dynamic environments with rapidly changing conditions. In addition, a high-level coordination model based on tuple channels (TC-WSAN) is integrated into the methodology by providing a component-based specification of this model in UM-RTCOM; this will allow us to satisfy both sensor-actor and actor-actor coordination requirements in WSANs. Finally, we present in this paper the design and implementation of an application which will help us to show how the methodology can be easily used in order to achieve the development of WSANs applications.

Web-Based Architecture of a System for Design Assessment of Night Vision Devices

Nowadays the devices of night vision are widely used both for military and civil applications. The variety of night vision applications require a variety of the night vision devices designs. A web-based architecture of a software system for design assessment before producing of night vision devices is developed. The proposed architecture of the web-based system is based on the application of a mathematical model for designing of night vision devices. An algorithm with two components – for iterative design and for intelligent design is developed and integrated into system architecture. The iterative component suggests compatible modules combinations to choose from. The intelligent component provides compatible combinations of modules  satisfying given user requirements to device parameters. The proposed web-based architecture of a system for design assessment of night vision devices is tested via a prototype of the system. The testing showed the applicability of both iterative and intelligent components of algorithm. 

A Comparison of Grey Model and Fuzzy Predictive Model for Time Series

The prediction of meteorological parameters at a meteorological station is an interesting and open problem. A firstorder linear dynamic model GM(1,1) is the main component of the grey system theory. The grey model requires only a few previous data points in order to make a real-time forecast. In this paper, we consider the daily average ambient temperature as a time series and the grey model GM(1,1) applied to local prediction (short-term prediction) of the temperature. In the same case study we use a fuzzy predictive model for global prediction. We conclude the paper with a comparison between local and global prediction schemes.

M-band Wavelet and Cosine Transform Based Watermark Algorithm Using Randomization and Principal Component Analysis

Computational techniques derived from digital image processing are playing a significant role in the security and digital copyrights of multimedia and visual arts. This technology has the effect within the domain of computers. This research presents discrete M-band wavelet transform (MWT) and cosine transform (DCT) based watermarking algorithm by incorporating the principal component analysis (PCA). The proposed algorithm is expected to achieve higher perceptual transparency. Specifically, the developed watermarking scheme can successfully resist common signal processing, such as geometric distortions, and Gaussian noise. In addition, the proposed algorithm can be parameterized, thus resulting in more security. To meet these requirements, the image is transformed by a combination of MWT & DCT. In order to improve the security further, we randomize the watermark image to create three code books. During the watermark embedding, PCA is applied to the coefficients in approximation sub-band. Finally, first few component bands represent an excellent domain for inserting the watermark.

Mechanism of Alcohol Related Disruption of the Error Monitoring and Processing System

The error monitoring and processing system, EMPS is the system located in the substantia nigra of the midbrain, basal ganglia and cortex of the forebrain, and plays a leading role in error detection and correction. The main components of EMPS are the dopaminergic system and anterior cingulate cortex. Although, recent studies show that alcohol disrupts the EMPS, the ways in which alcohol affects this system are poorly understood. Based on current literature data, here we suggest a hypothesis of alcohol-related glucose-dependent system of error monitoring and processing, which holds that the disruption of the EMPS is related to the competency of glucose homeostasis regulation, which in turn may determine the dopamine level as a major component of EMPS. Alcohol may indirectly disrupt the EMPS by affecting dopamine level through disorders in blood glucose homeostasis regulation.