Low Jitter ADPLL based Clock Generator for High Speed SoC Applications

An efficient architecture for low jitter All Digital Phase Locked Loop (ADPLL) suitable for high speed SoC applications is presented in this paper. The ADPLL is designed using standard cells and described by Hardware Description Language (HDL). The ADPLL implemented in a 90 nm CMOS process can operate from 10 to 200 MHz and achieve worst case frequency acquisition in 14 reference clock cycles. The simulation result shows that PLL has cycle to cycle jitter of 164 ps and period jitter of 100 ps at 100MHz. Since the digitally controlled oscillator (DCO) can achieve both high resolution and wide frequency range, it can meet the demands of system-level integration. The proposed ADPLL can easily be ported to different processes in a short time. Thus, it can reduce the design time and design complexity of the ADPLL, making it very suitable for System-on-Chip (SoC) applications.

Improvement of Semen Quality in Holstein Bulls during Heat Stress by Supplementing Omega-3 Fatty Acids

The aim of current study was to investigate the changes in the quality parameters of Holstein bull semen during the heat stress and the effect of feeding a source of omega-3 fatty acids in this period. Samples were obtained from 19 Holstein bulls during the expected time of heat stress in Iran (June to September 2009). Control group (n=10) were fed a standard concentrate feed while treatment group (n=9) had this feed top dressed with 100 g of an omega-3 enriched nutriceutical. Semen quality was assessed on ejaculates collected after 1, 5, 9 and 12 weeks of supplementation. Computer-assisted assessment of sperm motility, viability (eosinnigrosin) and hypo-osmotic swelling test (HOST) were conducted. Heat stress affected sperm quality parameters by week 5 and 9 (p

A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

SOA-Based Mobile Application for Crime Control in Thailand

Crime is a major societal problem for most of the world's nations. Consequently, the police need to develop new methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made). These types of data are collected from several heterogeneous sources in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for crime data collection, integration and dissemination through mobile devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This paper proposes and investigates the use of service oriented architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an appropriate way to support data exchange and model sharing from heterogeneous sources. Crime control also needs to facilitate mobile spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and anywhere.

Methane and Other Hydrocarbon Gas Emissions Resulting from Flaring in Kuwait Oilfields

Air pollution is a major environmental health problem, affecting developed and developing countries around the world. Increasing amounts of potentially harmful gases and particulate matter are being emitted into the atmosphere on a global scale, resulting in damage to human health and the environment. Petroleum-related air pollutants can have a wide variety of adverse environmental impacts. In the crude oil production sectors, there is a strong need for a thorough knowledge of gaseous emissions resulting from the flaring of associated gas of known composition on daily basis through combustion activities under several operating conditions. This can help in the control of gaseous emission from flares and thus in the protection of their immediate and distant surrounding against environmental degradation. The impacts of methane and non-methane hydrocarbons emissions from flaring activities at oil production facilities at Kuwait Oilfields have been assessed through a screening study using records of flaring operations taken at the gas and oil production sites, and by analyzing available meteorological and air quality data measured at stations located near anthropogenic sources. In the present study the Industrial Source Complex (ISCST3) Dispersion Model is used to calculate the ground level concentrations of methane and nonmethane hydrocarbons emitted due to flaring in all over Kuwait Oilfields. The simulation of real hourly air quality in and around oil production facilities in the State of Kuwait for the year 2006, inserting the respective source emission data into the ISCST3 software indicates that the levels of non-methane hydrocarbons from the flaring activities exceed the allowable ambient air standard set by Kuwait EPA. So, there is a strong need to address this acute problem to minimize the impact of methane and non-methane hydrocarbons released from flaring activities over the urban area of Kuwait.

Dynamic Models versus Frailty Models for Recurrent Event Data

Recurrent event data is a special type of multivariate survival data. Dynamic and frailty models are one of the approaches that dealt with this kind of data. A comparison between these two models is studied using the empirical standard deviation of the standardized martingale residual processes as a way of assessing the fit of the two models based on the Aalen additive regression model. Here we found both approaches took heterogeneity into account and produce residual standard deviations close to each other both in the simulation study and in the real data set.

An Experimental Study on Development of the Connection System of Concrete Barriers Applicable to Modular Bridge

Although many studies on the assembly technology of the bridge construction have dealt mostly with on the pier, girder or the deck of the bridge, studies on the prefabricated barrier have rarely been performed. For understanding structural characteristics and application of the concrete barrier in the modular bridge, which is an assembly of structure members, static loading test was performed. Structural performances as a road barrier of the three methods, conventional cast-in-place(ST), vertical bolt connection(BVC) and horizontal bolt connection(BHC) were evaluated and compared through the analyses of load-displacement curves, strain curves of the steel, concrete strain curves and the visual appearances of crack patterns. The vertical bolt connection(BVC) method demonstrated comparable performance as an alternative to conventional cast-in-place(ST) while providing all the advantages of prefabricated technology. Necessities for the future improvement in nuts enforcement as well as legal standard and regulation are also addressed.

A Comparison on Healing Effects of an Ayurvedic Preparation and Silver Sulfadiazine on Burn Wounds in Albino Rats

To compare Healing Effects of an Ayurvedic Preparation and Silver Sulfadiazine on burn wounds in Albino Rats. Methods: Albino rats– 30 male / female rats weighing between 150-200 g were used in the study. They were individually housed and maintained on normal diet and water ad libitum. Partial thickness burn wounds were inflicted, on overnight-starved animals under pentobarbitone (30mg/kg, i.p.) anaesthesia, by pouring hot molten wax at 80oC into a plastic cylinder of 300 mm2 circular openings placed on the shaven back of the animal. Apart from the drugs under investigation no local/ systemic chemotherapeutic cover will be provided to animals. All the animals were assessed for the percentage of wound contraction, signs of infection, scab formation and histopathological examination. Results: Percentage of wound healing was significantly better in the test ointment group compared to the standard. Signs of infection were observed in more animals in the test ointment group compared to the standard. Scab formation also took place earlier in the test ointment group compared to standard. Epithelial regeneration and healing profile was better in the test ointment compared to the standard. Moreover the test ointment group did not show any raised margins in the wound or blackish discoloration as was observed in silver sulfadiazine group. Conclusion: The burn wound healing effect of the ayurvedic ointment under study is better in comparison to standard therapy of silver sulfadiazine. The problem of infection encountered with the test ointment can be overcome by changing the concentrations and proportions of the ingredients in the test ointment which constitutes the further plan of the study.

Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia

Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.

Effective Traffic Lights Recognition Method for Real Time Driving Assistance Systemin the Daytime

This paper presents an effective traffic lights recognition method at the daytime. First, Potential Traffic Lights Detector (PTLD) use whole color source of YCbCr channel image and make each binary image of green and red traffic lights. After PTLD step, Shape Filter (SF) use to remove noise such as traffic sign, street tree, vehicle, and building. At this time, noise removal properties consist of information of blobs of binary image; length, area, area of boundary box, etc. Finally, after an intermediate association step witch goal is to define relevant candidates region from the previously detected traffic lights, Adaptive Multi-class Classifier (AMC) is executed. The classification method uses Haar-like feature and Adaboost algorithm. For simulation, we are implemented through Intel Core CPU with 2.80 GHz and 4 GB RAM and tested in the urban and rural roads. Through the test, we are compared with our method and standard object-recognition learning processes and proved that it reached up to 94 % of detection rate which is better than the results achieved with cascade classifiers. Computation time of our proposed method is 15 ms.

A Methodology for Reducing the BGP Convergence Time

Border Gateway Protocol (BGP) is the standard routing protocol between various autonomous systems (AS) in the internet. In the event of failure, a considerable delay in the BGP convergence has been shown by empirical measurements. During the convergence time the BGP will repeatedly advertise new routes to some destination and withdraw old ones until it reach a stable state. It has been found that the KEEPALIVE message timer and the HOLD time are tow parameters affecting the convergence speed. This paper aims to find the optimum value for the KEEPALIVE timer and the HOLD time that maximally reduces the convergence time without increasing the traffic. The KEEPALIVE message timer optimal value founded by this paper is 30 second instead of 60 seconds, and the optimal value for the HOLD time is 90 seconds instead of 180 seconds.

On Formalizing Predefined OCL Properties

The ability of UML to handle the modeling process of complex industrial software applications has increased its popularity to the extent of becoming the de-facto language in serving the design purpose. Although, its rich graphical notation naturally oriented towards the object-oriented concept, facilitates the understandability, it hardly successes to report all domainspecific aspects in a satisfactory way. OCL, as the standard language for expressing additional constraints on UML models, has great potential to help improve expressiveness. Unfortunately, it suffers from a weak formalism due to its poor semantic resulting in many obstacles towards the build of tools support and thus its application in the industry field. For this reason, many researches were established to formalize OCL expressions using a more rigorous approach. Our contribution join this work in a complementary way since it focuses specifically on OCL predefined properties which constitute an important part in the construction of OCL expressions. Using formal methods, we mainly succeed in expressing rigorously OCL predefined functions.

Automatic Generation of OWL Ontologies from UML Class Diagrams Based on Meta- Modelling and Graph Grammars

Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.

A Study of Recycle Materials to Develop for Auto Part

At the present, auto part industries have become higher challenge in strategy market. As this consequence, manufacturers need to have better response to customers in terms of quality, cost, and delivery time. Moreover, they need to have a good management in factory to comply with international standard maximum capacity and lower cost. This would lead companies to have to order standard part from aboard and become the major cost of inventory. The development of auto part research by recycling materials experiment is to compare the auto parts from recycle materials to international auto parts (CKD). Factors studied in this research were the recycle material ratios of PU-foam, felt, and fabric. Results of recycling materials were considered in terms of qualities and properties on the parameters such as weight, sound absorption, water absorption, tensile strength, elongation, and heat resistance with the CKD. The results were showed that recycling materials would be used to replace for the CKD.

Modeling Ambient Carbon Monoxide Pollutant Due to Road Traffic

Rapid urbanization, industrialization and population growth have led to an increase in number of automobiles that cause air pollution. It is estimated that road traffic contributes 60% of air pollution in urban areas. A case by case assessment is required to predict the air quality in urban situations, so as to evolve certain traffic management measures to maintain the air quality levels with in the tolerable limits. Calicut city in the state of Kerala, India has been chosen as the study area. Carbon Monoxide (CO) concentration was monitored at 15 links in Calicut city and air quality performance was evaluated over each link. The CO pollutant concentration values were compared with the National Ambient Air Quality Standards (NAAQS), and the CO values were predicted by using CALINE4 and IITLS and Linear regression models. The study has revealed that linear regression model performs better than the CALINE4 and IITLS models. The possible association between CO pollutant concentration and traffic parameters like traffic flow, type of vehicle, and traffic stream speed was also evaluated.

Tracking Objects in Color Image Sequences: Application to Football Images

In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.

Quality-Driven Business Process Refactoring

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model

The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learning. The algorithm is tested on XOR and parity problem and compared with the standard back propagation training algorithm.

Selection Initial modes for Belief K-modes Method

The belief K-modes method (BKM) approach is a new clustering technique handling uncertainty in the attribute values of objects in both the cluster construction task and the classification one. Like the standard version of this method, the BKM results depend on the chosen initial modes. So, one selection method of initial modes is developed, in this paper, aiming at improving the performances of the BKM approach. Experiments with several sets of real data show that by considered the developed selection initial modes method, the clustering algorithm produces more accurate results.

Secure Protocol for Short Message Service

Short Message Service (SMS) has grown in popularity over the years and it has become a common way of communication, it is a service provided through General System for Mobile Communications (GSM) that allows users to send text messages to others. SMS is usually used to transport unclassified information, but with the rise of mobile commerce it has become a popular tool for transmitting sensitive information between the business and its clients. By default SMS does not guarantee confidentiality and integrity to the message content. In the mobile communication systems, security (encryption) offered by the network operator only applies on the wireless link. Data delivered through the mobile core network may not be protected. Existing end-to-end security mechanisms are provided at application level and typically based on public key cryptosystem. The main concern in a public-key setting is the authenticity of the public key; this issue can be resolved by identity-based (IDbased) cryptography where the public key of a user can be derived from public information that uniquely identifies the user. This paper presents an encryption mechanism based on the IDbased scheme using Elliptic curves to provide end-to-end security for SMS. This mechanism has been implemented over the standard SMS network architecture and the encryption overhead has been estimated and compared with RSA scheme. This study indicates that the ID-based mechanism has advantages over the RSA mechanism in key distribution and scalability of increasing security level for mobile service.