A Programmable FSK-Modulator in 350nm CMOS Technology

This paper describes the design of a programmable FSK-modulator based on VCO and its implementation in 0.35m CMOS process. The circuit is used to transmit digital data at 100Kbps rate in the frequency range of 400-600MHz. The design and operation of the modulator is discussed briefly. Further the characteristics of PLL, frequency synthesizer, VCO and the whole design are elaborated. The variation among the proposed and tested specifications is presented. Finally, the layout of sub-modules, pin configurations, final chip and test results are presented.

Mirror Neuron System Study on Elderly Using Dynamic Causal Modeling fMRI Analysis

Dynamic Causal Modeling (DCM) functional Magnetic Resonance Imaging (fMRI) is a promising technique to study the connectivity among brain regions and effects of stimuli through modeling neuronal interactions from time-series neuroimaging. The aim of this study is to study characteristics of a mirror neuron system (MNS) in elderly group (age: 60-70 years old). Twenty volunteers were MRI scanned with visual stimuli to study a functional brain network. DCM was employed to determine the mechanism of mirror neuron effects. The results revealed major activated areas including precentral gyrus, inferior parietal lobule, inferior occipital gyrus, and supplementary motor area. When visual stimuli were presented, the feed-forward connectivity from visual area to conjunction area was increased and forwarded to motor area. Moreover, the connectivity from the conjunction areas to premotor area was also increased. Such findings can be useful for future diagnostic process for elderly with diseases such as Parkinson-s and Alzheimer-s.

An Advanced Stereo Vision Based Obstacle Detection with a Robust Shadow Removal Technique

This paper presents a robust method to detect obstacles in stereo images using shadow removal technique and color information. Stereo vision based obstacle detection is an algorithm that aims to detect and compute obstacle depth using stereo matching and disparity map. The proposed advanced method is divided into three phases, the first phase is detecting obstacles and removing shadows, the second one is matching and the last phase is depth computing. We propose a robust method for detecting obstacles in stereo images using a shadow removal technique based on color information in HIS space, at the first phase. In this paper we use Normalized Cross Correlation (NCC) function matching with a 5 × 5 window and prepare an empty matching table τ and start growing disparity components by drawing a seed s from S which is computed using canny edge detector, and adding it to τ. In this way we achieve higher performance than the previous works [2,17]. A fast stereo matching algorithm is proposed that visits only a small fraction of disparity space in order to find a semi-dense disparity map. It works by growing from a small set of correspondence seeds. The obstacle identified in phase one which appears in the disparity map of phase two enters to the third phase of depth computing. Finally, experimental results are presented to show the effectiveness of the proposed method.

Synchronization of Oestrus in Goats with Progestogen Sponges and Short Term Combined FGA, PGF2α Protocols

The study aimed to evaluated the reproductive performance response to short term oestrus synchronization during the transition period. One hundred and sixty-five indigenous multiparous non-lactating goats were subdivided into the following six treatment groups for oestrus synchronization: NT control Group (N= 30), Fe-21d, FGA vaginal sponge for 21days+eCG at 19thd; FPe- 11d, FGA 11d + PGF2α and eCG at 9th d; FPe-10d, FGA 10d+ PGF2α and eCG at 8th d; FPe-9d, FGA 9d +PGF2α and eCG at 7thd; PFe-5d, PGF2α at d0 + FGA 5d + eCG at 5thd. The goats were natural mated (1 male/6 females). Fecundity rates (n. births /n. females treated x 100) were statistically higher (P < 0.05) in short term FPe-9d (157.9%), FPe- 11d (115.4%), FPe-10d (111.1%) and PFe-5d (107.7%) groups compared to the NT control Group (66.7%).

Studying Implication of Globalization on Engineering Education

The primary purpose of this article is an attempt to find the implication of globalization on education. Globalization has an important role as a process in the economical, political, cultural and technological dimensions in the life of the contemporary human being and has been affected by it. Education has its effects in this procedure and while influencing it through educating global citizens having universal human features and characteristics, has been influenced by this phenomenon too. Nowadays, the role of education is not just to develop in the students the knowledge and skills necessary for the new kinds of jobs. If education wants to help students be prepared of the new global society, it has to make them engaged productive and critical citizens for the global era, so that they can reflect about their roles as key actors in a dynamic often uneven, matrix of economic and cultural exchanges. If education wants to reinforce and raise the national identity, the value system and the children and teenagers, it should make them ready for living in the global era of this century. The used method in this research is documentary and analyzing the documents. Studies in this field show globalization has influences on the processes of the production, distribution and consuming of knowledge. The happening of this event in the information era has not only provide the necessary opportunities for the exchanges of education worldwide but also has privileges for the developing countries which enables them to strengthen educational bases of their society and have an important step toward their future.

Enhancements in Blended e-Learning Management System

A learning management system (commonly abbreviated as LMS) is a software application for the administration, documentation, tracking, and reporting of training programs, classroom and online events, e-learning programs, and training content (Ellis 2009). (Hall 2003) defines an LMS as \"software that automates the administration of training events. All Learning Management Systems manage the log-in of registered users, manage course catalogs, record data from learners, and provide reports to management\". Evidence of the worldwide spread of e-learning in recent years is easy to obtain. In April 2003, no fewer than 66,000 fully online courses and 1,200 complete online programs were listed on the TeleCampus portal from TeleEducation (Paulsen 2003). In the report \" The US market in the Self-paced eLearning Products and Services:2010-2015 Forecast and Analysis\" The number of student taken classes exclusively online will be nearly equal (1% less) to the number taken classes exclusively in physical campuses. Number of student taken online course will increase from 1.37 million in 2010 to 3.86 million in 2015 in USA. In another report by The Sloan Consortium three-quarters of institutions report that the economic downturn has increased demand for online courses and programs.

Power System Security Constrained Economic Dispatch Using Real Coded Quantum Inspired Evolution Algorithm

This paper presents a new optimization technique based on quantum computing principles to solve a security constrained power system economic dispatch problem (SCED). The proposed technique is a population-based algorithm, which uses some quantum computing elements in coding and evolving groups of potential solutions to reach the optimum following a partially directed random approach. The SCED problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Real Coded Quantum-Inspired Evolution Algorithm (RQIEA) is then applied to solve the constrained optimization formulation. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that RQIEA is very applicable for solving security constrained power system economic dispatch problem (SCED).

PI Control for Positive Output Elementary Super Lift Luo Converter

The object of this paper is to design and analyze a proportional – integral (PI) control for positive output elementary super lift Luo converter (POESLLC), which is the start-of-the-art DC-DC converter. The positive output elementary super lift Luo converter performs the voltage conversion from positive source voltage to positive load voltage. This paper proposes a development of PI control capable of providing the good static and dynamic performance compared to proportional – integralderivative (PID) controller. Using state space average method derives the dynamic equations describing the positive output elementary super lift luo converter and PI control is designed. The simulation model of the positive output elementary super lift Luo converter with its control circuit is implemented in Matlab/Simulink. The PI control for positive output elementary super lift Luo converter is tested for transient region, line changes, load changes, steady state region and also for components variations.

Mechanical Evaluation of Stainless Steel and Titanium Dynamic Hip Screws for Trochanteric Fracture

This study aimed to present the mechanical performance evaluation of the dynamic hip screw (DHS) for trochanteric fracture by means of finite element method. The analyses were performed based on stainless steel and titanium implant material definitions at various stages of bone healing and including implant removal. The assessment of the mechanical performance used two parameters, von Mises stress to evaluate the strength of bone and implant and elastic strain to evaluate fracture stability. The results show several critical aspects of dynamic hip screw for trochanteric fracture stabilization. In the initial stage of bone healing process, partial weight bearing should be applied to avoid the implant failure. In the late stage of bone healing, stainless steel implant should be removed.

A Robust Controller for Output Variance Reduction and Minimum Variance with Application on a Permanent Field DC-Motor

In this paper, we present an experimental testing for a new algorithm that determines an optimal controller-s coefficients for output variance reduction related to Linear Time Invariant (LTI) Systems. The algorithm features simplicity in calculation, generalization to minimal and non-minimal phase systems, and could be configured to achieve reference tracking as well as variance reduction after compromising with the output variance. An experiment of DCmotor velocity control demonstrates the application of this new algorithm in designing the controller. The results show that the controller achieves minimum variance and reference tracking for a preset velocity reference relying on an identified model of the motor.

Connectivity Characteristic of Transcription Factor

Transcription factors are a group of proteins that helps for interpreting the genetic information in DNA. Protein-protein interactions play a major role in the execution of key biological functions of a cell. These interactions are represented in the form of a graph with nodes and edges. Studies have showed that some nodes have high degree of connectivity and such nodes, known as hub nodes, are the inevitable parts of the network. In the present paper a method is proposed to identify hub transcription factor proteins using sequence information. On a complete data set of transcription factor proteins available from the APID database, the proposed method showed an accuracy of 77%, sensitivity of 79% and specificity of 76%.

The Statistical Properties of Filtered Signals

In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.

Determination of Chemical Oxygen Demand in Spent Caustic by Potentiometric Determination

Measurement of the COD of a spent caustic solution involves firstly digestion of a test sample with dichromate solution and secondly measurement of dichromate remained by titration by ferrous ammonium sulfate [FAS] to an end point. In this paper we study by a potentiometric end point with Ag/AgCl reference electrode and gold rode electrode. The potentiometric end point is sharp and easily identified especially for the samples with high turbidity and color that other methods such as colorimetric in this type of sample do not result in high precision. Because interim of titration responds quickly to potential changes within the [Cr+6/Cr+3& Fe+2/Fe+3] solution producing stable readings that is lead to accurate COD measurement. Finally results are compared with data determined using colorimetric method for standard samples. It is shown that the potentiometric end point titration with gold rode electrode can be used with equal or better facility

Sustainable Walkability and Place Identity

The sustainability of a place depends on a series of factors which contribute to the quality of life, sense of place and recognition of identity. An activity like walking, which in itself is obviously ''sustainable'', can become non sustainable if the context in which it is carried out does not meet the conditions for an adequate quality of life. This work is aimed at proposing the analytical method of Place Maker to identify the elements that do not feature in traditional mapping and which constitute the contemporary identity of the places, and the relative complex map to represent those elements and support sustainable urban identity design. The method's potential for areas with a predominantly pedestrian vocation is illustrated by means of the case study of the Ramblas in Barcelona.

Real-Time Hand Tracking and Gesture Recognition System Using Neural Networks

This paper introduces a hand gesture recognition system to recognize real time gesture in unstrained environments. Efforts should be made to adapt computers to our natural means of communication: Speech and body language. A simple and fast algorithm using orientation histograms will be developed. It will recognize a subset of MAL static hand gestures. A pattern recognition system will be using a transforrn that converts an image into a feature vector, which will be compared with the feature vectors of a training set of gestures. The final system will be Perceptron implementation in MATLAB. This paper includes experiments of 33 hand postures and discusses the results. Experiments shows that the system can achieve a 90% recognition average rate and is suitable for real time applications.

Neural Network Based Determination of Splice Junctions by ROC Analysis

Gene, principal unit of inheritance, is an ordered sequence of nucleotides. The genes of eukaryotic organisms include alternating segments of exons and introns. The region of Deoxyribonucleic acid (DNA) within a gene containing instructions for coding a protein is called exon. On the other hand, non-coding regions called introns are another part of DNA that regulates gene expression by removing from the messenger Ribonucleic acid (RNA) in a splicing process. This paper proposes to determine splice junctions that are exon-intron boundaries by analyzing DNA sequences. A splice junction can be either exon-intron (EI) or intron exon (IE). Because of the popularity and compatibility of the artificial neural network (ANN) in genetic fields; various ANN models are applied in this research. Multi-layer Perceptron (MLP), Radial Basis Function (RBF) and Generalized Regression Neural Networks (GRNN) are used to analyze and detect the splice junctions of gene sequences. 10-fold cross validation is used to demonstrate the accuracy of networks. The real performances of these networks are found by applying Receiver Operating Characteristic (ROC) analysis.

Examining Corporate Tax Evaders: Evidence from the Finalized Audit Cases

This paper aims to (1) analyze the profiles of transgressors (detected evaders); (2) examine reason(s) that triggered a tax audit, causes of tax evasion, audit timeframe and tax penalty charged; and (3) to assess if tax auditors followed the guidelines as stated in the 'Tax Audit Framework' when conducting tax audits. In 2011, the Inland Revenue Board Malaysia (IRBM) had audited and finalized 557 company cases. With official permission, data of all the 557 cases were obtained from the IRBM. Of these, a total of 421 cases with complete information were analyzed. About 58.1% was small and medium corporations and from the construction industry (32.8%). The selection for tax audit was based on risk analysis (66.8%), information from third party (11.1%), and firm with low profitability or fluctuating profit pattern (7.8%). The three persistent causes of tax evasion by firms were over claimed expenses (46.8%), fraudulent reporting of income (38.5%) and overstating purchases (10.5%). These findings are consistent with past literature. Results showed that tax auditors took six to 18 months to close audit cases. More than half of tax evaders were fined 45% on additional tax raised during audit for the first offence. The study found tax auditors did follow the guidelines in the 'Tax Audit Framework' in audit selection, settlement and penalty imposition.

Using Automated Database Reverse Engineering for Database Integration

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.

Structure of Linkages and Cam Gear for Integral Steering of Vehicles

This paper addresses issues of integral steering of vehicles with two steering axles, where the rear wheels are pivoted in the direction of the front wheels, but also in the opposite direction. The steering box of the rear axle is presented with simple linkages (single contour) that correlate the pivoting of the rear wheels according to the direction of the front wheels, respectively to the rotation angle of the steering wheel. The functionality of the system is analyzed – the extent to which the requirements of the integral steering are met by the considered/proposed mechanisms. The paper highlights the quality of the single contour linkages, with two driving elements for meeting these requirements, emphasizing diagrams of mechanisms with 2 driving elements. Cam variants are analyzed and proposed for the rear axle steering box. Cam profiles are determined by various factors.

Towards Development of Solution for Business Process-Oriented Data Analysis

This paper proposes a modeling methodology for the development of data analysis solution. The Author introduce the approach to address data warehousing issues at the at enterprise level. The methodology covers the process of the requirements eliciting and analysis stage as well as initial design of data warehouse. The paper reviews extended business process model, which satisfy the needs of data warehouse development. The Author considers that the use of business process models is necessary, as it reflects both enterprise information systems and business functions, which are important for data analysis. The Described approach divides development into three steps with different detailed elaboration of models. The Described approach gives possibility to gather requirements and display them to business users in easy manner.