OCR for Script Identification of Hindi (Devnagari) Numerals using Error Diffusion Halftoning Algorithm with Neural Classifier

The applications on numbers are across-the-board that there is much scope for study. The chic of writing numbers is diverse and comes in a variety of form, size and fonts. Identification of Indian languages scripts is challenging problems. In Optical Character Recognition [OCR], machine printed or handwritten characters/numerals are recognized. There are plentiful approaches that deal with problem of detection of numerals/character depending on the sort of feature extracted and different way of extracting them. This paper proposes a recognition scheme for handwritten Hindi (devnagiri) numerals; most admired one in Indian subcontinent our work focused on a technique in feature extraction i.e. Local-based approach, a method using 16-segment display concept, which is extracted from halftoned images & Binary images of isolated numerals. These feature vectors are fed to neural classifier model that has been trained to recognize a Hindi numeral. The archetype of system has been tested on varieties of image of numerals. Experimentation result shows that recognition rate of halftoned images is 98 % compared to binary images (95%).

Automatic Building an Extensive Arabic FA Terms Dictionary

Field Association (FA) terms are a limited set of discriminating terms that give us the knowledge to identify document fields which are effective in document classification, similar file retrieval and passage retrieval. But the problem lies in the lack of an effective method to extract automatically relevant Arabic FA Terms to build a comprehensive dictionary. Moreover, all previous studies are based on FA terms in English and Japanese, and the extension of FA terms to other language such Arabic could be definitely strengthen further researches. This paper presents a new method to extract, Arabic FA Terms from domain-specific corpora using part-of-speech (POS) pattern rules and corpora comparison. Experimental evaluation is carried out for 14 different fields using 251 MB of domain-specific corpora obtained from Arabic Wikipedia dumps and Alhyah news selected average of 2,825 FA Terms (single and compound) per field. From the experimental results, recall and precision are 84% and 79% respectively. Therefore, this method selects higher number of relevant Arabic FA Terms at high precision and recall.

Development of Cooling Load Demand Program for Building in Malaysia

Air conditioning is mainly to be used as human comfort medium. It has been use more often in country in which the daily temperatures are high. In scientific, air conditioning is defined as a process of controlling the moisture, cooling, heating and cleaning air. Without proper estimation of cooling load, big amount of waste energy been used because of unsuitable of air conditioning system are not considering to overcoming heat gains from surrounding. This is due to the size of the room is too big and the air conditioning has to use more energy to cool the room and the air conditioning is too small for the room. The studies are basically to develop a program to calculate cooling load. Through this study it is easy to calculate cooling load estimation. Furthermore it-s help to compare the cooling load estimation by hourly and yearly. Base on the last study that been done, the developed software are not user-friendly. For individual without proper knowledge of calculating cooling load estimation might be problem. Easy excess and user-friendly should be the main objective to design something. This program will allow cooling load able be estimate by any users rather than estimation by using rule of thumb. Several of limitation of case study is judged to sure it-s meeting to Malaysia building specification. Finally validation is done by comparison manual calculation and by developed program.

Comparison of Compression Ability Using DCT and Fractal Technique on Different Imaging Modalities

Image compression is one of the most important applications Digital Image Processing. Advanced medical imaging requires storage of large quantities of digitized clinical data. Due to the constrained bandwidth and storage capacity, however, a medical image must be compressed before transmission and storage. There are two types of compression methods, lossless and lossy. In Lossless compression method the original image is retrieved without any distortion. In lossy compression method, the reconstructed images contain some distortion. Direct Cosine Transform (DCT) and Fractal Image Compression (FIC) are types of lossy compression methods. This work shows that lossy compression methods can be chosen for medical image compression without significant degradation of the image quality. In this work DCT and Fractal Compression using Partitioned Iterated Function Systems (PIFS) are applied on different modalities of images like CT Scan, Ultrasound, Angiogram, X-ray and mammogram. Approximately 20 images are considered in each modality and the average values of compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the reconstructed image is arrived by the PSNR values. Based on the results it can be concluded that the DCT has higher PSNR values and FIC has higher compression ratio. Hence in medical image compression, DCT can be used wherever picture quality is preferred and FIC is used wherever compression of images for storage and transmission is the priority, without loosing picture quality diagnostically.

Application of Artificial Neural Network to Forecast Actual Cost of a Project to Improve Earned Value Management System

This paper presents an application of Artificial Neural Network (ANN) to forecast actual cost of a project based on the earned value management system (EVMS). For this purpose, some projects randomly selected based on the standard data set , and it is produced necessary progress data such as actual cost ,actual percent complete , baseline cost and percent complete for five periods of project. Then an ANN with five inputs and five outputs and one hidden layer is trained to produce forecasted actual costs. The comparison between real and forecasted data show better performance based on the Mean Absolute Percentage Error (MAPE) criterion. This approach could be applicable to better forecasting the project cost and result in decreasing the risk of project cost overrun, and therefore it is beneficial for planning preventive actions.

Comparison of Evolutionary Algorithms and their Hybrids Applied to MarioAI

Researchers have been applying artificial/ computational intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI methods with respect to each game application. In thispaper, we report our experimental result on the comparison of evolution strategy, genetic algorithm and their hybrids, applied to evolving controller agents for MarioAI. GA revealed its advantage in our experiment, whereas the expected ability of ES in exploiting (fine-tuning) solutions was not clearly observed. The blend crossover operator and the mutation operator of GA might contribute well to explore the vast search space.

Bangla Vowel Characterization Based on Analysis by Synthesis

Bangla Vowel characterization determines the spectral properties of Bangla vowels for efficient synthesis as well as recognition of Bangla vowels. In this paper, Bangla vowels in isolated word have been analyzed based on speech production model within the framework of Analysis-by-Synthesis. This has led to the extraction of spectral parameters for the production model in order to produce different Bangla vowel sounds. The real and synthetic spectra are compared and a weighted square error has been computed along with the error in the formant bandwidths for efficient representation of Bangla vowels. The extracted features produced good representation of targeted Bangla vowel. Such a representation also plays essential role in low bit rate speech coding and vocoders.

Novel Ridge Orientation Based Approach for Fingerprint Identification Using Co-Occurrence Matrix

In this paper we use the property of co-occurrence matrix in finding parallel lines in binary pictures for fingerprint identification. In our proposed algorithm, we reduce the noise by filtering the fingerprint images and then transfer the fingerprint images to binary images using a proper threshold. Next, we divide the binary images into some regions having parallel lines in the same direction. The lines in each region have a specific angle that can be used for comparison. This method is simple, performs the comparison step quickly and has a good resistance in the presence of the noise.

Decision Support System “Crop-9-DSS“ for Identified Crops

Application of Expert System in the area of agriculture would take the form of Integrated Crop Management decision aids and would encompass water management, fertilizer management, crop protection systems and identification of implements. In order to remain competitive, the modern farmer often relies on agricultural specialists and advisors to provide information for decision-making. An expert system normally composed of a knowledge base (information, heuristics, etc.), inference engine (analyzes knowledge base), and end user interface (accepting inputs, generating outputs). Software named 'CROP-9-DSS' incorporating all modern features like, graphics, photos, video clippings etc. has been developed. This package will aid as a decision support system for identification of pest and diseases with control measures, fertilizer recommendation system, water management system and identification of farm implements for leading crops of Kerala (India) namely Coconut, Rice, Cashew, Pepper, Banana, four vegetables like Amaranthus, Bhindi, Brinjal and Cucurbits. 'CROP-9-DSS' will act as an expert system to agricultural officers, scientists in the field of agriculture and extension workers for decision-making and help them in suggesting suitable recommendations.

Removal of Elemental Mercury from Dry Methane Gas with Manganese Oxides

In this study, we sought to investigate the mercury removal efficiency of manganese oxides from natural gas. The fundamental studies on mercury removal with manganese oxides sorbents were carried out in a laboratory scale fixed bed reactor at 30 °C with a mixture of methane (20%) and nitrogen gas laden with 4.8 ppb of elemental mercury. Manganese oxides with varying surface area and crystalline phase were prepared by conventional precipitation method in this study. The effects of surface area, crystallinity and other metal oxides on mercury removal efficiency were investigated. Effect of Ag impregnation on mercury removal efficiency was also investigated. Ag supported on metal oxide such titania and zirconia as reference materials were also used in this study for comparison. The characteristics of mercury removal reaction with manganese oxide was investigated using a temperature programmed desorption (TPD) technique. Manganese oxides showed very high Hg removal activity (about 73-93% Hg removal) for first time use. Surface area of the manganese oxide samples decreased after heat-treatment and resulted in complete loss of Hg removal ability for repeated use after Hg desorption in the case of amorphous MnO2, and 75% loss of the initial Hg removal activity for the crystalline MnO2. Mercury desorption efficiency of crystalline MnO2 was very low (37%) for first time use and high (98%) after second time use. Residual potassium content in MnO2 may have some effect on the thermal stability of the adsorbed Hg species. Desorption of Hg from manganese oxides occurs at much higher temperatures (with a peak at 400 °C) than Ag/TiO2 or Ag/ZrO2. Mercury may be captured on manganese oxides in the form of mercury manganese oxide.

Accurate Optical Flow Based on Spatiotemporal Gradient Constancy Assumption

Variational methods for optical flow estimation are known for their excellent performance. The method proposed by Brox et al. [5] exemplifies the strength of that framework. It combines several concepts into single energy functional that is then minimized according to clear numerical procedure. In this paper we propose a modification of that algorithm starting from the spatiotemporal gradient constancy assumption. The numerical scheme allows to establish the connection between our model and the CLG(H) method introduced in [18]. Experimental evaluation carried out on synthetic sequences shows the significant superiority of the spatial variant of the proposed method. The comparison between methods for the realworld sequence is also enclosed.

Interfacing C and TMS320C6713 Assembly Language (Part-I)

This paper describes an interfacing of C and the TMS320C6713 assembly language which is crucially important for many real-time applications. Similarly, interfacing of C with the assembly language of a conventional microprocessor such as MC68000 is presented for comparison. However, it should be noted that the way the C compiler passes arguments among various functions in the TMS320C6713-based environment is totally different from the way the C compiler passes arguments in a conventional microprocessor such as MC68000. Therefore, it is very important for a user of the TMS320C6713-based system to properly understand and follow the register conventions when interfacing C with the TMS320C6713 assembly language subroutine. It should be also noted that in some cases (examples 6-9) the endian-mode of the board needs to be taken into consideration. In this paper, one method is presented in great detail. Other methods will be presented in the future.

Treatment or Re-Victimizing the Victims

Severe symptoms, such as dissociation, depersonalization, self-mutilation, suicidal ideations and gestures, are the main reasons for a person to be diagnosed with Borderline Personality Disorder (BPD) and admitted to an inpatient Psychiatric Hospital. However, these symptoms are also indicators of a severe traumatic history as indicated by the extensive research on the topic. Unfortunately patients with such clinical presentation often are treated repeatedly only for their symptomatic behavior, while the main cause for their suffering, the trauma itself, is usually left unaddressed therapeutically. All of the highly structured, replicable, and manualized treatments lack the recognition of the uniqueness of the person and fail to respect his/her rights to experience and react in an idiosyncratic manner. Thus the communicative and adaptive meaning of such symptomatic behavior is missed. Only its pathological side is recognized and subjected to correction and stigmatization, and the message that the person is damaged goods that needs fixing is conveyed once again. However, this time the message would be even more convincing for the victim, because it is sent by mental health providers, who have the credibility to make such a judgment. The result is a revolving door of very expensive hospitalizations for only a temporary and patchy fix. In this way the patients, once victims of abuse and hardship are left invalidated and thus their re-victimization is perpetuated in their search for understanding and help. Keywordsborderline personality disorder (BPD), complex PTSD, integrative treatment of trauma, re-victimization of trauma victims.

Design of Encoding Calculator Software for Huffman and Shannon-Fano Algorithms

This paper presents a design of source encoding calculator software which applies the two famous algorithms in the field of information theory- the Shannon-Fano and the Huffman schemes. This design helps to easily realize the algorithms without going into a cumbersome, tedious and prone to error manual mechanism of encoding the signals during the transmission. The work describes the design of the software, how it works, comparison with related works, its efficiency, its usefulness in the field of information technology studies and the future prospects of the software to engineers, students, technicians and alike. The designed “Encodia" software has been developed, tested and found to meet the intended requirements. It is expected that this application will help students and teaching staff in their daily doing of information theory related tasks. The process is ongoing to modify this tool so that it can also be more intensely useful in research activities on source coding.

Big Bang – Big Crunch Optimization Method in Optimum Design of Complex Composite Laminates

An accurate optimal design of laminated composite structures may present considerable difficulties due to the complexity and multi-modality of the functional design space. The Big Bang – Big Crunch (BB-BC) optimization method is a relatively new technique and has already proved to be a valuable tool for structural optimization. In the present study the exceptional efficiency of the method is demonstrated by an example of the lay-up optimization of multilayered anisotropic cylinders based on a three-dimensional elasticity solution. It is shown that, due to its simplicity and speed, the BB-BC is much more efficient for this class of problems when compared to the genetic algorithms.

Effect of a Probiotic Compound in Rumen Development, Diarrhea Incidence and Weight Gain in Young Holstein Calves

It has been proven that early establishment of microbial flora in digestive tract of ruminants, has a beneficial effect on their health condition and productivity. A probiotic compound, made from five bacteria isolated from adult bovine cattle, was dosed to 15 Holstein newborn calves in order to measure its capacity of improving body weight gain and reduce diarrhea incidence. The test was performed in the municipality of Cajicá (Colombia), at 2580 m.a.s.l., throughout rainy season, with environmental temperature that oscillated between 4 to 25 °C. Five calves were allotted to control (no addition of probiotic). Treatments 1, and 2 (5 calves per group) received 10 ml Probiotic mix 1 and 2, respectively. Probiotic mixes 1 and 2 where similar in microbial composition but different in production process. Probiotics were added to the morning milk and dosed on a daily basis by a month and then on a weekly basis for three additional months. Diarrhea incidence was measured by observance of number of animals affected in each group; each animal was weighed up on a daily basis for obtaining weight gain and rumen fluid samples were extracted with oro-esophageal catheter for determining level of fiber and grain consumption.

Comparison of SVC and STATCOM in Static Voltage Stability Margin Enhancement

One of the major causes of voltage instability is the reactive power limit of the system. Improving the system's reactive power handling capacity via Flexible AC transmission System (FACTS) devices is a remedy for prevention of voltage instability and hence voltage collapse. In this paper, the effects of SVC and STATCOM in Static Voltage Stability Margin Enhancement will be studied. AC and DC representations of SVC and STATCOM are used in the continuation power flow process in static voltage stability study. The IEEE-14 bus system is simulated to test the increasing loadability. It is found that these controllers significantly increase the loadability margin of power systems.

Measuring Pressure Wave Velocity in a Hydraulic System

Pressure wave velocity in a hydraulic system was determined using piezo pressure sensors without removing fluid from the system. The measurements were carried out in a low pressure range (0.2 – 6 bar) and the results were compared with the results of other studies. This method is not as accurate as measurement with separate measurement equipment, but the fluid is in the actual machine the whole time and the effect of air is taken into consideration if air is present in the system. The amount of air is estimated by calculations and comparisons between other studies. This measurement equipment can also be installed in an existing machine and it can be programmed so that it measures in real time. Thus, it could be used e.g. to control dampers.

Towards Security in Virtualization of SDN

In this paper, the potential security issues brought by the virtualization of a Software Defined Networks (SDN) would be analyzed. The virtualization of SDN is achieved by FlowVisor (FV). With FV, a physical network is divided into multiple isolated logical networks while the underlying resources are still shared by different slices (isolated logical networks). However, along with the benefits brought by network virtualization, it also presents some issues regarding security. By examining security issues existing in an OpenFlow network, which uses FlowVisor to slice it into multiple virtual networks, we hope we can get some significant results and also can get furtherdiscussions among the security of SDN virtualization.

The Decentralized Nonlinear Controller of Robot Manipulator with External Load Compensation

This paper describes a newly designed decentralized nonlinear control strategy to control a robot manipulator. Based on the concept of the nonlinear state feedback theory and decentralized concept is developed to improve the drawbacks in previous works concerned with complicate intelligent control and low cost effective sensor. The control methodology is derived in the sense of Lyapunov theorem so that the stability of the control system is guaranteed. The decentralized algorithm does not require other joint angle and velocity information. Individual Joint controller is implemented using a digital processor with nearly actuator to make it possible to achieve good dynamics and modular. Computer simulation result has been conducted to validate the effectiveness of the proposed control scheme under the occurrence of possible uncertainties and different reference trajectories. The merit of the proposed control system is indicated in comparison with a classical control system.