Estimation of Skew Angle in Binary Document Images Using Hough Transform

This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.

A Dual Method for Solving General Convex Quadratic Programs

In this paper, we present a new method for solving quadratic programming problems, not strictly convex. Constraints of the problem are linear equalities and inequalities, with bounded variables. The suggested method combines the active-set strategies and support methods. The algorithm of the method and numerical experiments are presented, while comparing our approach with the active set method on randomly generated problems.

Error Rate Probability for Coded MQAM with MRC Diversity in the Presence of Cochannel Interferers over Nakagami-Fading Channels

Exact expressions for bit-error probability (BEP) for coherent square detection of uncoded and coded M-ary quadrature amplitude modulation (MQAM) using an array of antennas with maximal ratio combining (MRC) in a flat fading channel interference limited system in a Nakagami-m fading environment is derived. The analysis assumes an arbitrary number of independent and identically distributed Nakagami interferers. The results for coded MQAM are computed numerically for the case of (24,12) extended Golay code and compared with uncoded MQAM by plotting error probabilities versus average signal-to-interference ratio (SIR) for various values of order of diversity N, number of distinct symbols M, in order to examine the effect of cochannel interferers on the performance of the digital communication system. The diversity gains and net gains are also presented in tabular form in order to examine the performance of digital communication system in the presence of interferers, as the order of diversity increases. The analytical results presented in this paper are expected to provide useful information needed for design and analysis of digital communication systems with space diversity in wireless fading channels.

Comparison of an Interior Mounted Permanent Magnet Synchronous Generator with a Synchronous Reluctance Generator for a Wind Application

This article presents a performance comparison of an interior mounted permanent magnet synchronous generator (IPMSG) with a synchronous reluctance generator (SynRG) with the same size for a wind application. It is found that using the same geometrical dimensions, a SynRG can convert 74 % of the power that an IPMSG can convert, while it has 80% of the IPMSG weight. Moreover it is found that the efficieny for the IMPSG is 99% at rated power compared to 98.7% for the SynRG.

Fuzzy Ideology based Long Term Load Forecasting

Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).

Using Perspective Schemata to Model the ETL Process

Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.

Geometric Data Structures and Their Selected Applications

Finding the shortest path between two positions is a fundamental problem in transportation, routing, and communications applications. In robot motion planning, the robot should pass around the obstacles touching none of them, i.e. the goal is to find a collision-free path from a starting to a target position. This task has many specific formulations depending on the shape of obstacles, allowable directions of movements, knowledge of the scene, etc. Research of path planning has yielded many fundamentally different approaches to its solution, mainly based on various decomposition and roadmap methods. In this paper, we show a possible use of visibility graphs in point-to-point motion planning in the Euclidean plane and an alternative approach using Voronoi diagrams that decreases the probability of collisions with obstacles. The second application area, investigated here, is focused on problems of finding minimal networks connecting a set of given points in the plane using either only straight connections between pairs of points (minimum spanning tree) or allowing the addition of auxiliary points to the set to obtain shorter spanning networks (minimum Steiner tree).

The Development of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

The paper investigates the feasibility of constructing a software multi-agent based monitoring and classification system and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. The agents function autonomously to provide continuous and periodic monitoring of excels spreadsheet workbooks. Resulting in, the development of the MultiAgent classification System (MACS) that is in compliance with the specifications of the Foundation for Intelligent Physical Agents (FIPA). However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies that are Windows Communication Foundation (WCF) services, Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW that is in order to satisfy the monitoring and classification of the multiple developer aspect. ODM was used to automate the classification phase of MACS.

A Hybrid Machine Learning System for Stock Market Forecasting

In this paper, we propose a hybrid machine learning system based on Genetic Algorithm (GA) and Support Vector Machines (SVM) for stock market prediction. A variety of indicators from the technical analysis field of study are used as input features. We also make use of the correlation between stock prices of different companies to forecast the price of a stock, making use of technical indicators of highly correlated stocks, not only the stock to be predicted. The genetic algorithm is used to select the set of most informative input features from among all the technical indicators. The results show that the hybrid GA-SVM system outperforms the stand alone SVM system.

The Applications of Quantum Mechanics Simulation for Solvent Selection in Chemicals Separation

The quantum mechanics simulation was applied for calculating the interaction force between 2 molecules based on atomic level. For the simple extractive distillation system, it is ternary components consisting of 2 closed boiling point components (A,lower boiling point and B, higher boiling point) and solvent (S). The quantum mechanics simulation was used to calculate the intermolecular force (interaction force) between the closed boiling point components and solvents consisting of intermolecular between A-S and B-S. The requirement of the promising solvent for extractive distillation is that solvent (S) has to form stronger intermolecular force with only one component than the other component (A or B). In this study, the systems of aromatic-aromatic, aromatic-cycloparaffin, and paraffindiolefin systems were selected as the demonstration for solvent selection. This study defined new term using for screening the solvents called relative interaction force which is calculated from the quantum mechanics simulation. The results showed that relative interaction force gave the good agreement with the literature data (relative volatilities from the experiment). The reasons are discussed. Finally, this study suggests that quantum mechanics results can improve the relative volatility estimation for screening the solvents leading to reduce time and money consuming

Partial Connection Architecture for Mobile Computing

In mobile computing environments, there are many new non existing problems in the distributed system, which is consisted of stationary hosts because of host mobility, sudden disconnection by handoff in wireless networks, voluntary disconnection for efficient power consumption of a mobile host, etc. To solve the problems, we proposed the architecture of Partial Connection Manager (PCM) in this paper. PCM creates the limited number of mobile agents according to priority, sends them in parallel to servers, and combines the results to process the user request rapidly. In applying the proposed PCM to the mobile market agent service, we understand that the mobile agent technique could be suited for the mobile computing environment and the partial connection problem management.

Laboratory Scale Extraction of Sugar Cane using High Electric Field Pulses

The aim of this study was to extract sugar from sugarcane using high electric field pulse (HELP) as a non-thermal cell permeabilization method. The result of this study showed that it is possible to permeablize sugar cane cells using HELP at very short times (less than 10 sec.) and at room temperature. Increasing the field strength (from 0.5kV/cm to 2kV/cm) and pulse number (1 to 12) led to increasing the permeabilization of sugar cane cells. The energy consumption during HELP treatment of sugar cane (2.4 kJ/kg) was about 100 times less compared to thermal cell disintegration at 85

An Evaluation of Land Use Control in Hokkaido, Japan

This study focuses on an evaluation of Hokkaido which is the northernmost and largest prefecture by surface area in Japan and particularly on two points: the rivalry between all kinds of land use such as urban land and agricultural and forestry land in various cities and their surrounding areas and the possibilities for forestry biomass in areas other than those mentioned above and grasps which areas require examination of the nature of land use control and guidance through conducting land use analysis at the district level using GIS (Geographic Information Systems). The results of analysis in this study demonstrated that it is essential to divide the whole of Hokkaido into two areas: those within delineated city planning areas and those outside of delineated city planning areas and to conduct an evaluation of each land use control. In delineated urban areas, particularly urban areas, it is essential to re-examine land use from the point of view of compact cities or smart cities along with conducting an evaluation of land use control that focuses on issues of rivalry between all kinds of land use such as urban land and agricultural and forestry land. In areas outside of delineated urban areas, it is desirable to aim to build a specific community recycling range based on forest biomass utilization by conducting an evaluation of land use control concerning the possibilities for forest biomass focusing particularly on forests within and outside of city planning areas.

Night-Time Traffic Light Detection Based On SVM with Geometric Moment Features

This paper presents an effective traffic lights detection method at the night-time. First, candidate blobs of traffic lights are extracted from RGB color image. Input image is represented on the dominant color domain by using color transform proposed by Ruta, then red and green color dominant regions are selected as candidates. After candidate blob selection, we carry out shape filter for noise reduction using information of blobs such as length, area, area of boundary box, etc. A multi-class classifier based on SVM (Support Vector Machine) applies into the candidates. Three kinds of features are used. We use basic features such as blob width, height, center coordinate, area, area of blob. Bright based stochastic features are also used. In particular, geometric based moment-s values between candidate region and adjacent region are proposed and used to improve the detection performance. The proposed system is implemented on Intel Core CPU with 2.80 GHz and 4 GB RAM and tested with the urban and rural road videos. Through the test, we show that the proposed method using PF, BMF, and GMF reaches up to 93 % of detection rate with computation time of in average 15 ms/frame.

Programming Aid Tool for Detecting Common Mistakes of Novice Programmers in OpenMP Code

OpenMP is an API for parallel programming model of shared memory multiprocessors. Novice OpenMP programmers often produce the code that compiler cannot find human errors. It was investigated how compiler coped with the common mistakes that can occur in OpenMP code. The latest version(4.4.3) of GCC is used for this research. It was found that GCC compiled the codes without any errors or warnings. In this paper the programming aid tool is presented for OpenMP programs. It can check 12 common mistakes that novice programmer can commit during the programming of OpenMP. It was demonstrated that the programming aid tool can detect the various common mistakes that GCC failed to detect.

Fatty Acids Composition of Elk, Deer, Roe Deer and Wild Boar Meat Hunted in Latvia

A game animals – elk (Alces alces), deer (Cervus elaphus), roe deer (Capreolus capreolus) or wild boar (Sus scrofa scrofa) - every autumn and winter period provide an excellent investment, diversification of many consumer meals. In last years consumption and assortiment of game meat products significantly increase. Investigations about biochemical composition of game meat are not very much. The meat of wild animals is more favourable for human health because it has lower saturated fatty acids content, but higher content of protein. Therefore the aim of investigations was to compare biochemical composition of ungulates obtained in Latvia.Investigations were carried out in wild animals different regions of Latvia. In the studied samples protein, intramuscular fat, fatty acids and cholesterol were determined. The biochemical analysis of 54 samples were done. Results of analysis showed that protein content 22.36 – 22.92% of all types of meat samples is not different statistically, significantly lower fat content 1.33 ± 0.88% had elk meat samples and 1.59 ± 0.59% roe deer samples. Content of cholesterol was various 64.41 – 95.07% in the ruminant meat samples of different species. From the dietetic point of view the best composition of fatty acids has meat samples of roe deer.

Hybrid Coding for Animated Polygonal Meshes

A new hybrid coding method for compressing animated polygonal meshes is presented. This paper assumes the simplistic representation of the geometric data: a temporal sequence of polygonal meshes for each discrete frame of the animated sequence. The method utilizes a delta coding and an octree-based method. In this hybrid method, both the octree approach and the delta coding approach are applied to each single frame in the animation sequence in parallel. The approach that generates the smaller encoded file size is chosen to encode the current frame. Given the same quality requirement, the hybrid coding method can achieve much higher compression ratio than the octree-only method or the delta-only method. The hybrid approach can represent 3D animated sequences with higher compression factors while maintaining reasonable quality. It is easy to implement and have a low cost encoding process and a fast decoding process, which make it a better choice for real time application.

Robust Power System Stabilizer Design Using Particle Swarm Optimization Technique

Power system stabilizers (PSS) are now routinely used in the industry to damp out power system oscillations. In this paper, particle swarm optimization (PSO) technique is applied to design a robust power system stabilizer (PSS). The design problem of the proposed controller is formulated as an optimization problem and PSO is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. The non-linear simulation results are presented under wide range of operating conditions; disturbances at different locations as well as for various fault clearing sequences to show the effectiveness and robustness of the proposed controller and their ability to provide efficient damping of low frequency oscillations. Further, all the simulations results are compared with a conventionally designed power system stabilizer to show the superiority of the proposed design approach.

Protein Delivery from Polymeric Nanoparticles

Aim of this work was to compare the efficacy of two loading methods of proteins onto polymeric nanocarriers: adsorption and encapsulation methods. Preliminary studies of protein loading were done using Bovine Serum Albumin (BSA) as model protein. Nanocarriers were prepared starting from polylactic co-glycolic acid (PLGA) polymer; production methods used are two different variants of emulsion evaporation method. Nanoparticles obtained were analyzed in terms of dimensions by Dynamic Light Scattering and Loading Efficiency of BSA by Bradford Assay. Loaded nanoparticles were then submitted to in-vitro protein dissolution test in order to study the effect of the delivery system on the release rate of the protein.

Preparation Influences of Breed, sex and Sodium Butyrate Supplementation on the Performance, Carcass Traits and Mortality of Fattening Rabbits

Twenty four New Zealand white rabbits (12 does and 12 bucks) and twenty four Flanders (12 does and 12 bucks) rabbits, allotted into two feeding regime (6 for each breed, 3 males and 3 females) first one fed commercial ration and second one fed commercial diet plus sodium butyrate (300 g/ton). The obtained results showed that at end of 8th week experimental period New Zealand white rabbits were heavier body weight than Flanders rabbits (1934.55+39.05 vs. 1802.5+30.99 g); significantly high body weight gain during experimental period especially during 8th week (136.1+3.5 vs. 126.8+1.8 g/week); better feed conversion ratio during all weeks of experiment from first week (3.07+0.16 vs. 3.12+0.10) till the 8th week of experiment (5.54+0.16 vs. 5.76+0.07) with significantly high dressing percentages (0.54+0.01 vs. 0.52+0.01). Also all carcass cuts were significantly high in New Zealand white rabbits than Flanders. Females rabbits (at the same age) were lower body weight than males from start of experiment (941.1+39.8 vs.972.1+33.5 g) till the end of experiment (1833.64+37.69 vs. 1903.41+36.93 g); gained less during all weeks of experiment except during 8th week (132.1+2.3 vs. 130.9+3.4 g/week), with lower dressing percentage (0.52+0.01 vs. 0.53+0.01) and lighter carcass cuts than males, however, they had better feed conversion ratio during 1st week, 7th week and 8th week of experiment. Addition of 300g sodium butyrate/ton of rabbit increased the body weight of rabbits at the end of experimental period (1882.71+26.45 vs. 1851.5+49.82 g); improve body weight gain at 3rd, 4th, 5th, 6th and 7th week of experiment and significantly improve feed conversion ratio during all weeks of the experiment from 1st week (2.85+0.07 vs. 3.30+0.15) till the 8th week of the experiment (5.51+0.12 vs. 5.77+0.12). Also the dressing percentage was higher in Sodium butyrate fed groups than control one (0.53+0.01 vs. 0.52+0.01) and the most important results of feeding sodium butyrate is the reducing of the mortality percentage in rabbits during 8 week experiment to zero percentage as compared with 16% in control group.