The Effects of Detector Spacing on Travel Time Prediction on Freeways

Loop detectors report traffic characteristics in real time. They are at the core of traffic control process. Intuitively, one would expect that as density of detection increases, so would the quality of estimates derived from detector data. However, as detector deployment increases, the associated operating and maintenance cost increases. Thus, traffic agencies often need to decide where to add new detectors and which detectors should continue receiving maintenance, given their resource constraints. This paper evaluates the effect of detector spacing on freeway travel time estimation. A freeway section (Interstate-15) in Salt Lake City metropolitan region is examined. The research reveals that travel time accuracy does not necessarily deteriorate with increased detector spacing. Rather, the actual location of detectors has far greater influence on the quality of travel time estimates. The study presents an innovative computational approach that delivers optimal detector locations through a process that relies on Genetic Algorithm formulation.

An Approach for a Bidding Process Knowledge Capitalization

Preparation and negotiation of innovative and future projects can be characterized as a strategic-type decision situation, involving many uncertainties and an unpredictable environment. We will focus in this paper on the bidding process. It includes cooperative and strategic decisions. Our approach for bidding process knowledge capitalization is aimed at information management in project-oriented organizations, based on the MUSIC (Management and Use of Co-operative Information Systems) model. We will show how to capitalize the company strategic knowledge and also how to organize the corporate memory. The result of the adopted approach is improvement of corporate memory quality.

Embedding a Large Amount of Information Using High Secure Neural Based Steganography Algorithm

In this paper, we construct and implement a new Steganography algorithm based on learning system to hide a large amount of information into color BMP image. We have used adaptive image filtering and adaptive non-uniform image segmentation with bits replacement on the appropriate pixels. These pixels are selected randomly rather than sequentially by using new concept defined by main cases with sub cases for each byte in one pixel. According to the steps of design, we have been concluded 16 main cases with their sub cases that covere all aspects of the input information into color bitmap image. High security layers have been proposed through four layers of security to make it difficult to break the encryption of the input information and confuse steganalysis too. Learning system has been introduces at the fourth layer of security through neural network. This layer is used to increase the difficulties of the statistical attacks. Our results against statistical and visual attacks are discussed before and after using the learning system and we make comparison with the previous Steganography algorithm. We show that our algorithm can embed efficiently a large amount of information that has been reached to 75% of the image size (replace 18 bits for each pixel as a maximum) with high quality of the output.

Biometric Technology in Securing the Internet Using Large Neural Network Technology

The article examines the methods of protection of citizens' personal data on the Internet using biometric identity authentication technology. It`s celebrated their potential danger due to the threat of loss of base biometric templates. To eliminate the threat of compromised biometric templates is proposed to use neural networks large and extra-large sizes, which will on the one hand securely (Highly reliable) to authenticate a person by his biometrics, and on the other hand make biometrics a person is not available for observation and understanding. This article also describes in detail the transformation of personal biometric data access code. It`s formed the requirements for biometrics converter code for his work with the images of "Insider," "Stranger", all the "Strangers". It`s analyzed the effect of the dimension of neural networks on the quality of converters mystery of biometrics in access code.

Research on Weakly Hard Real-Time Constraints and Their Boolean Combination to Support Adaptive QoS

Advances in computing applications in recent years have prompted the demand for more flexible scheduling models for QoS demand. Moreover, in practical applications, partly violated temporal constraints can be tolerated if the violation meets certain distribution. So we need extend the traditional Liu and Lanland model to adapt to these circumstances. There are two extensions, which are the (m, k)-firm model and Window-Constrained model. This paper researches on weakly hard real-time constraints and their combination to support QoS. The fact that a practical application can tolerate some violations of temporal constraint under certain distribution is employed to support adaptive QoS on the open real-time system. The experiment results show these approaches are effective compared to traditional scheduling algorithms.

Quality of Life: Expectations and Achievements of Middle Class in Kazakhstan

The improvement of quality of life is the main visible integrated indicator of state well-being. More and more states pay attention to define and to achieve social standards of quality of life as social-economic strategy of development. These standards are determinate by state features, complex of needs and interests of individual, family and society. It still remains in open question: “What is middle class" in contemporary Kazakhstan. Appearance of new social standards of quality of life is important indicator of its successful establishment. The middle class as agent of social, politic and economic reforms promotes to improve the quality of life of the country. But if consider a low and a middle stratums of middle class, we can see that high social expectations and real achievements are still significantly different. The article relies on the sociological data, collected during of search of household-s standards of living in Almaty city and Almaty region, and case-study of cottage city “Jana Kuat".

Multicast Optimization Techniques using Best Effort Genetic Algorithms

Multicast Network Technology has pervaded our lives-a few examples of the Networking Techniques and also for the improvement of various routing devices we use. As we know the Multicast Data is a technology offers many applications to the user such as high speed voice, high speed data services, which is presently dominated by the Normal networking and the cable system and digital subscriber line (DSL) technologies. Advantages of Multi cast Broadcast such as over other routing techniques. Usually QoS (Quality of Service) Guarantees are required in most of Multicast applications. The bandwidth-delay constrained optimization and we use a multi objective model and routing approach based on genetic algorithm that optimizes multiple QoS parameters simultaneously. The proposed approach is non-dominated routes and the performance with high efficiency of GA. Its betterment and high optimization has been verified. We have also introduced and correlate the result of multicast GA with the Broadband wireless to minimize the delay in the path.

Effect of Soil Tillage System upon the Soil Properties, Weed Control, Quality and Quantity Yield in Some Arable Crops

The paper presents the influence of the conventional ploughing tillage technology in comparison with the minimum tillage, upon the soil properties, weed control and yield in the case of maize (Zea mays L.), soya-bean (Glycine hispida L.) and winter wheat (Triticum aestivum L.) in a three years crop rotation. A research has been conducted at the University of Agricultural Sciences and Veterinary Medicine Cluj-Napoca, Romania. The use of minimum soil tillage systems within a three years rotation: maize, soya-bean, wheat favorites the rise of the aggregates hydro stability with 5.6-7.5% on a 0-20 cm depth and 5-11% on 20-30 cm depth. The minimum soil tillage systems – paraplow, chisel or rotary grape – are polyvalent alternatives for basic preparation, germination bed preparation and sowing, for fields and crops with moderate loose requirements being optimized technologies for: soil natural fertility activation and rationalization, reduction of erosion, increasing the accumulation capacity for water and realization of sowing in the optimal period. The soil tillage system influences the productivity elements of cultivated species and finally the productions thus obtained. Thus, related to conventional working system, the productions registered in minimum tillage working represented 89- 97% in maize, 103-112% in soya-bean, 93-99% in winter-wheat. The results of investigations showed that the yield is a conclusion soil tillage systems influence on soil properties, plant density assurance and on weed control. Under minimum tillage systems in the case of winter weat as an option for replacing classic ploughing, the best results in terms of quality indices were obtained from version worked with paraplow, followed by rotary harrow and chisel. At variants worked with paraplow were obtained quality indices close to those of the variant worked with plow, and protein and gluten content was even higher. At Ariesan variety, highest protein content, 12.50% and gluten, 28.6% was obtained for the variant paraplow.

Quality Assurance and Effectiveness in Kurdistan Higher Education: The Reform Process

Implementing quality assurance in higher education establishments is the main focus of the reform process currently undertaken by the Ministry of Higher Education and Scientific Research in the Kurdistan Region of Iraq. The reform agenda has involved attempts to improve academic quality and management processes in universities, technical institutions and colleges. The central challenge for the reform process is to produce change in higher education in a region where administration is described as centralized and bureaucratic. To make these changes, there should be a well-designed plans and follow up processes in order to monitor progress and develop responses to obstacles. Lack of skills, resources, political dilemmas, poor motivation, and readiness to face the consequences of change are factors which will determine the success of the reform process.

Grid-HPA: Predicting Resource Requirements of a Job in the Grid Computing Environment

For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.

Household Demand for Solid Waste Disposal Options in Malaysia

This paper estimates the economic values of household preference for enhanced solid waste disposal services in Malaysia. The contingent valuation (CV) method estimates an average additional monthly willingness-to-pay (WTP) in solid waste management charges of Ôé¼0.77 to 0.80 for improved waste disposal services quality. The finding of a slightly higher WTP from the generic CV question than that of label-specific, further reveals a higher WTP for sanitary landfill, at Ôé¼0.90, than incineration, at Ôé¼0.63. This suggests that sanitary landfill is a more preferred alternative. The logistic regression estimation procedure reveals that household-s concern of where their rubbish is disposed, age, ownership of house, household income and format of CV question are significant factors in influencing WTP.

Impact of Loading Conditions on the Emission- Economic Dispatch

Environmental awareness and the recent environmental policies have forced many electric utilities to restructure their operational practices to account for their emission impacts. One way to accomplish this is by reformulating the traditional economic dispatch problem such that emission effects are included in the mathematical model. This paper presents a Particle Swarm Optimization (PSO) algorithm to solve the Economic- Emission Dispatch problem (EED) which gained recent attention due to the deregulation of the power industry and strict environmental regulations. The problem is formulated as a multi-objective one with two competing functions, namely economic cost and emission functions, subject to different constraints. The inequality constraints considered are the generating unit capacity limits while the equality constraint is generation-demand balance. A novel equality constraint handling mechanism is proposed in this paper. PSO algorithm is tested on a 30-bus standard test system. Results obtained show that PSO algorithm has a great potential in handling multi-objective optimization problems and is capable of capturing Pareto optimal solution set under different loading conditions.

A Feature-based Invariant Watermarking Scheme Using Zernike Moments

In this paper, a novel feature-based image watermarking scheme is proposed. Zernike moments which have invariance properties are adopted in the scheme. In the proposed scheme, feature points are first extracted from host image and several circular patches centered on these points are generated. The patches are used as carriers of watermark information because they can be regenerated to locate watermark embedding positions even when watermarked images are severely distorted. Zernike transform is then applied to the patches to calculate local Zernike moments. Dither modulation is adopted to quantize the magnitudes of the Zernike moments followed by false alarm analysis. Experimental results show that quality degradation of watermarked image is visually transparent. The proposed scheme is very robust against image processing operations and geometric attacks.

Analysis of the Communication Methods of an iCIM 3000 System within the Frame of Research Purpose

Current trends in manufacturing are characterized by production broadening, innovation cycle shortening, and the products having a new shape, material and functions. The production strategy focused on time needed change from the traditional functional production structure to flexible manufacturing cells and lines. Production by automated manufacturing system (AMS) is one of the most important manufacturing philosophies in the last years. The main goals of the project we are involved in lies on building a laboratory in which will be located a flexible manufacturing system consisting of at least two production machines with NC control (milling machines, lathe). These machines will be linked to a transport system and they will be served by industrial robots. Within this flexible manufacturing system a station for the quality control consisting of a camera system and rack warehouse will be also located. The design, analysis and improvement of this manufacturing system, specially with a special focus on the communication among devices constitute the main aims of this paper. The key determining factors for the manufacturing system design are: the product, the production volume, the used machines, the disposable manpower, the disposable infrastructure and the legislative frame for the specific cases.

Software Architecture and Support for Patient Tracking Systems in Critical Scenarios

In this work a new platform for mobile-health systems is presented. System target application is providing decision support to rescue corps or military medical personnel in combat areas. Software architecture relies on a distributed client-server system that manages a wireless ad-hoc networks hierarchy in which several different types of client operate. Each client is characterized for different hardware and software requirements. Lower hierarchy levels rely in a network of completely custom devices that store clinical information and patient status and are designed to form an ad-hoc network operating in the 2.4 GHz ISM band and complying with the IEEE 802.15.4 standard (ZigBee). Medical personnel may interact with such devices, that are called MICs (Medical Information Carriers), by means of a PDA (Personal Digital Assistant) or a MDA (Medical Digital Assistant), and transmit the information stored in their local databases as well as issue a service request to the upper hierarchy levels by using IEEE 802.11 a/b/g standard (WiFi). The server acts as a repository that stores both medical evacuation forms and associated events (e.g., a teleconsulting request). All the actors participating in the diagnostic or evacuation process may access asynchronously to such repository and update its content or generate new events. The designed system pretends to optimise and improve information spreading and flow among all the system components with the aim of improving both diagnostic quality and evacuation process.

Sampling of Variables in Discrete-Event Simulation using the Example of Inventory Evolutions in Job-Shop-Systems Based on Deterministic and Non-Deterministic Data

Time series analysis often requires data that represents the evolution of an observed variable in equidistant time steps. In order to collect this data sampling is applied. While continuous signals may be sampled, analyzed and reconstructed applying Shannon-s sampling theorem, time-discrete signals have to be dealt with differently. In this article we consider the discrete-event simulation (DES) of job-shop-systems and study the effects of different sampling rates on data quality regarding completeness and accuracy of reconstructed inventory evolutions. At this we discuss deterministic as well as non-deterministic behavior of system variables. Error curves are deployed to illustrate and discuss the sampling rate-s impact and to derive recommendations for its wellfounded choice.

A Framework for Scalable Autonomous P2P Resource Discovery for the Grid Implementation

Recently, there have been considerable efforts towards the convergence between P2P and Grid computing in order to reach a solution that takes the best of both worlds by exploiting the advantages that each offers. Augmenting the peer-to-peer model to the services of the Grid promises to eliminate bottlenecks and ensure greater scalability, availability, and fault-tolerance. The Grid Information Service (GIS) directly influences quality of service for grid platforms. Most of the proposed solutions for decentralizing the GIS are based on completely flat overlays. The main contributions for this paper are: the investigation of a novel resource discovery framework for Grid implementations based on a hierarchy of structured peer-to-peer overlay networks, and introducing a discovery algorithm utilizing the proposed framework. Validation of the framework-s performance is done via simulation. Experimental results show that the proposed organization has the advantage of being scalable while providing fault-isolation, effective bandwidth utilization, and hierarchical access control. In addition, it will lead to a reliable, guaranteed sub-linear search which returns results within a bounded interval of time and with a smaller amount of generated traffic within each domain.

Planning for Minimization of Socioeconomic Inequalities within Vidarbha Region, Maharashtra, India

Disparity in India has been persisting since independence causing many socioeconomic problems and its removal has become the most prime objective of the planned development in India. Hence the paper attempts to study the disparity at State and Regional level and gives inclusive planning guidelines to achieve balanced regional development. At State level, the relative socioeconomic backwardness of Vidarbha Region based on Interregional analysis using selected indicators like Foreign Direct Investment, Human Development Index, Per Capita District Domestic Product has been assessed and broad guidelines have been proposed. In the later part at Regional level, the relative backwardness of districts based on Intraregional analysis using socioeconomic indicators has been assessed within Nagpur sub region and factors responsible for backwardness & disparity have been indicated. The policy guidelines for Identified sub region have been proposed based on the most significant factor and their extent of relationship explaining backwardness Nagpur sub region.

Enhanced Quality of Zeolite LSX: Studying Effect of Crystallized Containers

Low silica type X (LSX) Zeolite is one of useful material in many manufacturing due to the advantage properties including high surface area, stability, microporous crystalline aluminosilicates and positive ion in an extra–framework. The LSX was used rice husk silica source which obtained by leaching with hydrochloric acid and calcination at 500C. To improve the synthesis method, the LSX was crystallizated in Teflon–lined autoclave will expedite deceasing of the amorphous particles. The mixed gel with composition of 5.5 Na2O : 1.65 K2O : Al2O3 : 2.2 SiO2 : 122 H2O was crystallized in different container (Polypropylene bottom and Teflon–lined autoclave). The obtained powder was characterized by X–ray diffraction (XRD), X–ray fluorescence spectrometry, N2 adsorption-desorption analysis BET surface area Scanning electron microscopy (SEM) and Fourier transform infrared spectroscopy to justify the quality of zeolite. The results showed the crystallized zeolite in Teflon lined autoclave has 102.8 nm of crystal size, 286 m2/g of surface area and fewer amounts of round amorphous particles when compared with the crystallized zeolite in Polypropylene.

An Efficient 3D Animation Data Reduction Using Frame Removal

Existing methods in which the animation data of all frames are stored and reproduced as with vertex animation cannot be used in mobile device environments because these methods use large amounts of the memory. So 3D animation data reduction methods aimed at solving this problem have been extensively studied thus far and we propose a new method as follows. First, we find and remove frames in which motion changes are small out of all animation frames and store only the animation data of remaining frames (involving large motion changes). When playing the animation, the removed frame areas are reconstructed using the interpolation of the remaining frames. Our key contribution is to calculate the accelerations of the joints of individual frames and the standard deviations of the accelerations using the information of joint locations in the relevant 3D model in order to find and delete frames in which motion changes are small. Our methods can reduce data sizes by approximately 50% or more while providing quality which is not much lower compared to original animations. Therefore, our method is expected to be usefully used in mobile device environments or other environments in which memory sizes are limited.