A Study on Optimal Determination of Partial Transmission Ratios of Helical Gearboxes with Second-Step Double Gear-Sets

In this paper, a study on the applications of the optimization and regression techniques for optimal calculation of partial ratios of helical gearboxes with second-step double gear-sets for minimal cross section dimension is introduced. From the condition of the moment equilibrium of a mechanic system including three gear units and their regular resistance condition, models for calculation of the partial ratios of helical gearboxes with second-step double gear-sets were given. Especially, by regression analysis, explicit models for calculation of the partial ratios are introduced. These models allow determining the partial ratios accurately and simply.

Markov Game Controller Design Algorithms

Markov games are a generalization of Markov decision process to a multi-agent setting. Two-player zero-sum Markov game framework offers an effective platform for designing robust controllers. This paper presents two novel controller design algorithms that use ideas from game-theory literature to produce reliable controllers that are able to maintain performance in presence of noise and parameter variations. A more widely used approach for controller design is the H∞ optimal control, which suffers from high computational demand and at times, may be infeasible. Our approach generates an optimal control policy for the agent (controller) via a simple Linear Program enabling the controller to learn about the unknown environment. The controller is facing an unknown environment, and in our formulation this environment corresponds to the behavior rules of the noise modeled as the opponent. Proposed controller architectures attempt to improve controller reliability by a gradual mixing of algorithmic approaches drawn from the game theory literature and the Minimax-Q Markov game solution approach, in a reinforcement-learning framework. We test the proposed algorithms on a simulated Inverted Pendulum Swing-up task and compare its performance against standard Q learning.

Minimal Residual Method for Adaptive Filtering with Finite Termination

We present a discussion of three adaptive filtering algorithms well known for their one-step termination property, in terms of their relationship with the minimal residual method. These algorithms are the normalized least mean square (NLMS), Affine Projection algorithm (APA) and the recursive least squares algorithm (RLS). The NLMS is shown to be a result of the orthogonality condition imposed on the instantaneous approximation of the Wiener equation, while APA and RLS algorithm result from orthogonality condition in multi-dimensional minimal residual formulation. Further analysis of the minimal residual formulation for the RLS leads to a triangular system which also possesses the one-step termination property (in exact arithmetic)

Symmetry Breaking and the Emergence of Branching Structures in Morphogenesis: Minimal Conditions and Mechanical Interactions between Cells

The minimal condition for symmetry breaking in morphogenesis of cellular population was investigated using cellular automata based on reaction-diffusion dynamics. In particular, the study looked for the possibility of the emergence of branching structures due to mechanical interactions. The model used two types of cells an external gradient. The results showed that the external gradient influenced movement of cell type-I, also revealed that clusters formed by cells type-II worked as barrier to movement of cells type-I.

A Logic Approach to Database Dynamic Updating

We introduce a logic-based framework for database updating under constraints. In our framework, the constraints are represented as an instantiated extended logic program. When performing an update, database consistency may be violated. We provide an approach of maintaining database consistency, and study the conditions under which the maintenance process is deterministic. We show that the complexity of the computations and decision problems presented in our framework is in each case polynomial time.

Pollution Induced Structural and Physico-Chemical Changes in Algal Community: A Case Study of River Pandu of North India

The study area receives a wide variety of wastes generated by municipalities and the industries like paints and pigments, metal processing industries, thermal power plants electroprocessing industries etc. The Physico-chemical and structural investigation of water from river Pandu indicated high level of chlorides and calcium which made the water unsuitable for human use. Algae like Cyclotella fumida, Asterionella Formosa, Cladophora glomerata, Pediastrum simplex, Scenedesmus bijuga, Cladophora glomerata were the dominant pollution tolerant species recorded under these conditions. The sensitive and less abundant species of algae included Spirogyra sps., Merismopedia sps. The predominance colonies of Zygnema sps, Phormidium sps, Mycrocystis aeruginosa, Merismopedia minima, Pandorina morum, seems to correlate with high organic contents of Pandu river water. This study assumes significance as some algae can be used as bioindicators of water pollution and algal floral of a municipal drain carrying waste effluents from industrial area Kanpur and discharge them into the river Pandu flowing onto southern outskirts of Kanpur city.

A Study of Replacement Policies for Warranty Products with Different Failure Rate

This paper provides a replacement policy for warranty products with different failure rate from the consumer-s viewpoint. Assume that the product is replaced once within a finite planning horizon, and the failure rate of the second product is lower than the failure rate of the first product. Within warranty period (WP), the failed product is corrected by minimal repair without any cost to the consumers. After WP, the failed product is repaired with a fixed repair cost to the consumers. However, each failure incurs a fixed downtime cost to the consumers over a finite planning horizon. In this paper, we derive the model of the expected total disbursement cost within a finite planning horizon and some properties of the optimal replacement policy under some reasonable conditions are obtained. Finally, numerical examples are given to illustrate the features of the optimal replacement policy under various maintenance costs.

Potential of Agro-Waste Extracts as Supplements for the Continuous Bioremediation of Free Cyanide Contaminated Wastewater

Different agricultural waste peels were assessed for their suitability to be used as primary substrates for the bioremediation of free cyanide (CN-) by a cyanide-degrading fungus Aspergillus awamori isolated from cyanide containing wastewater. The bioremediated CN- concentration were in the range of 36 to 110 mg CN-/L, with Orange (C. sinensis) > Carrot (D. carota) > Onion (A. cepa) > Apple (M. pumila), being chosen as suitable substrates for large scale CN- degradation processes due to: 1) the high concentration of bioremediated CN-, 2) total reduced sugars released into solution to sustain the biocatalyst, and 3) minimal residual NH4- N concentration after fermentation. The bioremediation rate constants (k) were 0.017h-1 (0h < t < 24h), with improved bioremediation rates (0.02189h-1) observed after 24h. The averaged nitrilase activity was ~10 U/L.

Robust Design and Optimization of Production Wastes: An Application for Industries

This paper focuses on robust design and optimization of industrial production wastes. Past literatures were reviewed to case study Clamason Industries Limited (CIL) - a leading ladder-tops manufacturer. A painstaking study of the firm-s practices at the shop floor revealed that Over-production, Waiting time, Excess inventory, and Defects are the major wastes that are impeding their progress and profitability. Design expert8 software was used to apply Taguchi robust design and response surface methodology in order to model, analyse and optimise the wastes cost in CIL. Waiting time and overproduction rank first and second in contributing to the costs of wastes in CIL. For minimal wastes cost the control factors of overproduction, waiting-time, defects and excess-inventory must be set at 0.30, 390.70, 4 and 55.70 respectively for CIL. The optimal value of cost of wastes for the months studied was 22.3679. Finally, a recommendation was made that for the company to enhance their profitability and customer satisfaction, they must adopt the Shingeo Shingo-s Single Minute Exchange of Dies (SMED), which will immediately tackle the waste of waiting by drastically reducing their setup time.

Utilization of Advanced Data Storage Technology to Conduct Construction Industry on Clear Environment

Construction projects generally take place in uncontrolled and dynamic environments where construction waste is a serious environmental problem in many large cities. The total amount of waste and carbon dioxide emissions from transportation vehicles are still out of control due to increasing construction projects, massive urban development projects and the lack of effective tools for minimizing adverse environmental impacts in construction. This research is about utilization of the integrated applications of automated advanced tracking and data storage technologies in the area of environmental management to monitor and control adverse environmental impacts such as construction waste and carbon dioxide emissions. Radio Frequency Identification (RFID) integrated with the Global Position System (GPS) provides an opportunity to uniquely identify materials, components, and equipments and to locate and track them using minimal or no worker input. The transmission of data to the central database will be carried out with the help of Global System for Mobile Communications (GSM).

Investigating the Performance of Minimax Search and Aggregate Mahalanobis Distance Function in Evolving an Ayo/Awale Player

In this paper we describe a hybrid technique of Minimax search and aggregate Mahalanobis distance function synthesis to evolve Awale game player. The hybrid technique helps to suggest a move in a short amount of time without looking into endgame database. However, the effectiveness of the technique is heavily dependent on the training dataset of the Awale strategies utilized. The evolved player was tested against Awale shareware program and the result is appealing.

Geometry Design Supported by Minimizing and Visualizing Collision in Dynamic Packing

This paper presents a method to support dynamic packing in cases when no collision-free path can be found. The method, which is primarily based on path planning and shrinking of geometries, suggests a minimal geometry design change that results in a collision-free assembly path. A supplementing approach to optimize geometry design change with respect to redesign cost is described. Supporting this dynamic packing method, a new method to shrink geometry based on vertex translation, interweaved with retriangulation, is suggested. The shrinking method requires neither tetrahedralization nor calculation of medial axis and it preserves the topology of the geometry, i.e. holes are neither lost nor introduced. The proposed methods are successfully applied on industrial geometries.

Performance Analysis of Chrominance Red and Chrominance Blue in JPEG

While compressing text files is useful, compressing still image files is almost a necessity. A typical image takes up much more storage than a typical text message and without compression images would be extremely clumsy to store and distribute. The amount of information required to store pictures on modern computers is quite large in relation to the amount of bandwidth commonly available to transmit them over the Internet and applications. Image compression addresses the problem of reducing the amount of data required to represent a digital image. Performance of any image compression method can be evaluated by measuring the root-mean-square-error & peak signal to noise ratio. The method of image compression that will be analyzed in this paper is based on the lossy JPEG image compression technique, the most popular compression technique for color images. JPEG compression is able to greatly reduce file size with minimal image degradation by throwing away the least “important" information. In JPEG, both color components are downsampled simultaneously, but in this paper we will compare the results when the compression is done by downsampling the single chroma part. In this paper we will demonstrate more compression ratio is achieved when the chrominance blue is downsampled as compared to downsampling the chrominance red in JPEG compression. But the peak signal to noise ratio is more when the chrominance red is downsampled as compared to downsampling the chrominance blue in JPEG compression. In particular we will use the hats.jpg as a demonstration of JPEG compression using low pass filter and demonstrate that the image is compressed with barely any visual differences with both methods.

A Robust Controller for Output Variance Reduction and Minimum Variance with Application on a Permanent Field DC-Motor

In this paper, we present an experimental testing for a new algorithm that determines an optimal controller-s coefficients for output variance reduction related to Linear Time Invariant (LTI) Systems. The algorithm features simplicity in calculation, generalization to minimal and non-minimal phase systems, and could be configured to achieve reference tracking as well as variance reduction after compromising with the output variance. An experiment of DCmotor velocity control demonstrates the application of this new algorithm in designing the controller. The results show that the controller achieves minimum variance and reference tracking for a preset velocity reference relying on an identified model of the motor.

Cost and Productivity Experiences of Pakistan with Aggregate Learning Curve

The principal focus of this study is on the measurement and analysis of labor learnings in Pakistan. The study at the aggregate economy level focus on the labor productivity movements and at large-scale manufacturing level focus on the cost structure, with isolating the contribution of the learning curve. The analysis of S-shaped curve suggests that learnings are only below one half of aggregate learning curve and other half shows the retardation in learning, hence retardation in productivity movements. The study implies the existence of learning economies in term of cost reduction that is input cost per unit produced decreases by 0.51 percent every time the cumulative production output doubles.

Statistical Reliability Based Modeling of Series and Parallel Operating Systems using Extreme Value Theory

This paper tries to represent a new method for computing the reliability of a system which is arranged in series or parallel model. In this method we estimate life distribution function of whole structure using the asymptotic Extreme Value (EV) distribution of Type I, or Gumbel theory. We use EV distribution in minimal mode, for estimate the life distribution function of series structure and maximal mode for parallel system. All parameters also are estimated by Moments method. Reliability function and failure (hazard) rate and p-th percentile point of each function are determined. Other important indexes such as Mean Time to Failure (MTTF), Mean Time to repair (MTTR), for non-repairable and renewal systems in both of series and parallel structure will be computed.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Economic effects and Energy Use Efficiency of Incorporating Alfalfa and Fertilizer into Grass- Based Pasture Systems

A ten-year grazing study was conducted at the Agriculture and Agri-Food Canada Brandon Research Centre in Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P, K, and S) addition on economics and efficiency of non-renewable energy use in meadow brome grass-based pasture systems for beef production. Fertilizing grass-only or alfalfa-grass pastures to full soil test recommendations improved pasture productivity, but did not improve profitability compared to unfertilized pastures. Fertilizing grass-only pastures resulted in the highest net loss of any pasture management strategy in this study. Adding alfalfa at the time of seeding, with no added fertilizer, was economically the best pasture improvement strategy in this study. Because of moisture limitations, adding commercial fertilizer to full soil test recommendations is probably not economically justifiable in most years, especially with the rising cost of fertilizer. Improving grass-only pastures by adding fertilizer and/or alfalfa required additional non-renewable energy inputs; however, the additional energy required for unfertilized alfalfa-grass pastures was minimal compared to the fertilized pastures. Of the four pasture management strategies, adding alfalfa to grass pastures without adding fertilizer had the highest efficiency of energy use. Based on energy use and economic performance, the unfertilized alfalfa-grass pasture was the most efficient and sustainable pasture system.

An Efficient Multi Join Algorithm Utilizing a Lattice of Double Indices

In this paper, a novel multi join algorithm to join multiple relations will be introduced. The novel algorithm is based on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.

An Enhanced Tool for Implementing Dialogue Forms in Conversational Applications

Natural Language Understanding Systems (NLU) will not be widely deployed unless they are technically mature and cost effective to develop. Cost effective development hinges on the availability of tools and techniques enabling the rapid production of NLU applications through minimal human resources. Further, these tools and techniques should allow quick development of applications in a user friendly way and should be easy to upgrade in order to continuously follow the evolving technologies and standards. This paper presents a visual tool for the structuring and editing of dialog forms, the key element of driving conversation in NLU applications based on IBM technology. The main focus is given on the basic component used to describe Human – Machine interactions of that kind, the Dialogue Manager. In essence, the description of a tool that enables the visual representation of the Dialogue Manager mainly during the implementation phase is illustrated.