On the Standardizing the Metal Die of Punchand Matrix by Mechanical Desktop Software

In industry, on of the most important subjects is die and it's characteristics in which for cutting and forming different mechanical pieces, various punch and matrix metal die are used. whereas the common parts which form the main frame die are not often proportion with pieces and dies therefore using a part as socalled common part for frames in specified dimension ranges can decrease the time of designing, occupied space of warehouse and manufacturing costs. Parts in dies with getting uniform in their shape and dimension make common parts of dies. Common parts of punch and matrix metal die are as bolster, guide bush, guide pillar and shank. In this paper the common parts and effective parameters in selecting each of them as the primary information are studied, afterward for selection and design of mechanical parts an introduction and investigation based on the Mech. Desk. software is done hence with developing this software can standardize the metal common parts of punch and matrix. These studies will be so useful for designer in their designing and also using it has with very much advantage for manufactures of products in decreasing occupied spaces by dies.

Efficient Numerical Model for Studying Bridge Pier Collapse in Floods

High level and high velocity flood flows are potentially harmful to bridge piers as evidenced in many toppled piers, and among them the single-column piers were considered as the most vulnerable. The flood flow characteristic parameters including drag coefficient, scouring and vortex shedding are built into a pier-flood interaction model to investigate structural safety against flood hazards considering the effects of local scouring, hydrodynamic forces, and vortex induced resonance vibrations. By extracting the pier-flood simulation results embedded in a neural networks code, two cases of pier toppling occurred in typhoon days were reexamined: (1) a bridge overcome by flash flood near a mountain side; (2) a bridge washed off in flood across a wide channel near the estuary. The modeling procedures and simulations are capable of identifying the probable causes for the tumbled bridge piers during heavy floods, which include the excessive pier bending moments and resonance in structural vibrations.

Text Mining Technique for Data Mining Application

Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In decision tree approach is most useful in classification problem. With this technique, tree is constructed to model the classification process. There are two basic steps in the technique: building the tree and applying the tree to the database. This paper describes a proposed C5.0 classifier that performs rulesets, cross validation and boosting for original C5.0 in order to reduce the optimization of error ratio. The feasibility and the benefits of the proposed approach are demonstrated by means of medial data set like hypothyroid. It is shown that, the performance of a classifier on the training cases from which it was constructed gives a poor estimate by sampling or using a separate test file, either way, the classifier is evaluated on cases that were not used to build and evaluate the classifier are both are large. If the cases in hypothyroid.data and hypothyroid.test were to be shuffled and divided into a new 2772 case training set and a 1000 case test set, C5.0 might construct a different classifier with a lower or higher error rate on the test cases. An important feature of see5 is its ability to classifiers called rulesets. The ruleset has an error rate 0.5 % on the test cases. The standard errors of the means provide an estimate of the variability of results. One way to get a more reliable estimate of predictive is by f-fold –cross- validation. The error rate of a classifier produced from all the cases is estimated as the ratio of the total number of errors on the hold-out cases to the total number of cases. The Boost option with x trials instructs See5 to construct up to x classifiers in this manner. Trials over numerous datasets, large and small, show that on average 10-classifier boosting reduces the error rate for test cases by about 25%.

Kinetics of Hydrodesulphurization of Diesel: Mass Transfer Aspects

In order to meet environmental norms, Indian fuel policy aims at producing ultra low sulphur diesel (ULSD) in near future. A catalyst for meeting such requirements has been developed and kinetics of this catalytic process is being looked into. In the present investigations, effect of mass transfer on kinetics of ultra deep hydrodesulphurization (UDHDS) to produce ULSD has been studied to determine intrinsic kinetics over a pre-sulphided catalyst. Experiments have been carried out in a continuous flow micro reactor operated in the temperature range of 330 to 3600C, whsv of 1 hr-1 at a pressure of 35 bar, and its parameters estimated. Based on the derived rate expression and estimated parameters optimum operation range has been determined for this UDHDS catalyst to obtain ULSD product.

Recursive Similarity Hashing of Fractal Geometry

A new technique of topological multi-scale analysis is introduced. By performing a clustering recursively to build a hierarchy, and analyzing the co-scale and intra-scale similarities, an Iterated Function System can be extracted from any data set. The study of fractals shows that this method is efficient to extract self-similarities, and can find elegant solutions the inverse problem of building fractals. The theoretical aspects and practical implementations are discussed, together with examples of analyses of simple fractals.

The Place and Effects of Information Management in Corporate Identity

Corporate identity, which has several advantages such that the employees become integrated with their corporations, corporation is distinguished from its competitors and it is recognized by the masses, is the total of the distinctive corporate features that and corporation has. That the information takes part in production as a more important component than labor and capital has required that the corporations are reorganized as information-based. Therefore, information and its management have reached a basic and prevalent position in having sustainable competitive advantage. Thanks to the information management which regulates the information and makes it reachable and available, information will be produced in line with a specific purpose in the corporations and be used in all the corporate processes. As an auxiliary power for increase in the economic potential, efficiency and productivity of the corporation, corporate identity consists of four components. These are corporate philosophy, corporate design, corporate behavior and corporate communication. In this study, the effects of the information management on corporate identity are discussed from the point of these four elements.

Evaluating the Effect of Farmers’ Training on Rice Production in Sierra Leone: A Case Study of Rice Cultivation in Lowland Ecology

This study endeavors to evaluate the effects of farmers’ training program on the adoption of improved farming practices, the output of rice farming, and the income as well as the profit from rice farming by employing an ex-post non-experimental data in Sierra Leone. It was established that participating in farmers’ training program increased the possibility of adoption of the improved farming activities that were implemented in the study area. Through the training program also, the proceeds from rice production was also established to have increased considerably. These results were in line with the assumption that one of the main constraints on the growth in agricultural output particularly rice cultivation in most African states is the lack of efficient extension programs.

Parallel Direct Integration Variable Step Block Method for Solving Large System of Higher Order Ordinary Differential Equations

The aim of this paper is to investigate the performance of the developed two point block method designed for two processors for solving directly non stiff large systems of higher order ordinary differential equations (ODEs). The method calculates the numerical solution at two points simultaneously and produces two new equally spaced solution values within a block and it is possible to assign the computational tasks at each time step to a single processor. The algorithm of the method was developed in C language and the parallel computation was done on a parallel shared memory environment. Numerical results are given to compare the efficiency of the developed method to the sequential timing. For large problems, the parallel implementation produced 1.95 speed-up and 98% efficiency for the two processors.

Stress Relaxation of Date at Different Temperature and Moisture Content of Product: A New Approach

Iran is one of the greatest producers of date in the world. However due to lack of information about its viscoelastic properties, much of the production downgraded during harvesting and postharvesting processes. In this study the effect of temperature and moisture content of product were investigated on stress relaxation characteristics. Therefore, the freshly harvested date (kabkab) at tamar stage were put in controlled environment chamber to obtain different temperature levels (25, 35, 45, and 55 0C) and moisture contents (8.5, 8.7, 9.2, 15.3, 20, 32.2 %d.b.). A texture analyzer TAXT2 (Stable Microsystems, UK) was used to apply uniaxial compression tests. A chamber capable to control temperature was designed and fabricated around the plunger of texture analyzer to control the temperature during the experiment. As a new approach a CCD camera (A4tech, 30 fps) was mounted on a cylindrical glass probe to scan and record contact area between date and disk. Afterwards, pictures were analyzed using image processing toolbox of Matlab software. Individual date fruit was uniaxially compressed at speed of 1 mm/s. The constant strain of 30% of thickness of date was applied to the horizontally oriented fruit. To select a suitable model for describing stress relaxation of date, experimental data were fitted with three famous stress relaxation models including the generalized Maxwell, Nussinovitch, and Pelege. The constant in mentioned model were determined and correlated with temperature and moisture content of product using non-linear regression analysis. It was found that Generalized Maxwell and Nussinovitch models appropriately describe viscoelastic characteristics of date fruits as compared to Peleg mode.

From Hype to Ignorance – A Review of 30 Years of Lean Production

Lean production (or lean management respectively) gained popularity in several waves. The last three decades have been filled with numerous attempts to apply these concepts in companies. However, this has only been partially successful. The roots of lean production can be traced back to Toyota-s just-in-time production. This concept, which according to Womack-s, Jones- and Roos- research at MIT was employed by Japanese car manufacturers, became popular under its international names “lean production", “lean-manufacturing" and was termed “Schlanke Produktion" in Germany. This contribution shows a review about lean production in Germany over the last thirty years: development, trial & error and implementation as well.

Evolutionary Approach for Automated Discovery of Censored Production Rules

In the recent past, there has been an increasing interest in applying evolutionary methods to Knowledge Discovery in Databases (KDD) and a number of successful applications of Genetic Algorithms (GA) and Genetic Programming (GP) to KDD have been demonstrated. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski & Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations, in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the 'If P Then D' part of the CPR expresses important information, while the Unless C part acts only as a switch and changes the polarity of D to ~D. This paper presents a classification algorithm based on evolutionary approach that discovers comprehensible rules with exceptions in the form of CPRs. The proposed approach has flexible chromosome encoding, where each chromosome corresponds to a CPR. Appropriate genetic operators are suggested and a fitness function is proposed that incorporates the basic constraints on CPRs. Experimental results are presented to demonstrate the performance of the proposed algorithm.

A Microscopic Simulation Model for Earthmoving Operations

Earthmoving operations are a major part of many construction projects. Because of the complexity and fast-changing environment of such operations, the planning and estimating are crucial on both planning and operational levels. This paper presents the framework ofa microscopic discrete-event simulation system for modeling earthmoving operations and conducting productivity estimations on an operational level.A prototype has been developed to demonstrate the applicability of the proposed framework, and this simulation system is presented via a case study based on an actual earthmoving project. The case study shows that the proposed simulation model is capable of evaluating alternative operating strategies and resource utilization at a very detailed level.

Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part I: Modeling

This paper and its companion (Part 2) deal with modeling and optimization of two NP-hard problems in production planning of flexible manufacturing system (FMS), part type selection problem and loading problem. The part type selection problem and the loading problem are strongly related and heavily influence the system-s efficiency and productivity. The complexity of the problems is harder when flexibilities of operations such as the possibility of operation processed on alternative machines with alternative tools are considered. These problems have been modeled and solved simultaneously by using real coded genetic algorithms (RCGA) which uses an array of real numbers as chromosome representation. These real numbers can be converted into part type sequence and machines that are used to process the part types. This first part of the papers focuses on the modeling of the problems and discussing how the novel chromosome representation can be applied to solve the problems. The second part will discuss the effectiveness of the RCGA to solve various test bed problems.

Identifying Significant Factors of Brick Laying Process through Design of Experiment and Computer Simulation: A Case Study

Improving performance measures in the construction processes has been a major concern for managers and decision makers in the industry. They seek for ways to recognize the key factors which have the largest effect on the process. Identifying such factors can guide them to focus on the right parts of the process in order to gain the best possible result. In the present study design of experiment (DOE) has been applied to a computer simulation model of brick laying process to determine significant factors while productivity has been chosen as the response of the experiment. To this end, four controllable factors and their interaction have been experimented and the best factor level has been calculated for each one. The results indicate that three factors, namely, labor of brick, labor of mortar and inter arrival time of mortar along with interaction of labor of brick and labor of mortar are significant.

Modeling of Dielectric Heating in Radio- Frequency Applicator Optimized for Uniform Temperature by Means of Genetic Algorithms

The paper presents an optimization study based on genetic algorithms (GA-s) for a radio-frequency applicator used in heating dielectric band products. The weakly coupled electro-thermal problem is analyzed using 2D-FEM. The design variables in the optimization process are: the voltage of a supplementary “guard" electrode and six geometric parameters of the applicator. Two objective functions are used: temperature uniformity and total active power absorbed by the dielectric. Both mono-objective and multiobjective formulations are implemented in GA optimization.

Hydrogen Integration in Petrochemical Complexes, Using Modified Automated Targeting Method

Owing to extensive use of hydrogen in refining or petrochemical units, it is essential to manage hydrogen network in order to make the most efficient utilization of hydrogen. On the other hand, hydrogen is an important byproduct not properly used through petrochemical complexes and mostly sent to the fuel system. A few works have been reported in literature to improve hydrogen network for petrochemical complexes. In this study a comprehensive analysis is carried out on petrochemical units using a modified automated targeting technique which is applied to determine the minimum hydrogen consumption. Having applied the modified targeting method in two petrochemical cases, the results showed a significant reduction in required fresh hydrogen.

An Adaptive Model for Blind Image Restoration using Bayesian Approach

Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.

Characterization and Development of Anthropomorphic Phantoms Liver for Use in Nuclear Medicine

The objective this study was to characterize and develop anthropomorphic liver phantoms in tomography hepatic procedures for quality control and improvement professionals in nuclear medicine. For the conformation of the anthropomorphic phantom was used in plaster and acrylic. We constructed three phantoms representing processes with liver cirrhosis. The phantoms were filled with 99mTc diluted with water to obtain the scintigraphic images. Tomography images were analyzed anterior and posterior phantom representing a body with a greater degree cirrhotic. It was noted that the phantoms allow the acquisition of images similar to real liver with cirrhosis. Simulations of hemangiomas may contribute to continued professional education of nuclear medicine, on the question of image acquisition, allowing of the study parameters such of the matrix, energy window and count statistics.

Haematological Characterization of Reproductive Status at Laying Hens by Age

Physiological activity of the pineal gland with specific responses in the reproductive territory may be interpreted by monitoring the process parameters used in poultry practice in different age batches of laying hens. As biological material were used 105 laying hens, clinically healthy, belonging to ALBO SL- 2000 hybrid, raised on ground, from which blood samples were taken at the age of 12 and 28 weeks. The haematological examinations were concerned to obtain the total number of erythrocytes and leukocytes and the main erythrocyte constant (RBC, PCV, MCV, MCH, MCHC and WBC). The results allow the interpretation of the reproductive status through the dynamics of the presented values.

Data Preprocessing for Supervised Leaning

Many factors affect the success of Machine Learning (ML) on a given task. The representation and quality of the instance data is first and foremost. If there is much irrelevant and redundant information present or noisy and unreliable data, then knowledge discovery during the training phase is more difficult. It is well known that data preparation and filtering steps take considerable amount of processing time in ML problems. Data pre-processing includes data cleaning, normalization, transformation, feature extraction and selection, etc. The product of data pre-processing is the final training set. It would be nice if a single sequence of data pre-processing algorithms had the best performance for each data set but this is not happened. Thus, we present the most well know algorithms for each step of data pre-processing so that one achieves the best performance for their data set.