A Novel Fuzzy Technique for Image Noise Reduction

A new fuzzy filter is presented for noise reduction of images corrupted with additive noise. The filter consists of two stages. In the first stage, all the pixels of image are processed for determining noisy pixels. For this, a fuzzy rule based system associates a degree to each pixel. The degree of a pixel is a real number in the range [0,1], which denotes a probability that the pixel is not considered as a noisy pixel. In the second stage, another fuzzy rule based system is employed. It uses the output of the previous fuzzy system to perform fuzzy smoothing by weighting the contributions of neighboring pixel values. Experimental results are obtained to show the feasibility of the proposed filter. These results are also compared to other filters by numerical measure and visual inspection.

Prediction of Optimum Cutting Parameters to obtain Desired Surface in Finish Pass end Milling of Aluminium Alloy with Carbide Tool using Artificial Neural Network

End milling process is one of the common metal cutting operations used for machining parts in manufacturing industry. It is usually performed at the final stage in manufacturing a product and surface roughness of the produced job plays an important role. In general, the surface roughness affects wear resistance, ductility, tensile, fatigue strength, etc., for machined parts and cannot be neglected in design. In the present work an experimental investigation of end milling of aluminium alloy with carbide tool is carried out and the effect of different cutting parameters on the response are studied with three-dimensional surface plots. An artificial neural network (ANN) is used to establish the relationship between the surface roughness and the input cutting parameters (i.e., spindle speed, feed, and depth of cut). The Matlab ANN toolbox works on feed forward back propagation algorithm is used for modeling purpose. 3-12-1 network structure having minimum average prediction error found as best network architecture for predicting surface roughness value. The network predicts surface roughness for unseen data and found that the result/prediction is better. For desired surface finish of the component to be produced there are many different combination of cutting parameters are available. The optimum cutting parameter for obtaining desired surface finish, to maximize tool life is predicted. The methodology is demonstrated, number of problems are solved and algorithm is coded in Matlab®.

Just-In-Time for Reducing Inventory Costs throughout a Supply Chain: A Case Study

Supply Chain Management (SCM) is the integration between manufacturer, transporter and customer in order to form one seamless chain that allows smooth flow of raw materials, information and products throughout the entire network that help in minimizing all related efforts and costs. The main objective of this paper is to develop a model that can accept a specified number of spare-parts within the supply chain, simulating its inventory operations throughout all stages in order to minimize the inventory holding costs, base-stock, safety-stock, and to find the optimum quantity of inventory levels, thereby suggesting a way forward to adapt some factors of Just-In-Time to minimizing the inventory costs throughout the entire supply chain. The model has been developed using Micro- Soft Excel & Visual Basic in order to study inventory allocations in any network of the supply chain. The application and reproducibility of this model were tested by comparing the actual system that was implemented in the case study with the results of the developed model. The findings showed that the total inventory costs of the developed model are about 50% less than the actual costs of the inventory items within the case study.

A Hybrid Approach Using Particle Swarm Optimization and Simulated Annealing for N-queen Problem

This paper presents a hybrid approach for solving nqueen problem by combination of PSO and SA. PSO is a population based heuristic method that sometimes traps in local maximum. To solve this problem we can use SA. Although SA suffer from many iterations and long time convergence for solving some problems, By good adjusting initial parameters such as temperature and the length of temperature stages SA guarantees convergence. In this article we use discrete PSO (due to nature of n-queen problem) to achieve a good local maximum. Then we use SA to escape from local maximum. The experimental results show that our hybrid method in comparison of SA method converges to result faster, especially for high dimensions n-queen problems.

A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data

The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.

A New Heuristic Statistical Methodology for Optimizing Queuing Networks Using Discreet Event Simulation

Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.

Implementation of Technology Concept for the Reduction of Cyanobacteria in Laboratory

Following the research in the Department of environmental engineering in Faculty of mechanical engineering on Technical University of Kosice and experiences with electrocoagulation style of disposal waste water, there were designed and partly examining the equipment of two stage revitalization on the standing and little fusible water of tenet electrolysis on the little tarns. With the cooperation with vet experts was that manners prove and it is innocuous for animals, during which time cyanobacteria are totally paralyzed. For the implementation of science and research results have been obtained by means EU funds for structural development.

Real-time Laser Monitoring based on Pipe Detective Operation

The pipe inspection operation is the difficult detective performance. Almost applications are mainly relies on a manual recognition of defective areas that have carried out detection by an engineer. Therefore, an automation process task becomes a necessary in order to avoid the cost incurred in such a manual process. An automated monitoring method to obtain a complete picture of the sewer condition is proposed in this work. The focus of the research is the automated identification and classification of discontinuities in the internal surface of the pipe. The methodology consists of several processing stages including image segmentation into the potential defect regions and geometrical characteristic features. Automatic recognition and classification of pipe defects are carried out by means of using an artificial neural network technique (ANN) based on Radial Basic Function (RBF). Experiments in a realistic environment have been conducted and results are presented.

Labeling Method in Steganography

In this paper a way of hiding text message (Steganography) in the gray image has been presented. In this method tried to find binary value of each character of text message and then in the next stage, tried to find dark places of gray image (black) by converting the original image to binary image for labeling each object of image by considering on 8 connectivity. Then these images have been converted to RGB image in order to find dark places. Because in this way each sequence of gray color turns into RGB color and dark level of grey image is found by this way if the Gary image is very light the histogram must be changed manually to find just dark places. In the final stage each 8 pixels of dark places has been considered as a byte and binary value of each character has been put in low bit of each byte that was created manually by dark places pixels for increasing security of the main way of steganography (LSB).

A Power Reduction Technique for Built-In-Self Testing Using Modified Linear Feedback Shift Register

A linear feedback shift register (LFSR) is proposed which targets to reduce the power consumption from within. It reduces the power consumption during testing of a Circuit Under Test (CUT) at two stages. At first stage, Control Logic (CL) makes the clocks of the switching units of the register inactive for a time period when output from them is going to be same as previous one and thus reducing unnecessary switching of the flip-flops. And at second stage, the LFSR reorders the test vectors by interchanging the bit with its next and closest neighbor bit. It keeps fault coverage capacity of the vectors unchanged but reduces the Total Hamming Distance (THD) so that there is reduction in power while shifting operation.

Effect of Tube Materials and Special Coating on Coke Deposition in the Steam Cracking of Hydrocarbons

The steam cracking reactions are always accompanied with the formation of coke which deposits on the walls of the tubular reactors. The investigation has attempted to control catalytic coking by the applying aluminum, zinc and ceramic coating like aluminum-magnesium by thermal spray and pack cementation method. Rate of coke formation during steam cracking of naphtha has been investigated both for uncoated stainless steel (with different alloys) and metal coating constructed with thermal Spray and pack cementation method with metal powders of Aluminum, Aluminum-Magnesium, zinc, silicon, nickel and chromium. The results of the study show that passivating the surface of SS321 with a coating of Aluminum and Aluminum-Magnesium can significantly reduce the rate of coke deposition during naphtha pyrolysis. SEM and EDAX techniques (Philips XL Series) were used to examine the coke deposits formed by the metal-hydrocarbon reactions. Our objective was to separate the different stages by identifying the characteristic morphologies.

Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Sustainability Policies and Corporate Social Responsibility (CSR): Ergonomics Contribution Regarding Work in Companies

The growing importance of sustainability in corporate policies represents a great opportunity for workers to gain more consideration, with great benefits to their well being. Sustainable work is believed to be one which improves the organization-s performance and fosters professional development as well as workers- health. In a multiple case study based on document research, information was sought about work activities and their sustainability or corporate social responsibility (CSR) policies, as disseminated by corporations. All the companies devoted attention to work activities and delivered a good amount of information about them. Nevertheless, the information presented was generic; all the actions developed were top-down and there was no information about the impact of changes aimed at sustainability on the workers- activities. It was found that the companies seemed to be at an early stage. In the future, they need to show more commitment through concrete goals: they must be aware that workers contribute directly to the corporations- sustainability. This would allow room for Ergonomics and Work Psychodynamics to be incorporated and to be useful for both companies and society, so as to promote and ensure work sustainability.

Effect of Valve Pressure Drop in Exergy Analysis of C2+ Recovery Plants Refrigeration Cycles

This paper provides an exergy analysis of the multistage refrigeration cycle used for C2+ recovery plant. The behavior of an industrial refrigeration cycle with refrigerant propane has been investigated by the exergy method. A computational model based on the exergy analysis is presented for the investigation of the effects of the valves on the exergy losses, the second law of efficiency, and the coefficient of performance (COP) of a vapor compression refrigeration cycle. The equations of exergy destruction and exergetic efficiency for the main cycle components such as evaporators, condensers, compressors, and expansion valves are developed. The relations for the total exergy destruction in the cycle and the cycle exergetic efficiency are obtained. An ethane recovery unit with its refrigeration cycle has been simulated to prepare the exergy analysis. Using a typical actual work input value; the exergetic efficiency of the refrigeration cycle is determined to be 39.90% indicating a great potential for improvements. The simulation results reveal that the exergetic efficiencies of the heat exchanger and expansion sections get the lowest rank among the other compartments of refrigeration cycle. Refrigeration calculations have been carried out through the analysis of T–S and P–H diagrams where coefficient of performance (COP) was obtained as 1.85. The novelty of this article includes the effect and sensitivity analysis of molar flow, pressure drops and temperature on the exergy efficiency and coefficient of performance of the cycle.

The Alterations of Some Pancreas Gland Hormones after an Aerobic Strenuous Exercise in Male Students

The alterations in pancreas gland secretion hormones following an aerobic and exhausting exercise was the purpose of this study. Sixteen healthy men participated in the study. The blood samples of these participants were taken in four stages under fasting condition. The first sample was taken before Bruce exhausting and aerobic test, the second sample was taken after Bruce exercise and the third and forth stages samples were taken 24 and 48 hours after the exercises respectively. The final results indicated that a strenuous aerobic exercise can have a significant effect on glucagon and insulin concentration of blood serum. The increase in blood serum insulin was higher after 24 and 48 hours. It seems that an intensive exercise has little effect on changes in glucagon concentration of blood serum. Also, disorder in secretion in glucagon and insulin concentration of serum disturbs athletes- exercise.

Non Inmersive Virtual Reality for Improving Teaching Processes

The following paper shows an interactive tool which main purpose is to teach how to play a flute. It consists of three stages the first one is the instruction and teaching process through a software application, the second is the practice part when the user starts to play the flute (hardware specially designed for this application) this flute is capable of capturing how is being played the flute and the final stage is the one in which the data captured are sent to the software and the user is evaluated in order to give him / she a correction or an acceptance

Evaluation of Optimum Performance of Lateral Intakes

In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.

A Study of Indentation Energy in Three Points Bending of Sandwich beams with Composite Laminated Faces and Foam Core

This paper deals with analysis of flexural stiffness, indentation and their energies in three point loading of sandwich beams with composite faces from Eglass/epoxy and cores from Polyurethane or PVC. Energy is consumed in three stages of indentation in laminated beam, indentation of sandwich beam and bending of sandwich beam. Theory of elasticity is chosen to present equations for indentation of laminated beam, then these equations have been corrected to offer better results. An analytical model has been used assuming an elastic-perfectly plastic compressive behavior of the foam core. Classical theory of beam is used to describe three point bending. Finite element (FE) analysis of static indentation sandwich beams is performed using the FE code ABAQUS. The foam core is modeled using the crushable foam material model and response of the foam core is experimentally characterized in uniaxial compression. Three point bending and indentation have been done experimentally in two cases of low velocity and higher velocity (quasi-impact) of loading. Results can describe response of beam in terms of core and faces thicknesses, core material, indentor diameter, energy absorbed, and length of plastic area in the testing. The experimental results are in good agreement with the analytical and FE analyses. These results can be used as an introduction for impact loading and energy absorbing of sandwich structures.

Trace Emergence of Ants- Traffic Flow, based upon Exclusion Process

Biological evolution has generated a rich variety of successful solutions; from nature, optimized strategies can be inspired. One interesting example is the ant colonies, which are able to exhibit a collective intelligence, still that their dynamic is simple. The emergence of different patterns depends on the pheromone trail, leaved by the foragers. It serves as positive feedback mechanism for sharing information. In this paper, we use the dynamic of TASEP as a model of interaction at a low level of the collective environment in the ant-s traffic flow. This work consists of modifying the movement rules of particles “ants" belonging to the TASEP model, so that it adopts with the natural movement of ants. Therefore, as to respect the constraints of having no more than one particle per a given site, and in order to avoid collision within a bidirectional circulation, we suggested two strategies: decease strategy and waiting strategy. As a third work stage, this is devoted to the study of these two proposed strategies- stability. As a final work stage, we applied the first strategy to the whole environment, in order to get to the emergence of traffic flow, which is a way of learning.

A New Divide and Conquer Software Process Model

The software system goes through a number of stages during its life and a software process model gives a standard format for planning, organizing and running a project. The article presents a new software development process model named as “Divide and Conquer Process Model", based on the idea first it divides the things to make them simple and then gathered them to get the whole work done. The article begins with the backgrounds of different software process models and problems in these models. This is followed by a new divide and conquer process model, explanation of its different stages and at the end edge over other models is shown.