Self – Tuning Method of Fuzzy System: An Application on Greenhouse Process

The approach proposed here is oriented in the direction of fuzzy system for the analysis and the synthesis of intelligent climate controllers, the simulation of the internal climate of the greenhouse is achieved by a linear model whose coefficients are obtained by identification. The use of fuzzy logic controllers for the regulation of climate variables represents a powerful way to minimize the energy cost. Strategies of reduction and optimization are adopted to facilitate the tuning and to reduce the complexity of the controller.

Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Implementation of SU-MIMO and MU-MIMOGTD-System under Imperfect CSI Knowledge

We study the performance of compressed beamforming weights feedback technique in generalized triangular decomposition (GTD) based MIMO system. GTD is a beamforming technique that enjoys QoS flexibility. The technique, however, will perform at its optimum only when the full knowledge of channel state information (CSI) is available at the transmitter. This would be impossible in the real system, where there are channel estimation error and limited feedback. We suggest a way to implement the quantized beamforming weights feedback, which can significantly reduce the feedback data, on GTD-based MIMO system and investigate the performance of the system. Interestingly, we found that compressed beamforming weights feedback does not degrade the BER performance of the system at low input power, while the channel estimation error and quantization do. For comparison, GTD is more sensitive to compression and quantization, while SVD is more sensitive to the channel estimation error. We also explore the performance of GTDbased MU-MIMO system, and find that the BER performance starts to degrade largely at around -20 dB channel estimation error.

Improving Patients Discharge Process in Hospitals by using Six Sigma Approach

The need to increase the efficiency of health care systems is becoming an obligation, and one of area of improvement is the discharge process. The objective of this work is to minimize the patients discharge time (for insured patients) to be less than 50 minutes by using six sigma approach, this improvement will also: lead to an increase in customer satisfaction, increase the number of admissions and turnover on the rooms, increase hospital profitability.Three different departments were considered in this study: Female, Male, and Paediatrics. Six Sigma approach coupled with simulation has been applied to reduce the patients discharge time for pediatrics, female, and male departments at hospital. Upon applying these recommendations at hospital: 60%, 80%, and 22% of insured female, male, and pediatrics patients respectively will have discharge time less than the upper specification time i.e. 50 min.

The Micro Ecosystem Restoration Mechanism Applied for Feasible Research of Lakes Eutrophication Enhancement

The technique of inducing micro ecosystem restoration is one of aquatic ecology engineering methods used to retrieve the polluted water. Batch scale study, pilot plant study, and field study were carried out to observe the eutrophication using the Inducing Ecology Restorative Symbiosis Agent (IERSA) consisting mainly degraded products by using lactobacillus, saccharomycete, and phycomycete. The results obtained from the experiments of the batch scale and pilot plant study allowed us to development the parameters for the field study. A pond, 5 m to the outlet of a lake, with an area of 500 m2 and depth of 0.6-1.2 m containing about 500 tons of water was selected as a model. After the treatment with 10 mg IERSA/L water twice a week for 70 days, the micro restoration mechanisms consisted of three stages (i.e., restoration, impact maintenance, and ecology recovery experiment after impact). The COD, TN, TKN, and chlorophyll a were reduced significantly in the first week. Although the unexpected heavy rain and contaminate from sewage system might slow the ecology restoration. However, the self-cleaning function continued and the chlorophyll a reduced for 50% in one month. In the 4th week, amoeba, paramecium, rotifer, and red wriggle worm reappeared, and the number of fish flies appeared up to1000 fish fries/m3. Those results proved that inducing restorative mechanism can be applied to improve the eutrophication and to control the growth of algae in the lakes by gaining the selfcleaning through inducing and competition of microbes. The situation for growth of fishes also can reach an excellent result due to the improvement of water quality.

Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Enhanced Spectral Envelope Coding Based On NLMS for G.729.1

In this paper, a new encoding algorithm of spectral envelope based on NLMS in G.729.1 for VoIP is proposed. In the TDAC part of G.729.1, the spectral envelope and MDCT coefficients extracted in the weighted CELP coding error (lower-band) and the higher-band input signal are encoded. In order to reduce allocation bits for spectral envelope coding, a new quantization algorithm based on NLMS is proposed. Also, reduced bits are used to enhance sound quality. The performance of the proposed algorithm is evaluated by sound quality and bit reduction rates in clean and frame loss conditions.

Modeling of Cross Flow Classifier with Water Injection

In hydrocyclones, the particle separation efficiency is limited by the suspended fine particles, which are discharged with the coarse product in the underflow. It is well known that injecting water in the conical part of the cyclone reduces the fine particle fraction in the underflow. This paper presents a mathematical model that simulates the water injection in the conical component. The model accounts for the fluid flow and the particle motion. Particle interaction, due to hindered settling caused by increased density and viscosity of the suspension, and fine particle entrainment by settling coarse particles are included in the model. Water injection in the conical part of the hydrocyclone is performed to reduce fine particle discharge in the underflow. The model demonstrates the impact of the injection rate, injection velocity, and injection location on the shape of the partition curve. The simulations are compared with experimental data of a 50-mm cyclone.

Reduced Inventories, High Reliability and Short Throughput Times by Using CONWIP Production Planning System

CONWIP (constant work-in-process) as a pull production system have been widely studied by researchers to date. The CONWIP pull production system is an alternative to pure push and pure pull production systems. It lowers and controls inventory levels which make the throughput better, reduces production lead time, delivery reliability and utilization of work. In this article a CONWIP pull production system was simulated. It was simulated push and pull planning system. To compare these systems via a production planning system (PPS) game were adjusted parameters of each production planning system. The main target was to reduce the total WIP and achieve throughput and delivery reliability to minimum values. Data was recorded and evaluated. A future state was made for real production of plastic components and the setup of the two indicators with CONWIP pull production system which can greatly help the company to be more competitive on the market.

Learning Monte Carlo Data for Circuit Path Length

This paper analyzes the patterns of the Monte Carlo data for a large number of variables and minterms, in order to characterize the circuit path length behavior. We propose models that are determined by training process of shortest path length derived from a wide range of binary decision diagram (BDD) simulations. The creation of the model was done use of feed forward neural network (NN) modeling methodology. Experimental results for ISCAS benchmark circuits show an RMS error of 0.102 for the shortest path length complexity estimation predicted by the NN model (NNM). Use of such a model can help reduce the time complexity of very large scale integrated (VLSI) circuitries and related computer-aided design (CAD) tools that use BDDs.

A Multiple-Objective Environmental Rationalization and Optimization for Material Substitution in the Production of Stone-Washed Jeans- Garments

As the Textile Industry is the second largest industry in Egypt and as small and medium-sized enterprises (SMEs) make up a great portion of this industry therein it is essential to apply the concept of Cleaner Production for the purpose of reducing pollution. In order to achieve this goal, a case study concerned with ecofriendly stone-washing of jeans-garments was investigated. A raw material-substitution option was adopted whereby the toxic potassium permanganate and sodium sulfide were replaced by the environmentally compatible hydrogen peroxide and glucose respectively where the concentrations of both replaced chemicals together with the operating time were optimized. In addition, a process-rationalization option involving four additional processes was investigated. By means of criteria such as product quality, effluent analysis, mass and heat balance; and cost analysis with the aid of a statistical model, a process optimization treatment revealed that the superior process optima were 50%, 0.15% and 50min for H2O2 concentration, glucose concentration and time, respectively. With these values the superior process ought to reduce the annual cost by about EGP 105 relative to the currently used conventional method.

A Simulation Software for DNA Computing Algorithms Implementation

The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.

Toward Strengthening Social Resilience: A Case Study on Recovery of Capture Fisheries after Asia's Tsunami in Aceh, Indonesia

Social resilience has role to govern the local community and coastal fisheries resources toward sustainable fisheries development in tsunami affected area. This paper asses, explore and investigates of indigenous institutions, external and internal facilitators toward strengthening social resilience. Identification of the genuine organizations role had been conducted twice by using Rapid Assessment Appraisal, Focus Group Discussion, and in-depth interview for collecting primary and secondary data. Local wisdom had a contribution and adaptable to rebound social resilience. The Panglima Laot Lhok (sea commander) had determined and adapted role on recovery of the fishing community, particularly facilitated aid delivery to fishermen, as shown in anchovy fisheries relief case in Krueng Raya Bay. Toke Bangku (financial trader) had stimulated for reinforcement of advance payment and market channel. The other institutions supported upon linking and bridging connectivity among stakeholders. Collaborative governance can avoid conflict, reduce donor dependency and strengthen social resilience within fishing community.

Distribution Feeder Reconfiguration Considering Distributed Generators

Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. Fueling the attention have been the possibilities of international agreements to reduce greenhouse gas emissions, electricity sector restructuring, high power reliability requirements for certain activities, and concern about easing transmission and distribution capacity bottlenecks and congestion. So it is necessary that impact of these kinds of generators on distribution feeder reconfiguration would be investigated. This paper presents an approach for distribution reconfiguration considering Distributed Generators (DGs). The objective function is summation of electrical power losses A Tabu search optimization is used to solve the optimal operation problem. The approach is tested on a real distribution feeder.

Solar Energy Collection using a Double-layer Roof

The purpose of this study is to investigate the efficiency of a double-layer roof in collecting solar energy as an application to the areas such as raising high-end temperature of organic Rankine cycle (ORC). The by-product of the solar roof is to reduce building air-conditioning loads. The experimental apparatus are arranged to evaluate the effects of the solar roof in absorbing solar energy. The flow channel is basically formed by an aluminum plate on top of a plywood plate. The geometric configurations in which the effects of absorbing energy is analyzed include: a bare uncovered aluminum plate, a glass-covered aluminum plate, a glass-covered/black-painted aluminum plate, a plate with variable lengths, a flow channel with stuffed material (in an attempt on enhancement of heat conduction), and a flow channel with variable slanted angles. The experimental results show that the efficiency of energy collection varies from 0.6 % to 11 % for the geometric configurations mentioned above. An additional study is carried out using CFD simulation to investigate the effects of fins on the aluminum plate. It shows that due to vastly enhanced heat conduction, the efficiency can reach ~23 % if 50 fins are installed on the aluminum plate. The study shows that a double-layer roof can efficiently absorb solar energy and substantially reduce building air-conditioning loads. On the high end of an organic Rankine cycle, a solar pond is used to replace the warm surface water of the sea as OTEC (ocean thermal energy conversion) is the driving energy for the ORC. The energy collected from the double-layered solar roof can be pumped into the pond and raise the pond temperature as the pond surface area is equivalently increased by nearly one-fourth of the total area of the double-layer solar roof. The effect of raising solar pond temperature is especially prominent if the double-layer solar roofs are installed in a community area.

A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

Development of a Methodology for Processing of Drilling Operations

Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.

A New Effective Local Search Heuristic for the Maximum Clique Problem

An edge based local search algorithm, called ELS, is proposed for the maximum clique problem (MCP), a well-known combinatorial optimization problem. ELS is a two phased local search method effectively £nds the near optimal solutions for the MCP. A parameter ’support’ of vertices de£ned in the ELS greatly reduces the more number of random selections among vertices and also the number of iterations and running times. Computational results on BHOSLIB and DIMACS benchmark graphs indicate that ELS is capable of achieving state-of-the-art-performance for the maximum clique with reasonable average running times.

On Constructing Approximate Convex Hull

The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.

Constructing Distinct Kinds of Solutions for the Time-Dependent Coefficients Coupled Klein-Gordon-Schrödinger Equation

We seek exact solutions of the coupled Klein-Gordon-Schrödinger equation with variable coefficients with the aid of Lie classical approach. By using the Lie classical method, we are able to derive symmetries that are used for reducing the coupled system of partial differential equations into ordinary differential equations. From reduced differential equations we have derived some new exact solutions of coupled Klein-Gordon-Schrödinger equations involving some special functions such as Airy wave functions, Bessel functions, Mathieu functions etc.