Stress Relaxation of Date at Different Temperature and Moisture Content of Product: A New Approach

Iran is one of the greatest producers of date in the world. However due to lack of information about its viscoelastic properties, much of the production downgraded during harvesting and postharvesting processes. In this study the effect of temperature and moisture content of product were investigated on stress relaxation characteristics. Therefore, the freshly harvested date (kabkab) at tamar stage were put in controlled environment chamber to obtain different temperature levels (25, 35, 45, and 55 0C) and moisture contents (8.5, 8.7, 9.2, 15.3, 20, 32.2 %d.b.). A texture analyzer TAXT2 (Stable Microsystems, UK) was used to apply uniaxial compression tests. A chamber capable to control temperature was designed and fabricated around the plunger of texture analyzer to control the temperature during the experiment. As a new approach a CCD camera (A4tech, 30 fps) was mounted on a cylindrical glass probe to scan and record contact area between date and disk. Afterwards, pictures were analyzed using image processing toolbox of Matlab software. Individual date fruit was uniaxially compressed at speed of 1 mm/s. The constant strain of 30% of thickness of date was applied to the horizontally oriented fruit. To select a suitable model for describing stress relaxation of date, experimental data were fitted with three famous stress relaxation models including the generalized Maxwell, Nussinovitch, and Pelege. The constant in mentioned model were determined and correlated with temperature and moisture content of product using non-linear regression analysis. It was found that Generalized Maxwell and Nussinovitch models appropriately describe viscoelastic characteristics of date fruits as compared to Peleg mode.

Neural Network Based Icing Identification and Fault Tolerant Control of a 340 Aircraft

This paper presents a Neural Network (NN) identification of icing parameters in an A340 aircraft and a reconfiguration technique to keep the A/C performance close to the performance prior to icing. Five aircraft parameters are assumed to be considerably affected by icing. The off-line training for identifying the clear and iced dynamics is based on the Levenberg-Marquard Backpropagation algorithm. The icing parameters are located in the system matrix. The physical locations of the icing are assumed at the right and left wings. The reconfiguration is based on the technique known as the control mixer approach or pseudo inverse technique. This technique generates the new control input vector such that the A/C dynamics is not much affected by icing. In the simulations, the longitudinal and lateral dynamics of an Airbus A340 aircraft model are considered, and the stability derivatives affected by icing are identified. The simulation results show the successful NN identification of the icing parameters and the reconfigured flight dynamics having the similar performance before the icing. In other words, the destabilizing icing affect is compensated.

Wiener Filter as an Optimal MMSE Interpolator

The ideal sinc filter, ignoring the noise statistics, is often applied for generating an arbitrary sample of a bandlimited signal by using the uniformly sampled data. In this article, an optimal interpolator is proposed; it reaches a minimum mean square error (MMSE) at its output in the presence of noise. The resulting interpolator is thus a Wiener filter, and both the optimal infinite impulse response (IIR) and finite impulse response (FIR) filters are presented. The mean square errors (MSE-s) for the interpolator of different length impulse responses are obtained by computer simulations; it shows that the MSE-s of the proposed interpolators with a reasonable length are improved about 0.4 dB under flat power spectra in noisy environment with signal-to-noise power ratio (SNR) equal 10 dB. As expected, the results also demonstrate the improvements for the MSE-s with various fractional delays of the optimal interpolator against the ideal sinc filter under a fixed length impulse response.

From Hype to Ignorance – A Review of 30 Years of Lean Production

Lean production (or lean management respectively) gained popularity in several waves. The last three decades have been filled with numerous attempts to apply these concepts in companies. However, this has only been partially successful. The roots of lean production can be traced back to Toyota-s just-in-time production. This concept, which according to Womack-s, Jones- and Roos- research at MIT was employed by Japanese car manufacturers, became popular under its international names “lean production", “lean-manufacturing" and was termed “Schlanke Produktion" in Germany. This contribution shows a review about lean production in Germany over the last thirty years: development, trial & error and implementation as well.

Evolutionary Approach for Automated Discovery of Censored Production Rules

In the recent past, there has been an increasing interest in applying evolutionary methods to Knowledge Discovery in Databases (KDD) and a number of successful applications of Genetic Algorithms (GA) and Genetic Programming (GP) to KDD have been demonstrated. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski & Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations, in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the 'If P Then D' part of the CPR expresses important information, while the Unless C part acts only as a switch and changes the polarity of D to ~D. This paper presents a classification algorithm based on evolutionary approach that discovers comprehensible rules with exceptions in the form of CPRs. The proposed approach has flexible chromosome encoding, where each chromosome corresponds to a CPR. Appropriate genetic operators are suggested and a fitness function is proposed that incorporates the basic constraints on CPRs. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Blind Channel Estimation Based on URV Decomposition Technique for Uplink of MC-CDMA

In this paper, we investigate a blind channel estimation method for Multi-carrier CDMA systems that use a subspace decomposition technique. This technique exploits the orthogonality property between the noise subspace and the received user codes to obtain channel of each user. In the past we used Singular Value Decomposition (SVD) technique but SVD have most computational complexity so in this paper use a new algorithm called URV Decomposition, which serve as an intermediary between the QR decomposition and SVD, replaced in SVD technique to track the noise space of the received data. Because of the URV decomposition has almost the same estimation performance as the SVD, but has less computational complexity.

Directors- Islamic Code of Ethics

This paper discusses a new model of Islamic code of ethics for directors. Several corporate scandals and local (example Transmile and Megan Media) and overseas corporate (example Parmalat and Enron) collapses show that the current corporate governance and regulatory reform are unable to prevent these events from recurring. Arguably, the code of ethics for directors is under research and the current code of ethics only concentrates on binding the work of the employee of the organization as a whole, without specifically putting direct attention to the directors, the group of people responsible for the performance of the company. This study used a semi-structured interview survey of well-known Islamic scholars such as the Mufti to develop the model. It is expected that the outcome of the research is a comprehensive model of code of ethics based on the Islamic principles that can be applied and used by the company to construct a code of ethics for their directors.

The Analysis of the Software Industry in Thailand

The software industry has been considered a critical infrastructure for any nation. Several studies have indicated that national competitiveness increasingly depends upon Information and Communication Technology (ICT), and software is one of the major components of ICT, important for both large and small enterprises. Even though there has been strong growth in the software industry in Thailand, the industry has faced many challenges and problems that need to be resolved. For example, the amount of pirated software has been rising, and Thailand still has a large gap in the digital divide. Additionally, the adoption among SMEs has been slow. This paper investigates various issues in the software industry in Thailand, using information acquired through analysis of secondary sources, observation, and focus groups. The results of this study can be used as “lessons learned" for the development of the software industry in any developing country.

Investigation on Adjustable Mirror Bender Using Light Beam Size

In this research, the use of light beam size to design the adjustable mirror bender is presented. The focused beam line characterized by its size towards the synchrotron light beam line is investigated. The COSMOSWorks is used in all simulation components of curvature adjustment system to analyze in finite element method. The results based on simulation covers the use of applied forces during adjustment of the mirror radius are presented.

Non-negative Principal Component Analysis for Face Recognition

Principle component analysis is often combined with the state-of-art classification algorithms to recognize human faces. However, principle component analysis can only capture these features contributing to the global characteristics of data because it is a global feature selection algorithm. It misses those features contributing to the local characteristics of data because each principal component only contains some levels of global characteristics of data. In this study, we present a novel face recognition approach using non-negative principal component analysis which is added with the constraint of non-negative to improve data locality and contribute to elucidating latent data structures. Experiments are performed on the Cambridge ORL face database. We demonstrate the strong performances of the algorithm in recognizing human faces in comparison with PCA and NREMF approaches.

Designing and Implementing an Innovative Course about World Wide Web, Based on the Conceptual Representations of Students

Internet is nowadays included to all National Curriculums of the elementary school. A comparative study of their goals leads to the conclusion that a complete curriculum should aim to student-s acquisition of the abilities to navigate and search for information and additionally to emphasize on the evaluation of the information provided by the World Wide Web. In a constructivistic knowledge framework the design of a course has to take under consideration the conceptual representations of students. The following paper presents the conceptual representation of students of eleven years old, attending the Sixth Grade of Greek Elementary School about World Wide Web and their use in the design and implementation of an innovative course.

Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part I: Modeling

This paper and its companion (Part 2) deal with modeling and optimization of two NP-hard problems in production planning of flexible manufacturing system (FMS), part type selection problem and loading problem. The part type selection problem and the loading problem are strongly related and heavily influence the system-s efficiency and productivity. The complexity of the problems is harder when flexibilities of operations such as the possibility of operation processed on alternative machines with alternative tools are considered. These problems have been modeled and solved simultaneously by using real coded genetic algorithms (RCGA) which uses an array of real numbers as chromosome representation. These real numbers can be converted into part type sequence and machines that are used to process the part types. This first part of the papers focuses on the modeling of the problems and discussing how the novel chromosome representation can be applied to solve the problems. The second part will discuss the effectiveness of the RCGA to solve various test bed problems.

Quasi Multi-Pulse Back-to-Back Static Synchronous Compensator Employing Line Frequency Switching 2-Level GTO Inverters

Back-to-back static synchronous compensator (BtBSTATCOM) consists of two back-to-back voltage-source converters (VSC) with a common DC link in a substation. This configuration extends the capabilities of conventional STATCOM that bidirectional active power transfer from one bus to another is possible. In this paper, VSCs are designed in quasi multi-pulse form in which GTOs are triggered only once per cycle in PSCAD/EMTDC. The design details of VSCs as well as gate switching circuits and controllers are fully represented. Regulation modes of BtBSTATCOM are verified and tested on a multi-machine power system through different simulation cases. The results presented in the form of typical time responses show that practical PI controllers are almost robust and stable in case of start-up, set-point change, and line faults.

Identifying Significant Factors of Brick Laying Process through Design of Experiment and Computer Simulation: A Case Study

Improving performance measures in the construction processes has been a major concern for managers and decision makers in the industry. They seek for ways to recognize the key factors which have the largest effect on the process. Identifying such factors can guide them to focus on the right parts of the process in order to gain the best possible result. In the present study design of experiment (DOE) has been applied to a computer simulation model of brick laying process to determine significant factors while productivity has been chosen as the response of the experiment. To this end, four controllable factors and their interaction have been experimented and the best factor level has been calculated for each one. The results indicate that three factors, namely, labor of brick, labor of mortar and inter arrival time of mortar along with interaction of labor of brick and labor of mortar are significant.

The Residual Effects of Different Doses of Atrazine+Alachlor and Foramsulfuron on the Growth and Physiology of Rapeseed (Brassica napus L.)

A pot experiment was carried out under controlled conditions to evaluate the residual effects of different doses of atrazine+alachlor and foramsulfuron used in corn fields on the growth and physiology of rapeseed (Brassica napus L.). A split-plot experiment in CRD with 4 replications was used. The main plots consisted of herbicide type (atrazine+alachlor mixture and foramsulfuron) and the sub-plots were different residual doses of the herbicides (0, 1%, 5%, 10%, 20%, 40%, 50% and 100%). 7 cm diameter pots were filled with a virgin soil and seeds of rapeseed cv. Hayola were planted in them. The pots were kept under controlled conditions for 8 weeks after germination. At harvest, the growth parameters and the chlorophyll contents of the leaves were determined. The results showed that the growth of rapeseed plants was completely prevented at the highest residual doses of the herbicides (50 and 100 %). The growth parameters of rapeseed plants were affected by all doses of both types of the herbicide as compared to the controls. The residual effects of atrazine+alachlor mixture in reducing the growth parameters of rapeseed were more pronounced as compared to the residual effects of foramsulfuron alone.

Investigating Simple Multipath Compensation for Frequency Modulated Signals at Lower Frequencies

Radio propagation from point-to-point is affected by the physical channel in many ways. A signal arriving at a destination travels through a number of different paths which are referred to as multi-paths. Research in this area of wireless communications has progressed well over the years with the research taking different angles of focus. By this is meant that some researchers focus on ways of reducing or eluding Multipath effects whilst others focus on ways of mitigating the effects of Multipath through compensation schemes. Baseband processing is seen as one field of signal processing that is cardinal to the advancement of software defined radio technology. This has led to wide research into the carrying out certain algorithms at baseband. This paper considers compensating for Multipath for Frequency Modulated signals. The compensation process is carried out at Radio frequency (RF) and at Quadrature baseband (QBB) and the results are compared. Simulations are carried out using MatLab so as to show the benefits of working at lower QBB frequencies than at RF.

Modeling of Dielectric Heating in Radio- Frequency Applicator Optimized for Uniform Temperature by Means of Genetic Algorithms

The paper presents an optimization study based on genetic algorithms (GA-s) for a radio-frequency applicator used in heating dielectric band products. The weakly coupled electro-thermal problem is analyzed using 2D-FEM. The design variables in the optimization process are: the voltage of a supplementary “guard" electrode and six geometric parameters of the applicator. Two objective functions are used: temperature uniformity and total active power absorbed by the dielectric. Both mono-objective and multiobjective formulations are implemented in GA optimization.

Hydrogen Integration in Petrochemical Complexes, Using Modified Automated Targeting Method

Owing to extensive use of hydrogen in refining or petrochemical units, it is essential to manage hydrogen network in order to make the most efficient utilization of hydrogen. On the other hand, hydrogen is an important byproduct not properly used through petrochemical complexes and mostly sent to the fuel system. A few works have been reported in literature to improve hydrogen network for petrochemical complexes. In this study a comprehensive analysis is carried out on petrochemical units using a modified automated targeting technique which is applied to determine the minimum hydrogen consumption. Having applied the modified targeting method in two petrochemical cases, the results showed a significant reduction in required fresh hydrogen.

Technique for Processing and Preservation of Human Amniotic Membrane for Ocular Surface Reconstruction

Human amniotic membrane (HAM) is a useful biological material for the reconstruction of damaged ocular surface. The processing and preservation of HAM is critical to prevent the patients undergoing amniotic membrane transplant (AMT) from cross infections. For HAM preparation human placenta is obtained after an elective cesarean delivery. Before collection, the donor is screened for seronegativity of HCV, Hbs Ag, HIV and Syphilis. After collection, placenta is washed in balanced salt solution (BSS) in sterile environment. Amniotic membrane is then separated from the placenta as well as chorion while keeping the preparation in BSS. Scrapping of HAM is then carried out manually until all the debris is removed and clear transparent membrane is acquired. Nitrocellulose membrane filters are then placed on the stromal side of HAM, cut around the edges with little membrane folded towards other side making it easy to separate during surgery. HAM is finally stored in solution of glycerine and Dulbecco-s Modified Eagle Medium (DMEM) in 1:1 ratio containing antibiotics. The capped borosil vials containing HAM are kept at -80°C until use. This vial is thawed to room temperature and opened under sterile operation theatre conditions at the time of surgery.

Measurement of Small PD-S in Compressed SF6(10%) - N2(90%) Gas Mixture

Partial Discharge measurement is a very important means of assessing the integrity of insulation systems in a High Voltage apparatus. In compressed gas insulation systems, floating particles can initiate partial discharge activities which adversely affect the working of insulation. Partial Discharges below the inception voltage also plays a crucial in damaging the integrity of insulation over a period of time. This paper discusses the effect of loose and fixed Copper and Nichrome wire particles on the PD characteristics in SF6-N2 (10:90) gas mixtures at a pressure of 0.4MPa. The Partial Discharge statistical parameters and their correlation to the observed results are discussed.