Biomass and Pigment Production by Monascus during Miniaturized Submerged Culture on Adlay

Three reactor types were explored and successfully used for pigment production by Monascus: shake flasks, and shaken and stirred miniaturized reactors. Also, the use of dielectric spectroscopy for the on-line measurement of biomass levels was explored. Shake flasks gave good pigment yields, but scale up is difficult, and they cannot be automated. Shaken bioreactors were less successful with pigment production than stirred reactors. Experiments with different impeller speeds in different volumes of liquid in the reactor confirmed that this is most likely due oxygen availability. The availability of oxygen appeared to affect biomass levels less than pigment production; red pigment production in particular needed very high oxygen levels. Dielectric spectroscopy was effectively used to continuously measure biomass levels during the submerged fungal fermentation in the shaken and stirred miniaturized bioreactors, despite the presence of the solid substrate particles. Also, the capacitance signal gave useful information about the viability of the cells in the culture.

Semi-Automatic Analyzer to Detect Authorial Intentions in Scientific Documents

Information Retrieval has the objective of studying models and the realization of systems allowing a user to find the relevant documents adapted to his need of information. The information search is a problem which remains difficult because the difficulty in the representing and to treat the natural languages such as polysemia. Intentional Structures promise to be a new paradigm to extend the existing documents structures and to enhance the different phases of documents process such as creation, editing, search and retrieval. The intention recognition of the author-s of texts can reduce the largeness of this problem. In this article, we present intentions recognition system is based on a semi-automatic method of extraction the intentional information starting from a corpus of text. This system is also able to update the ontology of intentions for the enrichment of the knowledge base containing all possible intentions of a domain. This approach uses the construction of a semi-formal ontology which considered as the conceptualization of the intentional information contained in a text. An experiments on scientific publications in the field of computer science was considered to validate this approach.

System Identification and Performance Improvement to a Micro Gas Turbine Applying Biogas

In this study, the effects of biogas fuels on the performance of an annular micro gas turbine (MGT) were assessed experimentally and numerically. In the experiments, the proposed MGT system was operated successfully under each test condition; minimum composition to the fuel with the biogas was roughly 50% CH4 with 50% CO2. The power output was around 170W at 85,000 RPM as 90% CH4 with 10% CO2 was used and 70W at 65,000 RPM as 70% CH4 with 30% CO2 was used. When a critical limit of 60% CH4 was reached, the power output was extremely low. Furthermore, the theoretical Brayton cycle efficiency and electric efficiency of the MGT were calculated as 23% and 10%, respectively. Following the experiments, the measured data helped us identify the parameters of dynamic model in numerical simulation. Additionally, a numerical analysis of re-designed combustion chamber showed that the performance of MGT could be improved by raising the temperature at turbine inlet. This study presents a novel distributed power supply system that can utilize renewable biogas. The completed micro biogas power supply system is small, low cost, easy to maintain and suited to household use.

An ensemble of Weighted Support Vector Machines for Ordinal Regression

Instead of traditional (nominal) classification we investigate the subject of ordinal classification or ranking. An enhanced method based on an ensemble of Support Vector Machines (SVM-s) is proposed. Each binary classifier is trained with specific weights for each object in the training data set. Experiments on benchmark datasets and synthetic data indicate that the performance of our approach is comparable to state of the art kernel methods for ordinal regression. The ensemble method, which is straightforward to implement, provides a very good sensitivity-specificity trade-off for the highest and lowest rank.

Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Removal of Arsenic (III) from Contaminated Waterby Synthetic Nano Size Zerovalent Iron

The present work was conducted for Arsenic (III) removal, which one of the most poisonous groundwater pollutants, by synthetic nano size zerovalent iron (nZVI). Batch experiments were performed to investigate the influence of As (III), nZVI concentration, pH of solution and contact time on the efficiency of As (III) removal. nZVI was synthesized by reduction of ferric chloride by sodium borohydrid. SEM and XRD were used to determine particle size and characterization of produced nanoparticles. Up to 99.9% removal efficiency for arsenic (III) was obtained by nZVI dosage of 1 g/L at time equal to 10 min. and pH=7. It could be concluded that the removal efficiency were enhanced with increasing of ZVI dosage and reaction time, but decreased with increasing of arsenic concentration and pH for nano sized ZVI. nZVI presented an outstanding ability to remove As (III) due to not only a high surface area and low particle size but also to high inherent activity.

Pyrolysis of Rice Husk in a Fixed Bed Reactor

Fixed-bed slow pyrolysis experiments of rice husk have been conducted to determine the effect of pyrolysis temperature, heating rate, particle size and reactor length on the pyrolysis product yields. Pyrolysis experiments were performed at pyrolysis temperature between 400 and 600°C with a constant heating rate of 60°C/min and particle sizes of 0.60-1.18 mm. The optimum process conditions for maximum liquid yield from the rice husk pyrolysis in a fixed bed reactor were also identified. The highest liquid yield was obtained at a pyrolysis temperature of 500°C, particle size of 1.18-1.80 mm, with a heating rate of 60°C/min in a 300 mm length reactor. The obtained yield of, liquid, gas and solid were found be in the range of 22.57-31.78 %, 27.75-42.26 % and 34.17-42.52 % (all weight basics) respectively at different pyrolysis conditions. The results indicate that the effects of pyrolysis temperature and particle size on the pyrolysis yield are more significant than that of heating rate and reactor length. The functional groups and chemical compositions present in the liquid obtained at optimum conditions were identified by Fourier Transform-Infrared (FT-IR) spectroscopy and Gas Chromatography/ Mass Spectroscopy (GC/MS) analysis respectively.

A Real Time Collision Avoidance Algorithm for Mobile Robot based on Elastic Force

This present paper proposes the modified Elastic Strip method for mobile robot to avoid obstacles with a real time system in an uncertain environment. The method deals with the problem of robot in driving from an initial position to a target position based on elastic force and potential field force. To avoid the obstacles, the robot has to modify the trajectory based on signal received from the sensor system in the sampling times. It was evident that with the combination of Modification Elastic strip and Pseudomedian filter to process the nonlinear data from sensor uncertainties in the data received from the sensor system can be reduced. The simulations and experiments of these methods were carried out.

Text Retrieval Relevance Feedback Techniques for Bag of Words Model in CBIR

The state-of-the-art Bag of Words model in Content- Based Image Retrieval has been used for years but the relevance feedback strategies for this model are not fully investigated. Inspired from text retrieval, the Bag of Words model has the ability to use the wealth of knowledge and practices available in text retrieval. We study and experiment the relevance feedback model in text retrieval for adapting it to image retrieval. The experiments show that the techniques from text retrieval give good results for image retrieval and that further improvements is possible.

New Corneal Reflection Removal Method Used In Iris Recognition System

Images of human iris contain specular highlights due to the reflective properties of the cornea. This corneal reflection causes many errors not only in iris and pupil center estimation but also to locate iris and pupil boundaries especially for methods that use active contour. Each iris recognition system has four steps: Segmentation, Normalization, Encoding and Matching. In order to address the corneal reflection, a novel reflection removal method is proposed in this paper. Comparative experiments of two existing methods for reflection removal method are evaluated on CASIA iris image databases V3. The experimental results reveal that the proposed algorithm provides higher performance in reflection removal.

A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Mechanical Quadrature Methods and Their Extrapolations for Solving First Kind Boundary Integral Equations of Anisotropic Darcy-s Equation

The mechanical quadrature methods for solving the boundary integral equations of the anisotropic Darcy-s equations with Dirichlet conditions in smooth domains are presented. By applying the collectively compact theory, we prove the convergence and stability of approximate solutions. The asymptotic expansions for the error show that the methods converge with the order O (h3), where h is the mesh size. Based on these analysis, extrapolation methods can be introduced to achieve a higher convergence rate O (h5). An a posterior asymptotic error representation is derived in order to construct self-adaptive algorithms. Finally, the numerical experiments show the efficiency of our methods.

Bone Generation through Mechanical Loading

Bones are dynamic and responsive organs, they regulate their strength and mass according to the loads which they are subjected. Because, the Wnt/β-catenin pathway has profound effects on the regulation of bone mass, we hypothesized that mechanical loading of bone cells stimulates Wnt/β-catenin signaling, which results in the generation of new bone mass. Mechanical loading triggers the secretion of the Wnt molecule, which after binding to transmembrane proteins, causes GSK-3β (Glycogen synthase kinase 3 beta) to cease the phosphorylation of β-catenin. β-catenin accumulation in the cytoplasm, followed by its transport into the nucleus, binding to transcription factors (TCF/LEF) that initiate transcription of genes related to bone formation. To test this hypothesis, we used TOPGAL (Tcf Optimal Promoter β-galactosidase) mice in an experiment in which cyclic loads were applied to the forearm. TOPGAL mice are reporters for cells effected by the Wnt/β-catenin signaling pathway. TOPGAL mice are genetically engineered mice in which transcriptional activation of β- catenin, results in the production of an enzyme, β-galactosidase. The presence of this enzyme allows us to localize transcriptional activation of β-catenin to individual cells, thereby, allowing us to quantify the effects that mechanical loading has on the Wnt/β-catenin pathway and new bone formation. The ulnae of loaded TOPGAL mice were excised and transverse slices along different parts of the ulnar shaft were assayed for the presence of β-galactosidase. Our results indicate that loading increases β-catenin transcriptional activity in regions where this pathway is already primed (i.e. where basal activity is already higher) in a load magnitude dependent manner. Further experiments are needed to determine the temporal and spatial activation of this signaling in relation to bone formation.

Kinetic and Optimization Studies on Ethanol Production from Corn Flour

Studies on Simultaneous Saccharification and Fermentation (SSF) of corn flour, a major agricultural product as the substrate using starch digesting glucoamylase enzyme derived from Aspergillus niger and non starch digesting and sugar fermenting Saccharomyces cerevisiae in a batch fermentation. Experiments based on Central Composite Design (CCD) were conducted to study the effect of substrate concentration, pH, temperature, enzyme concentration on Ethanol Concentration and the above parameters were optimized using Response Surface Methodology (RSM). The optimum values of substrate concentration, pH, temperature and enzyme concentration were found to be 160 g/l, 5.5, 30°C and 50 IU respectively. The effect of inoculums age on ethanol concentration was also investigated. The corn flour solution equivalent to 16% initial starch concentration gave the highest ethanol concentration of 63.04 g/l after 48 h of fermentation at optimum conditions of pH and temperature. Monod model and Logistic model were used for growth kinetics and Leudeking – Piret model was used for product formation kinetics.

Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm

With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.

Design of Nonlinear Observer by Using Augmented Linear System based on Formal Linearization of Polynomial Type

The objective of this study is to propose an observer design for nonlinear systems by using an augmented linear system derived by application of a formal linearization method. A given nonlinear differential equation is linearized by the formal linearization method which is based on Taylor expansion considering up to the higher order terms, and a measurement equation is transformed into an augmented linear one. To this augmented dimensional linear system, a linear estimation theory is applied and a nonlinear observer is derived. As an application of this method, an estimation problem of transient state of electric power systems is studied, and its numerical experiments indicate that this observer design shows remarkable performances for nonlinear systems.

Removal of Hexavalent Chromium from Wastewater by Use of Scrap Iron

Hexavalent chromium is highly toxic to most living organisms and a known human carcinogen by the inhalation route of exposure. Therefore, treatment of Cr(VI) contaminated wastewater is essential before their discharge to the natural water bodies. Cr(VI) reduction to Cr(III) can be beneficial because a more mobile and more toxic chromium species is converted to a less mobile and less toxic form. Zero-valence-state metals, such as scrap iron, can serve as electron donors for reducing Cr(VI) to Cr(III). The influence of pH on scrap iron capacity to reduce Cr(VI) was investigated in this study. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the pH, the greater the experiment duration with maximum scrap iron reduction capacity. The experimental results showed that highest maximum reduction capacity of scrap iron was 12.5 mg Cr(VI)/g scrap iron, at pH 2.0, and decreased with increasing pH up to 1.9 mg Cr(VI)/g scrap iron at pH = 7.3.

Comparison of Three Meta Heuristics to Optimize Hybrid Flow Shop Scheduling Problem with Parallel Machines

This study compares three meta heuristics to minimize makespan (Cmax) for Hybrid Flow Shop (HFS) Scheduling Problem with Parallel Machines. This problem is known to be NP-Hard. This study proposes three algorithms among improvement heuristic searches which are: Genetic Algorithm (GA), Simulated Annealing (SA), and Tabu Search (TS). SA and TS are known as deterministic improvement heuristic search. GA is known as stochastic improvement heuristic search. A comprehensive comparison from these three improvement heuristic searches is presented. The results for the experiments conducted show that TS is effective and efficient to solve HFS scheduling problems.

Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part II: Optimization

This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.

Enhancing the Error-Correcting Performance of LDPC Codes through an Efficient Use of Decoding Iterations

The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.