On the Analysis and a Few Optimization Issues of a New iCIM 3000 System at an Academic-Research Oriented Institution

In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.

A Few Descriptive and Optimization Issues on the Material Flow at a Research-Academic Institution: The Role of Simulation

Lately, significant work in the area of Intelligent Manufacturing has become public and mainly applied within the frame of industrial purposes. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Aware of all this and due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: Intelligent Manufacturing, the present paper emerges with the main aim of contributing to the design and analysis of the material flow in either systems, cells or work stations under this new “intelligent" denomination. For this, besides offering a conceptual basis in some of the key points to be taken into account and some general principles to consider in the design and analysis of the material flow, also some tips on how to define other possible alternative material flow scenarios and a classification of the states a system, cell or workstation are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a detailed layout, other figures and a few expressions which could help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.

Creativity and Economic Development

The objective of this paper is to construct a creativity composite index designed to capture the growing role of creativity in driving economic and social development for the 27 European Union countries. The paper proposes a new approach for the measurement of EU-27 creative potential and for determining its capacity to attract and develop creative human capital. We apply a modified version of the 3T model developed by Richard Florida and Irene Tinagli for constructing a Euro-Creativity Index. The resulting indexes establish a quantitative base for policy makers, supporting their efforts to determine the contribution of creativity to economic development.

Ignition Time Delay in Swirling Supersonic Flow Combustion

Supersonic hydrogen-air cylindrical mixing layer is numerically analyzed to investigate the effect of inlet swirl on ignition time delay in scramjets. Combustion is treated using detail chemical kinetics. One-equation turbulence model of Spalart and Allmaras is chosen to study the problem and advection upstream splitting method is used as computational scheme. The results show that swirling both fuel and oxidizer streams may drastically decrease the ignition distance in supersonic combustion, unlike using the swirl just in fuel stream which has no helpful effect.

Induction of Alternative Oxidase Activity in Candida albicans by Oxidising Conditions

Candida albicans ATCC 10231 had low endogenous activity of the alternative oxidase compared with that of C. albicans ATCC 10261. In C. albicans ATCC 10231 the endogenous activity declined as the cultures aged. Alternative oxidase activity could be induced in C. albicans ATCC 10231 by treatment with cyanide, but the induction of this activity required the presence of oxygen which could be replaced, at least in part, with high concentrations of potassium ferricyanide. We infer from this that the expression of the gene encoding the alternative oxidase is under the control of a redoxsensitive transcription factor.

Effects of Beak Trimming on Behavior and Agonistic Activity of Thai Native Pullets Raised in Floor Pens

The effect of beak trimming on behavior of two strains of Thai native pullets kept in floor pens was studied. Six general activities (standing, crouching, moving, comforting, roosting, and nesting), 6 beak related activities (preening, feeding, drinking, pecking at inedible object, feather pecking, and litter pecking), and 4 agonistic activities (head pecking, threatening, avoiding, and fighting) were measured twice a for 15 consecutive days, started when the pullets were 19 wk old. It was found that beak trimmed pullets drank more frequent (P

Quality-Driven Business Process Refactoring

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

The Role of Ga to Improve AlN-Nucleation Layer for Al0.1Ga0.9N/Si(111)

Group-III nitride material as particularly AlxGa1-xN is one of promising optoelectronic materials to require for shortwavelength devices. To achieve the high-quality AlxGa1-xN films for a high performance of such devices, AlN-nucleation layers are the important factor. To improve the AlN-nucleation layers with a variation of Ga-addition, XRD measurements were conducted to analyze the crystalline quality of the subsequent Al0.1Ga0.9N with the minimum ω-FWHMs of (0002) and (10-10) reflections of 425 arcsec and 750 arcsec, respectively. SEM and AFM measurements were performed to observe the surface morphology and TEM measurements to identify the microstructures and orientations. Results showed that the optimized Ga-atoms in the Al(Ga)Nnucleation layers improved the surface diffusion to form moreuniform crystallites in structure and size, better alignment of each crystallite, and better homogeneity of island distribution. This, hence, improves the orientation of epilayers on the Si-surface and finally improves the crystalline quality and reduces the residual strain of subsequent Al0.1Ga0.9N layers.

Visualising Energy Efficiency Landscape

This paper discusses the landscape design that could increase energy efficiency in a house. By planting trees in a house compound, the tree shades prevent direct sunlight from heating up the building, and it enables cooling off the surrounding air. The requirement for air-conditioning could be minimized and the air quality could be improved. During the life time of a tree, the saving cost from the mentioned benefits could be up to US $ 200 for each tree. The project intends to visually describe the landscape design in a house compound that could enhance energy efficiency and consequently lead to energy saving. The house compound model was developed in three dimensions by using AutoCAD 2005, the animation was programmed by using LightWave 3D softwares i.e. Modeler and Layout to display the tree shadings in the wall. The visualization was executed on a VRML Pad platform and implemented on a web environment.

Ensembling Adaptively Constructed Polynomial Regression Models

The approach of subset selection in polynomial regression model building assumes that the chosen fixed full set of predefined basis functions contains a subset that is sufficient to describe the target relation sufficiently well. However, in most cases the necessary set of basis functions is not known and needs to be guessed – a potentially non-trivial (and long) trial and error process. In our research we consider a potentially more efficient approach – Adaptive Basis Function Construction (ABFC). It lets the model building method itself construct the basis functions necessary for creating a model of arbitrary complexity with adequate predictive performance. However, there are two issues that to some extent plague the methods of both the subset selection and the ABFC, especially when working with relatively small data samples: the selection bias and the selection instability. We try to correct these issues by model post-evaluation using Cross-Validation and model ensembling. To evaluate the proposed method, we empirically compare it to ABFC methods without ensembling, to a widely used method of subset selection, as well as to some other well-known regression modeling methods, using publicly available data sets.

The Sequestration of Heavy Metals Contaminating the Wonderfonteinspruit Catchment Area using Natural Zeolite

For more than 120 years, gold mining formed the backbone the South Africa-s economy. The consequence of mine closure was observed in large-scale land degradation and widespread pollution of surface water and groundwater. This paper investigates the feasibility of using natural zeolite in removing heavy metals contaminating the Wonderfonteinspruit Catchment Area (WCA), a water stream with high levels of heavy metals and radionuclide pollution. Batch experiments were conducted to study the adsorption behavior of natural zeolite with respect to Fe2+, Mn2+, Ni2+, and Zn2+. The data was analysed using the Langmuir and Freudlich isotherms. Langmuir was found to correlate the adsorption of Fe2+, Mn2+, Ni2+, and Zn2+ better, with the adsorption capacity of 11.9 mg/g, 1.2 mg/g, 1.3 mg/g, and 14.7 mg/g, respectively. Two kinetic models namely, pseudo-first order and pseudo second order were also tested to fit the data. Pseudo-second order equation was found to be the best fit for the adsorption of heavy metals by natural zeolite. Zeolite functionalization with humic acid increased its uptake ability.

Three Dimensional Modeling of Mixture Formation and Combustion in a Direct Injection Heavy-Duty Diesel Engine

Due to the stringent legislation for emission of diesel engines and also increasing demand on fuel consumption, the importance of detailed 3D simulation of fuel injection, mixing and combustion have been increased in the recent years. In the present work, FIRE code has been used to study the detailed modeling of spray and mixture formation in a Caterpillar heavy-duty diesel engine. The paper provides an overview of the submodels implemented, which account for liquid spray atomization, droplet secondary break-up, droplet collision, impingement, turbulent dispersion and evaporation. The simulation was performed from intake valve closing (IVC) to exhaust valve opening (EVO). The predicted in-cylinder pressure is validated by comparing with existing experimental data. A good agreement between the predicted and experimental values ensures the accuracy of the numerical predictions collected with the present work. Predictions of engine emissions were also performed and a good quantitative agreement between measured and predicted NOx and soot emission data were obtained with the use of the present Zeldowich mechanism and Hiroyasu model. In addition, the results reported in this paper illustrate that the numerical simulation can be one of the most powerful and beneficial tools for the internal combustion engine design, optimization and performance analysis.

Integrated Cultivation Technique for Microbial Lipid Production by Photosynthetic Microalgae and Locally Oleaginous Yeast

The objective of this research is to study of microbial lipid production by locally photosynthetic microalgae and oleaginous yeast via integrated cultivation technique using CO2 emissions from yeast fermentation. A maximum specific growth rate of Chlorella sp. KKU-S2 of 0.284 (1/d) was obtained under an integrated cultivation and a maximum lipid yield of 1.339g/L was found after cultivation for 5 days, while 0.969g/L of lipid yield was obtained after day 6 of cultivation time by using CO2 from air. A high value of volumetric lipid production rate (QP, 0.223 g/L/d), specific product yield (YP/X, 0.194), volumetric cell mass production rate (QX, 1.153 g/L/d) were found by using ambient air CO2 coupled with CO2 emissions from yeast fermentation. Overall lipid yield of 8.33 g/L was obtained (1.339 g/L of Chlorella sp. KKU-S2 and 7.06g/L of T. maleeae Y30) while low lipid yield of 0.969g/L was found using non-integrated cultivation technique. To our knowledge this is the unique report about the lipid production from locally microalgae Chlorella sp. KKU-S2 and yeast T. maleeae Y30 in an integrated technique to improve the biomass and lipid yield by using CO2 emissions from yeast fermentation.

An Analytical Framework for Multi-Site Supply Chain Planning Problems

As the gradual increase of the enterprise scale, the firms may possess many manufacturing plants located in different places geographically. This change will result in the multi-site production planning problems under the environment of multiple plants or production resources. Our research proposes the structural framework to analyze the multi-site planning problems. The analytical framework is composed of six elements: multi-site conceptual model, product structure (bill of manufacturing), production strategy, manufacturing capability and characteristics, production planning constraints, and key performance indicators. As well as the discussion of these six ingredients, we also review related literatures in this paper to match our analytical framework. Finally we take a real-world practical example of a TFT-LCD manufacturer in Taiwan to explain our proposed analytical framework for the multi-site production planning problems.

An Efficient and Generic Hybrid Framework for High Dimensional Data Clustering

Clustering in high dimensional space is a difficult problem which is recurrent in many fields of science and engineering, e.g., bioinformatics, image processing, pattern reorganization and data mining. In high dimensional space some of the dimensions are likely to be irrelevant, thus hiding the possible clustering. In very high dimensions it is common for all the objects in a dataset to be nearly equidistant from each other, completely masking the clusters. Hence, performance of the clustering algorithm decreases. In this paper, we propose an algorithmic framework which combines the (reduct) concept of rough set theory with the k-means algorithm to remove the irrelevant dimensions in a high dimensional space and obtain appropriate clusters. Our experiment on test data shows that this framework increases efficiency of the clustering process and accuracy of the results.

Secure Protocol for Short Message Service

Short Message Service (SMS) has grown in popularity over the years and it has become a common way of communication, it is a service provided through General System for Mobile Communications (GSM) that allows users to send text messages to others. SMS is usually used to transport unclassified information, but with the rise of mobile commerce it has become a popular tool for transmitting sensitive information between the business and its clients. By default SMS does not guarantee confidentiality and integrity to the message content. In the mobile communication systems, security (encryption) offered by the network operator only applies on the wireless link. Data delivered through the mobile core network may not be protected. Existing end-to-end security mechanisms are provided at application level and typically based on public key cryptosystem. The main concern in a public-key setting is the authenticity of the public key; this issue can be resolved by identity-based (IDbased) cryptography where the public key of a user can be derived from public information that uniquely identifies the user. This paper presents an encryption mechanism based on the IDbased scheme using Elliptic curves to provide end-to-end security for SMS. This mechanism has been implemented over the standard SMS network architecture and the encryption overhead has been estimated and compared with RSA scheme. This study indicates that the ID-based mechanism has advantages over the RSA mechanism in key distribution and scalability of increasing security level for mobile service.

Quantum Computation using Two Component Bose-Einstein Condensates

Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.

Accurate Visualization of Graphs of Functions of Two Real Variables

The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.

Comparative Analysis of Concentration in Insurance Markets in New EU Member States

The purpose of this article is to analyze the market structure as well as the degree of concentration in insurance markets in new EU member states. The analysis was conducted using several most commonly used concentration indicators such as concentration ratio, Herfindahl-Hirschman index and entropy index. These indicators were calculated for the 2000-2010 period on the basis of total gross written premium as the most relevant indicator of market power in insurance markets. The results of the analysis showed that in all observed countries the level of concentration decreased, though with significantly different intensity. Yet, in some countries, the level of concentration remains very high.