Design for Manufacturability and Concurrent Engineering for Product Development

In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.

Forecasting Enrollment Model Based on First-Order Fuzzy Time Series

This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.

A Face-to-Face Education Support System Capable of Lecture Adaptation and Q&A Assistance Based On Probabilistic Inference

Keys to high-quality face-to-face education are ensuring flexibility in the way lectures are given, and providing care and responsiveness to learners. This paper describes a face-to-face education support system that is designed to raise the satisfaction of learners and reduce the workload on instructors. This system consists of a lecture adaptation assistance part, which assists instructors in adapting teaching content and strategy, and a Q&A assistance part, which provides learners with answers to their questions. The core component of the former part is a “learning achievement map", which is composed of a Bayesian network (BN). From learners- performance in exercises on relevant past lectures, the lecture adaptation assistance part obtains information required to adapt appropriately the presentation of the next lecture. The core component of the Q&A assistance part is a case base, which accumulates cases consisting of questions expected from learners and answers to them. The Q&A assistance part is a case-based search system equipped with a search index which performs probabilistic inference. A prototype face-to-face education support system has been built, which is intended for the teaching of Java programming, and this approach was evaluated using this system. The expected degree of understanding of each learner for a future lecture was derived from his or her performance in exercises on past lectures, and this expected degree of understanding was used to select one of three adaptation levels. A model for determining the adaptation level most suitable for the individual learner has been identified. An experimental case base was built to examine the search performance of the Q&A assistance part, and it was found that the rate of successfully finding an appropriate case was 56%.

Calibration of Time-Skew Error in a M-Channel Time-Interleaved Analog-to-Digital Converter

Offset mismatch, gain mismatch, and time-skew error between time-interleaved channels limit the performance of time-interleaved analog-to-digital converters (TIADC). This paper focused on the time-skew error. A new technique for calibrating time-skew error in M-channels TIADC is described, and simulation results are also presented.

The National Security Assurance of the Republic of Kazakhstan

the article analyzes the national security as a scientific and practical problem, characterized by the state's political institutions to ensure effective action to maintain optimal conditions for the existence and development of the individual and society. National security, as a category of political science reflects the relationship between the security to the nation, including public relations and social consciousness, social institutions and their activities, ensuring the realization of national interests in a particular historical situation. In national security are three security levels: individual, society and state. Their role and place determined by the nature of social relations, political systems, the presence of internal and external threats. In terms of content in the concept of national security is taken to provide political, economic, military, environmental, information security and safety of the cultural development of the nation.

Numerical Analysis and Sensitivity Study of Non-Premixed Combustion Using LES

Non-premixed turbulent combustion Computational Fluid Dynamics (CFD) has been carried out in a simplified methanefuelled coaxial jet combustor employing Large Eddy Simulation (LES). The objective of this study is to evaluate the performance of LES in modelling non-premixed combustion using a commercial software, FLUENT, and investigate the effects of the grid density and chemistry models employed on the accuracy of the simulation results. A comparison has also been made between LES and Reynolds Averaged Navier-Stokes (RANS) predictions. For LES grid sensitivity test, 2.3 and 6.2 million cell grids are employed with the equilibrium model. The chemistry model sensitivity analysis is achieved by comparing the simulation results from the equilibrium chemistry and steady flamelet models. The predictions of the mixture fraction, axial velocity, species mass fraction and temperature by LES are in good agreement with the experimental data. The LES results are similar for the two chemistry models but influenced considerably by the grid resolution in the inner flame and near-wall regions.

An Advanced Nelder Mead Simplex Method for Clustering of Gene Expression Data

The DNA microarray technology concurrently monitors the expression levels of thousands of genes during significant biological processes and across the related samples. The better understanding of functional genomics is obtained by extracting the patterns hidden in gene expression data. It is handled by clustering which reveals natural structures and identify interesting patterns in the underlying data. In the proposed work clustering gene expression data is done through an Advanced Nelder Mead (ANM) algorithm. Nelder Mead (NM) method is a method designed for optimization process. In Nelder Mead method, the vertices of a triangle are considered as the solutions. Many operations are performed on this triangle to obtain a better result. In the proposed work, the operations like reflection and expansion is eliminated and a new operation called spread-out is introduced. The spread-out operation will increase the global search area and thus provides a better result on optimization. The spread-out operation will give three points and the best among these three points will be used to replace the worst point. The experiment results are analyzed with optimization benchmark test functions and gene expression benchmark datasets. The results show that ANM outperforms NM in both benchmarks.

Performance of a Power Generator System Using Crude Plant Oil Blend with Diesel Fuel

Under the variation of crude oil price and the impact of greenhouse effect, it is urgent to find a potential alternative fuel. Among these alternative fuels, non edible plant oils are the most potential ones, because they don-t have the problem of food and cropland competitions. Among the non-edible plant oils, Jatropha oil is the most potential one. Jatropha oil is non-eatable oil and has good oil quality and low temperature performance. It has potential to become one of the most competitive biomass crude oils. The crude plant oil will be blended with diesel fuel to be tested in a power generator. The international collaboration between Taiwan and Indonesia on the production of Jatropha in Indonesia will also be presented in this study.

Applying Theory of Perceived Risk and Technology Acceptance Model in the Online Shopping Channel

As the advancement of technology, online shopping channel develops rapidly in recent years. According to the report of Taiwan Network Information Center, there are almost eighty percents of internet population shopping in online channel. Synthesizing insights from the previous research, this study develops the conceptual model to integrate Theory of Perceived Risk (TPR) and Technology Acceptance Model (TAM) to apply in online shopping. Using data collected from 637 respondents from online survey website, we use structural equation modeling to test measurement and structural models. The results suggest the need for consideration of perceived risk as an antecedent in the Technology Acceptance Model. The limitations and implications are discussed.

Water Security in Rural Areas through Solar Energy in Baja California Sur, Mexico

This study aims to assess the potential of solar energy technology for improving access to water and hence the livelihood strategies of rural communities in Baja California Sur, Mexico. It focuses on livestock ranches and photovoltaic water-pumptechnology as well as other water extraction methods. The methodology used are the Sustainable Livelihoods and the Appropriate Technology approaches. A household survey was applied in June of 2006 to 32 ranches in the municipality, of which 22 used PV pumps; and semi-structured interviews were conducted. Findings indicate that solar pumps have in fact helped people improve their quality of life by allowing them to pursue a different livelihood strategy and that improved access to water -not necessarily as more water but as less effort to extract and collect it- does not automatically imply overexploitation of the resource; consumption is based on basic needs as well as on storage and pumping capacity. Justification for such systems lies in the avoidance of logistical problems associated to fossil fuels, PV pumps proved to be the most beneficial when substituting gasoline or diesel equipment but of dubious advantage if intended to replace wind or gravity systems. Solar water pumping technology-s main obstacle to dissemination are high investment and repairs costs and it is therefore not suitable for all cases even when insolation rates and water availability are adequate. In cases where affordability is not an obstacle it has become an important asset that contributes –by means of reduced expenses, less effort and saved time- to the improvement of livestock, the main livelihood provider for these ranches.

Fuzzy Control of Macroeconomic Models

The optimal control is one of the possible controllers for a dynamic system, having a linear quadratic regulator and using the Pontryagin-s principle or the dynamic programming method . Stochastic disturbances may affect the coefficients (multiplicative disturbances) or the equations (additive disturbances), provided that the shocks are not too great . Nevertheless, this approach encounters difficulties when uncertainties are very important or when the probability calculus is of no help with very imprecise data. The fuzzy logic contributes to a pragmatic solution of such a problem since it operates on fuzzy numbers. A fuzzy controller acts as an artificial decision maker that operates in a closed-loop system in real time. This contribution seeks to explore the tracking problem and control of dynamic macroeconomic models using a fuzzy learning algorithm. A two inputs - single output (TISO) fuzzy model is applied to the linear fluctuation model of Phillips and to the nonlinear growth model of Goodwin.

Calculating the Efficiency of Steam Boilers Based on Its Most Effecting Factors: A Case Study

This paper is concerned with calculating boiler efficiency as one of the most important types of performance measurements in any steam power plant. That has a key role in determining the overall effectiveness of the whole system within the power station. For this calculation, a Visual-Basic program was developed, and a steam power plant known as El-Khmus power plant, Libya was selected as a case study. The calculation of the boiler efficiency was applied by using heating balance method. The findings showed how the maximum heat energy which produced from the boiler increases the boiler efficiency through increasing the temperature of the feed water, and decreasing the exhaust temperature along with humidity levels of the of fuel used within the boiler.

Towards a New Methodology for Developing Web-Based Systems

Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.

Dynamic Interaction Network to Model the Interactive Patterns of International Stock Markets

Studies in economics domain tried to reveal the correlation between stock markets. Since the globalization era, interdependence between stock markets becomes more obvious. The Dynamic Interaction Network (DIN) algorithm, which was inspired by a Gene Regulatory Network (GRN) extraction method in the bioinformatics field, is applied to reveal important and complex dynamic relationship between stock markets. We use the data of the stock market indices from eight countries around the world in this study. Our results conclude that DIN is able to reveal and model patterns of dynamic interaction from the observed variables (i.e. stock market indices). Furthermore, it is also found that the extracted network models can be utilized to predict movement of the stock market indices with a considerably good accuracy.

Multivalued Knowledge-Base based on Multivalued Datalog

The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. The concept of multivalued knowledgebase will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these extensions the concept of multivalued knowledge-base will be defined. This knowledge-base can be a possible background of a future agent-model.

Stochastic Subspace Modelling of Turbulence

Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical cross spectral density function for the along-wind turbulence component over the wind field area is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since the succeeding state space and ARMA modelling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.

Classification and Resolving Urban Problems by Means of Fuzzy Approach

Urban problems are problems of organized complexity. Thus, many models and scientific methods to resolve urban problems are failed. This study is concerned with proposing of a fuzzy system driven approach for classification and solving urban problems. The proposed study investigated mainly the selection of the inputs and outputs of urban systems for classification of urban problems. In this research, five categories of urban problems, respect to fuzzy system approach had been recognized: control, polytely, optimizing, open and decision making problems. Grounded Theory techniques were then applied to analyze the data and develop new solving method for each category. The findings indicate that the fuzzy system methods are powerful processes and analytic tools for helping planners to resolve urban complex problems. These tools can be successful where as others have failed because both incorporate or address uncertainty and risk; complexity and systems interacting with other systems.

Influence of Number Parallels Paths of a Winding on Overvoltage in the Asynchronous Motors Fed by PWM- converters

This work is devoted to the calculation of the undulatory parameters and the study of the influence of te number parallel path of a winding on overvoltage compared to the frame and between turns (sections) in a multiturn random winding of an asynchronous motors supplied with PWM- converters.

Effect of Concrete Nonlinear Parameters on the Seismic Response of Concrete Gravity Dams

Behavior of dams against the seismic loads has been studied by many researchers. Most of them proposed new numerical methods to investigate the dam safety. In this paper, to study the effect of nonlinear parameters of concrete in gravity dams, a twodimensional approach was used including the finite element method, staggered method and smeared crack approach. Effective parameters in the models are physical properties of concrete such as modulus of elasticity, tensile strength and specific fracture energy. Two different models were used in foundation (mass-less and massed) in order to determine the seismic response of concrete gravity dams. Results show that when the nonlinear analysis includes the dam- foundation interaction, the foundation-s mass, flexibility and radiation damping are important in gravity dam-s response.

Application of Simulation and Response Surface to Optimize Hospital Resources

This paper presents a case study that uses processoriented simulation to identify bottlenecks in the service delivery system in an emergency department of a hospital in the United Arab Emirates. Using results of the simulation, response surface models were developed to explain patient waiting time and the total time patients spend in the hospital system. Results of the study could be used as a service improvement tool to help hospital management in improving patient throughput and service quality in the hospital system.