Learning Monte Carlo Data for Circuit Path Length

This paper analyzes the patterns of the Monte Carlo data for a large number of variables and minterms, in order to characterize the circuit path length behavior. We propose models that are determined by training process of shortest path length derived from a wide range of binary decision diagram (BDD) simulations. The creation of the model was done use of feed forward neural network (NN) modeling methodology. Experimental results for ISCAS benchmark circuits show an RMS error of 0.102 for the shortest path length complexity estimation predicted by the NN model (NNM). Use of such a model can help reduce the time complexity of very large scale integrated (VLSI) circuitries and related computer-aided design (CAD) tools that use BDDs.

A Study on the Quality of Hexapod Machine Tool's Workspace

One of the main concerns about parallel mechanisms is the presence of singular points within their workspaces. In singular positions the mechanism gains or loses one or several degrees of freedom. It is impossible to control the mechanism in singular positions. Therefore, these positions have to be avoided. This is a vital need especially in computer controlled machine tools designed and manufactured on the basis of parallel mechanisms. This need has to be taken into consideration when selecting design parameters. A prerequisite to this is a thorough knowledge about the effect of design parameters and constraints on singularity. In this paper, quality condition index was introduced as a criterion for evaluating singularities of different configurations of a hexapod mechanism obtainable by different design parameters. It was illustrated that this method can effectively be employed to obtain the optimum configuration of hexapod mechanism with the aim of avoiding singularity within the workspace. This method was then employed to design the hexapod table of a CNC milling machine.

Moment Generating Functions of Observed Gaps between Hypopnea Using Saddlepoint Approximations

Saddlepoint approximations is one of the tools to obtain an expressions for densities and distribution functions. We approximate the densities of the observed gaps between the hypopnea events using the Huzurbazar saddlepoint approximation. We demonstrate the density of a maximum likelihood estimator in exponential families.

A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

Development of a Methodology for Processing of Drilling Operations

Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.

Utilization of Advanced Data Storage Technology to Conduct Construction Industry on Clear Environment

Construction projects generally take place in uncontrolled and dynamic environments where construction waste is a serious environmental problem in many large cities. The total amount of waste and carbon dioxide emissions from transportation vehicles are still out of control due to increasing construction projects, massive urban development projects and the lack of effective tools for minimizing adverse environmental impacts in construction. This research is about utilization of the integrated applications of automated advanced tracking and data storage technologies in the area of environmental management to monitor and control adverse environmental impacts such as construction waste and carbon dioxide emissions. Radio Frequency Identification (RFID) integrated with the Global Position System (GPS) provides an opportunity to uniquely identify materials, components, and equipments and to locate and track them using minimal or no worker input. The transmission of data to the central database will be carried out with the help of Global System for Mobile Communications (GSM).

A Microcontroller Implementation of Model Predictive Control

Model Predictive Control (MPC) is increasingly being proposed for real time applications and embedded systems. However comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprises as well as a transformer of organizations and markets. Recently, advances in microelectronics and software allow such technique to be implemented in embedded systems. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In fact in this paper, we propose an efficient framework for implementation of Generalized Predictive Control (GPC) in the performed STM32 microcontroller. The STM32 keil starter kit based on a JTAG interface and the STM32 board was used to implement the proposed GPC firmware. Besides the GPC, the PID anti windup algorithm was also implemented using Keil development tools designed for ARM processor-based microcontroller devices and working with C/Cµ langage. A performances comparison study was done between both firmwares. This performances study show good execution speed and low computational burden. These results encourage to develop simple predictive algorithms to be programmed in industrial standard hardware. The main features of the proposed framework are illustrated through two examples and compared with the anti windup PID controller.

Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area

Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.

An Owl Ontology for Commonkads Template Knowledge Models

This paper gives an overview of how an OWL ontology has been created to represent template knowledge models defined in CML that are provided by CommonKADS. CommonKADS is a mature knowledge engineering methodology which proposes the use of template knowledge model for knowledge modelling. The aim of developing this ontology is to present the template knowledge model in a knowledge representation language that can be easily understood and shared in the knowledge engineering community. Hence OWL is used as it has become a standard for ontology and also it already has user friendly tools for viewing and editing.

Coping with the Rapidity of Information Technology Changes – A Comparison Reviewon Current Practices

Information technology managers nowadays are facing with tremendous pressure to plan, implement, and adopt new technology solution due to the rapidity of technology changes. Resulted from a lack of study that have been done in this topic, the aim of this paper is to provide a comparison review on current tools that are currently being used in order to respond to technological changes. The study is based on extensive literature review of published works with majority of them are ranging from 2000 to the first part of 2011. The works were gathered from journals, books, and other information sources available on the Web. Findings show that, each tools has different focus and none of the tools are providing a framework in holistic view, which should include technical, people, process, and business environment aspect. Hence, this result provides potential information about current available tools that IT managers could use to manage changes in technology. Further, the result reveals a research gap in the area where the industries a short of such framework.

From Individual Memory to Organizational Memory (Intelligence of Organizations)

Intensive changes of environment and strong market competition have raised management of information and knowledge to the strategic level of companies. In a knowledge based economy only those organizations are capable of living which have up-to-date, special knowledge and they are able to exploit and develop it. Companies have to know what knowledge they have by taking a survey of organizational knowledge and they have to fix actual and additional knowledge in organizational memory. The question is how to identify, acquire, fix and use knowledge effectively. The paper will show that over and above the tools of information technology supporting acquisition, storage and use of information and organizational learning as well as knowledge coming into being as a result of it, fixing and storage of knowledge in the memory of a company play an important role in the intelligence of organizations and competitiveness of a company.

Demand and Supply Chain Simulation in Telecommunication Industry by Multi-Rate Expert Systems

In modern telecommunications industry, demand & supply chain management (DSCM) needs reliable design and versatile tools to control the material flow. The objective for efficient DSCM is reducing inventory, lead times and related costs in order to assure reliable and on-time deliveries from manufacturing units towards customers. In this paper the multi-rate expert system based methodology for developing simulation tools that would enable optimal DSCM for multi region, high volume and high complexity manufacturing environment was proposed.

Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry

Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.

An Analytical Electron Mobility Model based on Particle Swarm Computation for Siliconbased Devices

The study of the transport coefficients in electronic devices is currently carried out by analytical and empirical models. This study requires several simplifying assumptions, generally necessary to lead to analytical expressions in order to study the different characteristics of the electronic silicon-based devices. Further progress in the development, design and optimization of Silicon-based devices necessarily requires new theory and modeling tools. In our study, we use the PSO (Particle Swarm Optimization) technique as a computational tool to develop analytical approaches in order to study the transport phenomenon of the electron in crystalline silicon as function of temperature and doping concentration. Good agreement between our results and measured data has been found. The optimized analytical models can also be incorporated into the circuits simulators to study Si-based devices without impact on the computational time and data storage.

CAD Based Predictive Models of the Undeformed Chip Geometry in Drilling

Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.

A Remote Sensing Approach for Vulnerability and Environmental Change in Apodi Valley Region, Northeast Brazil

The objective of this study was to improve our understanding of vulnerability and environmental change; it's causes basically show the intensity, its distribution and human-environment effect on the ecosystem in the Apodi Valley Region, This paper is identify, assess and classify vulnerability and environmental change in the Apodi valley region using a combined approach of landscape pattern and ecosystem sensitivity. Models were developed using the following five thematic layers: Geology, geomorphology, soil, vegetation and land use/cover, by means of a Geographical Information Systems (GIS)-based on hydro-geophysical parameters. In spite of the data problems and shortcomings, using ESRI-s ArcGIS 9.3 program, the vulnerability score, to classify, weight and combine a number of 15 separate land cover classes to create a single indicator provides a reliable measure of differences (6 classes) among regions and communities that are exposed to similar ranges of hazards. Indeed, the ongoing and active development of vulnerability concepts and methods have already produced some tools to help overcome common issues, such as acting in a context of high uncertainties, taking into account the dynamics and spatial scale of asocial-ecological system, or gathering viewpoints from different sciences to combine human and impact-based approaches. Based on this assessment, this paper proposes concrete perspectives and possibilities to benefit from existing commonalities in the construction and application of assessment tools.

Dynamic Features Selection for Heart Disease Classification

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Value Engineering and Its Effect in Reduction of Industrial Organization Energy Expenses

The review performed on the condition of energy consumption & rate in Iran, shows that unfortunately the subject of optimization and conservation of energy in active industries of country lacks a practical & effective method and in most factories, the energy consumption and rate is more than in similar industries of industrial countries. The increasing demand of electrical energy and the overheads which it imposes on the organization, forces companies to search for suitable approaches to optimize energy consumption and demand management. Application of value engineering techniques is among these approaches. Value engineering is considered a powerful tool for improving profitability. These tools are used for reduction of expenses, increasing profits, quality improvement, increasing market share, performing works in shorter durations, more efficient utilization of sources & etc. In this article, we shall review the subject of value engineering and its capabilities for creating effective transformations in industrial organizations, in order to reduce energy costs & the results have been investigated and described during a case study in Mazandaran wood and paper industries, the biggest consumer of energy in north of Iran, for the purpose of presenting the effects of performed tasks in optimization of energy consumption by utilizing value engineering techniques in one case study.

XML Integration of Data from CloudSat Satellite and GMS-6 Water Vapor Satellite

This study aimed at developing visualization tools for integrating CloudSat images and Water Vapor Satellite images. KML was used for integrating data from CloudSat Satellite and GMS-6 Water Vapor Satellite. CloudSat 2D images were transformed into 3D polygons in order to achieve 3D images. Before overlaying the images on Google Earth, GMS-6 water vapor satellite images had to be rescaled into linear images. Web service was developed using webMathematica. Shoreline from GMS-6 images was compared with shoreline from LandSat images on Google Earth for evaluation. The results showed that shoreline from GMS-6 images was highly matched with the shoreline in LandSat images from Google Earth. For CloudSat images, the visualizations were compared with GMS-6 images on Google Earth. The results showed that CloudSat and GMS-6 images were highly correlated.

Data-organization Before Learning Multi-Entity Bayesian Networks Structure

The objective of our work is to develop a new approach for discovering knowledge from a large mass of data, the result of applying this approach will be an expert system that will serve as diagnostic tools of a phenomenon related to a huge information system. We first recall the general problem of learning Bayesian network structure from data and suggest a solution for optimizing the complexity by using organizational and optimization methods of data. Afterward we proposed a new heuristic of learning a Multi-Entities Bayesian Networks structures. We have applied our approach to biological facts concerning hereditary complex illnesses where the literatures in biology identify the responsible variables for those diseases. Finally we conclude on the limits arched by this work.