Abstract: The indistinctness of the manufacturing processes makes that a parts cannot be realized in an absolutely exact way towards the specifications on the dimensions. It is thus necessary to assume that the effectively realized product has to belong in a very strict way to compatible intervals with a correct functioning of the parts. In this paper we present an approach based on mixing tow different characteristics theories, the fuzzy system and Petri net system. This tool has been proposed to model and control the quality in an assembly system. A robust command of a mechanical assembly process is presented as an application. This command will then have to maintain the specifications interval of parts in front of the variations. It also illustrates how the technique reacts when the product quality is high, medium, or low.
Abstract: Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.
Abstract: This paper presents the DC voltage control design of D-STATCOM when the D-STATCOM is used for load voltage regulation. Although, the DC voltage can be controlled by active current of the D-STATCOM, reactive current still affects the DC voltage. To eliminate this effect, the control strategy with elimination effect of the reactive current is proposed and the results of the control with and without the elimination the effect of the reactive current are compared. For obtaining the proportional and integral gains of the PI controllers, the symmetrical optimum and genetic algorithms methods are applied. The stability margin of these methods are obtained and discussed in detail. In addition, the performance of the DC voltage control based on symmetrical optimum and genetic algorithms methods are compared. Effectiveness of the controllers designed was verified through computer simulation performed by using Power System Tool Block (PSB) in SIMULINK/MATLAB. The simulation results demonstrated that the DC voltage control proposed is effective in regulating DC voltage when the DSTATCOM is used for load voltage regulation.
Abstract: This paper analyzes the patterns of the Monte Carlo
data for a large number of variables and minterms, in order to
characterize the circuit path length behavior. We propose models
that are determined by training process of shortest path length
derived from a wide range of binary decision diagram (BDD)
simulations. The creation of the model was done use of feed forward
neural network (NN) modeling methodology. Experimental results
for ISCAS benchmark circuits show an RMS error of 0.102 for the
shortest path length complexity estimation predicted by the NN
model (NNM). Use of such a model can help reduce the time
complexity of very large scale integrated (VLSI) circuitries and
related computer-aided design (CAD) tools that use BDDs.
Abstract: One of the main concerns about parallel mechanisms
is the presence of singular points within their workspaces. In singular
positions the mechanism gains or loses one or several degrees of
freedom. It is impossible to control the mechanism in singular
positions. Therefore, these positions have to be avoided. This is a
vital need especially in computer controlled machine tools designed
and manufactured on the basis of parallel mechanisms. This need has
to be taken into consideration when selecting design parameters. A
prerequisite to this is a thorough knowledge about the effect of
design parameters and constraints on singularity. In this paper,
quality condition index was introduced as a criterion for evaluating
singularities of different configurations of a hexapod mechanism
obtainable by different design parameters. It was illustrated that this
method can effectively be employed to obtain the optimum
configuration of hexapod mechanism with the aim of avoiding
singularity within the workspace. This method was then employed to
design the hexapod table of a CNC milling machine.
Abstract: Saddlepoint approximations is one of the tools to obtain
an expressions for densities and distribution functions. We approximate
the densities of the observed gaps between the hypopnea events
using the Huzurbazar saddlepoint approximation. We demonstrate the
density of a maximum likelihood estimator in exponential families.
Abstract: The purpose of this study is to introduce a new
interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues
in proton therapy. This interface program was developed under
MATLAB software and includes a friendly graphical user interface
with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image
segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton
beam. The result of the mentioned technique is a number of nonoverlapped
squares with different sizes in every image. By this way
the resolution of image segmentation is high enough in and near
heterogeneous areas to preserve the precision of dose calculations
and is low enough in homogeneous areas to reduce the number of
cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron
and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.
Abstract: Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.
Abstract: System testing is actually done to the entire system
against the Functional Requirement Specification and/or the System
Requirement Specification. Moreover, it is an investigatory testing
phase, where the focus is to have almost a destructive attitude and
test not only the design, but also the behavior and even the believed
expectations of the customer. It is also intended to test up to and
beyond the bounds defined in the software/hardware requirements
specifications. In Motorola®, Automated Testing is one of the testing
methodologies uses by GSG-iSGT (Global Software Group - iDEN
TM
Subcriber Group-Test) to increase the testing volume, productivity
and reduce test cycle-time in iDEN
TM
phones testing. Testing is able
to produce more robust products before release to the market. In this
paper, iHopper is proposed as a tool to perform stress test on iDEN
TM
phonse. We will discuss the value that automation has brought to
iDEN
TM
Phone testing such as improving software quality in the
iDEN
TM
phone together with some metrics. We will also look into
the advantages of the proposed system and some discussion of the
future work as well.
Abstract: This paper presents preliminary results on modeling
and control of a quadrotor UAV. With aerodynamic concepts, a
mathematical model is firstly proposed to describe the dynamics
of the quadrotor UAV. Parameters of this model are identified by
experiments with Matlab Identify Toolbox. A group of PID controllers
are then designed based on the developed model. To verify
the developed model and controllers, simulations and experiments for
altitude control, position control and trajectory tracking are carried
out. The results show that the quadrotor UAV well follows the
referenced commands, which clearly demonstrates the effectiveness
of the proposed approach.
Abstract: In North America, Most power distribution systems
employ a four-wire multi-grounded neutral (MGN) design. This paper has explained the inherent characteristics of multi-grounded three-phase four-wire distribution systems under unbalanced
situations. As a result, the mechanism of voltage swell and voltage sag in MGN feeders becomes difficult to understand. The simulation
tool that has been used in this paper is MATLAB under Windows software. In this paper the equivalent model of a full-scale multigrounded
distribution system implemented by MATLAB is
introduced. The results are expected to help utility engineers to understand the impact of MGN on distribution system operations.
Abstract: Construction projects generally take place in
uncontrolled and dynamic environments where construction waste is
a serious environmental problem in many large cities. The total
amount of waste and carbon dioxide emissions from transportation
vehicles are still out of control due to increasing construction
projects, massive urban development projects and the lack of
effective tools for minimizing adverse environmental impacts in
construction. This research is about utilization of the integrated
applications of automated advanced tracking and data storage
technologies in the area of environmental management to monitor
and control adverse environmental impacts such as construction
waste and carbon dioxide emissions. Radio Frequency Identification
(RFID) integrated with the Global Position System (GPS) provides
an opportunity to uniquely identify materials, components, and
equipments and to locate and track them using minimal or no worker
input. The transmission of data to the central database will be carried
out with the help of Global System for Mobile Communications
(GSM).
Abstract: Model Predictive Control (MPC) is increasingly being
proposed for real time applications and embedded systems. However
comparing to PID controller, the implementation of the MPC in
miniaturized devices like Field Programmable Gate Arrays (FPGA)
and microcontrollers has historically been very small scale due to its
complexity in implementation and its computation time requirement.
At the same time, such embedded technologies have become an
enabler for future manufacturing enterprises as well as a transformer
of organizations and markets. Recently, advances in microelectronics
and software allow such technique to be implemented in embedded
systems. In this work, we take advantage of these recent advances
in this area in the deployment of one of the most studied and
applied control technique in the industrial engineering. In fact in
this paper, we propose an efficient framework for implementation
of Generalized Predictive Control (GPC) in the performed STM32
microcontroller. The STM32 keil starter kit based on a JTAG interface
and the STM32 board was used to implement the proposed GPC
firmware. Besides the GPC, the PID anti windup algorithm was
also implemented using Keil development tools designed for ARM
processor-based microcontroller devices and working with C/Cµ
langage. A performances comparison study was done between both
firmwares. This performances study show good execution speed and
low computational burden. These results encourage to develop simple
predictive algorithms to be programmed in industrial standard hardware.
The main features of the proposed framework are illustrated
through two examples and compared with the anti windup PID
controller.
Abstract: Does open ended creative technology give positive impact in learning design? Although there are many researchers had examined on the impact of technology on design education but there are very few conclusive researches done on the impact of open ended used of software to learning design. This paper sought to investigate a group of student-s experience on relatively wider range of software application within the context of design project. A typography design project was used to create a learning environment with the aim of inculcate design skills into the learners and increase their creative problem-solving and critical thinking skills. The methods used in this study were questionnaire survey and personal observation which will be focus on the individual and group response during the completion of the task.
Abstract: Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.
Abstract: Bridges are one of the main components of
transportation networks. They should be functional before and after
earthquake for emergency services. Therefore we need to assess
seismic performance of bridges under different seismic loadings.
Fragility curve is one of the popular tools in seismic evaluations. The
fragility curves are conditional probability statements, which give the
probability of a bridge reaching or exceeding a particular damage
level for a given intensity level. In this study, the seismic
performance of a two-span simply supported concrete bridge is
assessed. Due to usual lack of empirical data, the analytical fragility
curve was developed by results of the dynamic analysis of bridge
subjected to the different time histories in near-fault area.
Abstract: This paper gives an overview of how an OWL
ontology has been created to represent template knowledge models
defined in CML that are provided by CommonKADS.
CommonKADS is a mature knowledge engineering methodology
which proposes the use of template knowledge model for knowledge
modelling. The aim of developing this ontology is to present the
template knowledge model in a knowledge representation language
that can be easily understood and shared in the knowledge
engineering community. Hence OWL is used as it has become a
standard for ontology and also it already has user friendly tools for
viewing and editing.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization (MSSVQ), which is a hybrid of Multi, switched,
split vector quantization techniques. The spectral distortion
performance, computational complexity, and memory requirements
of MSSVQ are compared to split vector quantization (SVQ), multi
stage vector quantization(MSVQ) and switched split vector
quantization (SSVQ) techniques. It has been proved from results that
MSSVQ has better spectral distortion performance, lower
computational complexity and lower memory requirements when
compared to all the above mentioned product code vector
quantization techniques. Computational complexity is measured in
floating point operations (flops), and memory requirements is
measured in (floats).
Abstract: Neighborhood Rough Sets (NRS) has been proven to
be an efficient tool for heterogeneous attribute reduction. However,
most of researches are focused on dealing with complete and noiseless
data. Factually, most of the information systems are noisy, namely,
filled with incomplete data and inconsistent data. In this paper, we
introduce a generalized neighborhood rough sets model, called
VPTNRS, to deal with the problem of heterogeneous attribute
reduction in noisy system. We generalize classical NRS model with
tolerance neighborhood relation and the probabilistic theory.
Furthermore, we use the neighborhood dependency to evaluate the
significance of a subset of heterogeneous attributes and construct a
forward greedy algorithm for attribute reduction based on it.
Experimental results show that the model is efficient to deal with noisy
data.
Abstract: Information technology managers nowadays are
facing with tremendous pressure to plan, implement, and adopt new
technology solution due to the rapidity of technology changes.
Resulted from a lack of study that have been done in this topic, the
aim of this paper is to provide a comparison review on current tools
that are currently being used in order to respond to technological
changes. The study is based on extensive literature review of
published works with majority of them are ranging from 2000 to the
first part of 2011. The works were gathered from journals, books,
and other information sources available on the Web. Findings show
that, each tools has different focus and none of the tools are
providing a framework in holistic view, which should include
technical, people, process, and business environment aspect. Hence,
this result provides potential information about current available
tools that IT managers could use to manage changes in technology.
Further, the result reveals a research gap in the area where the
industries a short of such framework.