Abstract: The study of the transport coefficients in electronic
devices is currently carried out by analytical and empirical models.
This study requires several simplifying assumptions, generally
necessary to lead to analytical expressions in order to study the
different characteristics of the electronic silicon-based devices.
Further progress in the development, design and optimization of
Silicon-based devices necessarily requires new theory and modeling
tools. In our study, we use the PSO (Particle Swarm Optimization)
technique as a computational tool to develop analytical approaches in
order to study the transport phenomenon of the electron in crystalline
silicon as function of temperature and doping concentration. Good
agreement between our results and measured data has been found.
The optimized analytical models can also be incorporated into the
circuits simulators to study Si-based devices without impact on the
computational time and data storage.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: The dynamic behaviour of a four-bar linkage driven by a velocity controlled DC motor is discussed in the paper. In particular the author presents the results obtained by means of a specifically developed software, which implements the mathematical models of all components of the system (linkage, transmission, electric motor, control devices). The use of this software enables a more efficient design approach, since it allows the designer to check, in a simple and immediate way, the dynamic behaviour of the mechanism, arising from different values of the system parameters.
Abstract: The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: In this article we are going to discuss the improvement
of the multi classes- classification problem using multi layer
Perceptron. The considered approach consists in breaking down the
n-class problem into two-classes- subproblems. The training of each
two-class subproblem is made independently; as for the phase of test,
we are going to confront a vector that we want to classify to all two
classes- models, the elected class will be the strongest one that won-t
lose any competition with the other classes. Rates of recognition
gotten with the multi class-s approach by two-class-s decomposition
are clearly better that those gotten by the simple multi class-s
approach.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.
Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: Indoor air distribution has great impact on people-s thermal sensation. Therefore, how to remove the indoor excess heat becomes an important issue to create a thermally comfortable indoor environment. To expel the extra indoor heat effectively, this paper used a dynamic CFD approach to study the effect of an air-supply guide vane swinging periodically on the indoor air distribution within a model room. The numerical results revealed that the indoor heat transfer performance caused by the swing guide vane had close relation with the number of vortices developing under the inlet cold jet. At larger swing amplitude, two smaller vortices continued to shed outward under the cold jet and remove the indoor heat load more effectively. As a result, it can be found that the average Nusselt number on the floor increased with the increase of the swing amplitude of the guide vane.
Abstract: carbonylation of methanol in homogenous phase is
one of the major routesfor production of acetic acid. Amongst group
VIII metal catalysts used in this process iridium has displayed the
best capabilities. To investigate effect of operating parameters like:
temperature, pressure, methyl iodide, methyl acetate, iridium,
ruthenium, and water concentrations on the reaction rate,
experimental design for this system based upon central composite
design (CCD) was utilized. Statistical rate equation developed by this
method contained individual, interactions and curvature effects of
parameters on the reaction rate. The model with p-value less than
0.0001 and R2 values greater than 0.9; confirmeda satisfactory fitness
of the experimental and theoretical studies. In other words, the
developed model and experimental data obtained passed all
diagnostic tests establishing this model as a statistically significant.
Abstract: Stable nonzero populations without random deaths
caused by the Verhulst factor (Verhulst-free) are a rarity. Majority
either grow without bounds or die of excessive harmful mutations.
To delay the accumulation of bad genes or diseases, a new
environmental parameter Γ is introduced in the simulation. Current
results demonstrate that stability may be achieved by setting Γ = 0.1.
These steady states approach a maximum size that scales inversely
with reproduction age.
Abstract: The design of Automatic Generation Control (AGC) system plays a vital role in automation of power system. This paper proposes Hybrid Neuro Fuzzy (HNF) approach for AGC of two-area interconnected reheat thermal power system with the consideration of Generation Rate Constraint (GRC). The advantage of proposed controller is that it can handle the system non-linearities and at the same time the proposed approach is faster than conventional controllers. The performance of HNF controller has been compared with that of both conventional Proportional Integral (PI) controller as well as Fuzzy Logic Controller (FLC) both in the absence and presence of Generation Rate Constraint (GRC). System performance is examined considering disturbance in each area of interconnected power system.
Abstract: The review performed on the condition of energy
consumption & rate in Iran, shows that unfortunately the subject of
optimization and conservation of energy in active industries of
country lacks a practical & effective method and in most factories,
the energy consumption and rate is more than in similar industries of
industrial countries. The increasing demand of electrical energy and
the overheads which it imposes on the organization, forces
companies to search for suitable approaches to optimize energy
consumption and demand management. Application of value
engineering techniques is among these approaches. Value
engineering is considered a powerful tool for improving profitability.
These tools are used for reduction of expenses, increasing profits,
quality improvement, increasing market share, performing works in
shorter durations, more efficient utilization of sources & etc.
In this article, we shall review the subject of value engineering and
its capabilities for creating effective transformations in industrial
organizations, in order to reduce energy costs & the results have
been investigated and described during a case study in Mazandaran
wood and paper industries, the biggest consumer of energy in north
of Iran, for the purpose of presenting the effects of performed tasks
in optimization of energy consumption by utilizing value engineering
techniques in one case study.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: In this study, the Taguchi method was used to optimize the effect of HALO structure or halo implant variations on threshold voltage (VTH) and leakage current (ILeak) in 45nm p-type Metal Oxide Semiconductor Field Effect Transistors (MOSFETs) device. Besides halo implant dose, the other process parameters which used were Source/Drain (S/D) implant dose, oxide growth temperature and silicide anneal temperature. This work was done using TCAD simulator, consisting of a process simulator, ATHENA and device simulator, ATLAS. These two simulators were combined with Taguchi method to aid in design and optimize the process parameters. In this research, the most effective process parameters with respect to VTH and ILeak are halo implant dose (40%) and S/D implant dose (52%) respectively. Whereas the second ranking factor affecting VTH and ILeak are oxide growth temperature (32%) and halo implant dose (34%) respectively. The results show that after optimizations approaches is -0.157V at ILeak=0.195mA/μm.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.