Abstract: Small signal stability causes small perturbations in the
generator that can cause instability in the power network. It is
generally known that small signal stability are directly related to the
generator and load properties. This paper examines the effects of
generator input variations on power system oscillations for a small
signal stability study. Eigenvaules and eigenvectors are used to
examine the stability of the power system. The dynamic power
system's mathematical model is constructed and thus calculated using
load flow and small signal stability toolbox on MATLAB. The power
system model is based on a 3-machine 9-bus system that was
modified to suit this study. In this paper, Participation Factors are a
means to gauge the effects of variation in generation with other
parameters on the network are also incorporated.
Abstract: The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.
Abstract: Through 1980s, management accounting researchers
described the increasing irrelevance of traditional control and
performance measurement systems. The Balanced Scorecard (BSC)
is a critical business tool for a lot of organizations. It is a
performance measurement system which translates mission and
strategy into objectives. Strategy map approach is a development
variant of BSC in which some necessary causal relations must be
established. To recognize these relations, experts usually use
experience. It is also possible to utilize regression for the same
purpose. Structural Equation Modeling (SEM), which is one of the
most powerful methods of multivariate data analysis, obtains more
appropriate results than traditional methods such as regression. In the
present paper, we propose SEM for the first time to identify the
relations between objectives in the strategy map, and a test to
measure the importance of relations. In SEM, factor analysis and test
of hypotheses are done in the same analysis. SEM is known to be
better than other techniques at supporting analysis and reporting. Our
approach provides a framework which permits the experts to design
the strategy map by applying a comprehensive and scientific method
together with their experience. Therefore this scheme is a more
reliable method in comparison with the previously established
methods.
Abstract: The objective of this work which is based on the
approach of simultaneous engineering is to contribute to the
development of a CIM tool for the synthesis of functional design
dimensions expressed by average values and tolerance intervals. In
this paper, the dispersions method known as the Δl method which
proved reliable in the simulation of manufacturing dimensions is
used to develop a methodology for the automation of the simulation.
This methodology is constructed around three procedures. The first
procedure executes the verification of the functional requirements by
automatically extracting the functional dimension chains in the
mechanical sub-assembly. Then a second procedure performs an
optimization of the dispersions on the basis of unknown variables.
The third procedure uses the optimized values of the dispersions to
compute the optimized average values and tolerances of the
functional dimensions in the chains. A statistical and cost based
approach is integrated in the methodology in order to take account of
the capabilities of the manufacturing processes and to distribute
optimal values among the individual components of the chains.
Abstract: A number of competing methodologies have been developed
to identify genes and classify DNA sequences into coding
and non-coding sequences. This classification process is fundamental
in gene finding and gene annotation tools and is one of the most
challenging tasks in bioinformatics and computational biology. An
information theory measure based on mutual information has shown
good accuracy in classifying DNA sequences into coding and noncoding.
In this paper we describe a species independent iterative
approach that distinguishes coding from non-coding sequences using
the mutual information measure (MIM). A set of sixty prokaryotes is
used to extract universal training data. To facilitate comparisons with
the published results of other researchers, a test set of 51 bacterial
and archaeal genomes was used to evaluate MIM. These results
demonstrate that MIM produces superior results while remaining
species independent.
Abstract: This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to
fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach)
has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a
superior hybrid solution. Recent researches have shown that there is a
need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this
instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent
systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.
Abstract: Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.
Abstract: Selecting the data modeling technique for an
information system is determined by the objective of the resultant
data model. Dimensional modeling is the preferred modeling
technique for data destined for data warehouses and data mining,
presenting data models that ease analysis and queries which are in
contrast with entity relationship modeling. The establishment of data
warehouses as components of information system landscapes in
many organizations has subsequently led to the development of
dimensional modeling. This has been significantly more developed
and reported for the commercial database management systems as
compared to the open sources thereby making it less affordable for
those in resource constrained settings. This paper presents
dimensional modeling of HIV patient information using open source
modeling tools. It aims to take advantage of the fact that the most
affected regions by the HIV virus are also heavily resource
constrained (sub-Saharan Africa) whereas having large quantities of
HIV data. Two HIV data source systems were studied to identify
appropriate dimensions and facts these were then modeled using two
open source dimensional modeling tools. Use of open source would
reduce the software costs for dimensional modeling and in turn make
data warehousing and data mining more feasible even for those in
resource constrained settings but with data available.
Abstract: This study aims to conduct a preliminary investigation to determine the topic to be focused in developing Virtual Laboratory For Biology (VLab-Bio). Samples involved in answering the questionnaire are form five students (equivalent to A-Level) and biology teachers. Time and economical resources for the setting up and construction of scientific laboratories can be solved with the adaptation of virtual laboratories as an educational tool. Thus, it is hoped that the proposed virtual laboratory will help students to learn the abstract concepts in biology. Findings show that the difficult topic chosen is Cell Division and the learning objective to be focused in developing the virtual lab is “Describe the application of knowledge on mitosis in cloning".
Abstract: Bond Graph as a unified multidisciplinary tool is widely
used not only for dynamic modelling but also for Fault Detection and
Isolation because of its structural and causal proprieties. A binary
Fault Signature Matrix is systematically generated but to make the
final binary decision is not always feasible because of the problems
revealed by such method. The purpose of this paper is introducing a
methodology for the improvement of the classical binary method of
decision-making, so that the unknown and identical failure signatures
can be treated to improve the robustness. This approach consists of
associating the evaluated residuals and the components reliability data
to build a Hybrid Bayesian Network. This network is used in two
distinct inference procedures: one for the continuous part and the
other for the discrete part. The continuous nodes of the network are
the prior probabilities of the components failures, which are used by
the inference procedure on the discrete part to compute the posterior
probabilities of the failures. The developed methodology is applied
to a real steam generator pilot process.
Abstract: Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.
Abstract: OpenMP is an API for parallel programming model of shared memory multiprocessors. Novice OpenMP programmers often produce the code that compiler cannot find human errors. It was investigated how compiler coped with the common mistakes that can occur in OpenMP code. The latest version(4.4.3) of GCC is used for this research. It was found that GCC compiled the codes without any errors or warnings. In this paper the programming aid tool is presented for OpenMP programs. It can check 12 common mistakes that novice programmer can commit during the programming of OpenMP. It was demonstrated that the programming aid tool can detect the various common mistakes that GCC failed to detect.
Abstract: The purpose of my research proposal is to
demonstrate that there is a relationship between EEG and
endometrial cancer.
The above relationship is based on an Aristotelian Syllogism;
since it is known that the 14-3-3 protein is related to the electrical
activity of the brain via control of the flow of Na+ and K+ ions and
since it is also known that many types of cancer are associated with
14-3-3 protein, it is possible that there is a relationship between EEG
and cancer. This research will be carried out by well-defined
diagnostic indicators, obtained via the EEG, using signal processing
procedures and pattern recognition tools such as neural networks in
order to recognize the endometrial cancer type. The current research
shall compare the findings from EEG and hysteroscopy performed on
women of a wide age range. Moreover, this practice could be
expanded to other types of cancer. The implementation of this
methodology will be completed with the creation of an ontology.
This ontology shall define the concepts existing in this research-s
domain and the relationships between them. It will represent the
types of relationships between hysteroscopy and EEG findings.
Abstract: FW4 is a newly developed hot die material widely
used in Forging Dies manufacturing. The right selection of the
machining conditions is one of the most important aspects to take
into consideration in the Electrical Discharge Machining (EDM) of
FW4. In this paper an attempt has been made to develop
mathematical models for relating the Material Removal Rate (MRR),
Tool Wear Ratio (TWR) and surface roughness (Ra) to machining
parameters (current, pulse-on time and voltage). Furthermore, a study
was carried out to analyze the effects of machining parameters in
respect of listed technological characteristics. The results of analysis
of variance (ANOVA) indicate that the proposed mathematical
models, can adequately describe the performance within the limits of
the factors being studied.
Abstract: Quantitative methods of economic decision-making as
the methodological base of the so called operational research
represent an important set of tools for managing complex economic
systems,both at the microeconomic level and on the macroeconomic
scale. Mathematical models of controlled and controlling processes
allow, by means of artificial experiments, obtaining information
foroptimalor optimum approaching managerial decision-making.The
quantitative methods of economic decision-making usually include a
methodology known as structural analysis -an analysisof
interdisciplinary production-consumption relations.
Abstract: Revolutions Applications such as telecommunications, hands-free communications, recording, etc. which need at least one microphone, the signal is usually infected by noise and echo. The important application is the speech enhancement, which is done to remove suppressed noises and echoes taken by a microphone, beside preferred speech. Accordingly, the microphone signal has to be cleaned using digital signal processing DSP tools before it is played out, transmitted, or stored. Engineers have so far tried different approaches to improving the speech by get back the desired speech signal from the noisy observations. Especially Mobile communication, so in this paper will do reconstruction of the speech signal, observed in additive background noise, using the Kalman filter technique to estimate the parameters of the Autoregressive Process (AR) in the state space model and the output speech signal obtained by the MATLAB. The accurate estimation by Kalman filter on speech would enhance and reduce the noise then compare and discuss the results between actual values and estimated values which produce the reconstructed signals.
Abstract: The article presents the whole model of IS/IT
architecture exception governance. As first, the assumptions of
presented model are set. As next, there is defined a generic
governance model that serves as a basis for the architecture exception
governance. The architecture exception definition and its attributes
follow. The model respects well known approaches to the area that
are described in the text, but it adopts higher granularity in
description and expands the process view with all the next necessary
governance components as roles, principles and policies, tools to
enable the implementation of the model into organizations. The
architecture exception process is decomposed into a set of processes
related to the architecture exception lifecycle consisting of set of
phases and architecture exception states. Finally, there is information
about my future research related to this area.
Abstract: Generally flow behavior in centrifugal fan is observed
to be in a state of instability with flow separation zones on suction
surface as well as near the front shroud. Overall performance of the
diffusion process in a centrifugal fan could be enhanced by
judiciously introducing the boundary layer suction slots. With easy
accessibility of CFD as an analytical tool, an extensive numerical
whole field analysis of the effect of boundary layer suction slots in
discrete regions of suspected separation points is possible. This paper
attempts to explore the effect of boundary layer suction slots
corresponding to various geometrical locations on the impeller with
converging configurations for the slots. The analysis shows that the
converging suction slots located on the impeller blade about 25%
from the trailing edge, significantly improves the static pressure
recovery across the fan. Also it is found that Slots provided at a
radial distance of about 12% from the leading and trailing edges
marginally improve the static pressure recovery across the fan.
Abstract: Enterprise applications are complex systems that are hard to develop and deploy in organizations. Although software application development tools, frameworks, methodologies and patterns are rapidly developing; many projects fail by causing big costs. There are challenging issues that programmers and designers face with while working on enterprise applications. In this paper, we present the three of the significant issues: Architectural, technological and performance. The important subjects in each issue are pointed out and recommendations are given. In architectural issues the lifecycle, meta-architecture, guidelines are pointed out. .NET and Java EE platforms are presented in technological issues. The importance of performance, measuring performance and profilers are explained in performance issues.
Abstract: Tanzania secondary schools in rural areas are geographically and socially isolated, hence face a number of problems in getting learning materials resulting in poor performance in National examinations. E-learning as defined to be the use of information and communication technology (ICT) for supporting the educational processes has motivated Tanzania to apply ICT in its education system. There has been effort to improve secondary school education using ICT through several projects. ICT for e-learning to Tanzania rural secondary school is one of the research projects conceived by the University of Dar-es-Salaam through its College of Engineering and Technology. The main objective of the project is to develop a tool to enable ICT support rural secondary school. The project is comprehensive with a number of components, one being development of e-learning management system (e-LMS) for Tanzania secondary schools. This paper presents strategies of developing e-LMS. It shows the importance of integrating action research methodology with the modeling methods as presented by model driven architecture (MDA) and the usefulness of Unified Modeling Language (UML) on the issue of modeling. The benefit of MDA will go along with the development based on software development life cycle (SDLC) process, from analysis and requirement phase through design and implementation stages as employed by object oriented system analysis and design approach. The paper also explains the employment of open source code reuse from open source learning platforms for the context sensitive development of the e-LMS for Tanzania secondary schools.