Abstract: The success of an e-learning system is highly
dependent on the quality of its educational content and how effective,
complete, and simple the design tool can be for teachers. Educational
modeling languages (EMLs) are proposed as design languages
intended to teachers for modeling diverse teaching-learning
experiences, independently of the pedagogical approach and in
different contexts. However, most existing EMLs are criticized for
being too abstract and too complex to be understood and manipulated
by teachers. In this paper, we present a visual EML that simplifies the
process of designing learning scenarios for teachers with no
programming background. Based on the conceptual framework of the
activity theory, our resulting visual EML focuses on using Domainspecific
modeling techniques to provide a pedagogical level of
abstraction in the design process.
Abstract: This paper presents a new optimization technique based on quantum computing principles to solve a security constrained power system economic dispatch problem (SCED). The proposed technique is a population-based algorithm, which uses some quantum computing elements in coding and evolving groups of potential solutions to reach the optimum following a partially directed random approach. The SCED problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Real Coded Quantum-Inspired Evolution Algorithm (RQIEA) is then applied to solve the constrained optimization formulation. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that RQIEA is very applicable for solving security constrained power system economic dispatch problem (SCED).
Abstract: The broadcast problem including the plan design is
considered. The data are inserted and numbered at predefined order
into customized size relations. The server ability to create a full,
regular Broadcast Plan (RBP) with single and multiple channels after
some data transformations is examined. The Regular Geometric
Algorithm (RGA) prepares a RBP and enables the users to catch their
items avoiding energy waste of their devices. Moreover, the
Grouping Dimensioning Algorithm (GDA) based on integrated
relations can guarantee the discrimination of services with a
minimum number of channels. This last property among the selfmonitoring,
self-organizing, can be offered by servers today
providing also channel availability and less energy consumption by
using smaller number of channels. Simulation results are provided.
Abstract: This paper introduces a framework based on the collaboration of multi agent and hyper-heuristics to find a solution of the real single machine production problem. There are many techniques used to solve this problem. Each of it has its own advantages and disadvantages. By the collaboration of multi agent system and hyper-heuristics, we can get more optimal solution. The hyper-heuristics approach operates on a search space of heuristics rather than directly on a search space of solutions. The proposed framework consists of some agents, i.e. problem agent, trainer agent, algorithm agent (GPHH, GAHH, and SAHH), optimizer agent, and solver agent. Some low level heuristics used in this paper are MRT, SPT, LPT, EDD, LDD, and MON
Abstract: Fractional delay FIR filters design method based on
the differential evolution algorithm is presented. Differential evolution
is an evolutionary algorithm for solving a global optimization problems in the continuous search space. In the proposed approach,
an evolutionary algorithm is used to determine the coefficients of
a fractional delay FIR filter based on the Farrow structure. Basic
differential evolution is enhanced with a restricted mating technique,
which improves the algorithm performance in terms of convergence
speed and obtained solution. Evolutionary optimization is carried out by minimizing an objective function which is based on the amplitude
response and phase delay errors. Experimental results show that the proposed algorithm leads to a reduction in the amplitude response and phase delay errors relative to those achieved with the Least-Squares
method.
Abstract: Roundabout work on the principle of circulation and
entry flows, where the maximum entry flow rates depend largely on
circulating flow bearing in mind that entry flows must give away to
circulating flows. Where an existing roundabout has a road hump
installed at the entry arm, it can be hypothesized that the kinematics
of vehicles may prevent the entry arm from achieving optimum
performance. Road humps are traffic calming devices placed across
road width solely as speed reduction mechanism. They are the
preferred traffic calming option in Malaysia and often used on single
and dual carriageway local routes. The speed limit on local routes is
30mph (50 km/hr). Road humps in their various forms achieved the
biggest mean speed reduction (based on a mean speed before traffic
calming of 30mph) of up to 10mph or 16 km/hr according to the UK
Department of Transport. The underlying aim of reduced speed
should be to achieve a 'safe' distribution of speeds which reflects the
function of the road and the impacts on the local community.
Constraining safe distribution of speeds may lead to poor drivers
timing and delayed reflex reaction that can probably cause accident.
Previous studies on road hump impact have focused mainly on speed
reduction, traffic volume, noise and vibrations, discomfort and delay
from the use of road humps. The paper is aimed at optimal entry and
circulating flow induced by road humps. Results show that
roundabout entry and circulating flow perform better in
circumstances where there is no road hump at entrance.
Abstract: Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.
Abstract: The purpose of this paper is to present two different
approaches of financial distress pre-warning models appropriate for
risk supervisors, investors and policy makers. We examine a sample
of the financial institutions and electronic companies of Taiwan
Security Exchange (TSE) market from 2002 through 2008. We
present a binary logistic regression with paned data analysis. With
the pooled binary logistic regression we build a model including
more variables in the regression than with random effects, while the
in-sample and out-sample forecasting performance is higher in
random effects estimation than in pooled regression. On the other
hand we estimate an Adaptive Neuro-Fuzzy Inference System
(ANFIS) with Gaussian and Generalized Bell (Gbell) functions and
we find that ANFIS outperforms significant Logit regressions in both
in-sample and out-of-sample periods, indicating that ANFIS is a
more appropriate tool for financial risk managers and for the
economic policy makers in central banks and national statistical
services.
Abstract: Magnesium is used implant material potentially for
non-toxicity to the human body. Due to the excellent
bio-compatibility, Mg alloys is applied to implants avoiding removal
second surgery. However, it is found commercial magnesium alloys
including aluminum has low corrosion resistance, resulting
subcutaneous gas bubbles and consequently the approach as
permanent bio-materials. Generally, Aluminum is known to pollution
substance, and it raises toxicity to nervous system. Therefore
especially Mg-35Zn-3Ca alloy is prepared for new biodegradable
materials in this study. And the pulsed power is used in
constant-current mode of DC power kinds of anodization. Based on
the aforementioned study, it examines corrosion resistance and
biocompatibility by effect of current and frequency variation. The
surface properties and thickness were compared using scanning
electronic microscopy. Corrosion resistance was assessed via
potentiodynamic polarization and the effect of oxide layer on the body
was assessed cell viability. Anodized Mg-35Zn-3Ca alloy has good
biocompatibility in vitro by current and frequency variation.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: The study of the transport coefficients in electronic
devices is currently carried out by analytical and empirical models.
This study requires several simplifying assumptions, generally
necessary to lead to analytical expressions in order to study the
different characteristics of the electronic silicon-based devices.
Further progress in the development, design and optimization of
Silicon-based devices necessarily requires new theory and modeling
tools. In our study, we use the PSO (Particle Swarm Optimization)
technique as a computational tool to develop analytical approaches in
order to study the transport phenomenon of the electron in crystalline
silicon as function of temperature and doping concentration. Good
agreement between our results and measured data has been found.
The optimized analytical models can also be incorporated into the
circuits simulators to study Si-based devices without impact on the
computational time and data storage.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: The dynamic behaviour of a four-bar linkage driven by a velocity controlled DC motor is discussed in the paper. In particular the author presents the results obtained by means of a specifically developed software, which implements the mathematical models of all components of the system (linkage, transmission, electric motor, control devices). The use of this software enables a more efficient design approach, since it allows the designer to check, in a simple and immediate way, the dynamic behaviour of the mechanism, arising from different values of the system parameters.
Abstract: The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: In this article we are going to discuss the improvement
of the multi classes- classification problem using multi layer
Perceptron. The considered approach consists in breaking down the
n-class problem into two-classes- subproblems. The training of each
two-class subproblem is made independently; as for the phase of test,
we are going to confront a vector that we want to classify to all two
classes- models, the elected class will be the strongest one that won-t
lose any competition with the other classes. Rates of recognition
gotten with the multi class-s approach by two-class-s decomposition
are clearly better that those gotten by the simple multi class-s
approach.