Abstract: CScheme, a concurrent programming paradigm based
on scheme concept enables concurrency schemes to be constructed
from smaller synchronization units through a GUI based composer
and latter be reused on other concurrency problems of a similar
nature. This paradigm is particularly important in the multi-core
environment prevalent nowadays. In this paper, we demonstrate
techniques to separate concurrency from functional code using the
CScheme paradigm. Then we illustrate how the CScheme
methodology can be used to solve some of the traditional
concurrency problems – critical section problem, and readers-writers
problem - using synchronization schemes such as Single Threaded
Execution Scheme, and Readers Writers Scheme.
Abstract: In this paper, we present C@sa, a multiagent system aiming at modeling, controlling and simulating the behavior of an intelligent house. The developed system aims at providing to architects, designers and psychologists a simulation and control tool for understanding which is the impact of embedded and pervasive technology on people daily life. In this vision, the house is seen as an environment made up of independent and distributed devices, controlled by agents, interacting to support user's goals and tasks.
Abstract: In this paper, an accurate theoretical analysis for the achievable average channel capacity (in the Shannon sense) per user of a hybrid cellular direct-sequence/fast frequency hopping code-division multiple-access (DS/FFH-CDMA) system operating in a Rayleigh fading environment is presented. The analysis covers the downlink operation and leads to the derivation of an exact mathematical expression between the normalized average channel capacity available to each system-s user, under simultaneous optimal power and rate adaptation and the system-s parameters, as the number of hops per bit, the processing gain applied, the number of users per cell and the received signal-tonoise power ratio over the signal bandwidth. Finally, numerical results are presented to illustrate the proposed mathematical analysis.
Abstract: The success of an e-learning system is highly
dependent on the quality of its educational content and how effective,
complete, and simple the design tool can be for teachers. Educational
modeling languages (EMLs) are proposed as design languages
intended to teachers for modeling diverse teaching-learning
experiences, independently of the pedagogical approach and in
different contexts. However, most existing EMLs are criticized for
being too abstract and too complex to be understood and manipulated
by teachers. In this paper, we present a visual EML that simplifies the
process of designing learning scenarios for teachers with no
programming background. Based on the conceptual framework of the
activity theory, our resulting visual EML focuses on using Domainspecific
modeling techniques to provide a pedagogical level of
abstraction in the design process.
Abstract: In today's world where everything is rapidly changing
and information technology is high in development, many features of culture, society, politic and economy has changed. The advent of
information technology and electronic data transmission lead to easy communication and fields like e-learning and e-commerce, are
accessible for everyone easily. One of these technologies is virtual
training. The "quality" of such kind of education systems is critical. 131 questionnaires were prepared and distributed among university
student in Toba University. So the research has followed factors that affect the quality of learning from the perspective of staff, students, professors and this type of university. It is concluded that the important factors in virtual training are the quality of professors, the
quality of staff, and the quality of the university. These mentioned factors were the most prior factors in this education system and
necessary for improving virtual training.
Abstract: A learning management system (commonly
abbreviated as LMS) is a software application for the administration,
documentation, tracking, and reporting of training programs,
classroom and online events, e-learning programs, and training
content (Ellis 2009). (Hall 2003) defines an LMS as \"software that
automates the administration of training events. All Learning
Management Systems manage the log-in of registered users, manage
course catalogs, record data from learners, and provide reports to
management\". Evidence of the worldwide spread of e-learning in
recent years is easy to obtain. In April 2003, no fewer than 66,000
fully online courses and 1,200 complete online programs were listed
on the TeleCampus portal from TeleEducation (Paulsen 2003). In the
report \" The US market in the Self-paced eLearning Products and
Services:2010-2015 Forecast and Analysis\" The number of student
taken classes exclusively online will be nearly equal (1% less) to the
number taken classes exclusively in physical campuses. Number of
student taken online course will increase from 1.37 million in 2010 to
3.86 million in 2015 in USA. In another report by The Sloan
Consortium three-quarters of institutions report that the economic
downturn has increased demand for online courses and programs.
Abstract: In this paper, we present an experimental testing for
a new algorithm that determines an optimal controller-s coefficients
for output variance reduction related to Linear Time Invariant (LTI)
Systems. The algorithm features simplicity in calculation, generalization
to minimal and non-minimal phase systems, and could be
configured to achieve reference tracking as well as variance reduction
after compromising with the output variance. An experiment of DCmotor
velocity control demonstrates the application of this new
algorithm in designing the controller. The results show that the
controller achieves minimum variance and reference tracking for a
preset velocity reference relying on an identified model of the motor.
Abstract: This paper introduces a framework based on the collaboration of multi agent and hyper-heuristics to find a solution of the real single machine production problem. There are many techniques used to solve this problem. Each of it has its own advantages and disadvantages. By the collaboration of multi agent system and hyper-heuristics, we can get more optimal solution. The hyper-heuristics approach operates on a search space of heuristics rather than directly on a search space of solutions. The proposed framework consists of some agents, i.e. problem agent, trainer agent, algorithm agent (GPHH, GAHH, and SAHH), optimizer agent, and solver agent. Some low level heuristics used in this paper are MRT, SPT, LPT, EDD, LDD, and MON
Abstract: In an Orthogonal Frequency Division Multiplexing (OFDM) systems, the Peak to Average power Ratio (PAR) is high. The clipping signal scheme is a useful and simple method to reduce the PAR. However, it introduces additional noise that degrades the systems performance. We propose an oversampling scheme to deal with the received signal in order to reduce the clipping noise by using Finite Impulse Response (FIR) filter. Coefficients of filter are obtained by correlation function of the received signal and the oversampling information at receiver. The performance of the proposed technique is evaluated for frequency selective channel. Results show that the proposed scheme can mitigate the clipping noise significantly for OFDM systems and in order to maintain the system's capacity, the clipping ratio should be larger than 2.5.
Abstract: This paper investigates the robust stability of uncertain neutral system with time-varying delay. By using Lyapunov method and linear matrix inequality technology, new delay-dependent stability criteria are obtained and formulated in terms of linear matrix inequalities (LMIs), which can be easy to check the robust stability of the considered systems. Numerical examples are given to indicate significant improvements over some existing results.
Abstract: Carbon nanotubes (CNTs) with their high mechanical,
electrical, thermal and chemical properties are regarded as promising
materials for many different potential applications. Having unique
properties they can be used in a wide range of fields such as
electronic devices, electrodes, drug delivery systems, hydrogen
storage, textile etc. Catalytic chemical vapor deposition (CCVD) is a
common method for CNT production especially for mass production.
Catalysts impregnated on a suitable substrate are important for
production with chemical vapor deposition (CVD) method. Iron
catalyst and MgO substrate is one of most common catalyst-substrate
combination used for CNT. In this study, CNTs were produced by
CCVD of acetylene (C2H2) on magnesium oxide (MgO) powder
substrate impregnated by iron nitrate (Fe(NO3)3•9H2O) solution. The
CNT synthesis conditions were as follows: at synthesis temperatures
of 500 and 800°C multiwall and single wall CNTs were produced
respectively. Iron (Fe) catalysts were prepared by with Fe:MgO ratio
of 1:100, 5:100 and 10:100. The duration of syntheses were 30 and
60 minutes for all temperatures and catalyst percentages. The
synthesized materials were characterized by thermal gravimetric
analysis (TGA), transmission electron microscopy (TEM) and Raman
spectroscopy.
Abstract: This paper presents an investigation of the power
penalties imposed by four-wave mixing (FWM) on G.652 (Single-
Mode Fiber - SMF), G.653 (Dispersion-Shifted Fiber - DSF), and
G.655 (Non-Zero Dispersion-Shifted Fiber - NZDSF) compliant
fibers, considering the DWDM grids suggested by the ITU-T
Recommendations G.692, and G.694.1, with uniform channel
spacing of 100, 50, 25, and 12.5 GHz. The mathematical/numerical
model assumes undepleted pumping, and shows very clearly the
deleterious effect of FWM on the performance of DWDM systems,
measured by the signal-to-noise ratio (SNR). The results make it
evident that non-uniform channel spacing is practically mandatory
for WDM systems based on DSF fibers.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: Twist drills are geometrical complex tools and thus various researchers have adopted different mathematical and experimental approaches for their simulation. The present paper acknowledges the increasing use of modern CAD systems and using the API (Application Programming Interface) of a CAD system, drilling simulations are carried out. The developed DRILL3D software routine, creates parametrically controlled tool geometries and using different cutting conditions, achieves the generation of solid models for all the relevant data involved (drilling tool, cut workpiece, undeformed chip). The final data derived, consist a platform for further direct simulations regarding the determination of cutting forces, tool wear, drilling optimizations etc.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.
Abstract: Software effort estimation is the process of predicting
the most realistic use of effort required to develop or maintain
software based on incomplete, uncertain and/or noisy input. Effort
estimates may be used as input to project plans, iteration plans,
budgets. There are various models like Halstead, Walston-Felix,
Bailey-Basili, Doty and GA Based models which have already used
to estimate the software effort for projects. In this study Statistical
Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are
experimented to estimate the software effort for projects. The
performances of the developed models were tested on NASA
software project datasets and results are compared with the Halstead,
Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based
models mentioned in the literature. The result shows that the NF
Model has the lowest MMRE and RMSE values. The NF Model
shows the best results as compared with the Fuzzy-GA based hybrid
Inference System and other existing Models that are being used for
the Effort Prediction with lowest MMRE and RMSE values.
Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: In Algeria, liberalization reforms undertaken since the 1990s have resulted in negative effects on the development and management of irrigation schemes, as well as on the conditions of farmers. Reforms have been undertaken to improve the performance of irrigation schemes, such as the national plan of agricultural development (PNDA) in 2000 and the water pricing policy of 2005. However, after implementation of these policies, questions have arisen with regard to irrigation performance and its suitability for agricultural development. Hence, the aim of this paper is to provide insight into the profitability of irrigation during the transition period under current irrigation agricultural policies in Algeria. By using the method of farm crop budget analysis in the East Mitidja irrigation scheme, the returns from using surface water resources based on farm typology were found to vary among crops and farmers- groups within the scheme. Irrigation under the current situation is profitable for all farmers, including both those who benefit from subsidies and those who do not. However, the returns to water were found to be very sensitive to crop price fluctuations, particularly for non-subsidized groups and less so for those whose farming is based on orchards. Moreover, the socio-economic environment of the farmers contributed to less significant impacts of the PNDA policy. In fact, the limiting factor is not only the water, but also the lack of land ownership title. Market access constraints led to less agricultural investment and therefore to low intensification and low water productivity. It is financially feasible to recover the annual O&M costs in the irrigation scheme. By comparing the irrigation water price, returns to water, and O&M costs of water delivery, it is clear that irrigation can be profitable in the future. However, water productivity must be improved by enhancing farmers- income through farming investment, improving assets access, and the allocation of activities and crops which bring high returns to water; this could allow the farmers to pay more for water and allow cost recovery for water systems.
Abstract: We investigate the planar quasi-septic non-analytic systems which have a center-focus equilibrium at the origin and whose angular speed is constant. The system could be changed into an analytic system by two transformations, with the help of computer algebra system MATHEMATICA, the conditions of uniform isochronous center are obtained.