Abstract: Sedimentation formation is a complex hydraulic phenomenon that has emerged as a major operational and maintenance consideration in modern hydraulic engineering in general and river engineering in particular. Sediments accumulation along the river course and their eventual storage in a form of islands affect water intake in the canal systems that are fed by the storage reservoirs. Without proper management, sediment transport can lead to major operational challenges in water distribution system of arid regions like the Dez and Hamidieh command areas. The paper aims to investigate sedimentation in the Western Canal of Dez Diversion Weir using the SHARC model and compare the results with the two intake structures of the Hamidieh dam in Iran using SSIIM model. The objective was to identify the factors which influence the process, check reliability of outcome and provide ways in which to mitigate the implications on operation and maintenance of the structures. Results estimated sand and silt bed loads concentrations to be 193 ppm and 827ppm respectively. This followed ,ore or less similar pattern in Hamidieh where the sediment formation impeded water intake in the canal system. Given the available data on average annual bed loads and average suspended sediment loads of 165ppm and 837ppm in the Dez, there was a significant statistical difference (16%) between the sand grains, whereas no significant difference (1.2%) was find in the silt grain sizes. One explanation for such finding being that along the 6 Km river course there was considerable meandering effects which explains recent shift in the hydraulic behavior along the stream course under investigation. The sand concentration in downstream relative to present state of the canal showed a steep descending curve. Sediment trapping on the other hand indicated a steep ascending curve. These occurred because the diversion weir was not considered in the simulation model. The comparative study showed very close similarities in the results which explains the fact that both software can be used as accurate and reliable analytical tools for simulation of the sedimentation in hydraulic engineering.
Abstract: Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.
Abstract: The capturing of gel electrophoresis image represents
the output of a DNA computing algorithm. Before this image is being
captured, DNA computing involves parallel overlap assembly (POA)
and polymerase chain reaction (PCR) that is the main of this
computing algorithm. However, the design of the DNA
oligonucleotides to represent a problem is quite complicated and is
prone to errors. In order to reduce these errors during the design stage
before the actual in-vitro experiment is carried out; a simulation
software capable of simulating the POA and PCR processes is
developed. This simulation software capability is unlimited where
problem of any size and complexity can be simulated, thus saving
cost due to possible errors during the design process. Information
regarding the DNA sequence during the computing process as well as
the computing output can be extracted at the same time using the
simulation software.
Abstract: Cellular communication is being widely used by all
over the world. The users of handsets are increasing due to the
request from marketing sector. The important aspect that has to be
touch in this paper is about the security system of cellular
communication. It is important to provide users with a secure channel
for communication. A brief description of the new GSM cellular
network architecture will be provided. Limitations of cellular
networks, their security issues and the different types of attacks will
be discussed. The paper will go over some new security mechanisms
that have been proposed by researchers. Overall, this paper clarifies
the security system or services of cellular communication using
GSM. Three Malaysian Communication Companies were taken as
Case study in this paper.
Abstract: The quality of Ribbed Smoked Sheets
(RSS) primarily based on color, dryness, and the presence or
absence of fungus and bubbles. This quality is strongly
influenced by the drying and fumigation process namely
smoking process. Smoking that is held in high temperature
long time will result scorched dark brown sheets, whereas if
the temperature is too low or slow drying rate would resulted
in less mature sheets and growth of fungus. Therefore need to
find the time and temperature for optimum quality of sheets.
Enhance, unmonitored heat and mass transfer during smoking
process lead to high losses of energy balance. This research
aims to generate simple empirical mathematical model
describing the effect of smoking time and temperature to RSS
quality of color, water content, fungus and bubbles. The
second goal of study was to analyze energy balance during
smoking process. Experimental study was conducted by
measuring temperature, residence time and quality parameters
of 16 sheets sample in smoking rooms. Data for energy
consumption balance such as mass of fuel wood, mass of
sheets being smoked, construction temperature, ambient
temperature and relative humidity were taken directly along
the smoking process. It was found that mathematical model
correlating smoking temperature and time with color is Color
= -169 - 0.184 T4 - 0.193 T3 - 0.160 0.405 T1 + T2 + 0.388 t1
+3.11 t2 + 3.92t3 + 0.215 t4 with R square 50.8% and with
moisture is Moisture = -1.40-0.00123 T4 + 0.00032 T3 +
0.00260 T2 - 0.00292 T1 - 0.0105 t1 + 0.0290 t2 + 0.0452 t3
+ 0.00061 t4 with R square of 49.9%. Smoking room energy
analysis found useful energy was 27.8%. The energy stored in
the material construction 7.3%. Lost of energy in conversion
of wood combustion, ventilation and others were 16.6%. The
energy flowed out through the contact of material construction
with the ambient air was found to be the highest contribution
to energy losses, it reached 48.3%.
Abstract: A way of achieving nanodimentional structural elements in high carbon steel by special kind of heat treatment and cold plastic deformation is being explored. This leads to increasing interlamellar spacing of ferrite-carbide mixture. Decreasing the interlamellar spacing with cooling temperature increasing is determined. Experiments confirm such interlamellar spacing with which high carbon steel demonstrates the highest treatment and hardening capability. Total deformation degree effect on interlamellar spacing value in a ferrite-carbide mixture is obtained. Mechanical experiments results show that high carbon steel after heat treatment and repetitive cold plastic deformation possesses high tensile strength and yield strength keeping good percentage elongation.
Abstract: This paper deals with e-government issues at several
levels. Initially we look at the concept of e-government itself in order
to give it a sound framework. Than we look at the e-government
issues at three levels, first we analyse it at the global level, second we
analyse it at the level of transition economies, and finally we take a
closer look on developments in Croatia. The analysis includes actual
progress being made in selected transition economies given the Euro
area averages, along with e-government potential in future
demanding period.
Abstract: Model Predictive Control (MPC) is increasingly being
proposed for real time applications and embedded systems. However
comparing to PID controller, the implementation of the MPC in
miniaturized devices like Field Programmable Gate Arrays (FPGA)
and microcontrollers has historically been very small scale due to its
complexity in implementation and its computation time requirement.
At the same time, such embedded technologies have become an
enabler for future manufacturing enterprises as well as a transformer
of organizations and markets. Recently, advances in microelectronics
and software allow such technique to be implemented in embedded
systems. In this work, we take advantage of these recent advances
in this area in the deployment of one of the most studied and
applied control technique in the industrial engineering. In fact in
this paper, we propose an efficient framework for implementation
of Generalized Predictive Control (GPC) in the performed STM32
microcontroller. The STM32 keil starter kit based on a JTAG interface
and the STM32 board was used to implement the proposed GPC
firmware. Besides the GPC, the PID anti windup algorithm was
also implemented using Keil development tools designed for ARM
processor-based microcontroller devices and working with C/Cµ
langage. A performances comparison study was done between both
firmwares. This performances study show good execution speed and
low computational burden. These results encourage to develop simple
predictive algorithms to be programmed in industrial standard hardware.
The main features of the proposed framework are illustrated
through two examples and compared with the anti windup PID
controller.
Abstract: The purpose of this study is to find natural gait of
biped robot such as human being by analyzing the COG (Center Of
Gravity) trajectory of human being's gait. It is discovered that human
beings gait naturally maintain the stability and use the minimum
energy. This paper intends to find the natural gait pattern of biped
robot using the minimum energy as well as maintaining the stability by
analyzing the human's gait pattern that is measured from gait image on
the sagittal plane and COG trajectory on the frontal plane. It is not
possible to apply the torques of human's articulation to those of biped
robot's because they have different degrees of freedom. Nonetheless,
human and 5-link biped robots are similar in kinematics. For this, we
generate gait pattern of the 5-link biped robot by using the GA
algorithm of adaptation gait pattern which utilize the human's ZMP
(Zero Moment Point) and torque of all articulation that are measured
from human's gait pattern. The algorithm proposed creates biped
robot's fluent gait pattern as that of human being's and to minimize
energy consumption because the gait pattern of the 5-link biped robot
model is modeled after consideration about the torque of human's each
articulation on the sagittal plane and ZMP trajectory on the frontal
plane. This paper demonstrate that the algorithm proposed is superior
by evaluating 2 kinds of the 5-link biped robot applied to each gait
patterns generated both in the general way using inverse kinematics
and in the special way in which by considering visuality and
efficiency.
Abstract: In this paper we investigated a number of the Internet
congestion control algorithms that has been developed in the last few
years. It was obviously found that many of these algorithms were
designed to deal with the Internet traffic merely as a train of
consequent packets. Other few algorithms were specifically tailored
to handle the Internet congestion caused by running media traffic that
represents audiovisual content. This later set of algorithms is
considered to be aware of the nature of this media content. In this
context we briefly explained a number of congestion control
algorithms and hence categorized them into the two following
categories: i) Media congestion control algorithms. ii) Common
congestion control algorithms. We hereby recommend the usage of
the media congestion control algorithms for the reason of being
media content-aware rather than the other common type of
algorithms that blindly manipulates such traffic. We showed that the
spread of such media content-aware algorithms over Internet will
lead to better congestion control status in the coming years. This is
due to the observed emergence of the era of digital convergence
where the media traffic type will form the majority of the Internet
traffic.
Abstract: We provide a maximum norm analysis of a finite
element Schwarz alternating method for a nonlinear elliptic boundary
value problem of the form -Δu = f(u), on two overlapping sub
domains with non matching grids. We consider a domain which is
the union of two overlapping sub domains where each sub domain
has its own independently generated grid. The two meshes being
mutually independent on the overlap region, a triangle belonging to
one triangulation does not necessarily belong to the other one. Under
a Lipschitz assumption on the nonlinearity, we establish, on each sub
domain, an optimal L∞ error estimate between the discrete Schwarz
sequence and the exact solution of the boundary value problem.
Abstract: Information technology managers nowadays are
facing with tremendous pressure to plan, implement, and adopt new
technology solution due to the rapidity of technology changes.
Resulted from a lack of study that have been done in this topic, the
aim of this paper is to provide a comparison review on current tools
that are currently being used in order to respond to technological
changes. The study is based on extensive literature review of
published works with majority of them are ranging from 2000 to the
first part of 2011. The works were gathered from journals, books,
and other information sources available on the Web. Findings show
that, each tools has different focus and none of the tools are
providing a framework in holistic view, which should include
technical, people, process, and business environment aspect. Hence,
this result provides potential information about current available
tools that IT managers could use to manage changes in technology.
Further, the result reveals a research gap in the area where the
industries a short of such framework.
Abstract: Querying a data source and routing data towards sink
becomes a serious challenge in static wireless sensor networks if sink
and/or data source are mobile. Many a times the event to be observed
either moves or spreads across wide area making maintenance of
continuous path between source and sink a challenge. Also, sink can
move while query is being issued or data is on its way towards sink.
In this paper, we extend our already proposed Grid Based Data
Dissemination (GBDD) scheme which is a virtual grid based
topology management scheme restricting impact of movement of
sink(s) and event(s) to some specific cells of a grid. This obviates the
need for frequent path modifications and hence maintains continuous
flow of data while minimizing the network energy consumptions.
Simulation experiments show significant improvements in network
energy savings and average packet delay for a packet to reach at sink.
Abstract: Cell phone forensics to acquire and analyze data in the
cellular phone is nowadays being used in a national investigation
organization and a private company. In order to collect cellular phone
flash memory data, we have two methods. Firstly, it is a logical
method which acquires files and directories from the file system of the
cell phone flash memory. Secondly, we can get all data from bit-by-bit
copy of entire physical memory using a low level access method. In
this paper, we describe a forensic tool to acquire cell phone flash
memory data using a logical level approach. By our tool, we can get
EFS file system and peek memory data with an arbitrary region from
Korea CDMA cell phone.
Abstract: Intensive changes of environment and strong market
competition have raised management of information and knowledge
to the strategic level of companies. In a knowledge based economy
only those organizations are capable of living which have up-to-date,
special knowledge and they are able to exploit and develop it.
Companies have to know what knowledge they have by taking a
survey of organizational knowledge and they have to fix actual and
additional knowledge in organizational memory. The question is how
to identify, acquire, fix and use knowledge effectively. The paper will
show that over and above the tools of information technology
supporting acquisition, storage and use of information and
organizational learning as well as knowledge coming into being as a
result of it, fixing and storage of knowledge in the memory of a
company play an important role in the intelligence of organizations
and competitiveness of a company.
Abstract: This research presented in this paper is an on-going
project of an application of neural network and fuzzy models to
evaluate the sociological factors which affect the educational
performance of the students in Sri Lanka. One of its major goals is to
prepare the grounds to device a counseling tool which helps these
students for a better performance at their examinations, especially at
their G.C.E O/L (General Certificate of Education-Ordinary Level)
examination. Closely related sociological factors are collected as raw
data and the noise of these data are filtered through the fuzzy
interface and the supervised neural network is being utilized to
recognize the performance patterns against the chosen social factors.
Abstract: A multicriteria linear programming problem with integer variables and parameterized optimality principle "from lexicographic to Slater" is considered. A situation in which initial coefficients of penalty cost functions are not fixed but may be potentially a subject to variations is studied. For any efficient solution, appropriate measures of the quality are introduced which incorporate information about variations of penalty cost function coefficients. These measures correspond to the so-called stability and accuracy functions defined earlier for efficient solutions of a generic multicriteria combinatorial optimization problem with Pareto and lexicographic optimality principles. Various properties of such functions are studied and maximum norms of perturbations for which an efficient solution preserves the property of being efficient are calculated.
Abstract: The continuity in the electric supply of the electric installations is becoming one of the main requirements of the electric supply network (generation, transmission, and distribution of the electric energy). The achievement of this requirement depends from one side on the structure of the electric network and on the other side on the avaibility of the reserve source provided to maintain the supply in case of failure of the principal one. The avaibility of supply does not only depends on the reliability parameters of the both sources (principal and reserve) but it also depends on the reliability of the circuit breaker which plays the role of interlocking the reserve source in case of failure of the principal one. In addition, the principal source being under operation, its control can be ideal and sure, however, for the reserve source being in stop, a preventive maintenances which proceed on time intervals (periodicity) and for well defined lengths of time are envisaged, so that this source will always available in case of the principal source failure. The choice of the periodicity of preventive maintenance of the source of reserve influences directly the reliability of the electric feeder system In this work and on the basis of the semi- markovian's processes, the influence of the time of interlocking the reserve source upon the reliability of an industrial electric network is studied and is given the optimal time of interlocking the reserve source in case of failure the principal one, also the influence of the periodicity of the preventive maintenance of the source of reserve is studied and is given the optimal periodicity.
Abstract: The primary purpose of this article is an attempt to
find the implication of globalization on education. Globalization has
an important role as a process in the economical, political, cultural
and technological dimensions in the life of the contemporary human
being and has been affected by it. Education has its effects in this
procedure and while influencing it through educating global citizens
having universal human features and characteristics, has been
influenced by this phenomenon too. Nowadays, the role of education
is not just to develop in the students the knowledge and skills
necessary for the new kinds of jobs. If education wants to help
students be prepared of the new global society, it has to make them
engaged productive and critical citizens for the global era, so that
they can reflect about their roles as key actors in a dynamic often
uneven, matrix of economic and cultural exchanges. If education
wants to reinforce and raise the national identity, the value system
and the children and teenagers, it should make them ready for living
in the global era of this century. The used method in this research is
documentary and analyzing the documents. Studies in this field show
globalization has influences on the processes of the production,
distribution and consuming of knowledge. The happening of this
event in the information era has not only provide the necessary
opportunities for the exchanges of education worldwide but also has
privileges for the developing countries which enables them to
strengthen educational bases of their society and have an important
step toward their future.
Abstract: The success of an e-learning system is highly
dependent on the quality of its educational content and how effective,
complete, and simple the design tool can be for teachers. Educational
modeling languages (EMLs) are proposed as design languages
intended to teachers for modeling diverse teaching-learning
experiences, independently of the pedagogical approach and in
different contexts. However, most existing EMLs are criticized for
being too abstract and too complex to be understood and manipulated
by teachers. In this paper, we present a visual EML that simplifies the
process of designing learning scenarios for teachers with no
programming background. Based on the conceptual framework of the
activity theory, our resulting visual EML focuses on using Domainspecific
modeling techniques to provide a pedagogical level of
abstraction in the design process.