Abstract: This paper provides an in-depth study of Wireless
Sensor Network (WSN) application to monitor and control the
swiftlet habitat. A set of system design is designed and developed
that includes the hardware design of the nodes, Graphical User
Interface (GUI) software, sensor network, and interconnectivity for
remote data access and management. System architecture is proposed
to address the requirements for habitat monitoring. Such applicationdriven
design provides and identify important areas of further work
in data sampling, communications and networking. For this
monitoring system, a sensor node (MTS400), IRIS and Micaz radio
transceivers, and a USB interfaced gateway base station of Crossbow
(Xbow) Technology WSN are employed. The GUI of this monitoring
system is written using a Laboratory Virtual Instrumentation
Engineering Workbench (LabVIEW) along with Xbow Technology
drivers provided by National Instrument. As a result, this monitoring
system is capable of collecting data and presents it in both tables and
waveform charts for further analysis. This system is also able to send
notification message by email provided Internet connectivity is
available whenever changes on habitat at remote sites (swiftlet farms)
occur. Other functions that have been implemented in this system
are the database system for record and management purposes; remote
access through the internet using LogMeIn software. Finally, this
research draws a conclusion that a WSN for monitoring swiftlet
habitat can be effectively used to monitor and manage swiftlet
farming industry in Sarawak.
Abstract: Transient simulation of power electronic circuits is of
considerable interest to the designer. The switching nature of the
devices used permits development of specialized algorithms which
allow a considerable reduction in simulation time compared to
general purpose simulation algorithms. This paper describes a
method used to simulate a power electronic circuits using the
SIMULINK toolbox within MATLAB software. Theoretical results
are presented provides the basis of transient analysis of a power
electronic circuits.
Abstract: Sedimentation is a hydraulic phenomenon that is
emerging as a serious challenge in river engineering. When the flow
reaches a certain state that gather potential energy, it shifts the
sediment load along channel bed. The transport of such materials can
be in the form of suspended and bed loads. The movement of these
along the river course and channels and the ways in which this could
influence the water intakes is considered as the major challenges for
sustainable O&M of hydraulic structures. This could be very serious
in arid and semi-arid regions like Iran, where inappropriate watershed
management could lead to shifting a great deal of sediments into the
reservoirs and irrigation systems. This paper aims to investigate
sedimentation in the Western Canal of Dez Diversion Weir in Iran,
identifying factors which influence the process and provide ways in
which to mitigate its detrimental effects by using the SHARC
Software.
For the purpose of this paper, data from the Dezful water authority
and Dezful Hydrometric Station pertinent to a river course of about 6
Km were used.
Results estimated sand and silt bed loads concentrations to be 193
ppm and 827ppm respectively. Given the available data on average
annual bed loads and average suspended sediment loads of 165ppm
and 837ppm, there was a significant statistical difference (16%)
between the sand grains, whereas no significant difference (1.2%)
was find in the silt grain sizes. One explanation for such finding
being that along the 6 Km river course there was considerable
meandering effects which explains recent shift in the hydraulic
behavior along the stream course under investigation. The sand
concentration in downstream relative to present state of the canal
showed a steep descending curve. Sediment trapping on the other
hand indicated a steep ascending curve. These occurred because the
diversion weir was not considered in the simulation model.
Abstract: In this paper, we propose a hardware and software
design method for automotive Electronic Control Units (ECU)
considering the functional safety. The proposed ECU is considered for
the application to Electro-Mechanical Actuator systems and the
validity of the design method is shown by the application to the
Electro-Mechanical Brake (EMB) control system which is used as a
brake actuator in Brake-By-Wire (BBW) systems. The importance of a
functional safety-based design approach to EMB ECU design has been
emphasized because of its safety-critical functions, which are executed
with the aid of many electric actuators, sensors, and application
software. Based on hazard analysis and risk assessment according to
ISO26262, the EMB system should be ASIL-D-compliant, the highest
ASIL level. To this end, an external signature watchdog and an
Infineon 32-bit microcontroller TriCore are used to reduce risks
considering common-cause hardware failure. Moreover, a software
design method is introduced for implementing functional
safety-oriented monitoring functions based on an asymmetric dual
core architecture considering redundancy and diversity. The validity
of the proposed ECU design approach is verified by using the EMB
Hardware-In-the-Loop (HILS) system, which consists of the EMB
assembly, actuator ECU, a host PC, and a few debugging devices.
Furthermore, it is shown that the existing sensor fault tolerant control
system can be used more effectively for mitigating the effects of
hardware and software faults by applying the proposed ECU design
method.
Abstract: In this research work, a novel parallel manipulator
with high positioning and orienting rate is introduced. This
mechanism has two rotational and one translational degree of
freedom. Kinematics and Jacobian analysis are investigated.
Moreover, workspace analysis and optimization has been performed
by using genetic algorithm toolbox in Matlab software. Because of
decreasing moving elements, it is expected much more better
dynamic performance with respect to other counterpart mechanisms
with the same degrees of freedom. In addition, using couple of
cylindrical and revolute joints increased mechanism ability to have
more extended workspace.
Abstract: Estimation time and cost of work completion in a
project and follow up them during execution are contributors to
success or fail of a project, and is very important for project
management team. Delivering on time and within budgeted cost
needs to well managing and controlling the projects. To dealing with
complex task of controlling and modifying the baseline project
schedule during execution, earned value management systems have
been set up and widely used to measure and communicate the real
physical progress of a project. But it often fails to predict the total
duration of the project. In this paper data mining techniques is used
predicting the total project duration in term of Time Estimate At
Completion-EAC (t). For this purpose, we have used a project with
90 activities, it has updated day by day. Then, it is used regular
indexes in literature and applied Earned Duration Method to
calculate time estimate at completion and set these as input data for
prediction and specifying the major parameters among them using
Clem software. By using data mining, the effective parameters on
EAC and the relationship between them could be extracted and it is
very useful to manage a project with minimum delay risks. As we
state, this could be a simple, safe and applicable method in prediction
the completion time of a project during execution.
Abstract: Nowadays, HPC, Grid and Cloud systems are evolving
very rapidly. However, the development of infrastructure solutions
related to HPC is lagging behind. While the existing infrastructure is
sufficient for simple cases, many computational problems have more
complex requirements.Such computational experiments use different
resources simultaneously to start a large number of computational
jobs.These resources are heterogeneous. They have different
purposes, architectures, performance and used software.Users need a
convenient tool that allows to describe and to run complex
computational experiments under conditions of HPC environment.
This paper introduces a modularworkflow system called SEGL
which makes it possible to run complex computational experiments
under conditions of a real HPC organization. The system can be used
in a great number of organizations, which provide HPC power.
Significant requirements to this system are high efficiency and
interoperability with the existing HPC infrastructure of the
organization without any changes.
Abstract: This study has applied the L16 orthogonal array of the
Taguchi method to determine the optimized polymeric
Nanocomposite asphalt binder. Three control factors are defined as
polypropylene plastomer (PP), styrene-butadiene-styrene elastomer
(SBS) and Nanoclay. Four level of concentration contents are
introduced for prepared asphalt binder samples. all samples were
prepared with 4.5% of bitumen 60/70 content. Compressive strength
tests were carried out for defining the optimized sample via
QUALITEK-4 software. SBS with 3%, PP with 5 % and Nanoclay
with 1.5% of concentrations are defined as the optimized
Nanocomposite asphalt binders. The confirmation compressive
strength and also softening point tests showed that modification of
asphalt binders with this method, improved the compressive strength
and softening points of asphalt binders up to 55%.
Abstract: Conception is the primordial part in the realization of
a computer system. Several tools have been used to help inventors to
describe their software. These tools knew a big success in the
relational databases domain since they permit to generate SQL script
modeling the database from an Entity/Association model. However,
with the evolution of the computer domain, the relational databases
proved their limits and object-relational model became used more
and more. Tools of present conception don't support all new concepts
introduced by this model and the syntax of the SQL3 language. We
propose in this paper a tool of help to the conception and
implementation of object-relational databases called «NAVIGTOOLS"
that allows the user to generate script modeling its database
in SQL3 language. This tool bases itself on the Entity/Association
and navigational model for modeling the object-relational databases.
Abstract: In this paper, we first consider the quality of service
problems in heterogeneous wireless networks for sending the video
data, which their problem of being real-time is pronounced. At last,
we present a method for ensuring the end-to-end quality of service at
application layer level for adaptable sending of the video data at
heterogeneous wireless networks. To do this, mechanism in different
layers has been used. We have used the stop mechanism, the
adaptation mechanism and the graceful degrade at the application
layer, the multi-level congestion feedback mechanism in the network
layer and connection cutting off decision mechanism in the link
layer. At the end, the presented method and the achieved
improvement is simulated and presented in the NS-2 software.
Abstract: In this study, a 3D combustion chamber was simulated
using FLUENT 6.32. Aim to obtain detailed information on
combustion characteristics and _ nitrogen oxides in the furnace and
the effect of oxygen enrichment in a combustion process. Oxygenenriched
combustion is an effective way to reduce emissions. This
paper analyzes NO emission, including thermal NO and prompt NO.
Flow rate ratio of air to fuel is varied as 1.3, 3.2 and 5.1 and the
oxygen enriched flow rates are 28, 54 and 68 lit/min. The 3D
Reynolds Averaged Navier Stokes (RANS) equations with standard
k-ε turbulence model are solved together by Fluent 6.32 software.
First order upwind scheme is used to model governing equations and
the SIMPLE algorithm is used as pressure velocity coupling. Results
show that for AF=1.3, increase the oxygen flow rate of oxygen
reduction in NO emissions is Lance. Moreover, in a fixed oxygen
enrichment condition, increasing the air to fuel ratio will increase the
temperature peak, but not the NO emission rate. As a result, oxygen
enrichment can reduce the NO emission at this kind of furnace in low
air to fuel rates.
Abstract: Distributed Power generation has gained a lot of
attention in recent times due to constraints associated with
conventional power generation and new advancements in DG
technologies .The need to operate the power system economically
and with optimum levels of reliability has further led to an increase
in interest in Distributed Generation. However it is important to place
Distributed Generator on an optimum location so that the purpose of
loss minimization and voltage regulation is dully served on the
feeder. This paper investigates the impact of DG units installation on
electric losses, reliability and voltage profile of distribution networks.
In this paper, our aim would be to find optimal distributed
generation allocation for loss reduction subjected to constraint of
voltage regulation in distribution network. The system is further
analyzed for increased levels of Reliability. Distributed Generator
offers the additional advantage of increase in reliability levels as
suggested by the improvements in various reliability indices such as
SAIDI, CAIDI and AENS. Comparative studies are performed and
related results are addressed. An analytical technique is used in order
to find the optimal location of Distributed Generator. The suggested
technique is programmed under MATLAB software. The results
clearly indicate that DG can reduce the electrical line loss while
simultaneously improving the reliability of the system.
Abstract: The Learning Management Systems present learning
environment which offers a collection of e-learning tools in a
package that allows a common interface and information sharing
among the tools. South East European University initial experience
in LMS was with the usage of the commercial LMS-ANGEL. After a
three year experience on ANGEL usage because of expenses that
were very high it was decided to develop our own software. As part
of the research project team for the in-house design and development
of the new LMS, we primarily had to select the features that would
cover our needs and also comply with the actual trends in the area of
software development, and then design and develop the system. In
this paper we present the process of LMS in-house development for
South East European University, its architecture, conception and
strengths with a special accent on the process of migration and
integration with other enterprise applications.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: The article investigates how 14- to 15- year-olds build informal conceptions of inferential statistics as they engage in a modelling process and build their own computer simulations with dynamic statistical software. This study proposes four primary phases of informal inferential reasoning for the students in the statistical modeling and simulation process. Findings show shifts in the conceptual structures across the four phases and point to the potential of all of these phases for fostering the development of students- robust knowledge of the logic of inference when using computer based simulations to model and investigate statistical questions.
Abstract: This paper presented a proposed design for
transcutaneous inductive powering links. The design used to transfer
power and data to the implanted devices such as implanted
Microsystems to stimulate and monitoring the nerves and muscles.
The system operated with low band frequency 13.56 MHZ according
to industrial- scientific – medical (ISM) band to avoid the tissue
heating. For external part, the modulation index is 13 % and the
modulation rate 7.3% with data rate 1 Mbit/s assuming Tbit=1us. The
system has been designed using 0.35-μm fabricated CMOS
technology. The mathematical model is given and the design is
simulated using OrCAD P Spice 16.2 software tool and for real-time
simulation the electronic workbench MULISIM 11 has been used.
The novel circular plane (pancake) coils was simulated using
ANSOFT- HFss software.
Abstract: This paper presents the findings of two experiments that were performed on the Redundancy in Wireless Connection Model (RiWC) using the 802.11b standard. The experiments were simulated using OPNET 11.5 Modeler software. The first was aimed at finding the maximum number of simultaneous Voice over Internet Protocol (VoIP) users the model would support under the G.711 and G.729 codec standards when the packetization interval was 10 milliseconds (ms). The second experiment examined the model?s VoIP user capacity using the G.729 codec standard along with background traffic using the same packetization interval as in the first experiment. To determine the capacity of the model under various experiments, we checked three metrics: jitter, delay and data loss. When background traffic was added, we checked the response time in addition to the previous three metrics. The findings of the first experiment indicated that the maximum number of simultaneous VoIP users the model was able to support was 5, which is consistent with recent research findings. When using the G.729 codec, the model was able to support up to 16 VoIP users; similar experiments in current literature have indicated a maximum of 7 users. The finding of the second experiment demonstrated that the maximum number of VoIP users the model was able to support was 12, with the existence of background traffic.
Abstract: Nowadays, the rapid development of multimedia
and internet allows for wide distribution of digital media data.
It becomes much easier to edit, modify and duplicate digital
information Besides that, digital documents are also easy to
copy and distribute, therefore it will be faced by many
threatens. It-s a big security and privacy issue with the large
flood of information and the development of the digital
format, it become necessary to find appropriate protection
because of the significance, accuracy and sensitivity of the
information. Nowadays protection system classified with more
specific as hiding information, encryption information, and
combination between hiding and encryption to increase information
security, the strength of the information hiding science is due to the
non-existence of standard algorithms to be used in hiding secret
messages. Also there is randomness in hiding methods such as
combining several media (covers) with different methods to pass a
secret message. In addition, there are no formal methods to be
followed to discover the hidden data. For this reason, the task of this
research becomes difficult. In this paper, a new system of information
hiding is presented. The proposed system aim to hidden information
(data file) in any execution file (EXE) and to detect the hidden file
and we will see implementation of steganography system which
embeds information in an execution file. (EXE) files have been
investigated. The system tries to find a solution to the size of the
cover file and making it undetectable by anti-virus software. The
system includes two main functions; first is the hiding of the
information in a Portable Executable File (EXE), through the
execution of four process (specify the cover file, specify the
information file, encryption of the information, and hiding the
information) and the second function is the extraction of the hiding
information through three process (specify the steno file, extract the
information, and decryption of the information). The system has
achieved the main goals, such as make the relation of the size of the
cover file and the size of information independent and the result file
does not make any conflict with anti-virus software.
Abstract: Components of a software system may be related in a
wide variety of ways. These relationships need to be represented in
software architecture in order develop quality software. In practice, software architecture is immensely challenging, strikingly
multifaceted, extravagantly domain based, perpetually changing,
rarely cost-effective, and deceptively ambiguous. This paper analyses
relations among the major components of software systems and
argues for using several broad categories for software architecture for
assessment purposes: strongly adequate, weakly adequate and
functionally adequate software architectures among other categories.
These categories are intended for formative assessments of
architectural designs.
Abstract: Analytical investigation of the sedimentation
processes in the river engineering and hydraulic structures is of vital
importance as this can affect water supply for the cultivating lands in
the command area. The reason being that gradual sediment formation
behind the reservoir can reduce the nominal capacity of these dams.
The aim of the present paper is to analytically investigate
sedimentation process along the river course and behind the storage
reservoirs in general and the Eastern Intake of the Dez Diversion weir
in particular using the SHARC software. Results of the model
indicated the water level at 115.97m whereas the real time
measurement from the river cross section was 115.98 m which
suggests a significantly close relation between them. The average
transported sediment load in the river was measured at 0.25mm ,
from which it can be concluded that nearly 100% of the suspended
loads in river are moving which suggests no sediment settling but
indicates that almost all sediment loads enters into the intake. It was
further showed the average sediment diameter entering the intake to
be 0.293 mm which in turn suggests that about 85% of suspended
sediments in the river entre the intake. Comparison of the results
from the SHARC model with those obtained form the SSIIM
software suggests quite similar outputs but distinguishing the
SHARC model as more appropriate for the analysis of simpler
problems than other model.