Abstract: We present a BeeBot, Binus Multi-client Intelligent Telepresence Robot, a custom-build robot system specifically designed for teleconference with multiple person using omni directional actuator. The robot is controlled using a computer networks, so the manager/supervisor can direct the robot to the intended person to start a discussion/inspection. People tracking and autonomous navigation are intelligent features of this robot. We build a web application for controlling the multi-client telepresence robot and open-source teleconference system used. Experimental result presented and we evaluated its performance.
Abstract: This work presents the Risk Threshold RED (RTRED)
congestion control strategy for TCP networks. In addition to the
maximum and minimum thresholds in existing RED-based strategies,
we add a third dropping level. This new dropping level is the risk
threshold which works with the actual and average queue sizes to
detect the immediate congestion in gateways. Congestion reaction
by RTRED is on time. The reaction to congestion is neither too
early, to avoid unfair packet losses, nor too late to avoid packet
dropping from time-outs. We compared our novel strategy with RED
and ARED strategies for TCP congestion handling using a NS-2
simulation script. We found that the RTRED strategy outperformed
RED and ARED.
Abstract: In this paper, we combine a probabilistic neural method with radial-bias functions in order to construct the lithofacies of the wells DF01, DF02 and DF03 situated in the Triassic province of Algeria (Sahara). Lithofacies is a crucial problem in reservoir characterization. Our objective is to facilitate the experts' work in geological domain and to allow them to obtain quickly the structure and the nature of lands around the drilling. This study intends to design a tool that helps automatic deduction from numerical data. We used a probabilistic formalism to enhance the classification process initiated by a Self-Organized Map procedure. Our system gives lithofacies, from well-log data, of the concerned reservoir wells in an aspect easy to read by a geology expert who identifies the potential for oil production at a given source and so forms the basis for estimating the financial returns and economic benefits.
Abstract: The purpose of this paper is to assess the value of neural networks for classification of cancer and noncancer prostate cells. Gauss Markov Random Fields, Fourier entropy and wavelet average deviation features are calculated from 80 noncancer and 80 cancer prostate cell nuclei. For classification, artificial neural network techniques which are multilayer perceptron, radial basis function and learning vector quantization are used. Two methods are utilized for multilayer perceptron. First method has single hidden layer and between 3-15 nodes, second method has two hidden layer and each layer has between 3-15 nodes. Overall classification rate of 86.88% is achieved.
Abstract: Entrepreneurs are important for national labour markets and economies in that they contribute significantly to economic growth as well as provide the majority of jobs and create new ones. According to the Global Entrepreneurship Monitor’s “Report on Women and Entrepreneurship”, investment in women’s entrepreneurship is an important way to exponentially increase the impact of new venture creation finding ways to empower women’s participation and success in entrepreneurship are critical for more sustainable and successful economic development. Our results confirm that they are still differences between men and women entrepreneurs The reasons seems to be the lack of specific business skills, the less extensive social network, and the lack of identification patterns among women. Those differences can be explained by the fact that women still have fewer opportunities to make a career. If this is correct, we can predict an increasing proportion of women among entrepreneurs in the next years. Concerning the development of a favorable environment for developing and enhancing women entrepreneurship activities, our results show the insertion in a network and the role of a model doubtless represent elements determining in the choice to launch an entrepreneurship activity, as well as a precious resource for the success of her company.
Abstract: Preparation of size controlled nano-particles of silver catalyst on carbon substrate from e-waste has been investigated. Chemical route was developed by extraction of the metals available in nitric acid followed by treatment with hydrofluoric acid. Silver metal particles deposited with an average size 4-10 nm. A stabilizer concentration of 10- 40 g/l was used. The average size of the prepared silver decreased with increase of the anode current density. Size uniformity of the silver nano-particles was improved distinctly at higher current density no more than 20mA... Grain size increased with EK time whereby aggregation of particles was observed after 6 h of reaction.. The chemical method involves adsorption of silver nitrate on the carbon substrate. Adsorbed silver ions were directly reduced to metal particles using hydrazine hydrate. Another alternative method is by treatment with ammonia followed by heating the carbon loaded-silver hydroxide at 980°C. The product was characterized with the help of XRD, XRF, ICP, SEM and TEM techniques.
Abstract: It has been defined that the “network is the system".
This implies providing levels of service, reliability, predictability and
availability that are commensurate with or better than those that
individual computers provide today. To provide this requires
integrated network management for interconnected networks of
heterogeneous devices covering both the local campus. In this paper
we are addressing a framework to effectively deal with this issue. It
consists of components and interactions between them which are
required to perform the service fault management. A real-world
scenario is used to derive the requirements which have been applied
to the component identification. An analysis of existing frameworks
and approaches with respect to their applicability to the framework is
also carried out.
Abstract: In IETF RFC 2002, Mobile-IP was developed to
enable Laptobs to maintain Internet connectivity while moving
between subnets. However, the packet loss that comes from
switching subnets arises because network connectivity is lost while
the mobile host registers with the foreign agent and this encounters
large end-to-end packet delays. The criterion to initiate a simple and
fast full-duplex connection between the home agent and foreign
agent, to reduce the roaming duration, is a very important issue to be
considered by a work in this paper. State-transition Petri-Nets of the
modeling scenario-based CIA: communication inter-agents procedure
as an extension to the basic Mobile-IP registration process was
designed and manipulated to describe the system in discrete events.
The heuristic of configuration file during practical Setup session for
registration parameters, on Cisco platform Router-1760 using IOS
12.3 (15)T and TFTP server S/W is created. Finally, stand-alone
performance simulations from Simulink Matlab, within each subnet
and also between subnets, are illustrated for reporting better end-toend
packet delays. Results verified the effectiveness of our Mathcad
analytical manipulation and experimental implementation. It showed
lower values of end-to-end packet delay for Mobile-IP using CIA
procedure-based early registration. Furthermore, it reported packets
flow between subnets to improve losses between subnets.
Abstract: Artificial Intelligence based gaming is an interesting topic in the state-of-art technology. This paper presents an automation of a tradition Omani game, called Al-Hawalees. Its related issues are resolved and implemented using artificial intelligence approach. An AI approach called mini-max procedure is incorporated to make a diverse budges of the on-line gaming. If number of moves increase, time complexity will be increased in terms of propositionally. In order to tackle the time and space complexities, we have employed a back propagation neural network (BPNN) to train in off-line to make a decision for resources required to fulfill the automation of the game. We have utilized Leverberg- Marquardt training in order to get the rapid response during the gaming. A set of optimal moves is determined by the on-line back propagation training fashioned with alpha-beta pruning. The results and analyses reveal that the proposed scheme will be easily incorporated in the on-line scenario with one player against the system.
Abstract: Traditionally, Internet has provided best-effort service to every user regardless of its requirements. However, as Internet becomes universally available, users demand more bandwidth and applications require more and more resources, and interest has developed in having the Internet provide some degree of Quality of Service. Although QoS is an important issue, the question of how it will be brought into the Internet has not been solved yet. Researches, due to the rapid advances in technology are proposing new and more desirable capabilities for the next generation of IP infrastructures. But neither all applications demand the same amount of resources, nor all users are service providers. In this way, this paper is the first of a series of papers that presents an architecture as a first step to the optimization of QoS in the Internet environment as a solution to a SMSE's problem whose objective is to provide public service to internet with certain Quality of Service expectations. The service provides new business opportunities, but also presents new challenges. We have designed and implemented a scalable service framework that supports adaptive bandwidth based on user demands, and the billing based on usage and on QoS. The developed application has been evaluated and the results show that traffic limiting works at optimum and so it does exceeding bandwidth distribution. However, some considerations are done and currently research is under way in two basic areas: (i) development and testing new transfer protocols, and (ii) developing new strategies for traffic improvements based on service differentiation.
Abstract: Multi-Radio Multi-Channel Wireless Mesh Networks (MRMC-WMNs) operate at the backbone to access and route high volumes of traffic simultaneously. Such roles demand high network capacity, and long “online" time at the expense of accelerated transmission energy depletion and poor connectivity. This is the problem of transmission power control. Numerous power control methods for wireless networks are in literature. However, contributions towards MRMC configurations still face many challenges worth considering. In this paper, an energy-efficient power selection protocol called PMMUP is suggested at the Link-Layer. This protocol first divides the MRMC-WMN into a set of unified channel graphs (UCGs). A UCG consists of multiple radios interconnected to each other via a common wireless channel. In each UCG, a stochastic linear quadratic cost function is formulated. Each user minimizes this cost function consisting of trade-off between the size of unification states and the control action. Unification state variables come from independent UCGs and higher layers of the protocol stack. The PMMUP coordinates power optimizations at the network interface cards (NICs) of wireless mesh routers. The proposed PMMUP based algorithm converges fast analytically with a linear rate. Performance evaluations through simulations confirm the efficacy of the proposed dynamic power control.
Abstract: Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows drawing conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stageby- stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.
Abstract: Sedimentation process resulting from soil erosion in
the water basin especially in arid and semi-arid where poor
vegetation cover in the slope of the mountains upstream could
contribute to sediment formation. The consequence of sedimentation
not only makes considerable change in the morphology of the river
and the hydraulic characteristics but would also have a major
challenge for the operation and maintenance of the canal network
which depend on water flow to meet the stakeholder-s requirements.
For this reason mathematical modeling can be used to simulate the
effective factors on scouring, sediment transport and their settling
along the waterways. This is particularly important behind the
reservoirs which enable the operators to estimate the useful life of
these hydraulic structures. The aim of this paper is to simulate the
sedimentation and erosion in the eastern and western water intake
structures of the Dez Diversion weir using GSTARS-3 software. This
is done to estimate the sedimentation and investigate the ways in
which to optimize the process and minimize the operational
problems. Results indicated that the at the furthest point upstream of
the diversion weir, the coarser sediment grains tended to settle. The
reason for this is the construction of the phantom bridge and the
outstanding rocks just upstream of the structure. The construction of
these along the river course has reduced the momentum energy
require to push the sediment loads and make it possible for them to
settle wherever the river regime allows it. Results further indicated a
trend for the sediment size in such a way that as the focus of study
shifts downstream the size of grains get smaller and vice versa. It
was also found that the finding of the GSTARS-3 had a close
proximity with the sets of the observed data. This suggests that the
software is a powerful analytical tool which can be applied in the
river engineering project with a minimum of costs and relatively
accurate results.
Abstract: The quality of short term load forecasting can improve the efficiency of planning and operation of electric utilities. Artificial Neural Networks (ANNs) are employed for nonlinear short term load forecasting owing to their powerful nonlinear mapping capabilities. At present, there is no systematic methodology for optimal design and training of an artificial neural network. One has often to resort to the trial and error approach. This paper describes the process of developing three layer feed-forward large neural networks for short-term load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. Particle Swarm Optimization (PSO) is used to develop the optimum large neural network structure and connecting weights for one-day ahead electric load forecasting problem. PSO is a novel random optimization method based on swarm intelligence, which has more powerful ability of global optimization. Employing PSO algorithms on the design and training of ANNs allows the ANN architecture and parameters to be easily optimized. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. The experimental results show that the proposed method optimized by PSO can quicken the learning speed of the network and improve the forecasting precision compared with the conventional Back Propagation (BP) method. Moreover, it is not only simple to calculate, but also practical and effective. Also, it provides a greater degree of accuracy in many cases and gives lower percent errors all the time for STLF problem compared to BP method. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: Since the advent of the information era, the Internet has
brought various positive effects in everyday life. Nevertheless,
recently, problems and side-effects have been noted. Internet
witch-trials and spread of pornography are only a few of these
problems.In this study, problems and causes of malicious replies on
internet boards were analyzed, using the key ideas of game theory. The
study provides a mathematical model for the internet reply game to
devise three possible plans that could efficiently counteract malicious
replies. Furthermore, seven specific measures that comply with one of
the three plans were proposed and evaluated according to the
importance and utility of each measure using the orthogonal array
survey and SPSS conjoint analysis.The conclusion was that the most
effective measure would be forbidding unsigned user access to
malicious replies. Also notable was that some analytically proposed
measures, when implemented, could backfire and encourage malicious
replies.
Abstract: IEEE 802.16 is a new wireless technology standard, it
has some advantages, including wider coverage, higher bandwidth,
and QoS support. As the new wireless technology for last mile
solution, there are designed two models in IEEE 802.16 standard. One
is PMP (point to multipoint) and the other is Mesh. In this paper we
only focus on IEEE 802.16 Mesh model. According to the IEEE
802.16 standard description, Mesh model has two scheduling modes,
centralized and distributed. Considering the pros and cons of the two
scheduling, we present the combined scheduling QoS framework that
the BS (Base Station) controls time frame scheduling and selects the
shortest path from source to destination directly. On the other hand, we
propose the Expedited Queue mechanism to cut down the transmission
time. The EQ mechanism can reduce a lot of end-to-end delay in our
QoS framework. Simulation study has shown that the average delay is
smaller than contrasts. Furthermore, our proposed scheme can also
achieve higher performance.
Abstract: In this paper, the periodic surveillance scheme has
been proposed for any convex region using mobile wireless sensor
nodes. A sensor network typically consists of fixed number of
sensor nodes which report the measurements of sensed data such as
temperature, pressure, humidity, etc., of its immediate proximity
(the area within its sensing range). For the purpose of sensing an
area of interest, there are adequate number of fixed sensor
nodes required to cover the entire region of interest. It implies
that the number of fixed sensor nodes required to cover a given
area will depend on the sensing range of the sensor as well as
deployment strategies employed. It is assumed that the sensors to
be mobile within the region of surveillance, can be mounted on
moving bodies like robots or vehicle. Therefore, in our
scheme, the surveillance time period determines the number of
sensor nodes required to be deployed in the region of interest.
The proposed scheme comprises of three algorithms namely:
Hexagonalization, Clustering, and Scheduling, The first algorithm
partitions the coverage area into fixed sized hexagons that
approximate the sensing range (cell) of individual sensor node.
The clustering algorithm groups the cells into clusters, each of
which will be covered by a single sensor node. The later
determines a schedule for each sensor to serve its respective cluster.
Each sensor node traverses all the cells belonging to the cluster
assigned to it by oscillating between the first and the last cell for
the duration of its life time. Simulation results show that our
scheme provides full coverage within a given period of time using
few sensors with minimum movement, less power consumption,
and relatively less infrastructure cost.
Abstract: One main drawback of intrusion detection system is the
inability of detecting new attacks which do not have known
signatures. In this paper we discuss an intrusion detection method
that proposes independent component analysis (ICA) based feature
selection heuristics and using rough fuzzy for clustering data. ICA is
to separate these independent components (ICs) from the monitored
variables. Rough set has to decrease the amount of data and get rid of
redundancy and Fuzzy methods allow objects to belong to several
clusters simultaneously, with different degrees of membership. Our
approach allows us to recognize not only known attacks but also to
detect activity that may be the result of a new, unknown attack. The
experimental results on Knowledge Discovery and Data Mining-
(KDDCup 1999) dataset.
Abstract: In this paper, SFQ (Start Time Fair Queuing)
algorithm is analyzed when this is applied in computer networks to
know what kind of behavior the traffic in the net has when different
data sources are managed by the scheduler. Using the NS2 software
the computer networks were simulated to be able to get the graphs
showing the performance of the scheduler. Different traffic sources
were introduced in the scripts, trying to establish the real scenario.
Finally the results were that depending on the data source, the traffic
can be affected in different levels, when Constant Bite Rate is
applied, the scheduler ensures a constant level of data sent and
received, but the truth is that in the real life it is impossible to ensure
a level that resists the changes in work load.
Abstract: This paper deals with the effect of a power transformer’s vector group on the basic voltage sag characteristics during unbalanced faults at a meshed or radial power network. Specifically, the propagation of voltage sags through a power transformer is studied with advanced short-circuit analysis. A smart method to incorporate this effect on analytical mathematical expressions is proposed. Based on this methodology, the positive effect of transformers of certain vector groups on the mitigation of the expected number of voltage sags per year (sag frequency) at the terminals of critical industrial customers can be estimated.