Abstract: Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.
Abstract: Due to the three- dimensional flow pattern interacting with bed material, the process of local scour around bridge piers is complex. Modeling 3D flow field and scour hole evolution around a bridge pier is more feasible nowadays because the computational cost and computational time have significantly decreased. In order to evaluate local flow and scouring around a bridge pier, a completely three-dimensional numerical model, SSIIM program, was used. The model solves 3-D Navier-Stokes equations and a bed load conservation equation. The model was applied to simulate local flow and scouring around a bridge pier in a large natural river with four piers. Computation for 1 day of flood condition was carried out to predict the maximum local scour depth. The results show that the SSIIM program can be used efficiently for simulating the scouring in natural rivers. The results also showed that among the various turbulence models, the k-ω model gives more reasonable results.
Abstract: In this paper we propose an intelligent agent approach
to control the electric power grid at a smaller granularity in order to
give it self-healing capabilities. We develop a method using the
influence model to transform transmission substations into
information processing, analyzing and decision making (intelligent
behavior) units. We also develop a wireless communication method
to deliver real-time uncorrupted information to an intelligent
controller in a power system environment. A combined networking
and information theoretic approach is adopted in meeting both the
delay and error probability requirements. We use a mobile agent
approach in optimizing the achievable information rate vector and in
the distribution of rates to users (sensors). We developed the concept
and the quantitative tools require in the creation of cooperating semiautonomous
subsystems which puts the electric grid on the path
towards intelligent and self-healing system.
Abstract: Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.
Abstract: This paper presents an indirect adaptive stabilization
scheme for first-order continuous-time systems under saturated input
which is described by a sigmoidal function. The singularities are
avoided through a modification scheme for the estimated plant
parameter vector so that its associated Sylvester matrix is guaranteed
to be non-singular and then the estimated plant model is controllable.
The modification mechanism involves the use of a hysteresis
switching function. An alternative hybrid scheme, whose estimated
parameters are updated at sampling instants is also given to solve a
similar adaptive stabilization problem. Such a scheme also uses
hysteresis switching for modification of the parameter estimates so as
to ensure the controllability of the estimated plant model.
Abstract: The objective of this paper is to present explicit analytical formulas for evaluating important characteristics of Double Moving Average control chart (DMA) for Poisson distribution. The most popular characteristics of a control chart are Average Run Length ( 0 ARL ) - the mean of observations that are taken before a system is signaled to be out-of control when it is actually still incontrol, and Average Delay time ( 1 ARL ) - mean delay of true alarm times. An important property required of 0 ARL is that it should be sufficiently large when the process is in-control to reduce a number of false alarms. On the other side, if the process is actually out-ofcontrol then 1 ARL should be as small as possible. In particular, the explicit analytical formulas for evaluating 0 ARL and 1 ARL be able to get a set of optimal parameters which depend on a width of the moving average ( w ) and width of control limit ( H ) for designing DMA chart with minimum of 1 ARL
Abstract: Rambutan is a tropical fruit which peel possesses antioxidant properties. This work was conducted to optimize extraction conditions of phenolic compounds from rambutan peel. Response surface methodology (RSM) was adopted to optimize subcritical water extraction (SWE) on temperature, extraction time and percent solvent mixture. The results demonstrated that the optimum conditions for SWE were as follows: temperature 160°C, extraction time 20min. and concentration of 50% ethanol. Comparison of the phenolic compounds from the rambutan peels in maceration 6h, soxhlet 4h, and SWE 20min., it indicated that total phenolic content (using Folin-Ciocalteu-s phenol reagent) was 26.42, 70.29, and 172.47mg of tannic acid equivalent (TAE) per g dry rambutan peel, respectively. The comparative study concluded that SWE was a promising technique for phenolic compounds extraction from rambutan peel, due to much more two times of conventional techniques and shorter extraction times.
Abstract: Hypersonic flows around spatial vehicles during their
reentry phase in planetary atmospheres are characterized by intense
aerothermal phenomena. The aim of this work is to analyze high
temperature flows around an axisymmetric blunt body taking into
account chemical and vibrational non-equilibrium for air mixture
species. For this purpose, a finite volume methodology is employed
to determine the supersonic flow parameters around the axisymmetric
blunt body, especially at the stagnation point and along the wall of
spacecraft for several altitudes. This allows the capture shock wave
before a blunt body placed in supersonic free stream. The numerical
technique uses the Flux Vector Splitting method of Van Leer. Here,
adequate time stepping parameter, along with CFL coefficient and
mesh size level are selected to ensure numerical convergence, sought
with an order of 10-8
Abstract: Evaluation of educational portals is an important
subject area that needs more attention from researchers. A university
that has an educational portal which is difficult to use and interact by
teachers or students or management staff can reduce the position and
reputation of the university. Therefore, it is important to have the
ability to make an evaluation of the quality of e-services the
university provide to improve them over time.
The present study evaluates the usability of the Information
Technology Faculty portal at University of Benghazi. Two evaluation
methods were used: a questionnaire-based method and an online
automated tool-based method. The first method was used to measure
the portal's external attributes of usability (Information, Content and
Organization of the portal, Navigation, Links and Accessibility,
Aesthetic and Visual Appeal, Performance and Effectiveness and
educational purpose) from users' perspectives, while the second
method was used to measure the portal's internal attributes of
usability (number and size of HTML files, number and size of images,
load time, HTML check errors, browsers compatibility problems,
number of bad and broken links), which cannot be perceived by the
users. The study showed that some of the usability aspects have been
found at the acceptable level of performance and quality, and some
others have been found otherwise. In general, it was concluded that
the usability of IT faculty educational portal generally acceptable.
Recommendations and suggestions to improve the weakness and
quality of the portal usability are presented in this study.
Abstract: The purpose of this research was to demonstrate
prevalence of post-exposure preventive measures (PEP) after needlestick
injuries and its relationship with locus of control beliefs in a
sample of medical students. In this cross-sectional study, 300 medical
students with history of having experienced needle stick injuries
(NSI) for at least once filled in a questionnaire to determine if they
perceived themselves to be responsible and effective in preventing
blood born infections after NSI. About 38% of students did not seek
any professional consult or PEP after NSI due to lack of enough time
or access, anxiety about tests results, belief in uselessness of followup
and not being able to change destiny. These 114 students were not
different from others regarding their scores on NSI specific scale of
locus of health control. Thus, the potentiality of NSI locus of control
beliefs in predicting PEP was not seen in this study.
Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over
some complex biological phenomena, such as problematic diseases
like cancer. This paper presents a new technique that allows for an
accurate analysis of the human interactome network. It is basically
a two-step analysis process that involves, at first, the detection of
each protein-s absolute importance through the betweenness centrality
computation. Then, the second step determines the functionallyrelated
communities of proteins. For this purpose, we use a community
detection technique that is based on the edge betweenness
calculation. The new technique was thoroughly tested on real biological
data and the results prove some interesting properties of those proteins that are involved in the carcinogenesis process. Apart from its
experimental usefulness, the novel technique is also computationally
effective in terms of execution times. Based on the analysis- results, some topological features of cancer mutated proteins are presented
and a possible optimization solution for cancer drugs design is suggested.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: In this paper real money demand function is analyzed
within multivariate time-series framework. Cointegration approach is
used (Johansen procedure) assuming interdependence between
money demand determinants, which are nonstationary variables. This
will help us to understand the behavior of money demand in Croatia,
revealing the significant influence between endogenous variables in
vector autoregrression system (VAR), i.e. vector error correction
model (VECM). Exogeneity of the explanatory variables is tested.
Long-run money demand function is estimated indicating slow speed
of adjustment of removing the disequilibrium. Empirical results
provide the evidence that real industrial production and exchange
rate explains the most variations of money demand in the long-run,
while interest rate is significant only in short-run.
Abstract: Despite the fact that Arabic language is currently one
of the most common languages worldwide, there has been only a
little research on Arabic speech recognition relative to other
languages such as English and Japanese. Generally, digital speech
processing and voice recognition algorithms are of special
importance for designing efficient, accurate, as well as fast automatic
speech recognition systems. However, the speech recognition process
carried out in this paper is divided into three stages as follows: firstly,
the signal is preprocessed to reduce noise effects. After that, the
signal is digitized and hearingized. Consequently, the voice activity
regions are segmented using voice activity detection (VAD)
algorithm. Secondly, features are extracted from the speech signal
using Mel-frequency cepstral coefficients (MFCC) algorithm.
Moreover, delta and acceleration (delta-delta) coefficients have been
added for the reason of improving the recognition accuracy. Finally,
each test word-s features are compared to the training database using
dynamic time warping (DTW) algorithm. Utilizing the best set up
made for all affected parameters to the aforementioned techniques,
the proposed system achieved a recognition rate of about 98.5%
which outperformed other HMM and ANN-based approaches
available in the literature.
Abstract: In this paper we present a modification to existed model of threshold for shot cut detection, which is able to adapt itself to the sequence statistics and operate in real time, because it use for calculation only previously evaluated frames. The efficiency of proposed modified adaptive threshold scheme was verified through extensive test experiment with several similarity metrics and achieved results were compared to the results reached by the original model. According to results proposed threshold scheme reached higher accuracy than existed original model.
Abstract: It is not easy to imagine how the existing city can be
converted to the principles of sustainability, however, the need for
innovation, requires a pioneering phase which must address the main
problems of rehabilitation of the operating models of the city. Today,
however, there is a growing awareness that the identification and
implementation of policies and measures to promote the adaptation,
resilience and reversibility of the city, require the contribution of our
discipline. This breakthrough is present in some recent international
experiences of Climate Plans, in which the envisaged measures are
closely interwoven with those of urban planning. These experiences,
provide some answers principle questions, such as: how the strategies
to combat climate can be integrated in the instruments of the local
government; what new and specific analysis must be introduced in
urban planning in order to understand the issues of urban
sustainability, and how the project compares with different spatial
scales.
Abstract: This paper explores the plant maintenance management system that has been used by giant oil and gas company in Malaysia. The system also called as PMMS used to manage the upstream operations for more than 100 plants of the case study company. Moreover, from the observations, focus group discussion with PMMS personnel and application through simulation (SAP R/3), the paper reviews the step-by-step approach and the elements that required for the PMMS. The findings show that the PMMS integrates the overall business strategy in upstream operations that consist of asset management, work management and performance management. In addition, PMMS roles are to help operations personnel organize and plan their daily activities, to improve productivity and reduce equipment downtime and to help operations management analyze the facilities and create performance, and to provide and maintain the operational effectiveness of the facilities.
Abstract: The modeling of sound radiation is of fundamental importance for understanding the propagation of acoustic waves and, consequently, develop mechanisms for reducing acoustic noise. The propagation of acoustic waves, are involved in various phenomena such as radiation, absorption, transmission and reflection. The radiation is studied through the linear equation of the acoustic wave that is obtained through the equation for the Conservation of Momentum, equation of State and Continuity. From these equations, is the Helmholtz differential equation that describes the problem of acoustic radiation. In this paper we obtained the solution of the Helmholtz differential equation for an infinite cylinder in a pulsating through free and homogeneous. The analytical solution is implemented and the results are compared with the literature. A numerical formulation for this problem is obtained using the Boundary Element Method (BEM). This method has great power for solving certain acoustical problems in open field, compared to differential methods. BEM reduces the size of the problem, thereby simplifying the input data to be worked and reducing the computational time used.
Abstract: The System Identification problem looks for a
suitably parameterized model, representing a given process. The
parameters of the model are adjusted to optimize a performance
function based on error between the given process output and
identified process output. The linear system identification field is
well established with many classical approaches whereas most of
those methods cannot be applied for nonlinear systems. The problem
becomes tougher if the system is completely unknown with only the
output time series is available. It has been reported that the
capability of Artificial Neural Network to approximate all linear and
nonlinear input-output maps makes it predominantly suitable for the
identification of nonlinear systems, where only the output time series
is available. [1][2][4][5]. The work reported here is an attempt to
implement few of the well known algorithms in the context of
modeling of nonlinear systems, and to make a performance
comparison to establish the relative merits and demerits.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.