Abstract: Text-based game is supposed to be a low resource
consumption application that delivers good performances when
compared to graphical-intensive type of games. But, nowadays, some
of the online text-based games are not offering performances that are
acceptable to the users. Therefore, an online text-based game called
Star_Quest has been developed in order to analyze its behavior under
different performance measurements. Performance metrics such as
throughput, scalability, response time and page loading time are
captured to yield the performance of the game. The techniques in
performing the load testing are also disclosed to exhibit the viability
of our work. The comparative assessment between the results
obtained and the accepted level of performances are conducted as to
determine the performance level of the game. The study reveals that
the developed game managed to meet all the performance objectives
set forth.
Abstract: Many recent high energy physics calculations
involving charm and beauty invoke wave function at the origin
(WFO) for the meson bound state. Uncertainties of charm and beauty
quark masses and different models for potentials governing these
bound states require a simple numerical algorithm for evaluation of
the WFO's for these bound states. We present a simple algorithm for
this propose which provides WFO's with high precision compared
with similar ones already obtained in the literature.
Abstract: We present a new method to reconstruct a temporally
coherent 3D animation from single or multi-view RGB-D video data
using unbiased feature point sampling. Given RGB-D video data, in
form of a 3D point cloud sequence, our method first extracts feature
points using both color and depth information. In the subsequent
steps, these feature points are used to match two 3D point clouds in
consecutive frames independent of their resolution. Our new motion
vectors based dynamic alignement method then fully reconstruct
a spatio-temporally coherent 3D animation. We perform extensive
quantitative validation using novel error functions to analyze the
results. We show that despite the limiting factors of temporal and
spatial noise associated to RGB-D data, it is possible to extract
temporal coherence to faithfully reconstruct a temporally coherent
3D animation from RGB-D video data.
Abstract: Based on the feature of model disturbances and uncertainty being compensated dynamically in auto – disturbances-rejection-controller (ADRC), a new method using ADRC is proposed for the decoupling control of dispenser longitudinal movement in big flight envelope. Developed from nonlinear model directly, ADRC is especially suitable for dynamic model that has big disturbances. Furthermore, without changing the structure and parameters of the controller in big flight envelope, this scheme can simplify the design of flight control system. The simulation results in big flight envelope show that the system achieves high dynamic performance, steady state performance and the controller has strong robustness.
Abstract: Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.
Abstract: The wind resource in the Italian site of Lendinara
(RO) is analyzed through a systematic anemometric campaign
performed on the top of the bell tower, at an altitude of over 100 m
above the ground. Both the average wind speed and the Weibull
distribution are computed. The resulting average wind velocity is in
accordance with the numerical predictions of the Italian Wind Atlas,
confirming the accuracy of the extrapolation of wind data adopted for
the evaluation of wind potential at higher altitudes with respect to the
commonly placed measurement stations.
Abstract: Energy efficient protocol design is the aim of current
researches in the area of sensor networks where limited power
resources impose energy conservation considerations. In this paper
we care for Medium Access Control (MAC) protocols and after an
extensive literature review, two adaptive schemes are discussed. Of
them, adaptive-rate MACs which were introduced for throughput
enhancement show the potency to save energy, even more than
adaptive-power schemes. Then we propose an allocation algorithm
for getting accurate and reliable results. Through a simulation study
we validated our claim and showed the power saving of adaptive-rate
protocols.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: One of the most important aspects expected from ERP systems is to integrate various operations existing in administrative, financial, commercial, human resources, and production departments of the consumer organization. Also, it is often needed to integrate the new ERP system with the organization legacy systems when implementing the ERP package in the organization. Without relying on an appropriate software architecture to realize the required integration, ERP implementation processes become error prone and time consuming; in some cases, the ERP implementation may even encounters serious risks. In this paper, we propose a new architecture that is based on the agent oriented vision and supplies the integration expected from ERP systems using several independent but cooperator agents. Besides integration which is the main issue of this paper, the presented architecture will address some aspects of intelligence and learning capabilities existing in ERP systems
Abstract: Decentralized eco-sanitation system is a promising and sustainable mode comparing to the century-old centralized conventional sanitation system. The decentralized concept relies on an environmentally and economically sound management of water, nutrient and energy fluxes. Source-separation systems for urban waste management collect different solid waste and wastewater streams separately to facilitate the recovery of valuable resources from wastewater (energy, nutrients). A resource recovery centre constituted for 20,000 people will act as the functional unit for the treatment of urban waste of a high-density population community, like Singapore. The decentralized system includes urine treatment, faeces and food waste co-digestion, and horticultural waste and organic fraction of municipal solid waste treatment in composting plants. A design model is developed to estimate the input and output in terms of materials and energy. The inputs of urine (yellow water, YW) and faeces (brown water, BW) are calculated by considering the daily mean production of urine and faeces by humans and the water consumption of no-mix vacuum toilet (0.2 and 1 L flushing water for urine and faeces, respectively). The food waste (FW) production is estimated to be 150 g wet weight/person/day. The YW is collected and discharged by gravity into tank. It was found that two days are required for urine hydrolysis and struvite precipitation. The maximum nitrogen (N) and phosphorus (P) recovery are 150-266 kg/day and 20-70 kg/day, respectively. In contrast, BW and FW are mixed for co-digestion in a thermophilic acidification tank and later a decentralized/centralized methanogenic reactor is used for biogas production. It is determined that 6.16-15.67 m3/h methane is produced which is equivalent to 0.07-0.19 kWh/ca/day. The digestion residues are treated with horticultural waste and organic fraction of municipal waste in co-composting plants.
Abstract: This paper presents a new approach for image
segmentation by applying Pillar-Kmeans algorithm. This
segmentation process includes a new mechanism for clustering the
elements of high-resolution images in order to improve precision and
reduce computation time. The system applies K-means clustering to
the image segmentation after optimized by Pillar Algorithm. The
Pillar algorithm considers the pillars- placement which should be
located as far as possible from each other to withstand against the
pressure distribution of a roof, as identical to the number of centroids
amongst the data distribution. This algorithm is able to optimize the
K-means clustering for image segmentation in aspects of precision
and computation time. It designates the initial centroids- positions
by calculating the accumulated distance metric between each data
point and all previous centroids, and then selects data points which
have the maximum distance as new initial centroids. This algorithm
distributes all initial centroids according to the maximum
accumulated distance metric. This paper evaluates the proposed
approach for image segmentation by comparing with K-means and
Gaussian Mixture Model algorithm and involving RGB, HSV, HSL
and CIELAB color spaces. The experimental results clarify the
effectiveness of our approach to improve the segmentation quality in
aspects of precision and computational time.
Abstract: Phase-Contrast MR imaging methods are widely used
for measurement of blood flow velocity components. Also there are
some other tools such as CT and Ultrasound for velocity map
detection in intravascular studies. These data are used in deriving
flow characteristics. Some clinical applications are investigated
which use pressure distribution in diagnosis of intravascular disorders
such as vascular stenosis. In this paper an approach to the problem of
measurement of intravascular pressure field by using velocity field
obtained from flow images is proposed. The method presented in this
paper uses an algorithm to calculate nonlinear equations of Navier-
Stokes, assuming blood as an incompressible and Newtonian fluid.
Flow images usually suffer the lack of spatial resolution. Our
attempt is to consider the effect of spatial resolution on the pressure
distribution estimated from this method. In order to achieve this aim,
velocity map of a numerical phantom is derived at six different
spatial resolutions. To determine the effects of vascular stenoses on
pressure distribution, a stenotic phantom geometry is considered. A
comparison between the pressure distribution obtained from the
phantom and the pressure resulted from the algorithm is presented. In
this regard we also compared the effects of collocated and staggered
computational grids on the pressure distribution resulted from this
algorithm.
Abstract: Quantitative measurements of tumor in general and tumor volume in particular, become more realistic with the use of Magnetic Resonance imaging, especially when the tumor morphological changes become irregular and difficult to assess by clinical examination. However, tumor volume estimation strongly depends on the image segmentation, which is fuzzy by nature. In this paper a fuzzy approach is presented for tumor volume segmentation based on the fuzzy connectedness algorithm. The fuzzy affinity matrix resulting from segmentation is then used to estimate a fuzzy volume based on a certainty parameter, an Alpha Cut, defined by the user. The proposed method was shown to highly affect treatment decisions. A statistical analysis was performed in this study to validate the results based on a manual method for volume estimation and the importance of using the Alpha Cut is further explained.
Abstract: Market competition and a desire to gain advantages on globalized market, drives companies towards innovation efforts. Project overload is an unpleasant phenomenon, which is happening for employees inside those organizations trying to make the most efficient use of their resources to be innovative. But what are the impacts of project overload on organization-s innovation capabilities? Advanced engineering teams (AE) inside a major heavy equipment manufacturer are suffering from project overload in their quest for innovation. In this paper, Agent-based modeling (ABM) is used to examine the current reality of the company context, and of the AE team, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. Project overload is more likely to stifle innovation and creativity inside teams. On the other hand, motivations on proper challenging goals are more likely to help individual to alleviate the negative aspects of low level of project overload.
Abstract: Grid environments include aggregation of
geographical distributed resources. Grid is put forward in three types
of computational, data and storage. This paper presents a research on
data grid. Data grid is used for covering and securing accessibility to
data from among many heterogeneous sources. Users are not worry
on the place where data is located in it, provided that, they should get
access to the data. Metadata is used for getting access to data in data
grid. Presently, application metadata catalogue and SRB middle-ware
package are used in data grids for management of metadata. At this
paper, possibility of updating, streamlining and searching is provided
simultaneously and rapidly through classified table of preserving
metadata and conversion of each table to numerous tables.
Meanwhile, with regard to the specific application, the most
appropriate and best division is set and determined. Concurrency of
implementation of some of requests and execution of pipeline is
adaptability as a result of this technique.
Abstract: The school / university orientation interests a broad and
often badly informed public. Technically, it is an important
multicriterion decision problem, which supposes the combination of
much academic professional and/or lawful knowledge, which in turn
justifies software resorting to the techniques of Artificial Intelligence.
CORUS is an expert system of the "Conseil et ORientation
Universitaire et Scolaire", based on a knowledge representation
language (KRL) with rules and objects, called/ known as Ibn Rochd.
CORUS was developed thanks to DéGSE, a workshop of cognitive
engineering which supports this LRC. CORUS works out many
acceptable solutions for the case considered, and retains the most
satisfactory among them. Several versions of CORUS have extended
its services gradually.
Abstract: The prediction of Software quality during development life cycle of software project helps the development organization to make efficient use of available resource to produce the product of highest quality. “Whether a module is faulty or not" approach can be used to predict quality of a software module. There are numbers of software quality prediction models described in the literature based upon genetic algorithms, artificial neural network and other data mining algorithms. One of the promising aspects for quality prediction is based on clustering techniques. Most quality prediction models that are based on clustering techniques make use of K-means, Mixture-of-Guassians, Self-Organizing Map, Neural Gas and fuzzy K-means algorithm for prediction. In all these techniques a predefined structure is required that is number of neurons or clusters should be known before we start clustering process. But in case of Growing Neural Gas there is no need of predetermining the quantity of neurons and the topology of the structure to be used and it starts with a minimal neurons structure that is incremented during training until it reaches a maximum number user defined limits for clusters. Hence, in this work we have used Growing Neural Gas as underlying cluster algorithm that produces the initial set of labeled cluster from training data set and thereafter this set of clusters is used to predict the quality of test data set of software modules. The best testing results shows 80% accuracy in evaluating the quality of software modules. Hence, the proposed technique can be used by programmers in evaluating the quality of modules during software development.
Abstract: The ultimate goal of this article is to develop a robust and accurate numerical method for solving hyperbolic conservation laws in one and two dimensions. A hybrid numerical method, coupling a cheap fourth order total variation diminishing (TVD) scheme [1] for smooth region and a Robust seventh-order weighted non-oscillatory (WENO) scheme [2] near discontinuities, is considered. High order multi-resolution analysis is used to detect the high gradients regions of the numerical solution in order to capture the shocks with the WENO scheme, while the smooth regions are computed with fourth order total variation diminishing (TVD). For time integration, we use the third order TVD Runge-Kutta scheme. The accuracy of the resulting hybrid high order scheme is comparable with these of WENO, but with significant decrease of the CPU cost. Numerical demonstrates that the proposed scheme is comparable to the high order WENO scheme and superior to the fourth order TVD scheme. Our scheme has the added advantage of simplicity and computational efficiency. Numerical tests are presented which show the robustness and effectiveness of the proposed scheme.
Abstract: Contour filter strips planted with perennial vegetation
can be used to improve surface and ground water quality by reducing
pollutant, such as NO3-N, and sediment outflow from cropland to a
river or lake. Meanwhile, the filter strips of perennial grass with biofuel
potentials also have economic benefits of producing ethanol. In
this study, The Soil and Water Assessment Tool (SWAT) model was
applied to the Walnut Creek Watershed to examine the effectiveness
of contour strips in reducing NO3-N outflows from crop fields to the
river or lake. Required input data include watershed topography,
slope, soil type, land-use, management practices in the watershed and
climate parameters (precipitation, maximum/minimum air
temperature, solar radiation, wind speed and relative humidity).
Numerical experiments were conducted to identify potential
subbasins in the watershed that have high water quality impact, and
to examine the effects of strip size and location on NO3-N reduction
in the subbasins under various meteorological conditions (dry,
average and wet). Variable sizes of contour strips (10%, 20%, 30%
and 50%, respectively, of a subbasin area) planted with perennial
switchgrass were selected for simulating the effects of strip size and
location on stream water quality. Simulation results showed that a
filter strip having 10%-50% of the subbasin area could lead to 55%-
90% NO3-N reduction in the subbasin during an average rainfall
year. Strips occupying 10-20% of the subbasin area were found to be
more efficient in reducing NO3-N when placed along the contour
than that when placed along the river. The results of this study can
assist in cost-benefit analysis and decision-making in best water
resources management practices for environmental protection.
Abstract: In this paper, an efficient structural approach for
recognizing on-line handwritten digits is proposed. After reading
the digit from the user, the slope is estimated and normalized for
adjacent nodes. Based on the changing of signs of the slope values,
the primitives are identified and extracted. The names of these
primitives are represented by strings, and then a finite state
machine, which contains the grammars of the digits, is traced to
identify the digit. Finally, if there is any ambiguity, it will be
resolved. Experiments showed that this technique is flexible and
can achieve high recognition accuracy for the shapes of the digits
represented in this work.