Abstract: Tool Tracker is a client-server based application. It is essentially a catalogue of various network monitoring and management tools that are available online. There is a database maintained on the server side that contains the information about various tools. Several clients can access this information simultaneously and utilize this information. The various categories of tools considered are packet sniffers, port mappers, port scanners, encryption tools, and vulnerability scanners etc for the development of this application. This application provides a front end through which the user can invoke any tool from a central repository for the purpose of packet sniffing, port scanning, network analysis etc. Apart from the tool, its description and the help files associated with it would also be stored in the central repository. This facility will enable the user to view the documentation pertaining to the tool without having to download and install the tool. The application would update the central repository with the latest versions of the tools. The application would inform the user about the availability of a newer version of the tool currently being used and give the choice of installing the newer version to the user. Thus ToolTracker provides any network administrator that much needed abstraction and ease-ofuse with respect to the tools that he can use to efficiently monitor a network.
Abstract: EGOTHOR is a search engine that indexes the Web
and allows us to search the Web documents. Its hit list contains URL
and title of the hits, and also some snippet which tries to shortly
show a match. The snippet can be almost always assembled by an
algorithm that has a full knowledge of the original document (mostly
HTML page). It implies that the search engine is required to store
the full text of the documents as a part of the index.
Such a requirement leads us to pick up an appropriate compression
algorithm which would reduce the space demand. One of the solutions
could be to use common compression methods, for instance gzip or
bzip2, but it might be preferable if we develop a new method which
would take advantage of the document structure, or rather, the textual
character of the documents.
There already exist a special compression text algorithms and
methods for a compression of XML documents. The aim of this
paper is an integration of the two approaches to achieve an optimal
level of the compression ratio
Abstract: In this paper, a one-dimensional numerical approach is
used to study the effect of applying electrohydrodynamics on the
temperature and species mass fraction profiles along the microcombustor.
Premixed mixture is H2-Air with a multi-step chemistry
(9 species and 19 reactions). In the micro-scale combustion because
of the increasing ratio of area-to-volume, thermal and radical
quenching mechanisms are important. Also, there is a significant heat
loss from the combustor walls. By inserting a number of electrodes
into micro-combustor and applying high voltage to them corona
discharge occurs. This leads in moving of induced ions toward
natural molecules and colliding with them. So this phenomenon
causes the movement of the molecules and reattaches the flow to the
walls. It increases the velocity near the walls that reduces the wall
boundary layer. Consequently, applying electrohydrodynamics
mechanism can enhance the temperature profile in the microcombustor.
Ultimately, it prevents the flame quenching in microcombustor.
Abstract: The aim of the present study is to analyze empirical
researches on the social resources dimension of occupational status
attainment process and relate them to the rational choice approach.
The analysis suggests that the existing data on the strength of ties
aspect of social resources is insufficient and does not allow any
implication concerning rational actor-s behavior. However, the results
concerning work relation aspect are more encouraging.
Abstract: The data exchanged on the Web are of different nature
from those treated by the classical database management systems;
these data are called semi-structured data since they do not have a
regular and static structure like data found in a relational database;
their schema is dynamic and may contain missing data or types.
Therefore, the needs for developing further techniques and
algorithms to exploit and integrate such data, and extract relevant
information for the user have been raised. In this paper we present
the system OSIX (Osiris based System for Integration of XML
Sources). This system has a Data Warehouse model designed for the
integration of semi-structured data and more precisely for the
integration of XML documents. The architecture of OSIX relies on
the Osiris system, a DL-based model designed for the representation
and management of databases and knowledge bases. Osiris is a viewbased
data model whose indexing system supports semantic query
optimization. We show that the problem of query processing on a
XML source is optimized by the indexing approach proposed by
Osiris.
Abstract: Representing objects in a dynamic domain is essential
in commonsense reasoning under some circumstances. Classical logics
and their nonmonotonic consequences, however, are usually not
able to deal with reasoning with dynamic domains due to the fact that
every constant in the logical language denotes some existing object
in the static domain. In this paper, we explore a logical formalization
which allows us to represent nonexisting objects in commonsense
reasoning. A formal system named N-theory is proposed for this
purpose and its possible application in computer security is briefly
discussed.
Abstract: With the rapid development in the field of life
sciences and the flooding of genomic information, the need for faster
and scalable searching methods has become urgent. One of the
approaches that were investigated is indexing. The indexing methods
have been categorized into three categories which are the lengthbased
index algorithms, transformation-based algorithms and mixed
techniques-based algorithms. In this research, we focused on the
transformation based methods. We embedded the N-gram method
into the transformation-based method to build an inverted index
table. We then applied the parallel methods to speed up the index
building time and to reduce the overall retrieval time when querying
the genomic database. Our experiments show that the use of N-Gram
transformation algorithm is an economical solution; it saves time and
space too. The result shows that the size of the index is smaller than
the size of the dataset when the size of N-Gram is 5 and 6. The
parallel N-Gram transformation algorithm-s results indicate that the
uses of parallel programming with large dataset are promising which
can be improved further.
Abstract: Recurrent event data is a special type of multivariate
survival data. Dynamic and frailty models are one of the approaches
that dealt with this kind of data. A comparison between these two
models is studied using the empirical standard deviation of the
standardized martingale residual processes as a way of assessing the
fit of the two models based on the Aalen additive regression model.
Here we found both approaches took heterogeneity into account and
produce residual standard deviations close to each other both in the
simulation study and in the real data set.
Abstract: The study of tourist activities and the mapping of their routes in space and time has become an important issue in tourism management. Here we represent space-time paths for the tourism industry by visualizing individual tourist activities and the paths followed using a 3D Geographic Information System (GIS). Considerable attention has been devoted to the measurement of accessibility to shopping, eating, walking and other services at the tourist destination. I turns out that GIS is a useful tool for studying the spatial behaviors of tourists in the area. The value of GIS is especially advantageous for space-time potential path area measures, especially for the accurate visualization of possible paths through existing city road networks. This study seeks to apply space-time concepts with a detailed street network map obtained from Google Maps to measure tourist paths both spatially and temporally. These paths are further determined based on data obtained from map questionnaires regarding the trip activities of 40 individuals. The analysis of the data makes it possible to determining the locations of the more popular paths. The results can be visualized using 3D GIS to show the areas and potential activity opportunities accessible to tourists during their travel time.
Abstract: Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is of fundamental importance because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in developed countries, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we study the factors that influence exclusive breastfeeding and use the Generalized Poisson regression model to analyze the practices of exclusive breastfeeding in Mauritius. We develop two sets of quasi-likelihood equations (QLE)to estimate the parameters.
Abstract: Although many studies on the assembly technology of
the bridge construction have dealt mostly with on the pier, girder or the
deck of the bridge, studies on the prefabricated barrier have rarely been
performed. For understanding structural characteristics and
application of the concrete barrier in the modular bridge, which is an
assembly of structure members, static loading test was performed.
Structural performances as a road barrier of the three methods,
conventional cast-in-place(ST), vertical bolt connection(BVC) and
horizontal bolt connection(BHC) were evaluated and compared
through the analyses of load-displacement curves, strain curves of the
steel, concrete strain curves and the visual appearances of crack
patterns. The vertical bolt connection(BVC) method demonstrated
comparable performance as an alternative to conventional
cast-in-place(ST) while providing all the advantages of prefabricated
technology. Necessities for the future improvement in nuts
enforcement as well as legal standard and regulation are also
addressed.
Abstract: This paper discuss the separation of the miscible
liquids by means of fractional distillation. For complete separation of
liquids, the process of heating, condensation, separation and storage
is done automatically to achieve the objective. PIC micro-controller
has been used to control each and every process of the work. The
controller also controls the storage process by activating and deactivating
the conveyors. The liquids are heated which on reaching
their respective boiling points evaporate and enter the condensation
chamber where they convert into liquid. The liquids are then directed
to their respective tanks by means of stepper motor which moves in
three directions, each movement into different tank. The tank on
filling sends the signal to controller which then opens the solenoid
valves. The tank is emptied into the beakers below the nozzle. As the
beaker filled, the nozzle closes and the conveyors come into
operation. The filled beaker is replaced by an empty beaker from
behind. The work can be used in oil industries, chemical industries
and paint industries.
Abstract: This study was conducted to explore the effects of two
countries model comparison program in Taiwan and Singapore in
TIMSS database. The researchers used Multi-Group Hierarchical
Linear Modeling techniques to compare the effects of two different
country models and we tested our hypotheses on 4,046 Taiwan
students and 4,599 Singapore students in 2007 at two levels: the class
level and student (individual) level. Design quality is a class level
variable. Student level variables are achievement and self-confidence.
The results challenge the widely held view that retention has a positive
impact on self-confidence. Suggestions for future research are
discussed.
Abstract: Dengue disease is an infectious vector-borne viral
disease that is commonly found in tropical and sub-tropical regions,
especially in urban and semi-urban areas, around the world and
including Malaysia. There is no currently available vaccine or
chemotherapy for the prevention or treatment of dengue disease.
Therefore prevention and treatment of the disease depend on vector
surveillance and control measures. Disease risk mapping has been
recognized as an important tool in the prevention and control
strategies for diseases. The choice of statistical model used for
relative risk estimation is important as a good model will
subsequently produce a good disease risk map. Therefore, the aim of
this study is to estimate the relative risk for dengue disease based
initially on the most common statistic used in disease mapping called
Standardized Morbidity Ratio (SMR) and one of the earliest
applications of Bayesian methodology called Poisson-gamma model.
This paper begins by providing a review of the SMR method, which
we then apply to dengue data of Perak, Malaysia. We then fit an
extension of the SMR method, which is the Poisson-gamma model.
Both results are displayed and compared using graph, tables and
maps. Results of the analysis shows that the latter method gives a
better relative risk estimates compared with using the SMR. The
Poisson-gamma model has been demonstrated can overcome the
problem of SMR when there is no observed dengue cases in certain
regions. However, covariate adjustment in this model is difficult and
there is no possibility for allowing spatial correlation between risks in
adjacent areas. The drawbacks of this model have motivated many
researchers to propose other alternative methods for estimating the
risk.
Abstract: The objective of this study is to propose an observer design for nonlinear systems by using an augmented linear system derived by application of a formal linearization method. A given nonlinear differential equation is linearized by the formal linearization method which is based on Taylor expansion considering up to the higher order terms, and a measurement equation is transformed into an augmented linear one. To this augmented dimensional linear system, a linear estimation theory is applied and a nonlinear observer is derived. As an application of this method, an estimation problem of transient state of electric power systems is studied, and its numerical experiments indicate that this observer design shows remarkable performances for nonlinear systems.
Abstract: This article presents a current-mode quadrature
oscillator using differential different current conveyor (DDCC) and
voltage differencing transconductance amplifier (VDTA) as active
elements. The proposed circuit is realized fro m a non-inverting
lossless integrator and an inverting second order low-pass filter. The
oscillation condition and oscillation frequency can be
electronically/orthogonally controlled via input bias currents. The
circuit description is very simple, consisting of merely 1 DDCC, 1
VDTA, 1 grounded resistor and 3 grounded capacitors. Using only
grounded elements, the proposed circuit is then suitable for IC
architecture. The proposed oscillator has high output impedance
which is easy to cascade or dive the external load without the buffer
devices. The PSPICE simulation results are depicted, and the given
results agree well with the theoretical anticipation. The power
consumption is approximately 1.76mW at ±1.25V supply voltages.
Abstract: In Multiple Sclerosis, pathological changes in the
brain results in deviations in signal intensity on Magnetic Resonance
Images (MRI). Quantitative analysis of these changes and their
correlation with clinical finding provides important information for
diagnosis. This constitutes the objective of our work. A new approach
is developed. After the enhancement of images contrast and the brain
extraction by mathematical morphology algorithm, we proceed to the
brain segmentation. Our approach is based on building statistical
model from data itself, for normal brain MRI and including clustering
tissue type. Then we detect signal abnormalities (MS lesions) as a
rejection class containing voxels that are not explained by the built
model. We validate the method on MR images of Multiple Sclerosis
patients by comparing its results with those of human expert
segmentation.
Abstract: Cameron Highlands is a mountainous area subjected
to torrential tropical showers. It extracts 5.8 million liters of water
per day for drinking supply from its rivers at several intake points.
The water quality of rivers in Cameron Highlands, however, has
deteriorated significantly due to land clearing for agriculture,
excessive usage of pesticides and fertilizers as well as construction
activities in rapidly developing urban areas. On the other hand, these
pollution sources known as non-point pollution sources are diverse
and hard to identify and therefore they are difficult to estimate.
Hence, Geographical Information Systems (GIS) was used to provide
an extensive approach to evaluate landuse and other mapping
characteristics to explain the spatial distribution of non-point sources
of contamination in Cameron Highlands. The method to assess
pollution sources has been developed by using Cameron Highlands
Master Plan (2006-2010) for integrating GIS, databases, as well as
pollution loads in the area of study. The results show highest annual
runoff is created by forest, 3.56 × 108 m3/yr followed by urban
development, 1.46 × 108 m3/yr. Furthermore, urban development
causes highest BOD load (1.31 × 106 kgBOD/yr) while agricultural
activities and forest contribute the highest annual loads for
phosphorus (6.91 × 104 kgP/yr) and nitrogen (2.50 × 105 kgN/yr),
respectively. Therefore, best management practices (BMPs) are
suggested to be applied to reduce pollution level in the area.
Abstract: The vehicle routing problem (VRP) is a famous combinatorial optimization problem. Because of its well-known difficulty, metaheuristics are the most appropriate methods to tackle large and realistic instances. The goal of this paper is to highlight the key ideas for designing VRP metaheuristics according to the following criteria: efficiency, speed, robustness, and ability to take advantage of the problem structure. Such elements can obviously be used to build solution methods for other combinatorial optimization problems, at least in the deterministic field.
Abstract: When the foundations of structures under cyclic
loading with amplitudes less than their permissible load, the concern exists often for the amount of uniform and non-uniform settlement of
such structures. Storage tank foundations with numerous filling and discharging and railways ballast course under repeating
transportation loads are examples of such conditions. This paper
deals with the effects of using the new generation of reinforcements,
Grid-Anchor, for the purpose of reducing the permanent settlement
of these foundations under the influence of different proportions of
the ultimate load. Other items such as the type and the number of
reinforcements as well as the number of loading cycles are studied numerically. Numerical models were made using the Plaxis3D
Tunnel finite element code. The results show that by using gridanchor
and increasing the number of their layers in the same
proportion as that of the cyclic load being applied, the amount of
permanent settlement decreases up to 42% relative to unreinforced
condition depends on the number of reinforcement layers and percent
of applied load and the number of loading cycles to reach a constant
value of dimensionless settlement decreases up to 20% relative to
unreinforced condition.