Abstract: The purpose of this paper is to consider the
introduction of online courses to replace the current classroom-based
staff training. The current training is practical, and must be
completed before access to the financial computer system is
authorized. The long term objective is to measure the efficacy,
effectiveness and efficiency of the training, and to establish whether
a transfer of knowledge back to the workplace has occurred. This
paper begins with an overview explaining the importance of staff
training in an evolving, competitive business environment and
defines the problem facing this particular organization. A summary
of the literature review is followed by a brief discussion of the
research methodology and objective. The implementation of the
alpha version of the online course is then described. This paper may
be of interest to those seeking insights into, or new theory regarding,
practical interventions of online learning in the real world.
Abstract: Data from 1731 Gentile di Puglia lambs, sired by 65 rams over a 5-year period were analyzed by a mixed model to estimate the variance components for heritability. The considered growth traits were: birth weight (BW), weight at 30 days of age (W30) and average daily gain from birth to 30 days of age (DG). Year of birth, sex of lamb, type of birth (single or twin), dam age at lambing and farm were significant sources of variation for all the considered growth traits. The average lamb weights were 3.85±0.16 kg at birth, 9.57±0.91 kg at 30 days of age and the average daily gain was 191±14 g. Estimates of heritability were 0.33±0.05, 0.41±0.06 and 0.16±0.05 respectively for the same traits. These values suggest there is a good opportunity to improve Gentile di Puglia lambs by selecting animals for growth traits.
Abstract: A number of routing algorithms based on learning
automata technique have been proposed for communication
networks. How ever, there has been little work on the effects of
variation of graph scarcity on the performance of these algorithms. In
this paper, a comprehensive study is launched to investigate the
performance of LASPA, the first learning automata based solution to
the dynamic shortest path routing, across different graph structures
with varying scarcities. The sensitivity of three main performance
parameters of the algorithm, being average number of processed
nodes, scanned edges and average time per update, to variation in
graph scarcity is reported. Simulation results indicate that the LASPA
algorithm can adapt well to the scarcity variation in graph structure
and gives much better outputs than the existing dynamic and fixed
algorithms in terms of performance criteria.
Abstract: This study aims to clarify constructions which enable to improve socio-cultural values of environments and also to obtain new knowledge on selecting development plans. CVM is adopted as a method of evaluation. As a case of the research, university campus (CP; the following) is selected on account of its various environments, institutions and many users. Investigations were conducted from 4 points of view, total value and utility value of whole CP environments, values of each environment existing in CP or development plan assumed in CP. Furthermore, respondents- attributes were also investigated. In consequence, the following is obtained. 1) Almost all of total value of CP is composed of utility value of direct use. 2) Each of environment and development plans whose value is the highest is clarified. 3) Moreover, development plan to improve environmental value the most is specified.
Abstract: In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as
weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services
with the Web-mined knowledge have begun to be developed for
the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be
problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore,
this paper introduces the simplest Web Sensor and spatiotemporallynormalized
Web Sensor to extract spatiotemporal data about a target
phenomenon from weblogs searched by keyword(s) representing the
target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity
analyses of coefficient correlation with temperature, rainfall, snowfall,
and earthquake statistics per day by region of Japan Meteorological
Agency as physical-world data: spatial granularity (region-s population
density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and
media granularity (weblogs vs. microblogs such as Tweets).
Abstract: Automobile Industry has great importance in the
Spanish economy (8,7 % of the active Spanish population is
employed in this sector).The above mentioned sector has been one of
the principal sectors affected by the current economic crisis,
consistently, the budgets in advertising have been severely limited
(46,9 % less in the period of reference), these needs of reduction
have originated a substantial change in the advertising strategy (from
2007 the increase of the advertising investment in Internet is 251,6
%), and increase profitability. The growing use of social media by
consumers therefore makes online consumer conversations an
attractive additional format for Automobile firms to promote
products at a lower cost. This research analyzes the relation between
the activity in Social Media and the design in the car industry,
looking for relations between strategies of design based on Social
Media and sales and a channel of information for companies to know
what the consumer preferences. For this ongoing research we used a
longitudinal withdrawal of information has been used using
information of panel. Managerial and research implications of the
finding are discussed.
Abstract: Cryptographic protocols are widely used in various
applications to provide secure communications. They are usually
represented as communicating agents that send and receive messages.
These agents use their knowledge to exchange information and
communicate with other agents involved in the protocol. An agent
knowledge can be partitioned into explicit knowledge and procedural
knowledge. The explicit knowledge refers to the set of information
which is either proper to the agent or directly obtained from other
agents through communication. The procedural knowledge relates to
the set of mechanisms used to get new information from what is
already available to the agent.
In this paper, we propose a mathematical framework which specifies
the explicit knowledge of an agent involved in a cryptographic
protocol. Modelling this knowledge is crucial for the specification,
analysis, and implementation of cryptographic protocols. We also,
report on a prototype tool that allows the representation and the
manipulation of the explicit knowledge.
Abstract: The state-of-the-art Bag of Words model in Content-
Based Image Retrieval has been used for years but the relevance
feedback strategies for this model are not fully investigated. Inspired
from text retrieval, the Bag of Words model has the ability to use the
wealth of knowledge and practices available in text retrieval. We
study and experiment the relevance feedback model in text retrieval
for adapting it to image retrieval. The experiments show that the
techniques from text retrieval give good results for image retrieval
and that further improvements is possible.
Abstract: In this paper, we consider the effect of the initial
sample size on the performance of a sequential approach that used
in selecting a good enough simulated system, when the number
of alternatives is very large. We implement a sequential approach
on M=M=1 queuing system under some parameter settings, with a
different choice of the initial sample sizes to explore the impacts on
the performance of this approach. The results show that the choice
of the initial sample size does affect the performance of our selection
approach.
Abstract: In this study, a longitudinal joint connection was
proposed for the short-span slab-type modular bridges with rapid
construction. The slab-type modular bridge consists of a number of
precast slab modules and has the joint connection between the
modules in the longitudinal direction of the bridge. A finite element
based parameter analysis was conducted to design the shape and the
dimensions of the longitudinal joint connection. Numbers of shear
keys within the joint, height and depth of the shear key, tooth angle,
and the spacing were considered as the design parameters. Using the
local cracking load at the corner of the shear key and the
cross-sectional area of the joint, an efficiency factor was proposed to
evaluate the effectiveness of the longitudinal joint connection. The
dimensions of shear key were determined by comparing the cracking
loads and the efficiency factors obtained from the finite element
analysis.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: EGOTHOR is a search engine that indexes the Web
and allows us to search the Web documents. Its hit list contains URL
and title of the hits, and also some snippet which tries to shortly
show a match. The snippet can be almost always assembled by an
algorithm that has a full knowledge of the original document (mostly
HTML page). It implies that the search engine is required to store
the full text of the documents as a part of the index.
Such a requirement leads us to pick up an appropriate compression
algorithm which would reduce the space demand. One of the solutions
could be to use common compression methods, for instance gzip or
bzip2, but it might be preferable if we develop a new method which
would take advantage of the document structure, or rather, the textual
character of the documents.
There already exist a special compression text algorithms and
methods for a compression of XML documents. The aim of this
paper is an integration of the two approaches to achieve an optimal
level of the compression ratio
Abstract: Air pollution is a major environmental health
problem, affecting developed and developing countries around the
world. Increasing amounts of potentially harmful gases and
particulate matter are being emitted into the atmosphere on a global
scale, resulting in damage to human health and the environment.
Petroleum-related air pollutants can have a wide variety of adverse
environmental impacts. In the crude oil production sectors, there is a
strong need for a thorough knowledge of gaseous emissions resulting
from the flaring of associated gas of known composition on daily
basis through combustion activities under several operating
conditions. This can help in the control of gaseous emission from
flares and thus in the protection of their immediate and distant
surrounding against environmental degradation.
The impacts of methane and non-methane hydrocarbons emissions
from flaring activities at oil production facilities at Kuwait Oilfields
have been assessed through a screening study using records of flaring
operations taken at the gas and oil production sites, and by analyzing
available meteorological and air quality data measured at stations
located near anthropogenic sources. In the present study the
Industrial Source Complex (ISCST3) Dispersion Model is used to
calculate the ground level concentrations of methane and nonmethane
hydrocarbons emitted due to flaring in all over Kuwait
Oilfields.
The simulation of real hourly air quality in and around oil
production facilities in the State of Kuwait for the year 2006,
inserting the respective source emission data into the ISCST3
software indicates that the levels of non-methane hydrocarbons from
the flaring activities exceed the allowable ambient air standard set by
Kuwait EPA. So, there is a strong need to address this acute problem
to minimize the impact of methane and non-methane hydrocarbons
released from flaring activities over the urban area of Kuwait.
Abstract: The data exchanged on the Web are of different nature
from those treated by the classical database management systems;
these data are called semi-structured data since they do not have a
regular and static structure like data found in a relational database;
their schema is dynamic and may contain missing data or types.
Therefore, the needs for developing further techniques and
algorithms to exploit and integrate such data, and extract relevant
information for the user have been raised. In this paper we present
the system OSIX (Osiris based System for Integration of XML
Sources). This system has a Data Warehouse model designed for the
integration of semi-structured data and more precisely for the
integration of XML documents. The architecture of OSIX relies on
the Osiris system, a DL-based model designed for the representation
and management of databases and knowledge bases. Osiris is a viewbased
data model whose indexing system supports semantic query
optimization. We show that the problem of query processing on a
XML source is optimized by the indexing approach proposed by
Osiris.
Abstract: Representing objects in a dynamic domain is essential
in commonsense reasoning under some circumstances. Classical logics
and their nonmonotonic consequences, however, are usually not
able to deal with reasoning with dynamic domains due to the fact that
every constant in the logical language denotes some existing object
in the static domain. In this paper, we explore a logical formalization
which allows us to represent nonexisting objects in commonsense
reasoning. A formal system named N-theory is proposed for this
purpose and its possible application in computer security is briefly
discussed.
Abstract: Although many studies on the assembly technology of
the bridge construction have dealt mostly with on the pier, girder or the
deck of the bridge, studies on the prefabricated barrier have rarely been
performed. For understanding structural characteristics and
application of the concrete barrier in the modular bridge, which is an
assembly of structure members, static loading test was performed.
Structural performances as a road barrier of the three methods,
conventional cast-in-place(ST), vertical bolt connection(BVC) and
horizontal bolt connection(BHC) were evaluated and compared
through the analyses of load-displacement curves, strain curves of the
steel, concrete strain curves and the visual appearances of crack
patterns. The vertical bolt connection(BVC) method demonstrated
comparable performance as an alternative to conventional
cast-in-place(ST) while providing all the advantages of prefabricated
technology. Necessities for the future improvement in nuts
enforcement as well as legal standard and regulation are also
addressed.
Abstract: In recent years, a number of works proposing the
combination of multiple classifiers to produce a single
classification have been reported in remote sensing literature. The
resulting classifier, referred to as an ensemble classifier, is
generally found to be more accurate than any of the individual
classifiers making up the ensemble. As accuracy is the primary
concern, much of the research in the field of land cover
classification is focused on improving classification accuracy. This
study compares the performance of four ensemble approaches
(boosting, bagging, DECORATE and random subspace) with a
univariate decision tree as base classifier. Two training datasets,
one without ant noise and other with 20 percent noise was used to
judge the performance of different ensemble approaches. Results
with noise free data set suggest an improvement of about 4% in
classification accuracy with all ensemble approaches in
comparison to the results provided by univariate decision tree
classifier. Highest classification accuracy of 87.43% was achieved
by boosted decision tree. A comparison of results with noisy data
set suggests that bagging, DECORATE and random subspace
approaches works well with this data whereas the performance of
boosted decision tree degrades and a classification accuracy of
79.7% is achieved which is even lower than that is achieved (i.e.
80.02%) by using unboosted decision tree classifier.
Abstract: Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.
Abstract: In many buildings we rely on large footings to offer
structural stability. Designers often compensate for the lack of
knowledge available with regard to foundation-soil interaction by
furnishing structures with overly large footings. This may lead to a
significant increase in building expenditures if many large
foundations are present. This paper describes the interface material
law that governs the behavior along the contact surface of adjacent
materials, and the behavior of a large foundation under ultimate limit
loading. A case study is chosen that represents a common
foundation-soil system frequently used in general practice and
therefore relevant to other structures. Investigations include
compressing versus uplifting wind forces, alterations to the
foundation size and subgrade compositions, the role of the slab
stiffness and presence and the effect of commonly used structural
joints and connections. These investigations aim to provide the
reader with an objective design approach, efficiently preventing
structural instability.
Abstract: The purpose of this study was to evaluate the tissue
composition and carcass muscularity of 32 legs of Ile de France
lambs fed with diets containing sunflower seeds and vitamin E, with
mean body weight of 15 kg, lodged in individual pens at 15 kg and
slaughtered at 32 kg of body weight. The treatments influenced
(P0,05) by the treatments. The
interaction of the sunflower and vitamin E was positive for bone total
weights and intermuscular fat.