Abstract: In this paper we proposed comparison of four content based objective metrics with results of subjective tests from 80 video sequences. We also include two objective metrics VQM and SSIM to our comparison to serve as “reference” objective metrics because their pros and cons have already been published. Each of the video sequence was preprocessed by the region recognition algorithm and then the particular objective video quality metric were calculated i.e. mutual information, angular distance, moment of angle and normalized cross-correlation measure. The Pearson coefficient was calculated to express metrics relationship to accuracy of the model and the Spearman rank order correlation coefficient to represent the metrics relationship to monotonicity. The results show that model with the mutual information as objective metric provides best result and it is suitable for evaluating quality of video sequences.
Abstract: Reducing river sediments through path correction and
preservation of river walls leads to considerable reduction of
sedimentation at the pumping stations. Path correction and
preservation of walls is not limited to one particular method but,
depending on various conditions, a combination of several methods
can be employed. In this article, we try to review and evaluate
methods for preservation of river banks in order to reduce sediments.
Abstract: Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.
Abstract: Due to a high unemployment rate among local people
and a high reliance on expatriate workers, the governments in the
Gulf Co-operation Council (GCC) countries have been implementing
programmes of localisation (replacing foreign workers with GCC
nationals). These programmes have been successful in the public
sector but much less so in the private sector. However, there are now
insufficient jobs for locals in the public sector and the onus to provide
employment has fallen on the private sector. This paper is concerned
with a study, which is a work in progress (certain elements are
complete but not the whole study), investigating the effective
implementation of localisation policies in four- and five-star hotels in
the Kingdom of Saudi Arabia (KSA) and the United Arab Emirates
(UAE). The purpose of the paper is to identify the research gap, and
to present the need for the research. Further, it will explain how this
research was conducted.
Studies of localisation in the GCC countries are under-represented
in scholarly literature. Currently, the hotel sectors in KSA and UAE
play an important part in the countries’ economies. However, the
total proportion of Saudis working in the hotel sector in KSA is
slightly under 8%, and in the UAE, the hotel sector remains highly
reliant on expatriates. There is therefore a need for research on
strategies to enhance the implementation of the localisation policies
in general and in the hotel sector in particular.
Further, despite the importance of the hotel sector to their
economies, there remains a dearth of research into the
implementation of localisation policies in this sector. Indeed, as far as
the researchers are aware, there is no study examining localisation in
the hotel sector in KSA, and few in the UAE. This represents a
considerable research gap.
Regarding how the research was carried out, a multiple case study
strategy was used. The four- and five-star hotel sector in KSA is one
of the cases, while the four- and five-star hotel sector in the UAE is
the other case. Four- and five-star hotels in KSA and the UAE were
chosen as these countries have the longest established localisation
policies of all the GCC states and there are more hotels of these
classifications in these countries than in any of the other Gulf
countries. A literature review was carried out to underpin the
research. The empirical data were gathered in three phases. In order
to gain a pre-understanding of the issues pertaining to the research
context, Phase I involved eight unstructured interviews with officials
from the Saudi Commission for Tourism and Antiquities (three
interviewees); the Saudi Human Resources Development Fund (one);
the Abu Dhabi Tourism and Culture Authority (three); and the Abu
Dhabi Development Fund (one).
In Phase II, a questionnaire was administered to 24 managers and
24 employees in four- and five-star hotels in each country to obtain
their beliefs, attitudes, opinions, preferences and practices concerning
localisation.
Unstructured interviews were carried out in Phase III with six
managers in each country in order to allow them to express opinions
that may not have been explored in sufficient depth in the
questionnaire. The interviews in Phases I and III were analysed using
thematic analysis and SPSS will be used to analyse the questionnaire
data.
It is recommended that future research be undertaken on a larger
scale, with a larger sample taken from all over KSA and the UAE
rather than from only four cities (i.e., Riyadh and Jeddah in KSA and
Abu Dhabi and Sharjah in the UAE), as was the case in this research.
Abstract: Automatic currency note recognition invariably
depends on the currency note characteristics of a particular country
and the extraction of features directly affects the recognition ability.
Sri Lanka has not been involved in any kind of research or
implementation of this kind. The proposed system “SLCRec" comes
up with a solution focusing on minimizing false rejection of notes.
Sri Lankan currency notes undergo severe changes in image quality
in usage. Hence a special linear transformation function is adapted to
wipe out noise patterns from backgrounds without affecting the
notes- characteristic images and re-appear images of interest. The
transformation maps the original gray scale range into a smaller
range of 0 to 125. Applying Edge detection after the transformation
provided better robustness for noise and fair representation of edges
for new and old damaged notes. A three layer back propagation
neural network is presented with the number of edges detected in row
order of the notes and classification is accepted in four classes of
interest which are 100, 500, 1000 and 2000 rupee notes. The
experiments showed good classification results and proved that the
proposed methodology has the capability of separating classes
properly in varying image conditions.
Abstract: Neonatal lupus erythematous (NLE) is a rare disease marked by clinical characteristic and specific maternal autoantibody. Many cutaneous, cardiac, liver, and hematological manifestations could happen with affect of one organ or multiple. In this case, both babies were premature, low birth weight (LBW), small for gestational age (SGA) and born through caesarean section from a systemic lupus erythematous (SLE) mother. In the first case, we found a baby girl with dyspnea and grunting. Chest X ray showed respiratory distress syndrome (RDS) great I and echocardiography showed small atrial septal defect (ASD) and ventricular septal defect (VSD). She also developed anemia, thrombocytopenia, elevated C-reactive protein, hypoalbuminemia, increasing coagulation factors, hyperbilirubinemia, and positive blood culture of Klebsiella pneumonia. Anti-Ro/SSA and Anti-nRNP/sm were positive. Intravenous fluid, antibiotic, transfusion of blood, thrombocyte concentrate, and fresh frozen plasma were given. The second baby, male presented with necrotic tissue on the left ear and skin rashes, erythematous macula, athropic scarring, hyperpigmentation on all of his body with various size and facial haemorrhage. He also suffered from thrombocytopenia, mild elevated transaminase enzyme, hyperbilirubinemia, anti-Ro/SSA was positive. Intravenous fluid, methyprednisolone, intravenous immunoglobulin (IVIG), blood, and thrombocyte concentrate transfution were given. Two cases of neonatal lupus erythematous had been presented. Diagnosis based on clinical presentation and maternal auto antibody on neonate. Organ involvement in NLE can occur as single or multiple manifestations.
Abstract: This paper introduces two decoders for binary linear
codes based on Metaheuristics. The first one uses a genetic algorithm
and the second is based on a combination genetic algorithm with
a feed forward neural network. The decoder based on the genetic
algorithms (DAG) applied to BCH and convolutional codes give good
performances compared to Chase-2 and Viterbi algorithm respectively
and reach the performances of the OSD-3 for some Residue
Quadratic (RQ) codes. This algorithm is less complex for linear
block codes of large block length; furthermore their performances
can be improved by tuning the decoder-s parameters, in particular the
number of individuals by population and the number of generations.
In the second algorithm, the search space, in contrast to DAG which
was limited to the code word space, now covers the whole binary
vector space. It tries to elude a great number of coding operations
by using a neural network. This reduces greatly the complexity of
the decoder while maintaining comparable performances.
Abstract: Macrophomina phaseolina is a devastating soil-borne
fungal plant pathogen that causes charcoal rot disease in many
economically important crops worldwide. So far, no registered
fungicide is available against this plant pathogen. This study was
planned to examine the antifungal activity of an allelopathic grass
Cenchrus pennisetiformis (Hochst. & Steud.) Wipff. for the
management of M. phaseolina isolated from cowpea [Vigna
unguiculata (L.) Walp.] plants suffering from charcoal rot disease.
Different parts of the plants viz. inflorescence, shoot and root were
extracted in methanol. Laboratory bioassays were carried out using
different concentrations (0, 0.5, 1.0, …, 3.0 g mL-1) of methanolic
extracts of the test allelopathic grass species to assess the antifungal
activity against the pathogen. In general, extracts of all parts of the
grass exhibited antifungal activity. All the concentrations of
methanolic extracts of shoot and root significantly reduced fungal
biomass by 20–73% and 40–80%, respectively. Methanolic shoot
extract was fractionated using n-hexane, chloroform, ethyl acetate
and n-butanol. Different concentrations of these fractions (3.125,
6.25, …, 200 mg mL-1) were analyzed for their antifungal activity.
All the concentrations of n-hexane fraction significantly reduced
fungal biomass by 15–96% over corresponding control treatments.
Higher concentrations (12.5–200 mg mL-1) of chloroform, ethyl
acetate and n-butanol also reduced the fungal biomass significantly
by 29–100%, 46–100% and 24–100%, respectively.
Abstract: Titanium alloys like Ti-6Al-2Sn-4Zr-6Mo (Ti-
6246) are widely used in aerospace applications. Component
manufacturing, however, is difficult and expensive as their
machinability is extremely poor. A thorough understanding of the
chip formation process is needed to improve related metal cutting
operations.In the current study, orthogonal cutting experiments have
been performed and theresulting chips were analyzed by optical
microscopy and scanning electron microscopy.Chips from aTi-
6246ingot were produced at different cutting speeds and cutting
depths. During the experiments, depending of the cutting conditions,
continuous or segmented chips were formed. Narrow, highly
deformed and grain oriented zones, the so-called shear zone,
separated individual segments. Different material properties have
been measured in the shear zones and the segments.
Abstract: Modeling and simulation of biochemical reactions is of great interest in the context of system biology. The central dogma of this re-emerging area states that it is system dynamics and organizing principles of complex biological phenomena that give rise to functioning and function of cells. Cell functions, such as growth, division, differentiation and apoptosis are temporal processes, that can be understood if they are treated as dynamic systems. System biology focuses on an understanding of functional activity from a system-wide perspective and, consequently, it is defined by two hey questions: (i) how do the components within a cell interact, so as to bring about its structure and functioning? (ii) How do cells interact, so as to develop and maintain higher levels of organization and functions? In recent years, wet-lab biologists embraced mathematical modeling and simulation as two essential means toward answering the above questions. The credo of dynamics system theory is that the behavior of a biological system is given by the temporal evolution of its state. Our understanding of the time behavior of a biological system can be measured by the extent to which a simulation mimics the real behavior of that system. Deviations of a simulation indicate either limitations or errors in our knowledge. The aim of this paper is to summarize and review the main conceptual frameworks in which models of biochemical networks can be developed. In particular, we review the stochastic molecular modelling approaches, by reporting the principal conceptualizations suggested by A. A. Markov, P. Langevin, A. Fokker, M. Planck, D. T. Gillespie, N. G. van Kampfen, and recently by D. Wilkinson, O. Wolkenhauer, P. S. Jöberg and by the author.
Abstract: Classifying biomedical literature is a difficult and
challenging task, especially when a large number of biomedical
articles should be organized into a hierarchical structure. In this paper,
we present an approach for classifying a collection of biomedical text
abstracts downloaded from Medline database with the help of
ontology alignment. To accomplish our goal, we construct two types
of hierarchies, the OHSUMED disease hierarchy and the Medline
abstract disease hierarchies from the OHSUMED dataset and the
Medline abstracts, respectively. Then, we enrich the OHSUMED
disease hierarchy before adapting it to ontology alignment process for
finding probable concepts or categories. Subsequently, we compute
the cosine similarity between the vector in probable concepts (in the
“enriched" OHSUMED disease hierarchy) and the vector in Medline
abstract disease hierarchies. Finally, we assign category to the new
Medline abstracts based on the similarity score. The results obtained
from the experiments show the performance of our proposed approach
for hierarchical classification is slightly better than the performance of
the multi-class flat classification.
Abstract: Currently, there are many local area industrial networks
that can give guaranteed bandwidth to synchronous traffic, particularly
providing CBR channels (Constant Bit Rate), which allow
improved bandwidth management. Some of such networks operate
over Ethernet, delivering channels with enough capacity, specially
with compressors, to integrate multimedia traffic in industrial monitoring
and image processing applications with many sources. In
these industrial environments where a low latency is an essential
requirement, JPEG is an adequate compressing technique but it
generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic
in CBR channels is inefficient and current solutions to this problem
significantly increase the latency or further degrade the quality. In
this paper an R(q) model is used which allows on-line calculation of
the JPEG quantification factor. We obtained increased quality, a lower
requirement for the CBR channel with reduced number of discarded
frames along with better use of the channel bandwidth.
Abstract: One major difficulty that faces developers of
concurrent and distributed software is analysis for concurrency based
faults like deadlocks. Petri nets are used extensively in the
verification of correctness of concurrent programs. ECATNets [2] are
a category of algebraic Petri nets based on a sound combination of
algebraic abstract types and high-level Petri nets. ECATNets have
'sound' and 'complete' semantics because of their integration in
rewriting logic [12] and its programming language Maude [13].
Rewriting logic is considered as one of very powerful logics in terms
of description, verification and programming of concurrent systems.
We proposed in [4] a method for translating Ada-95 tasking
programs to ECATNets formalism (Ada-ECATNet). In this paper,
we show that ECATNets formalism provides a more compact
translation for Ada programs compared to the other approaches based
on simple Petri nets or Colored Petri nets (CPNs). Such translation
doesn-t reduce only the size of program, but reduces also the number
of program states. We show also, how this compact Ada-ECATNet
may be reduced again by applying reduction rules on it. This double
reduction of Ada-ECATNet permits a considerable minimization of
the memory space and run time of corresponding Maude program.
Abstract: The objective of the research was to evaluate the
quality of milk pomade sweet – sherbet packed in different packaging
materials (Multibarrier 60, met.BOPET/PE, Aluthen), by several
packaging technologies – active and modified atmosphere (MAP)
(consisting of 100% CO2), and control – in air ambiance.
Experiments were carried out at the Faculty of Food Technology of
Latvia University of Agriculture. Samples were stored at the room
temperature +21±1 °C. The physiochemical properties – weight
losses, moisture, hardening, colour and changes in headspace
atmosphere concentration (CO2 and O2) of packs were analysed
before packaging and after 2, 4, 6, 8, 10 and 12 storage weeks.
Abstract: In order to study the effect of plant density and
competition of wheat with field bindweed (Convolvulus arvensis) on
yield and agronomical properties of wheat(Triticum Sativum) in
irrigated conditions, a factorial experiment as the base of complete
randomize block design in three replication was conducted at the
field of Kamalvand in khoramabad (Lorestan) region of Iran during
2008-2009. Three plant density (Factor A=200, 230 and 260kg/ha)
three cultivar (Factor B=Bahar,Pishtaz and Alvand) and weed control
(Factor C= control and no control of weeds)were assigned in
experiment. Results show that: Plant density had significant effect
(statistically) on seed yield, 1000 seed weight, weed density and dry
weight of weeds, seed yield and harvest index had been meaningful
effect for cultivars. The interaction between plant density and
cultivars for weed density, seed yield, thousand seed weight and
harvest index were significant. 260 kg/ha (plant density) of wheat had
more effect on increasing of seed yield in Bahar cultivar wheat in
khoramabad region of Iran.
Abstract: This paper presents a sensing system for 3D sensing
and mapping by a tracked mobile robot with an arm-type sensor
movable unit and a laser range finder (LRF). The arm-type sensor
movable unit is mounted on the robot and the LRF is installed at the
end of the unit. This system enables the sensor to change position and
orientation so that it avoids occlusions according to terrain by this
mechanism. This sensing system is also able to change the height of
the LRF by keeping its orientation flat for efficient sensing. In this kind
of mapping, it may be difficult for moving robot to apply mapping
algorithms such as the iterative closest point (ICP) because sets of the
2D data at each sensor height may be distant in a common surface. In
order for this kind of mapping, the authors therefore applied
interpolation to generate plausible model data for ICP. The results of
several experiments provided validity of these kinds of sensing and
mapping in this sensing system.
Abstract: This paper demonstrates a model of an e-Learning
system based on nowadays learning theory and distant education
practice. The relationships in the model are designed to be simple
and functional and do not necessarily represent any particular e-
Learning environments. It is meant to be a generic e-Learning
system model with implications for any distant education course
instructional design. It allows online instructors to move away from
the discrepancy between the courses and body of knowledge. The
interrelationships of four primary sectors that are at the e-Learning
system are presented in this paper. This integrated model includes
[1] pedagogy, [2] technology, [3] teaching, and [4] learning. There
are interactions within each of these sectors depicted by system loop
map.
Abstract: Technology assessment is a vital part of decision process in manufacturing, particularly for decisions on selection of new sustainable manufacturing processes. To assess these processes, a matrix approach is introduced and sustainability assessment models are developed. Case studies show that the matrix-based approach provides a flexible and practical way for sustainability evaluation of new manufacturing technologies such as those used in surface coating. The technology assessment of coating processes reveals that compared with powder coating, the sol-gel coating can deliver better technical, economical and environmental sustainability with respect to the selected sustainability evaluation criteria for a decorative coating application of car wheels.
Abstract: Image retrieval is a topic where scientific interest is currently high. The important steps associated with image retrieval system are the extraction of discriminative features and a feasible similarity metric for retrieving the database images that are similar in content with the search image. Gabor filtering is a widely adopted technique for feature extraction from the texture images. The recently proposed sparsity promoting l1-norm minimization technique finds the sparsest solution of an under-determined system of linear equations. In the present paper, the l1-norm minimization technique as a similarity metric is used in image retrieval. It is demonstrated through simulation results that the l1-norm minimization technique provides a promising alternative to existing similarity metrics. In particular, the cases where the l1-norm minimization technique works better than the Euclidean distance metric are singled out.
Abstract: The LMS adaptive filter has several parameters which can affect their performance. From among these parameters, most papers handle the step size parameter for controlling the performance. In this paper, we approach three parameters: step-size, filter tap-size and filter form. The regression analysis is used for defining the relation between parameters and performance of LMS adaptive filter with using the system level simulation results. The results present that all parameters have performance trends in each own particular form, which can be estimated from equations drawn by regression analysis.