Abstract: Despite the advances made in various new
technologies, application of these technologies for agriculture still
remains a formidable task, as it involves integration of diverse
domains for monitoring the different process involved in agricultural
management. Advances in ambient intelligence technology represents
one of the most powerful technology for increasing the yield of
agricultural crops and to mitigate the impact of water scarcity,
climatic change and methods for managing pests, weeds and diseases.
This paper proposes a GPS-assisted, machine to machine solutions
that combine information collected by multiple sensors for the
automated management of paddy crops. To maintain the economic
viability of paddy cultivation, the various techniques used in
agriculture are discussed and a novel system which uses ambient
intelligence technique is proposed in this paper. The ambient
intelligence based agricultural system gives a great scope.
Abstract: A knowledge base stores facts and rules about the
world that applications can use for the purpose of reasoning. By
applying the concept of granular computing to a knowledge base,
several advantages emerge. These can be harnessed by applications
to improve their capabilities and performance. In this paper, the
concept behind such a construct, called a granular knowledge cube,
is defined, and its intended use as an instrument that manages to
cope with different data types and detect knowledge domains is
elaborated. Furthermore, the underlying architecture, consisting of the
three layers of the storing, representing, and structuring of knowledge,
is described. Finally, benefits as well as challenges of deploying it
are listed alongside application types that could profit from having
such an enhanced knowledge base.
Abstract: This paper presents an application of a “Systematic
Soft Domain Driven Design Framework” as a soft systems approach
to domain-driven design of information systems development. The
framework use SSM as a guiding methodology within which we have
embedded a sequence of design tasks based on the UML leading to
the implementation of a software system using the Naked Objects
framework. This framework have been used in action research
projects that have involved the investigation and modelling of
business processes using object-oriented domain models and the
implementation of software systems based on those domain models.
Within this framework, Soft Systems Methodology (SSM) is used as
a guiding methodology to explore the problem situation and to
develop the domain model using UML for the given business
domain. The framework is proposed and evaluated in our previous
works, and a real case study “Information Retrieval System for
academic research” is used, in this paper, to show further practice and
evaluation of the framework in different business domain. We argue
that there are advantages from combining and using techniques from
different methodologies in this way for business domain modelling.
The framework is overviewed and justified as multimethodology
using Mingers multimethodology ideas.
Abstract: Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.
Abstract: Boiling process is characterized by the rapid
formation of vapour bubbles at the solid–liquid interface (nucleate
boiling) with pre-existing vapour or gas pockets. Computational fluid
dynamics (CFD) is an important tool to study bubble dynamics. In
the present study, CFD simulation has been carried out to determine
the bubble detachment diameter and its terminal velocity. Volume of
fluid method is used to model the bubble and the surrounding by
solving single set of momentum equations and tracking the volume
fraction of each of the fluids throughout the domain. In the
simulation, bubble is generated by allowing water-vapour to enter a
cylinder filled with liquid water through an inlet at the bottom. After
the bubble is fully formed, the bubble detaches from the surface and
rises up during which the bubble accelerates due to the net balance
between buoyancy force and viscous drag. Finally when these forces
exactly balance each other, it attains a constant terminal velocity. The
bubble detachment diameter and the terminal velocity of the bubble
are captured by the monitor function provided in FLUENT. The
detachment diameter and the terminal velocity obtained are compared
with the established results based on the shape of the bubble. A good
agreement is obtained between the results obtained from simulation
and the equations in comparison with the established results.
Abstract: Model transformation, as a pivotal aspect of Modeldriven
engineering, attracts more and more attentions both from
researchers and practitioners. Many domains (enterprise engineering,
software engineering, knowledge engineering, etc.) use model
transformation principles and practices to serve to their domain
specific problems; furthermore, model transformation could also be
used to fulfill the gap between different domains: by sharing and
exchanging knowledge. Since model transformation has been widely
used, there comes new requirement on it: effectively and efficiently
define the transformation process and reduce manual effort that
involved in. This paper presents an automatic model transformation
methodology based on semantic and syntactic comparisons, and
focuses particularly on granularity issue that existed in transformation
process. Comparing to the traditional model transformation
methodologies, this methodology serves to a general purpose: crossdomain
methodology. Semantic and syntactic checking
measurements are combined into a refined transformation process,
which solves the granularity issue. Moreover, semantic and syntactic
comparisons are supported by software tool; manual effort is replaced
in this way.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: By the evolvement in technology, the way of
expressing opinions switched direction to the digital world. The
domain of politics, as one of the hottest topics of opinion mining
research, merged together with the behavior analysis for affiliation
determination in texts, which constitutes the subject of this paper.
This study aims to classify the text in news/blogs either as
Republican or Democrat with the minimum number of features. As
an initial set, 68 features which 64 were constituted by Linguistic
Inquiry and Word Count (LIWC) features were tested against 14
benchmark classification algorithms. In the later experiments, the
dimensions of the feature vector reduced based on the 7 feature
selection algorithms. The results show that the “Decision Tree”,
“Rule Induction” and “M5 Rule” classifiers when used with “SVM”
and “IGR” feature selection algorithms performed the best up to
82.5% accuracy on a given dataset. Further tests on a single feature
and the linguistic based feature sets showed the similar results. The
feature “Function”, as an aggregate feature of the linguistic category,
was found as the most differentiating feature among the 68 features
with the accuracy of 81% in classifying articles either as Republican
or Democrat.
Abstract: Rapidly changing factors that affect daily life also affect operational environment and the way military leaders fulfill their missions. With the help of technological developments, traditional linearity of conflict and war has started to fade away. Furthermore, mission domain has broadened to include traditional threats, hybrid threats and new challenges of cyber and space. Considering the future operational environment, future military leaders need to adapt themselves to the new challenges of the future battlefield. But how to decide what kind of features of leadership are required to operate and accomplish mission in the new complex battlefield? In this article, the main aim is to provide answers to this question. To be able to find right answers, first leadership and leadership components are defined, and then characteristics of future operational environment are analyzed. Finally, leadership features that are required to be successful in redefined battlefield are explained.
Abstract: Methicillin/multiple-resistant Staphylococcus aureus
(MRSA) are infectious bacteria that are resistant to common
antibiotics. A previous in silico study in our group has identified a
hypothetical protein SAV1226 as one of the potential drug targets. In
this study, we reported the bioinformatics characterization, as well as
cloning, expression, purification and kinetic assays of hypothetical
protein SAV1226 from methicillin/vancomycin-resistant
Staphylococcus aureus Mu50 strain. MALDI-TOF/MS analysis
revealed a low degree of structural similarity with known proteins.
Kinetic assays demonstrated that hypothetical protein SAV1226 is
neither a domain of an ATP dependent dihydroxyacetone kinase nor
of a phosphotransferase system (PTS) dihydroxyacetone kinase,
suggesting that the function of hypothetical protein SAV1226 might
be misannotated on public databases such as UniProt and
InterProScan 5.
Abstract: Plasmin plays an important role in the human
circulatory system owing to its catalytic ability of fibrinolysis. The
immediate injection of plasmin in patients of strokes has intrigued
many scientists to design vectors that can transport plasmin to the
desired location in human body. Here we predict the structure of
human plasmin and investigate the interaction of plasmin with the
gold-nanoparticle.
Because the crystal structure of plasminogen has been solved, we
deleted N-terminal domain (Pan-apple domain) of plasminogen and
generate a mimic of the active form of this enzyme (plasmin). We
conducted a simulated annealing process on plasmin and discovered a
very large conformation occurs. Kringle domains 1, 4 and 5 had been
observed to leave its original location relative to the main body of the
enzyme and the original doughnut shape of this enzyme has been
transformed to a V-shaped by opening its two arms. This observation
of conformational change is consistent with the experimental results of
neutron scattering and centrifugation.
We subsequently docked the plasmin on the simulated gold surface
to predict their interaction. The V-shaped plasmin could utilize its
Kringle domain and catalytic domain to contact the gold surface.
Our findings not only reveal the flexibility of plasmin structure but
also provide a guide for the design of a plasmin-gold nanoparticle.
Abstract: This work is the first dowel in a rather wide research
activity in collaboration with Euro Mediterranean Center for Climate
Changes, aimed at introducing scalable approaches in Ocean
Circulation Models. We discuss designing and implementation of
a parallel algorithm for solving the Variational Data Assimilation
(DA) problem on Graphics Processing Units (GPUs). The algorithm
is based on the fully scalable 3DVar DA model, previously proposed
by the authors, which uses a Domain Decomposition approach
(we refer to this model as the DD-DA model). We proceed with
an incremental porting process consisting of 3 distinct stages:
requirements and source code analysis, incremental development of
CUDA kernels, testing and optimization. Experiments confirm the
theoretic performance analysis based on the so-called scale up factor
demonstrating that the DD-DA model can be suitably mapped on
GPU architectures.
Abstract: Crosstalk among interconnects and printed-circuit
board (PCB) traces is a major limiting factor of signal quality in highspeed
digital and communication equipments especially when fast
data buses are involved. Such a bus is considered as a planar
multiconductor transmission line. This paper will demonstrate how
the finite difference time domain (FDTD) method provides an exact
solution of the transmission-line equations to analyze the near end
and the far end crosstalk. In addition, this study makes it possible to
analyze the rise time effect on the near and far end voltages of the
victim conductor. The paper also discusses a statistical analysis,
based upon a set of several simulations. Such analysis leads to a
better understanding of the phenomenon and yields useful
information.
Abstract: Image compression based on fractal coding is a lossy
compression method and normally used for gray level images range
and domain blocks in rectangular shape. Fractal based digital image
compression technique provide a large compression ratio and in this
paper, it is proposed using YUV colour space and the fractal theory
which is based on iterated transformation. Fractal geometry is mainly
applied in the current study towards colour image compression
coding. These colour images possesses correlations among the colour
components and hence high compression ratio can be achieved by
exploiting all these redundancies. The proposed method utilises the
self-similarity in the colour image as well as the cross-correlations
between them. Experimental results show that the greater
compression ratio can be achieved with large domain blocks but more
trade off in image quality is good to acceptable at less than 1 bit per
pixel.
Abstract: The objective of present research paper is to highlight
the importance of measuring advertisement effectiveness in print
media and to develop a conceptual model for advertisement
effectiveness. The developed model is based on dimensions on which
advertisement effectiveness depends and on the dimensions which are
used to measure the effectiveness. An in-depth and extensive
literature review is carried out to understand the concept of
advertisement effectiveness and its various determinants in context of
print media. Based on the insights gained, a conceptual framework
for advertisement effectiveness is presented. The model is an attempt
to uncover the relatively less explored area of advertisement
effectiveness in Indian advertising scenario. It is believed that present
work will encourage scholars and academicians to further explore the
area and will offer conceptual assistance and a fresh direction in the
domain of advertisement effectiveness.
Abstract: A total of 115 yeast strains isolated from local cassava
processing wastes were measured for crude protein content. Among
these strains, the strain MSY-2 possessed the highest protein
concentration (>3.5 mg protein/mL). By using molecular
identification tools, it was identified to be a strain of Pichia
kudriavzevii based on similarity of D1/D2 domain of 26S rDNA
region. In this study, to optimize the protein production by MSY-2
strain, Response Surface Methodology (RSM) was applied. The
tested parameters were the carbon content, nitrogen content, and
incubation time. Here, the value of regression coefficient (R2) =
0.7194 could be explained by the model which is high to support the
significance of the model. Under the optimal condition, the protein
content was produced up to 3.77 g per L of the culture and MSY-2
strain contains 66.8 g protein per 100 g of cell dry weight. These
results revealed the plausibility of applying the novel strain of yeast
in single-cell protein production.
Abstract: The distribution of a single global clock across a chip
has become the major design bottleneck for high performance VLSI
systems owing to the power dissipation, process variability and multicycle
cross-chip signaling. A Network-on-Chip (NoC) architecture
partitioned into several synchronous blocks has become a promising
approach for attaining fine-grain power management at the system
level. In a NoC architecture the communication between the blocks is
handled asynchronously. To interface these blocks on a chip
operating at different frequencies, an asynchronous FIFO interface is
inevitable. However, these asynchronous FIFOs are not required if
adjacent blocks belong to the same clock domain. In this paper, we
have designed and analyzed a 16-bit asynchronous micropipelined
FIFO of depth four, with the awareness of place and route on an
FPGA device. We have used a commercially available Spartan 3
device and designed a high speed implementation of the
asynchronous 4-phase micropipeline. The asynchronous FIFO
implemented on the FPGA device shows 76 Mb/s throughput and a
handshake cycle of 109 ns for write and 101.3 ns for read at the
simulation under the worst case operating conditions (voltage =
0.95V) on a working chip at the room temperature.
Abstract: Many organizations are investing in web applications
and technologies in order to be competitive, some of them could not
achieve its goals. The quality of web-based applications could play
an important role for organizations to be competitive. So the aim of
this study is to investigate the impact of quality of web-based
applications to achieve a competitive advantage. A new model has
been developed. An empirical investigation was performed on a
banking sector in Jordan to test the new model. The results show that
impact of web-based applications on competitive advantage is
significant. Finally, further work is planned to validate and evaluate
the proposed model using several domains.
Abstract: Groundwater inflow to the tunnels is one of the most
important problems in tunneling operation. The objective of this
study is the investigation of model dimension effects on tunnel inflow
assessment in discontinuous rock masses using numerical modeling.
In the numerical simulation, the model dimension has an important
role in prediction of water inflow rate. When the model dimension is
very small, due to low distance to the tunnel border, the model
boundary conditions affect the estimated amount of groundwater flow
into the tunnel and results show a very high inflow to tunnel. Hence,
in this study, the two-dimensional universal distinct element code
(UDEC) used and the impact of different model parameters, such as
tunnel radius, joint spacing, horizontal and vertical model domain
extent has been evaluated. Results show that the model domain extent
is a function of the most significant parameters, which are tunnel
radius and joint spacing.
Abstract: This paper attempts to define the validity domain of
LSDP (Loop Shaping Design Procedure) controller system, by
determining the suitable uncertainty region, so that linear system be
stable. Indeed the LSDP controller cannot provide stability for any
perturbed system. For this, we will use the gap metric tool that is
introduced into the control literature for studying robustness
properties of feedback systems with uncertainty. A 2nd order electric
linear system example is given to define the validity domain of LSDP
controller and effectiveness gap metric.