Abstract: We consider a heterogeneously mixing SIR stochastic
epidemic process in populations described by a general graph.
Likelihood theory is developed to facilitate statistic inference for the
parameters of the model under complete observation. We show that
these estimators are asymptotically Gaussian unbiased estimates by
using a martingale central limit theorem.
Abstract: Automatic face detection is a complex problem in
image processing. Many methods exist to solve this problem such as
template matching, Fisher Linear Discriminate, Neural Networks,
SVM, and MRC. Success has been achieved with each method to
varying degrees and complexities. In proposed algorithm we used
upright, frontal faces for single gray scale images with decent
resolution and under good lighting condition. In the field of face
recognition technique the single face is matched with single face
from the training dataset. The author proposed a neural network
based face detection algorithm from the photographs as well as if any
test data appears it check from the online scanned training dataset.
Experimental result shows that the algorithm detected up to 95%
accuracy for any image.
Abstract: The aim of the paper is based on detailed analysis of
literary sources and carried out research to develop a model
development and implementation of innovation strategy in the
business. The paper brings the main results of the authors conducted
research on a sample of 462 respondents that shows the current
situation in the Slovak enterprises in the use of innovation strategy.
Carried out research and analysis provided the base for a model
development and implementation of innovation strategy in the
business, which is in the paper in detail, step by step explained with
emphasis on the implementation process. Implementing the
innovation strategy is described a separate model. Paper contains
recommendations for successful implementation of innovation
strategy in the business. These recommendations should serve mainly
business managers as valuable tool in implementing the innovation
strategy.
Abstract: Graph based image segmentation techniques are
considered to be one of the most efficient segmentation techniques
which are mainly used as time & space efficient methods for real
time applications. How ever, there is need to focus on improving the
quality of segmented images obtained from the earlier graph based
methods. This paper proposes an improvement to the graph based
image segmentation methods already described in the literature. We
contribute to the existing method by proposing the use of a weighted
Euclidean distance to calculate the edge weight which is the key
element in building the graph. We also propose a slight modification
of the segmentation method already described in the literature, which
results in selection of more prominent edges in the graph. The
experimental results show the improvement in the segmentation
quality as compared to the methods that already exist, with a slight
compromise in efficiency.
Abstract: This paper describes the results and implications of a correlational study of learning styles and learner satisfaction. The relationship of these empirical concepts was examined in the context of traditional versus e-blended modes of course delivery in an introductory graduate research course. Significant results indicated that the visual side of the visual-verbal dimension of students- learning style(s) was positively correlated to satisfaction with themselves as learners in an e-blended course delivery mode and negatively correlated to satisfaction with the classroom environment in the context of a traditional classroom course delivery mode.
Abstract: Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.
Abstract: Multimedia information availability has increased
dramatically with the advent of video broadcasting on handheld
devices. But with this availability comes problems of maintaining the
security of information that is displayed in public. ISMA Encryption
and Authentication (ISMACryp) is one of the chosen technologies for
service protection in DVB-H (Digital Video Broadcasting-
Handheld), the TV system for portable handheld devices. The
ISMACryp is encoded with H.264/AVC (advanced video coding),
while leaving all structural data as it is. Two modes of ISMACryp are
available; the CTR mode (Counter type) and CBC mode (Cipher
Block Chaining) mode. Both modes of ISMACryp are based on 128-
bit AES algorithm. AES algorithms are more complex and require
larger time for execution which is not suitable for real time
application like live TV. The proposed system aims to gain a deep
understanding of video data security on multimedia technologies and
to provide security for real time video applications using selective
encryption for H.264/AVC. Five level of security proposed in this
paper based on the content of NAL unit in Baseline Constrain profile
of H.264/AVC. The selective encryption in different levels provides
encryption of intra-prediction mode, residue data, inter-prediction
mode or motion vectors only. Experimental results shown in this
paper described that fifth level which is ISMACryp provide higher
level of security with more encryption time and the one level provide
lower level of security by encrypting only motion vectors with lower
execution time without compromise on compression and quality of
visual content. This encryption scheme with compression process
with low cost, and keeps the file format unchanged with some direct
operations supported. Simulation was being carried out in Matlab.
Abstract: Experimental data from an atmospheric air/water terrain slugging case has been made available by the Shell Amsterdam research center, and has been subject to numerical simulation and comparison with a one-dimensional two-phase slug tracking simulator under development at the Norwegian University of Science and Technology. The code is based on tracking of liquid slugs in pipelines by use of a Lagrangian grid formulation implemented in Cµ by use of object oriented techniques. An existing hybrid spatial discretization scheme is tested, in which the stratified regions are modelled by the two-fluid model. The slug regions are treated incompressible, thus requiring a single momentum balance over the whole slug. Upon comparison with the experimental data, the period of the simulated severe slugging cycle is observed to be sensitive to slug generation in the horizontal parts of the system. Two different slug initiation methods have been tested with the slug tracking code, and grid dependency has been investigated.
Abstract: The review performed on the condition of energy
consumption & rate in Iran, shows that unfortunately the subject of
optimization and conservation of energy in active industries of
country lacks a practical & effective method and in most factories,
the energy consumption and rate is more than in similar industries of
industrial countries. The increasing demand of electrical energy and
the overheads which it imposes on the organization, forces
companies to search for suitable approaches to optimize energy
consumption and demand management. Application of value
engineering techniques is among these approaches. Value
engineering is considered a powerful tool for improving profitability.
These tools are used for reduction of expenses, increasing profits,
quality improvement, increasing market share, performing works in
shorter durations, more efficient utilization of sources & etc.
In this article, we shall review the subject of value engineering and
its capabilities for creating effective transformations in industrial
organizations, in order to reduce energy costs & the results have
been investigated and described during a case study in Mazandaran
wood and paper industries, the biggest consumer of energy in north
of Iran, for the purpose of presenting the effects of performed tasks
in optimization of energy consumption by utilizing value engineering
techniques in one case study.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: In this paper, we present a novel statistical approach to
corpus-based speech synthesis. Classically, phonetic information is
defined and considered as acoustic reference to be respected. In this
way, many studies were elaborated for acoustical unit classification.
This type of classification allows separating units according to their
symbolic characteristics. Indeed, target cost and concatenation cost
were classically defined for unit selection.
In Corpus-Based Speech Synthesis System, when using large text
corpora, cost functions were limited to a juxtaposition of symbolic
criteria and the acoustic information of units is not exploited in the
definition of the target cost.
In this manuscript, we token in our consideration the unit phonetic
information corresponding to acoustic information. This would be realized
by defining a probabilistic linguistic Bi-grams model basically
used for unit selection. The selected units would be extracted from
the English TIMIT corpora.
Abstract: In this paper I have developed a system for evaluating
the degree of fear emotion that the intelligent agent-based system
may feel when it encounters to a persecuting event. In this paper I
want to describe behaviors of emotional agents using human
behavior in terms of the way their emotional states evolve over time.
I have implemented a fuzzy inference system using Java
environment. As the inputs of this system, I have considered three
parameters related on human fear emotion. The system outputs can
be used in agent decision making process or choosing a person for
team working systems by combination the intensity of fear to other
emotion intensities.
Abstract: This paper focuses on testing database of existing
information system. At the beginning we describe the basic problems
of implemented databases, such as data redundancy, poor design of
database logical structure or inappropriate data types in columns of
database tables. These problems are often the result of incorrect
understanding of the primary requirements for a database of an
information system. Then we propose an algorithm to compare the
conceptual model created from vague requirements for a database
with a conceptual model reconstructed from implemented database.
An algorithm also suggests steps leading to optimization of
implemented database. The proposed algorithm is verified by an
implemented prototype. The paper also describes a fuzzy system
which works with the vague requirements for a database of an
information system, procedure for creating conceptual from vague
requirements and an algorithm for reconstructing a conceptual model
from implemented database.
Abstract: Specification-based testing enables us to detect errors
in the implementation of functions defined in given specifications.
Its effectiveness in achieving high path coverage and efficiency in
generating test cases are always major concerns of testers. The automatic
test cases generation approach based on formal specifications
proposed by Liu and Nakajima is aimed at ensuring high effectiveness
and efficiency, but this approach has not been empirically assessed.
In this paper, we present an experiment for assessing Liu-s testing
approach. The result indicates that this testing approach may not be
effective in some circumstances. We discuss the result, analyse the
specific causes for the ineffectiveness, and describe some suggestions
for improvement.
Abstract: Dedicated Short Range Communication (DSRC) is a
key enabling technology for the next generation of
communication-based safety applications. One of the important
problems for DSRC deployment is maintaining high performance
under heavy channel load. Many studies focus on congestion control
mechanisms for simulating hundreds of physical radios deployed on
vehicles. The U.S. department of transportation-s (DOT) Intelligent
Transportation Systems (ITS) division has a plan to chosen prototype
on-board devices capable of transmitting basic “Here I am" safety
messages to other vehicles. The devices will be used in an IntelliDrive
safety pilot deployment of up to 3,000 vehicles. It is hard to log the
information of 3,000 vehicles. In this paper we present the designs and
issues related to the DSRC Radio Testbed under heavy channel load.
The details not only include the architecture of DSRC Radio Testbed,
but also describe how the Radio Interfere System is used to help for
emulating the congestion radio environment.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: Information hiding, especially watermarking is a
promising technique for the protection of intellectual property rights.
This technology is mainly advanced for multimedia but the same has
not been done for text. Web pages, like other documents, need a
protection against piracy. In this paper, some techniques are
proposed to show how to hide information in web pages using some
features of the markup language used to describe these pages. Most
of the techniques proposed here use the white space to hide
information or some varieties of the language in representing
elements. Experiments on a very small page and analysis of five
thousands web pages show that these techniques have a wide
bandwidth available for information hiding, and they might form a
solid base to develop a robust algorithm for web page watermarking.
Abstract: The main purpose of this study is to provide a detailed
statistical overview of the time and regional distribution, relative
timing occurrence of economic crises and government changes in 51
economies over the 1990–2007 periods. At the same time, the
predictive power of the economic crises on set government changes
will be examined using “signal approach".
The result showed that the percentage of government changes is
highest in transition economies (86 percent of observations) and
lowest in Latin American economies (39 percent of observations).
The percentages of government changes are same in both developed
and developing countries (43 percent of observations). However,
average crises per year (frequency of crises) are higher (lower) in
developing (developed) countries than developed (developing)
countries. Also, the predictive power of economic crises about the
onset of a government change is highest in Transition economies (81
percent) and lowest in Latin American countries (30 percent). The
predictive power of economic crises in developing countries (43
percent) is lower than developed countries (55 percent).
Abstract: This paper proposes the application of a hierarchical fuzzy system (HFS) based on multi-input power system stabilizer (MPSS) and also Static Var Compensator (SVC) in multi-machine environment.The number of rules grows exponentially with the number of variables in a conventional fuzzy logic system. The proposed HFS method is developed to solve this problem. To reduce the number of rules the HFS consists of a number of low-dimensional fuzzy systems in a hierarchical structure. In fact, by using HFS the total number of involved rules increases only linearly with the number of input variables. In the MPSS, to have better efficiency an auxiliary signal of reactive power deviation (ΔQ) is added with ΔP+ Δω input type Power system stabilizer (PSS). Phasor model of SVC is described and used in this paper. The performances of MPSS, Conventional power system stabilizer (CPSS), hierarchical Fuzzy Multi-input Power System Stabilizer (HFMPSS) and the proposed method in damping inter-area mode of oscillation are examined in response to disturbances. By using digital simulations the comparative study is illustrated. It can be seen that the proposed PSS is performing satisfactorily within the whole range of disturbances.
Abstract: Selection of maize (Zea mays) hybrids with wide adaptability across diverse farming environments is important, prior to recommending them to achieve a high rate of hybrid adoption. Grain yield of 14 maize hybrids, tested in a randomized completeblock design with four replicates across 22 environments in Iran, was analyzed using site regression (SREG) stability model. The biplot technique facilitates a visual evaluation of superior genotypes, which is useful for cultivar recommendation and mega-environment identification. The objectives of this study were (i) identification of suitable hybrids with both high mean performance and high stability (ii) to determine mega-environments for maize production in Iran. Biplot analysis identifies two mega-environments in this study. The first mega-environments included KRM, KSH, MGN, DZF A, KRJ, DRB, DZF B, SHZ B, and KHM, where G10 hybrid was the best performing hybrid. The second mega-environment included ESF B, ESF A, and SHZ A, where G4 hybrid was the best hybrid. According to the ideal-hybrid biplot, G10 hybrid was better than all other hybrids, followed by the G1 and G3 hybrids. These hybrids were identified as best hybrids that have high grain yield and high yield stability. GGE biplot analysis provided a framework for identifying the target testing locations that discriminates genotypes that are high yielding and stable.