Abstract: Food poisoning and infection by bacteria are of public
health significance to both developing and developed countries.
Samples of ogi (akamu) prepared from white and yellow variety of
maize sold in Uturu and Okigwe were analyzed together with the
laboratory prepared ogi for bacterial quality using the standard
microbiological methods. The analyses showed that both white and
yellow variety had total bacterial counts (cfu/g) of 4.0 ×107 and 3.9 x
107 for the laboratory prepared ogi while the commercial ogi had 5.2
x 107 and 4.9 x107, 4.9 x107 and 4.5 x107, 5.4 x107 and 5.0 x107 for
Eke-Okigwe, Up-gate and Nkwo-Achara market respectively. The
Staphylococcal counts ranged from 2.0 x 102 to 5.0 x102 and 1.0 x
102 to 4.0 x102 for the white and yellow variety from the different
markets while Staphylococcal growth was not recorded on the
laboratory prepared ogi. The laboratory prepared ogi had no Coliform
growth while the commercially prepared ogi had counts of 0.5 x103
to 1.6 x 103 for white variety and 0.3 x 103 to 1.1 x103 for yellow
variety respectively. The Lactic acid bacterial count of 3.5x106 and
3.0x106 was recorded for the laboratory ogi while the commercially
prepared ogi ranged from 3.2x106 to 4.2x106 (white variety) and 3.0
x106 to 3.9 x106 (yellow). The presence of bacteria isolates from the
commercial and laboratory fermented ogi showed that Lactobacillus
sp, Leuconostoc sp and Citrobacter sp were present in all the
samples, Micrococcus sp and Klebsiella sp were isolated from Eke-
Okigwe and ABSU-up-gate markets varieties respectively, E. coli
and Staphylococcus sp were present in Eke-Okigwe and Nkwo-
Achara markets while Salmonella sp were isolated from the three
markets. Hence, there are chances of contracting food borne diseases
from commercially prepared ogi. Therefore, there is the need for
sanitary measures in the production of fermented cereals so as to
minimize the rate of food borne pathogens during processing and
storage.
Abstract: Current systems complexity has reached a degree that
requires addressing conception and design issues while taking into
account environmental, operational, social, legal and financial
aspects. Therefore, one of the main challenges is the way complex
systems are specified and designed. The exponential growing effort,
cost and time investment of complex systems in modeling phase
emphasize the need for a paradigm, a framework and an environment
to handle the system model complexity. For that, it is necessary to
understand the expectations of the human user of the model and his
limits. This paper presents a generic framework for designing
complex systems, highlights the requirements a system model needs
to fulfill to meet human user expectations, and suggests a graphbased
formalism for modeling complex systems. Finally, a set of
transformations are defined to handle the model complexity.
Abstract: Market is an important factor for start-ups to look into
during decision-making in product development and related areas.
Emerging country markets are more uncertain in terms of information
availability and institutional supports. The literature review of market
uncertainty reveals the need for identifying factors representing the
market uncertainty. This paper identifies factors for market
uncertainty using Exploratory Factor Analysis (EFA) and confirmed
the number of factor retention using an alternative factor retention
criterion ‘Parallel Analysis’. 500 entrepreneurs, engaged in start-ups
from all over India participated in the study. This paper concludes
with the factor structure of ‘market uncertainty’ having dimensions of
uncertainty in industry orientation, uncertainty in customer
orientation and uncertainty in marketing orientation.
Abstract: In this paper, the actuality of the study, and the role of
subjective well-being problem in modern psychology and the
comprehending of subjective well-being by current students is
defined. The purpose of this research is to educe peculiarities of
comprehending of subjective well-being by students with various
levels of emotional intelligence. Methods of research are adapted
Russian-Language questionnaire of K. Riff 'The scales of
psychological well-being'; emotional intelligence questionnaire of D.
V. Lusin. The research involved 72 students from different
universities and disciplines aged between 18 and 24. Analyzing the
results of the studies, it can be concluded that the understanding of
happiness in different groups of students with high and low levels of
overall emotional intelligence is different, as well as differentiated by
gender. Students with a higher level of happiness possess more
capacity and higher need to control their emotions, to cause and
maintain the desired emotions and control something undesirable.
Abstract: The new era of digital communication has brought up
many challenges that network operators need to overcome. The high
demand of mobile data rates require improved networks, which is a
challenge for the operators in terms of maintaining the quality of
experience (QoE) for their consumers. In live video transmission,
there is a sheer need for live surveillance of the videos in order to
maintain the quality of the network. For this purpose objective
algorithms are employed to monitor the quality of the videos that are
transmitted over a network. In order to test these objective algorithms,
subjective quality assessment of the streamed videos is required, as the
human eye is the best source of perceptual assessment. In this paper we
have conducted subjective evaluation of videos with varying spatial
and temporal impairments. These videos were impaired with frame
freezing distortions so that the impact of frame freezing on the quality
of experience could be studied. We present subjective Mean Opinion
Score (MOS) for these videos that can be used for fine tuning the
objective algorithms for video quality assessment.
Abstract: Due to today’s globalization as well as outsourcing
practices of the companies, the Supply Chain (SC) performances
have become more dependent on the efficient movement of material
among places that are geographically dispersed, where there is more
chance for disruptions. One such disruption is the quality and
delivery uncertainties of outsourcing. These uncertainties could lead
the products to be unsafe and, as is the case in a number of recent
examples, companies may have to end up in recalling their products.
As a result of these problems, there is a need to develop a
methodology for selecting suppliers globally in view of risks
associated with low quality and late delivery. Accordingly, we
developed a two-stage stochastic model that captures the risks
associated with uncertainty in quality and delivery as well as a
solution procedure for the model. The stochastic model developed
simultaneously optimizes supplier selection and purchase quantities
under price discounts over a time horizon. In particular, our target is
the study of global organizations with multiple sites and multiple
overseas suppliers, where the pricing is offered in suppliers’ local
currencies. Our proposed methodology is applied to a case study for a
US automotive company having two assembly plants and four
potential global suppliers to illustrate how the proposed model works
in practice.
Abstract: In this paper a new model for center of motion
creating is proposed. This new method uses cables. So, it is very
useful in robots because it is light and has easy assembling process.
In the robots which need to be in touch with some things this method
is so useful. It will be described in the following. The accuracy of the
idea is proved by two experiments. This system could be used in the
robots which need a fixed point in the contact with some things and
make a circular motion.
Abstract: Biodiesel, as an alternative renewable fuel, has been
receiving increasing attention due to the limited supply of fossil fuels
and the increasing need for energy. Microalgae are promising source
for lipids, which can be converted to biodiesel. The biodiesel
production from microalgae lipids using lipase catalyzed reaction in
supercritical CO2 medium has several advantages over conventional
production processes. However, identifying the optimum microalgae
lipid extraction and transesterification conditions is still a challenge.
In this study, the quality of biodiesel produced from lipids extracted
from Scenedesmus sp. and their enzymatic transesterification using
supercritical carbon dioxide have been investigated. At the optimum
conditions, the highest biodiesel production yield was found to be
82%. The fuel properties of the produced biodiesel, without any
separation step, at optimum reaction condition, were determined and
compared to ASTM standards. The properties were found to comply
with the limits, and showed a low glycerol content, without any
separation step.
Abstract: The legality of some countries or agencies’ acts to spy
on personal phone calls of the public became a hot topic to many
social groups’ talks. It is believed that this act is considered an
invasion to someone’s privacy. Such act may be justified if it is
singling out specific cases but to spy without limits is very
unacceptable. This paper discusses the needs for not only a simple
and light weight technique to secure mobile voice calls but also a
technique that is independent from any encryption standard or library.
It then presents and tests one encrypting algorithm that is based of
Frequency scrambling technique to show fair and delay-free process
that can be used to protect phone calls from such spying acts.
Abstract: In addition to environmental parameters like rain,
temperature diseases on crop is a major factor which affects
production quality & quantity of crop yield. Hence disease
management is a key issue in agriculture. For the management of
disease, it needs to be detected at early stage. So, treat it properly &
control spread of the disease. Now a day, it is possible to use the
images of diseased leaf to detect the type of disease by using image
processing techniques. This can be achieved by extracting features
from the images which can be further used with classification
algorithms or content based image retrieval systems. In this paper,
color image is used to extract the features such as mean and standard
deviation after the process of region cropping. The selected features
are taken from the cropped image with different image size samples.
Then, the extracted features are taken in to the account for
classification using Fuzzy Inference System (FIS).
Abstract: STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: To construct the lumped spring-mass model
considering the occupants for the offset frontal crash, the SISAME
software and the NHTSA test data were used. The data on 56 kph 40%
offset frontal vehicle to deformable barrier crash test of a MY2007
Mazda 6 4-door sedan were obtained from NHTSA test database. The
overall behaviors of B-pillar and engine of simulation models agreed
very well with the test data. The trends of accelerations at the driver
and passenger head were similar but big differences in peak values.
The differences of peak values caused the large errors of the HIC36
and 3 ms chest g’s. To predict well the behaviors of dummies, the
spring-mass model for the offset frontal crash needs to be improved.
Abstract: The main function of Medium Access Control (MAC) is to share the channel efficiently between all nodes. In the real-time scenario, there will be certain amount of wastage in bandwidth due to back-off periods. More bandwidth will be wasted in idle state if the back-off period is very high and collision may occur if the back-off period is small. So, an optimization is needed for this problem. The main objective of the work is to reduce delay due to back-off period thereby reducing collision and increasing throughput. Here a method, called the virtual back-off algorithm (VBA) is used to optimize the back-off period and thereby it increases throughput and reduces collisions. The main idea is to optimize the number of transmission for every node. A counter is introduced at each node to implement this idea. Here counter value represents the sequence number. VBA is classified into two types VBA with counter sharing (VBA-CS) and VBA with no counter sharing (VBA-NCS). These two classifications of VBA are compared for various parameters. Simulation is done in NS-2 environment. The results obtained are found to be promising.
Abstract: Rapidly changing factors that affect daily life also affect operational environment and the way military leaders fulfill their missions. With the help of technological developments, traditional linearity of conflict and war has started to fade away. Furthermore, mission domain has broadened to include traditional threats, hybrid threats and new challenges of cyber and space. Considering the future operational environment, future military leaders need to adapt themselves to the new challenges of the future battlefield. But how to decide what kind of features of leadership are required to operate and accomplish mission in the new complex battlefield? In this article, the main aim is to provide answers to this question. To be able to find right answers, first leadership and leadership components are defined, and then characteristics of future operational environment are analyzed. Finally, leadership features that are required to be successful in redefined battlefield are explained.
Abstract: Distributed applications deployed on LEO satellites
and ground stations require substantial communication between
different members in a constellation to overcome the earth
coverage barriers imposed by GEOs. Applications running on LEO
constellations suffer the earth line-of-sight blockage effect. They
need adequate lab testing before launching to space. We propose
a scalable cloud-based network simulation framework to simulate
problems created by the earth line-of-sight blockage. The framework
utilized cloud IaaS virtual machines to simulate LEO satellites
and ground stations distributed software. A factorial ANOVA
statistical analysis is conducted to measure simulator overhead on
overall communication performance. The results showed a very low
simulator communication overhead. Consequently, the simulation
framework is proposed as a candidate for testing LEO constellations
with distributed software in the lab before space launch.
Abstract: In dynamic system theory a mathematical model is
often used to describe their properties. In order to find a transfer
matrix of a dynamic system we need to calculate an inverse matrix.
The paper contains the fusion of the classical theory and the
procedures used in the theory of automated control for calculating the
inverse matrix. The final part of the paper models the given problem
by the Matlab.
Abstract: Exploration and exploitation capabilities are both
important within Operations as means for improvement when
managed separately, and for establishing dynamic improvement
capabilities when combined in balance. However, it is unclear what
exploration and exploitation capabilities imply in improvement and
development work within an Operations context. So, in order to
better understand how to develop exploration and exploitation
capabilities within Operations, the main characteristics of these
constructs needs to be identified and further understood. Thus, the
objective of this research is to increase the understanding about
exploitation and exploration characteristics, to concretize what they
translates to within the context of improvement and development
work in an Operations unit, and to identify practical challenges. A
literature review and a case study are presented. In the literature
review, different interpretations of exploration and exploitation are
portrayed, key characteristics have been identified, and a deepened
understanding of exploration and exploitation characteristics is
described. The case in the study is an Operations unit, and the aim is
to explore to what extent and in what ways exploration and
exploitation activities are part of the improvement structures and
processes. The contribution includes an identification of key
characteristics of exploitation and exploration, as well as an
interpretation of the constructs. Further, some practical challenges are
identified. For instance, exploration activities tend to be given low
priority, both in daily work as in the manufacturing strategy. Also,
the overall understanding about the concepts of exploitation and
exploration (or any similar aspect of dynamic improvement
capabilities) is very low.
Abstract: In order to address construction project requirements
and specifications, scholars and practitioners need to establish
taxonomy according to a scheme that best fits their need. While
existing characterization methods are continuously being improved,
new ones are devised to cover project properties which have not been
previously addressed. One such method, the Project Definition Rating
Index (PDRI), has received limited consideration strictly as a
classification scheme. Developed by the Construction Industry
Institute (CII) in 1996, the PDRI has been refined over the last two
decades as a method for evaluating a project's scope definition
completeness during front-end planning (FEP). The main
contribution of this study is a review of practical project classification
methods, and a discussion of how PDRI can be used to classify
projects based on their readiness in the FEP phase. The proposed
model has been applied to 59 construction projects in Ontario, and
the results are discussed.