Abstract: The performances of small and medium enterprises
have stagnated in the last two decades. This has mainly been due to
the emergence of HIV / Aids. The disease has had a detrimental
effect on the general economy of the country leading to morbidity
and mortality of the Kenyan workforce in their primary age. The
present study sought to establish the economic impact of HIV / Aids
on the micro-enterprise development in Obunga slum – Kisumu, in
terms of production loss, increasing labor related cost and to establish
possible strategies to address the impact of HIV / Aids on microenterprises.
The study was necessitated by the observation that most
micro-enterprises in the slum are facing severe economic and social
crisis due to the impact of HIV / Aids, they get depleted and close
down within a short time due to death of skilled and experience
workforce. The study was carried out between June 2008 and June
2009 in Obunga slum. Data was subjected to computer aided
statistical analysis that included descriptive statistic, chi-squared and
ANOVA techniques. Chi-squared analysis on the micro-enterprise
owners opinion on the impact of HIV / Aids on depletion of microenterprise
compared to other diseases indicated high levels of the
negative effects of the disease at significance levels of P
Abstract: The purpose of this paper primarily intends to develop GIS interface for estimating sequences of stream-flows at ungauged stations based on known flows at gauged stations. The integrated GIS interface is composed of three major steps. The first, precipitation characteristics using statistical analysis is the procedure for making multiple linear regression equation to get the long term mean daily flow at ungauged stations. The independent variables in regression equation are mean daily flow and drainage area. Traditionally, mean flow data are generated by using Thissen polygon method. However, method for obtaining mean flow data can be selected by user such as Kriging, IDW (Inverse Distance Weighted), Spline methods as well as other traditional methods. At the second, flow duration curve (FDC) is computing at unguaged station by FDCs in gauged stations. Finally, the mean annual daily flow is computed by spatial interpolation algorithm. The third step is to obtain watershed/topographic characteristics. They are the most important factors which govern stream-flows. In summary, the simulated daily flow time series are compared with observed times series. The results using integrated GIS interface are closely similar and are well fitted each other. Also, the relationship between the topographic/watershed characteristics and stream flow time series is highly correlated.
Abstract: Content-Based Image Retrieval (CBIR) has been
one on the most vivid research areas in the field of computer vision
over the last 10 years. Many programs and tools have been
developed to formulate and execute queries based on the visual or
audio content and to help browsing large multimedia repositories.
Still, no general breakthrough has been achieved with respect to
large varied databases with documents of difering sorts and with
varying characteristics. Answers to many questions with respect to
speed, semantic descriptors or objective image interpretations are
still unanswered. In the medical field, images, and especially
digital images, are produced in ever increasing quantities and used
for diagnostics and therapy. In several articles, content based
access to medical images for supporting clinical decision making
has been proposed that would ease the management of clinical data
and scenarios for the integration of content-based access methods
into Picture Archiving and Communication Systems (PACS) have
been created. This paper gives an overview of soft computing
techniques. New research directions are being defined that can
prove to be useful. Still, there are very few systems that seem to be
used in clinical practice. It needs to be stated as well that the goal
is not, in general, to replace text based retrieval methods as they
exist at the moment.
Abstract: This paper presents a novel method that allows an
agent host to delegate its signing power to an anonymous mobile
agent in such away that the mobile agent does not reveal any information about its host-s identity and, at the same time, can be authenticated by the service host, hence, ensuring fairness of service
provision. The solution introduces a verification server to verify the
signature generated by the mobile agent in such a way that even if colluding with the service host, both parties will not get more information than what they already have. The solution incorporates
three methods: Agent Signature Key Generation method, Agent
Signature Generation method, Agent Signature Verification method.
The most notable feature of the solution is that, in addition to allowing secure and anonymous signature delegation, it enables
tracking of malicious mobile agents when a service host is attacked. The security properties of the proposed solution are analyzed, and the solution is compared with the most related work.
Abstract: The study aimed to investigate the effect of rice types on chewing behaviours (chewing time, number of chews, and portion size) and bolus properties (bolus moisture content, solid loss, and particle size distribution (PSD)) in human subjects. Five cooked rice types including brown rice (BR), white rice (WR), parboiled white rice (PR), high amylose white rice (HR) and waxy white rice (WXR) were chewed by six subjects. The chewing behaviours were recorded and the food boluses were collected during mastication. Rice typeswere found to significantly influence all chewing parameters evaluated. The WXR and BR showed the most pronounced differences compared with other rice types. The initial moisture content of un-chewed WXR was lowest (43.39%) whereas those of other rice types were ranged from 66.86 to 70.33%. The bolus obtained from chewing the WXR contained lowest moisture content (56.43%) whilst its solid loss (22.03%) was not significant different from those of all rice types. In PSD evaluation using Mastersizer S, the diameter of particles measured was ranged between 4 to 3500 μm. The particle size of food bolus from BR, HR, and WXR contained much finer particles than those of WR and PR.
Abstract: This paper presents an overview of the multiobjective shortest path problem (MSPP) and a review of essential and recent issues regarding the methods to its solution. The paper further explores a multiobjective evolutionary algorithm as applied to the MSPP and describes its behavior in terms of diversity of solutions, computational complexity, and optimality of solutions. Results show that the evolutionary algorithm can find diverse solutions to the MSPP in polynomial time (based on several network instances) and can be an alternative when other methods are trapped by the tractability problem.
Abstract: A novel adaptive fuzzy trajectory tracking algorithm of Stewart platform based motion platform is proposed to compensate path deviation and degradation of controller-s performance due to actuator torque limit. The algorithm can be divided into two parts: the real-time trajectory shaping part and the joint space adaptive fuzzy controller part. For a reference trajectory in task space whenever any of the actuators is saturated, the desired acceleration of the reference trajectory is modified on-line by using dynamic model of motion platform. Meanwhile an additional action with respect to the difference between the nominal and modified trajectories is utilized in the non-saturated region of actuators to reduce the path error. Using modified trajectory as input, the joint space controller incorporates compute torque controller, leg velocity observer and fuzzy disturbance observer with saturation compensation. It can ensure stability and tracking performance of controller in present of external disturbance and position only measurement. Simulation results verify the effectiveness of proposed control scheme.
Abstract: This paper proposes fractal patterns for power quality
(PQ) detection using color relational analysis (CRA) based classifier.
Iterated function system (IFS) uses the non-linear interpolation in the
map and uses similarity maps to construct various fractal patterns of
power quality disturbances, including harmonics, voltage sag, voltage
swell, voltage sag involving harmonics, voltage swell involving
harmonics, and voltage interruption. The non-linear interpolation
functions (NIFs) with fractal dimension (FD) make fractal patterns
more distinguishing between normal and abnormal voltage signals.
The classifier based on CRA discriminates the disturbance events in a
power system. Compared with the wavelet neural networks, the test
results will show accurate discrimination, good robustness, and faster
processing time for detecting disturbing events.
Abstract: In this paper is to evaluate audio and speech quality
with the help of Digital Audio Watermarking Technique under the
different types of attacks (signal impairments) like Gaussian Noise,
Compression Error and Jittering Effect. Further attacks are
considered as Hostile Environment. Audio and Speech Quality
Evaluation is an important research topic. The traditional way for
speech quality evaluation is using subjective tests. They are reliable,
but very expensive, time consuming, and cannot be used in certain
applications such as online monitoring. Objective models, based on
human perception, were developed to predict the results of subjective
tests. The existing objective methods require either the original
speech or complicated computation model, which makes some
applications of quality evaluation impossible.
Abstract: The electrical potentials generated during eye movements and blinks are one of the main sources of artifacts in Electroencephalogram (EEG) recording and can propagate much across the scalp, masking and distorting brain signals. In recent times, signal separation algorithms are used widely for removing artifacts from the observed EEG data. In this paper, a recently introduced signal separation algorithm Mutual Information based Least dependent Component Analysis (MILCA) is employed to separate ocular artifacts from EEG. The aim of MILCA is to minimize the Mutual Information (MI) between the independent components (estimated sources) under a pure rotation. Performance of this algorithm is compared with eleven popular algorithms (Infomax, Extended Infomax, Fast ICA, SOBI, TDSEP, JADE, OGWE, MS-ICA, SHIBBS, Kernel-ICA, and RADICAL) for the actual independence and uniqueness of the estimated source components obtained for different sets of EEG data with ocular artifacts by using a reliable MI Estimator. Results show that MILCA is best in separating the ocular artifacts and EEG and is recommended for further analysis.
Abstract: In order to study the influence of different methods of controlling weeds such as mechanical weeding and mechanical weeder efficiency analysis in mechanical cultivation conditions, in farming year of 2011 an experiment was done in a farm in coupling and development of technology center in Haraz,Iran. The treatments consisted of (I) control treatment: where no weeding was done, (II) use of mechanical weeding without engine and (III) power mechanical weeding. Results showed that experimental treatments had significantly different effects (p=0.05) on yield traits and number of filled grains per panicle, while treatments had the significant effects on grain weight and dry weight of weeds in the first, second and third weeding methods at 1% of confidence level. Treatment (II) had its most significant effect on number of filled grains per panicle and yield performance standpoint, which was 3705.97 kg ha-1 in its highest peak. Treatment (III) was ranked as second influential with 3559.8 kg ha-1. In addition, under (I) treatments, 2364.73 kg ha-1 of yield produced. The minimum dry weights of weeds in all weeding methods were related to the treatment (II), (III) and (I), respectively. The correlation coefficient analysis showed that total yield had a significant positive correlation with the panicle grain yield per plant (r= 0.55*) and the number of grains per panicle-1 (r= 0.57*) and the number of filled grains (r= 0.63*). Total rice yield also had negative correlation of r= -0. 64* with weed dry weight at second weed sampling time (17 DAT). The weed dry weight at third and fourth sampling times (24 and 40 DAT) had negative correlations of -0.65** and r=-0.61* with rice yield, respectively.
Abstract: Unlike the best effort service provided by the internet
today, next-generation wireless networks will support real-time
applications. This paper proposes an adaptive early packet discard
(AEPD) policy to improve the performance of the real time TCP
traffic over ATM networks and avoid the fragmentation problem.
Three main aspects are incorporated in the proposed policy. First,
providing quality-of-service (QoS) guaranteed for real-time
applications by implementing a priority scheduling. Second,
resolving the partially corrupted packets problem by differentiating
the buffered cells of one packet from another. Third, adapting a
threshold dynamically using Fuzzy logic based on the traffic
behavior to maintain a high throughput under a variety of load
conditions. The simulation is run for two priority classes of the input
traffic: real time and non-real time classes. Simulation results show
that the proposed AEPD policy improves throughput and fairness
over that using static threshold under the same traffic conditions.
Abstract: Formal Specification languages are being widely used
for system specification and testing. Highly critical systems such as
real time systems, avionics, and medical systems are represented
using Formal specification languages. Formal specifications based
testing is mostly performed using black box testing approaches thus
testing only the set of inputs and outputs of the system. The formal
specification language such as VDMµ can be used for white box
testing as they provide enough constructs as any other high level
programming language. In this work, we perform data and control
flow analysis of VDMµ class specifications. The proposed work is
discussed with an example of SavingAccount.
Abstract: Agricultural waste is mainly composed of cellulose
and hemicelluloses which can be converted to sugars. The
inexpensive reducing sugar from durian peel was obtained by
hydrolysis with HCl concentration at 0.5-2.0% (v/v). The hydrolysis
range of time was for 15-60 min when the mixture was autoclaved at
121 °C. The result showed that acid hydrolysis efficiency (AHE)
highest to 80.99% at condition is 2.0%concentration for 15 min.
Reducing sugar highest to 56.07 g/litre at condition is 2.0%
concentration for 45min. Total sugar highest to 59.83 g/litre at
condition is 2.0%concentration for 45min, which was not significant
(p < 0.05) with condition 2.0% concentration for 30 min and 1.5 %
concentration for 45 and 60 min. The increase in concentration
increased AHE, reducing sugar and total sugar. The hydrolysis time
had no effect on AHE, reducing sugar and total sugar. The maximum
reducing sugars of each concentration were at hydrolysis time 45
min .The hydrolysated were analysis by HPLC, the results revealed
that the principle of sugar were glucose, fructose and xylose.
Abstract: The ability to detect and classify the type of fault
plays a great role in the protection of power system. This procedure
is required to be precise with no time consumption. In this paper
detection of fault type has been implemented using wavelet analysis
together with wavelet entropy principle. The simulation of power
system is carried out using PSCAD/EMTDC. Different types of
faults were studied obtaining various current waveforms. These
current waveforms were decomposed using wavelet analysis into
different approximation and details. The wavelet entropy of such
decompositions is analyzed reaching a successful methodology for
fault classification. The suggested approach is tested using different
fault types and proven successful identification for the type of fault.
Abstract: In this paper, we study the knapsack sharing problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of a tree search for optimally solving the problem. The used method combines two complementary phases: a reduction interval search phase and a branch and bound procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for decomposing the problem into a series of knapsack problems. Second, the tree search procedure is applied in order to attain a set of optimal capacities characterizing the knapsack problems. Finally, the performance of the proposed optimal algorithm is evaluated on a set of instances of the literature and its runtime is compared to the best exact algorithm of the literature.
Abstract: Neem is a highly heterozygous and commercially
important perennial plant. Conventionally, it is propagated by seeds
which loose viability within two weeks. Strictly cross pollinating
nature of the plant causes serious barrier to the genetic improvement
by conventional methods. Alternative methods of tree improvement
such as somatic hybridization, mutagenesis and genetic
transformation require an efficient in vitro plant regeneration system.
In this regard, somatic embryogenesis particularly secondary somatic
embryogenesis may offer an effective system for large scale plant
propagation without affecting the clonal fidelity of the regenerants. It
can be used for synthetic seed production, which further bolsters
conservation of this tree species which is otherwise very difficult
The present report describes the culture conditions necessary to
induce and maintain repetitive somatic embryogenesis, for the first
time, in neem. Out of various treatments tested, the somatic embryos
were induced directly from immature zygotic embryos of neem on
MS + TDZ (0.1 μM) + ABA (4 μM), in more than 76 % cultures.
Direct secondary somatic embryogenesis occurred from primary
somatic embryos on MS + IAA (5 μM) + GA3 (5 μM) in 12.5 %
cultures. Embryogenic competence of the explant as well as of the
primary embryos was maintained for a long period by repeated
subcultures at frequent intervals. A maximum of 10 % of these
somatic embryos were converted into plantlets.
Abstract: Animated graph gives some good impressions in
presenting information. However, not many people are able to produce it because the process of generating an animated graph requires some technical skills. This work presents Content
Management System with Animated Graph (CMS-AG). It is a webbased system enabling users to produce an effective and interactive
graphical report in a short time period. It allows for three levels of user authentication, provides update profile, account management, template management, graph management, and track changes. The system development applies incremental development approach, object-oriented concepts and Web programming technologies. The design architecture promotes new technology of reporting. It also helps user cut off unnecessary expenses, save time and learn new things on different levels of users. In this paper, the developed system is described.
Abstract: Ontologies play an important role in semantic web applications and are often developed by different groups and continues to evolve over time. The knowledge in ontologies changes very rapidly that make the applications outdated if they continue to use old versions or unstable if they jump to new versions. Temporal frames using frame versioning and slot versioning are used to take care of dynamic nature of the ontologies. The paper proposes new tags and restructured OWL format enabling the applications to work with the old or new version of ontologies. Gene Ontology, a very dynamic ontology, has been used as a case study to explain the OWL Ontology with Temporal Tags.
Abstract: Clustering unstructured text documents is an
important issue in data mining community and has a number of
applications such as document archive filtering, document
organization and topic detection and subject tracing. In the real
world, some of the already clustered documents may not be of
importance while new documents of more significance may evolve.
Most of the work done so far in clustering unstructured text
documents overlooks this aspect of clustering. This paper, addresses
this issue by using the Fading Function. The unstructured text
documents are clustered. And for each cluster a statistics structure
called Cluster Profile (CP) is implemented. The cluster profile
incorporates the Fading Function. This Fading Function keeps an
account of the time-dependent importance of the cluster. The work
proposes a novel algorithm Clustering n-ary Merge Algorithm
(CnMA) for unstructured text documents, that uses Cluster Profile
and Fading Function. Experimental results illustrating the
effectiveness of the proposed technique are also included.