Abstract: Psoriasis is a chronic inflammatory skin condition
which affects 2-3% of population around the world. Psoriasis Area
and Severity Index (PASI) is a gold standard to assess psoriasis
severity as well as the treatment efficacy. Although a gold standard,
PASI is rarely used because it is tedious and complex. In practice,
PASI score is determined subjectively by dermatologists, therefore
inter and intra variations of assessment are possible to happen even
among expert dermatologists. This research develops an algorithm to
assess psoriasis lesion for PASI scoring objectively. Focus of this
research is thickness assessment as one of PASI four parameters
beside area, erythema and scaliness. Psoriasis lesion thickness is
measured by averaging the total elevation from lesion base to lesion
surface. Thickness values of 122 3D images taken from 39 patients
are grouped into 4 PASI thickness score using K-means clustering.
Validation on lesion base construction is performed using twelve
body curvature models and show good result with coefficient of
determinant (R2) is equal to 1.
Abstract: Several studies have been carried out, using various techniques, including neural networks, to discriminate vigilance states in humans from electroencephalographic (EEG) signals, but we are still far from results satisfactorily useable results. The work presented in this paper aims at improving this status with regards to 2 aspects. Firstly, we introduce an original procedure made of the association of two neural networks, a self organizing map (SOM) and a learning vector quantization (LVQ), that allows to automatically detect artefacted states and to separate the different levels of vigilance which is a major breakthrough in the field of vigilance. Lastly and more importantly, our study has been oriented toward real-worked situation and the resulting model can be easily implemented as a wearable device. It benefits from restricted computational and memory requirements and data access is very limited in time. Furthermore, some ongoing works demonstrate that this work should shortly results in the design and conception of a non invasive electronic wearable device.
Abstract: The most common domestic birds live in Turkey are: crows (Corvus corone), pigeons (Columba livia), sparrows (Passer domesticus), starlings (Sturnus vulgaris) and blackbirds (Turdus merula). These birds give damage to the agricultural areas and make dirty the human life areas. In order to send away these birds, some different materials and methods such as chemicals, treatments, colored lights, flash and audible scarers are used. It is possible to see many studies about chemical methods in the literatures. However there is not enough works regarding audible bird scarers are reported in the literature. Therefore, a solar powered bird scarer was designed, manufactured and tested in this experimental investigation. Firstly, to understand the sensitive level of these domestic birds against to the audible scarer, many series preliminary studies were conducted. These studies showed that crows are the most resistant against to the audible bird scarer when compared with pigeons, sparrows, starlings and blackbirds. Therefore the solar powered audible bird scarer was tested on crows. The scarer was tested about one month during April- May, 2007. 18 different common known predators- sounds (voices or calls) of domestic birds from Falcon (Falco eleonorae), Falcon (Buteo lagopus), Eagle (Aquila chrysaetos), Montagu-s harrier (Circus pygargus) and Owl (Glaucidium passerinum) were selected for test of the scarer. It was seen from the results that the reaction of the birds was changed depending on the predators- sound type, camouflage of the scarer, sound quality and volume, loudspeaker play and pause periods in one application. In addition, it was also seen that the sound from Falcon (Buteo lagopus) was most effective on crows and the scarer was enough efficient.
Abstract: In this paper a new fast simplification method is
presented. Such method realizes Karnough map with large
number of variables. In order to accelerate the operation of the
proposed method, a new approach for fast detection of group
of ones is presented. Such approach implemented in the
frequency domain. The search operation relies on performing
cross correlation in the frequency domain rather than time one.
It is proved mathematically and practically that the number of
computation steps required for the presented method is less
than that needed by conventional cross correlation. Simulation
results using MATLAB confirm the theoretical computations.
Furthermore, a powerful solution for realization of complex
functions is given. The simplified functions are implemented
by using a new desigen for neural networks. Neural networks
are used because they are fault tolerance and as a result they
can recognize signals even with noise or distortion. This is
very useful for logic functions used in data and computer
communications. Moreover, the implemented functions are
realized with minimum amount of components. This is done
by using modular neural nets (MNNs) that divide the input
space into several homogenous regions. Such approach is
applied to implement XOR function, 16 logic functions on one
bit level, and 2-bit digital multiplier. Compared to previous
non- modular designs, a clear reduction in the order of
computations and hardware requirements is achieved.
Abstract: This paper analyzes the linkage between migration,
economic globalization and terrorism concerns. On a broad level, I
analyze Canadian economic and political considerations, searching
for causal relationships between political and economic actors on the
one hand, and Canadian immigration law on the other. Specifically,
the paper argues that there are contradictory impulses affecting state
sovereignty. These impulses are are currently being played out in the
field of Canadian immigration law through several proposed changes
to Canada-s Immigration and Refugee Protection Act (IRPA). These
changes reflect an ideological conception of sovereignty that is
intrinsically connected with decision-making capacity centered on an
individual. This conception of sovereign decision-making views
Parliamentary debate and bureaucratic inefficiencies as both equally
responsible for delaying essential decisions relating to the protection
of state sovereignty, economic benefits and immigration control This
paper discusses these concepts in relation to Canadian immigration
policy under Canadian governments over the past twenty five years.
Abstract: Today, design requirements are extending more and
more from electronic (analogue and digital) to multidiscipline design.
These current needs imply implementation of methodologies to make
the CAD product reliable in order to improve time to market, study
costs, reusability and reliability of the design process.
This paper proposes a high level design approach applied for the
characterization and the optimization of Switched-Current Sigma-
Delta Modulators. It uses the new hardware description language
VHDL-AMS to help the designers to optimize the characteristics of
the modulator at a high level with a considerably reduced CPU time
before passing to a transistor level characterization.
Abstract: Script identification is one of the challenging steps in the development of optical character recognition system for bilingual or multilingual documents. In this paper an attempt is made for identification of English numerals at word level from Punjabi documents by using Gabor features. The support vector machine (SVM) classifier with five fold cross validation is used to classify the word images. The results obtained are quite encouraging. Average accuracy with RBF kernel, Polynomial and Linear Kernel functions comes out to be greater than 99%.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: Image compression plays a vital role in today-s
communication. The limitation in allocated bandwidth leads to
slower communication. To exchange the rate of transmission in the
limited bandwidth the Image data must be compressed before
transmission. Basically there are two types of compressions, 1)
LOSSY compression and 2) LOSSLESS compression. Lossy
compression though gives more compression compared to lossless
compression; the accuracy in retrievation is less in case of lossy
compression as compared to lossless compression. JPEG, JPEG2000
image compression system follows huffman coding for image
compression. JPEG 2000 coding system use wavelet transform,
which decompose the image into different levels, where the
coefficient in each sub band are uncorrelated from coefficient of
other sub bands. Embedded Zero tree wavelet (EZW) coding exploits
the multi-resolution properties of the wavelet transform to give a
computationally simple algorithm with better performance compared
to existing wavelet transforms. For further improvement of
compression applications other coding methods were recently been
suggested. An ANN base approach is one such method. Artificial
Neural Network has been applied to many problems in image
processing and has demonstrated their superiority over classical
methods when dealing with noisy or incomplete data for image
compression applications. The performance analysis of different
images is proposed with an analysis of EZW coding system with
Error Backpropagation algorithm. The implementation and analysis
shows approximately 30% more accuracy in retrieved image
compare to the existing EZW coding system.
Abstract: Most papers model Joint Replenishment Problem
(JRP) as a (kT,S) where kT is a multiple value for a common review
period T,and S is a predefined order up to level. In general the (T,S)
policy is characterized by a long out of control period which requires
a large amount of safety stock compared to the (R,Q) policy. In this
paper a probabilistic model is built where an item, call it item(i),
with the shortest order time between interval (T)is modeled under
(R,Q) policy and its inventory is continuously reviewed, while the
rest of items (j) are periodically reviewed at a definite time
corresponding to item
Abstract: This paper focuses on a critical component of the situational awareness (SA), the neural control of depth flight of an autonomous underwater vehicle (AUV). Constant depth flight is a challenging but important task for AUVs to achieve high level of autonomy under adverse conditions. With the SA strategy, we proposed a multirate neural control of an AUV trajectory using neural network model reference controller for a nontrivial mid-small size AUV "r2D4" stochastic model. This control system has been demonstrated and evaluated by simulation of diving maneuvers using software package Simulink. From the simulation results it can be seen that the chosen AUV model is stable in the presence of high noise, and also can be concluded that the fast SA of similar AUV systems with economy in energy of batteries can be asserted during the underwater missions in search-and-rescue operations.
Abstract: In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.
Abstract: Bloom filter is a probabilistic and memory efficient
data structure designed to answer rapidly whether an element is
present in a set. It tells that the element is definitely not in the set but
its presence is with certain probability. The trade-off to use Bloom
filter is a certain configurable risk of false positives. The odds of a
false positive can be made very low if the number of hash function is
sufficiently large. For spam detection, weight is attached to each set
of elements. The spam weight for a word is a measure used to rate the
e-mail. Each word is assigned to a Bloom filter based on its weight.
The proposed work introduces an enhanced concept in Bloom filter
called Bin Bloom Filter (BBF). The performance of BBF over
conventional Bloom filter is evaluated under various optimization
techniques. Real time data set and synthetic data sets are used for
experimental analysis and the results are demonstrated for bin sizes 4,
5, 6 and 7. Finally analyzing the results, it is found that the BBF
which uses heuristic techniques performs better than the traditional
Bloom filter in spam detection.
Abstract: In the paper we discuss the influence of the route
flexibility degree, the open rate of operations and the production type
coefficient on makespan. The flexible job-open shop scheduling
problem FJOSP (an extension of the classical job shop scheduling) is
analyzed. For the analysis of the production process we used a
hybrid heuristic of the GRASP (greedy randomized adaptive search
procedure) with simulated annealing algorithm. Experiments with
different levels of factors have been considered and compared. The
GRASP+SA algorithm has been tested and illustrated with results for
the serial route and the parallel one.
Abstract: This work presents the Risk Threshold RED (RTRED)
congestion control strategy for TCP networks. In addition to the
maximum and minimum thresholds in existing RED-based strategies,
we add a third dropping level. This new dropping level is the risk
threshold which works with the actual and average queue sizes to
detect the immediate congestion in gateways. Congestion reaction
by RTRED is on time. The reaction to congestion is neither too
early, to avoid unfair packet losses, nor too late to avoid packet
dropping from time-outs. We compared our novel strategy with RED
and ARED strategies for TCP congestion handling using a NS-2
simulation script. We found that the RTRED strategy outperformed
RED and ARED.
Abstract: It has been defined that the “network is the system".
This implies providing levels of service, reliability, predictability and
availability that are commensurate with or better than those that
individual computers provide today. To provide this requires
integrated network management for interconnected networks of
heterogeneous devices covering both the local campus. In this paper
we are addressing a framework to effectively deal with this issue. It
consists of components and interactions between them which are
required to perform the service fault management. A real-world
scenario is used to derive the requirements which have been applied
to the component identification. An analysis of existing frameworks
and approaches with respect to their applicability to the framework is
also carried out.
Abstract: This paper aimed to study the factors that relate to
working behavior of employees at Pakkred Municipality, Nonthaburi
Province. A questionnaire was utilized as the tool in collecting
information. Descriptive statistics included frequency, percentage,
mean and standard deviation. Independent- sample t- test, analysis of
variance and Pearson Correlation were also used. The findings of this
research revealed that the majority of the respondents were female,
between 25- 35 years old, married, with a Bachelor degree. The
average monthly salary of respondents was between 8,001- 12,000
Baht, and having about 4-7 years of working experience. Regarding
the overall working motivation factors, the findings showed that
interrelationship, respect, and acceptance were ranked as highly
important factors, whereas motivation, remunerations & welfare,
career growth, and working conditions were ranked as moderately
important factors. Also, overall working behavior was ranked as high.
The hypotheses testing revealed that different genders had a
different working behavior and had a different way of working as a
team, which was significant at the 0.05 confidence level, Moreover,
there was a difference among employees with different monthly
salary in working behavior, problem- solving and decision making,
which all were significant at the 0.05 confidence level. Employees
with different years of working experience were found to have work
working behavior both individual and as a team at the statistical
significance level of 0.01 and 0.05. The result of testing the
relationship between motivation in overall working revealed that
interrelationship, respect and acceptance from others, career growth,
and working conditions related to working behavior at a moderate
level, while motivation in performing duties and remunerations and
welfares related to working behavior towards the same direction at a
low level, with a statistical significance of 0.01.
Abstract: Artificial Intelligence based gaming is an interesting topic in the state-of-art technology. This paper presents an automation of a tradition Omani game, called Al-Hawalees. Its related issues are resolved and implemented using artificial intelligence approach. An AI approach called mini-max procedure is incorporated to make a diverse budges of the on-line gaming. If number of moves increase, time complexity will be increased in terms of propositionally. In order to tackle the time and space complexities, we have employed a back propagation neural network (BPNN) to train in off-line to make a decision for resources required to fulfill the automation of the game. We have utilized Leverberg- Marquardt training in order to get the rapid response during the gaming. A set of optimal moves is determined by the on-line back propagation training fashioned with alpha-beta pruning. The results and analyses reveal that the proposed scheme will be easily incorporated in the on-line scenario with one player against the system.
Abstract: At present, it is very common to find renewable
energy resources, especially wind power, connected to distribution
systems. The impact of this wind power on voltage distribution levels
has been addressed in the literature. The majority of this works deals
with the determination of the maximum active and reactive power
that is possible to be connected on a system load bus, until the
voltage at that bus reaches the voltage collapse point. It is done by the
traditional methods of PV curves reported in many references.
Theoretical expression of maximum power limited by voltage
stability transfer through a grid is formulated using an exact
representation of distribution line with ABCD parameters. The
expression is used to plot PV curves at various power factors of a
radial system. Limited values of reactive power can be obtained. This
paper presents a method to study the relationship between the active
power and voltage (PV) at the load bus to identify the voltage
stability limit. It is a foundation to build a permitted working
operation region in complying with the voltage stability limit at the
point of common coupling (PCC) connected wind farm.
Abstract: Oxidative stress and overwhelming free radicals
associated with diabetes mellitus are likely to be linked with
development of certain complication such as retinopathy,
nephropathy and neuropathy. Treatment of diabetic subjects with
antioxidant may be of advantage in attenuating these complications.
Olive leaf (Oleaeuropaea), has been endowed with many beneficial
and health promoting properties mostly linked to its antioxidant
activity. This study aimed to evaluate the significance of
supplementation of Olive leaves extract (OLE) in reducing oxidative
stress, hyperglycemia and hyperlipidemia in Sterptozotocin (STZ)-
induced diabetic rats. After induction of diabetes, a significant rise in
plasma glucose, lipid profiles except High density lipoproteincholestrol
(HDLc), malondialdehyde (MDA) and significant decrease
of plasma insulin, HDLc and Plasma reduced glutathione GSH as
well as alteration in enzymatic antioxidants was observed in all
diabetic animals. During treatment of diabetic rats with 0.5g/kg body
weight of Olive leaves extract (OLE) the levels of plasma (MDA)
,(GSH), insulin, lipid profiles along with blood glucose and
erythrocyte enzymatic antioxidant enzymes were significantly
restored to establish values that were not different from normal
control rats. Untreated diabetic rats on the other hand demonstrated
persistent alterations in the oxidative stress marker (MDA), blood
glucose, insulin, lipid profiles and the antioxidant parameters. These
results demonstrate that OLE may be of advantage in inhibiting
hyperglycemia, hyperlipidemia and oxidative stress induced by
diabetes and suggest that administration of OLE may be helpful in
the prevention or at least reduced of diabetic complications
associated with oxidative stress.