Abstract: Electronic Government is one of the special concepts
which has been performed successfully within recent decades.
Electronic government is a digital, wall-free government with a
virtual organization for presenting of online governmental services
and further cooperation in different political/social activities. In order
to have a successful implementation of electronic government
strategy and benefiting from its complete potential and benefits and
generally for establishment and applying of electronic government, it
is necessary to have different infrastructures as the basics of
electronic government with lack of which it is impossible to benefit
from mentioned services. For this purpose, in this paper we have
managed to recognize relevant obstacles for establishment of
electronic government in Iran. All required data for recognition of
obstacles were collected from statistical society of involved
specialists of Ministry of Communications & Information
Technology of Iran and Information Technology Organization of
Tehran Municipality through questionnaire. Then by considering of
five-point Likert scope and μ =3 as the index of relevant factors of
proposed model, we could specify current obstacles against
electronic government in Iran along with some guidelines and
proposal in this regard. According to the results, mentioned obstacles
for applying of electronic government in Iran are as follows:
Technical & technological problems, Legal, judicial & safety
problems, Economic problems and Humanistic Problems.
Abstract: The new concept of two–dimensional (2D) image
processing implementation for auto-guiding system is shown in this
paper. It is dedicated to astrophotography and operates with
astronomy CCD guide cameras or with self-guided dual-detector
CCD cameras and ST4 compatible equatorial mounts. This idea was
verified by MATLAB model, which was used to test all procedures
and data conversions. Next the circuit prototype was implemented at
Altera MAX II CPLD device and tested for real astronomical object
images. The digital processing speed of CPLD prototype board was
sufficient for correct equatorial mount guiding in real-time system.
Abstract: A basic conceptual study of TCSC device on Simulink is a teaching aid and helps in understanding the rudiments of the topic. This paper thus stems out from basics of TCSC device and analyzes the impedance characteristics and associated single & multi resonance conditions. The Impedance characteristics curve is drawn for different values of inductance in MATLAB using M-files. The study is also helpful in estimating the appropriate inductance and capacitance values which have influence on multi resonance point in TCSC device. The capacitor voltage, line current, thyristor current and capacitor current waveforms are discussed briefly as simulation results. Simulink model of TCSC device is given and corresponding waveforms are analyzed. The subsidiary topics e.g. power oscillation damping, SSR mitigation and transient stability is also brought out.
Abstract: The exploration of this paper will focus on the Cshaped
transition curve. This curve is designed by using the concept
of circle to circle where one circle lies inside other. The degree of
smoothness employed is curvature continuity. The function used in
designing the C-curve is Bézier-like cubic function. This function has
a low degree, flexible for the interactive design of curves and
surfaces and has a shape parameter. The shape parameter is used to
control the C-shape curve. Once the C-shaped curve design is
completed, this curve will be applied to design spur gear tooth. After
the tooth design procedure is finished, the design will be analyzed by
using Finite Element Analysis (FEA). This analysis is used to find
out the applicability of the tooth design and the gear material that
chosen. In this research, Cast Iron 4.5 % Carbon, ASTM A-48 is
selected as a gear material.
Abstract: Palestinian cities face the challenges of land scarcity,
high population growth rates, rapid urbanization, uneven
development and territorial fragmentation. Due to geopolitical
constrains and the absence of an effective Palestinian planning
institution, urban development in Palestinian cities has not followed
any discernable planning scheme. This has led to a number of
internal contradictions in the structure of cities, and adversely
affected land use, the provision of urban services, and the quality of
the living environment.
This paper explores these challenges, and the potential that exists
for introducing a more sustainable urban development pattern in
Palestinian cities. It assesses alternative development approaches
with a particular focus on sustainable development, promoting ecodevelopment
imperatives, limiting random urbanization, and meeting
present and future challenges, including fulfilling the needs of the
people and conserving the scarce land and limited natural resources.
This paper concludes by offering conceptual proposals and guidelines
for promoting sustainable physical development in Palestinian cities.
Abstract: Aloe vera has been used worldwide both for
pharmaceutical, food, and cosmetic industries due to the plethora of
biological activities of some of its metabolites. The aim of this study
was to evaluate the antifungal and antioxidant activities of the leaf
extract. The antifungal activity was determined by the agar-well
diffusion method against plant and human fungal pathogens. The
methanol and ethanol portions of the extracts studied were more
bioactive than ethyl acetate portion. It was also observed that the
activity was more pronounced on plant pathogen than human
pathogen except Candida albicans. This is an indication that the
extract has the potential to treat plant fungal infections. The Aloe
extract showed the significant antioxidant activity by the DPPH
radical scavenging method. Therefore, the Aloe extract provided as
natural antioxidant has been used in health foods for medical and
preservative purposes.
Abstract: The p53 tumor suppressor gene plays two important
roles in genomic stability: blocking cell proliferation after DNA
damage until it has been repaired, and starting apoptosis if the
damage is too critical. Codon 72 exon4 polymorphism (Arg72Pro) of
the P53 gene has been implicated in cancer risk. Various studies have
been done to investigate the status of p53 at codon 72 for arginine
(Arg) and proline (Pro) alleles in different populations and also the
association of this codon 72 polymorphism with various tumors. Our
objective was to investigate the possible association between P53
Arg72Pro polymorphism and susceptibility to colorectal cancer
among Isfahan and Chaharmahal Va Bakhtiari (a part of south west
of Iran) population. We investigated the status of p53 at codon 72 for
Arg/Arg, Arg/Pro and Pro/Pro allele polymorphisms in blood
samples from 145 colorectal cancer patients and 140 controls by
Nested-PCR of p53 exon 4 and digestion with BstUI restriction
enzyme and the DNA fragments were then resolved by
electrophoresis in 2% agarose gel. The Pro allele was 279 bp, while
the Arg allele was restricted into two fragments of 160 and 119 bp.
Among the 145 colorectal cancer cases 49 cases (33.79%) were
homozygous for the Arg72 allele (Arg/Arg), 18 cases (12.41%) were
homozygous for the Pro72 allele (Pro/Pro) and 78 cases (53.8%)
found in heterozygous (Arg/Pro).
In conclusion, it can be said that p53Arg/Arg genotype may be
correlated with possible increased risk of this kind of cancers in south
west of Iran.
Abstract: In the recent past Learning Classifier Systems have
been successfully used for data mining. Learning Classifier System
(LCS) is basically a machine learning technique which combines
evolutionary computing, reinforcement learning, supervised or
unsupervised learning and heuristics to produce adaptive systems. A
LCS learns by interacting with an environment from which it
receives feedback in the form of numerical reward. Learning is
achieved by trying to maximize the amount of reward received. All
LCSs models more or less, comprise four main components; a finite
population of condition–action rules, called classifiers; the
performance component, which governs the interaction with the
environment; the credit assignment component, which distributes the
reward received from the environment to the classifiers accountable
for the rewards obtained; the discovery component, which is
responsible for discovering better rules and improving existing ones
through a genetic algorithm. The concatenate of the production rules
in the LCS form the genotype, and therefore the GA should operate
on a population of classifier systems. This approach is known as the
'Pittsburgh' Classifier Systems. Other LCS that perform their GA at
the rule level within a population are known as 'Mitchigan' Classifier
Systems. The most predominant representation of the discovered
knowledge is the standard production rules (PRs) in the form of IF P
THEN D. The PRs, however, are unable to handle exceptions and do
not exhibit variable precision. The Censored Production Rules
(CPRs), an extension of PRs, were proposed by Michalski and
Winston that exhibit variable precision and supports an efficient
mechanism for handling exceptions. A CPR is an augmented
production rule of the form: IF P THEN D UNLESS C, where
Censor C is an exception to the rule. Such rules are employed in
situations, in which conditional statement IF P THEN D holds
frequently and the assertion C holds rarely. By using a rule of this
type we are free to ignore the exception conditions, when the
resources needed to establish its presence are tight or there is simply
no information available as to whether it holds or not. Thus, the IF P
THEN D part of CPR expresses important information, while the
UNLESS C part acts only as a switch and changes the polarity of D
to ~D. In this paper Pittsburgh style LCSs approach is used for
automated discovery of CPRs. An appropriate encoding scheme is
suggested to represent a chromosome consisting of fixed size set of
CPRs. Suitable genetic operators are designed for the set of CPRs
and individual CPRs and also appropriate fitness function is proposed
that incorporates basic constraints on CPR. Experimental results are
presented to demonstrate the performance of the proposed learning
classifier system.
Abstract: Human identification at a distance has recently gained
growing interest from computer vision researchers. Gait recognition
aims essentially to address this problem by identifying people based
on the way they walk [1]. Gait recognition has 3 steps. The first step
is preprocessing, the second step is feature extraction and the third
one is classification. This paper focuses on the classification step that
is essential to increase the CCR (Correct Classification Rate).
Multilayer Perceptron (MLP) is used in this work. Neural Networks
imitate the human brain to perform intelligent tasks [3].They can
represent complicated relationships between input and output and
acquire knowledge about these relationships directly from the data
[2]. In this paper we apply MLP NN for 11 views in our database and
compare the CCR values for these views. Experiments are performed
with the NLPR databases, and the effectiveness of the proposed
method for gait recognition is demonstrated.
Abstract: 17α-ethynylestradiol (EE2) is a synthetic estrogen
used as a key ingredient in an oral contraceptives pill. EE2 is an
endocrine disrupting compound, high in estrogenic potency.
Although EE2 exhibits low degree of biodegradability with common
microorganisms in wastewater treatment plants (WWTPs), this
compound can be biotransformed by ammonia-oxidizing bacteria
(AOB) via a co-metabolism mechanism in WWTPs. This study
aimed to investigate the effect of real wastewater on
biotransformation of EE2 by AOB. A preliminary experiment on the
effect of nitrite and pH levels on abiotic transformation of EE2
suggested that the abiotic transformation occurred at only pH
Abstract: Data objects are usually organized hierarchically, and
the relations between them are analyzed based on a corresponding
concept hierarchy. The relation between data objects, for example how
similar they are, are usually analyzed based on the conceptual distance
in the hierarchy. If a node is an ancestor of another node, it is enough
to analyze how close they are by calculating the distance vertically.
However, if there is not such relation between two nodes, the vertical
distance cannot express their relation explicitly. This paper tries to fill
this gap by improving the analysis method for data objects based on
hierarchy. The contributions of this paper include: (1) proposing an
improved method to evaluate the vertical distance between concepts;
(2) defining the concept horizontal distance and a method to calculate
the horizontal distance; and (3) discussing the methods to confine a
range by the horizontal distance and the vertical distance, and
evaluating the relation between concepts.
Abstract: We aimed to investigate how can target and optimize
pulmonary delivery distribution by changing physicochemical
characteristics of instilled liquid.Therefore, we created a new liquids
group:
a. eligible for desired distribution within lung because of
assorted physicochemical characteristics
b. capable of being augmented with a broad range of
chemicals inertly
c. no interference on respiratory function
d. compatible with airway surface liquid
We developed forty types of new liquid,were composed of
Carboxymethylcellulose sodium,Glycerin and different types of
Polysorbates.Viscosity was measured using a Programmable
Rheometer and surface tension by KRUSS Tensiometer.We
subsequently examined the liquids and delivery protocols by simple
and branched glass capillary tube models of airways.Eventually,we
explored pulmonary distribution of liquids being augmented with
technetium-99m in mechanically ventilated rabbits.We used a single
head large field of view gamma camera.Kinematic viscosity between
0.265Stokes and 0.289Stokes,density between 1g/cm3 and 1.5g/cm3
and surface tension between 25dyn/cm and 35dyn/cm were the most
acceptable.
Abstract: Mining sequential patterns from large customer transaction databases has been recognized as a key research topic in database systems. However, the previous works more focused on mining sequential patterns at a single concept level. In this study, we introduced concept hierarchies into this problem and present several algorithms for discovering multiple-level sequential patterns based on the hierarchies. An experiment was conducted to assess the performance of the proposed algorithms. The performances of the algorithms were measured by the relative time spent on completing the mining tasks on two different datasets. The experimental results showed that the performance depends on the characteristics of the datasets and the pre-defined threshold of minimal support for each level of the concept hierarchy. Based on the experimental results, some suggestions were also given for how to select appropriate algorithm for a certain datasets.
Abstract: In recent years, scanning probe atomic force
microscopy SPM AFM has gained acceptance over a wide spectrum
of research and science applications. Most fields focuses on physical,
chemical, biological while less attention is devoted to manufacturing
and machining aspects. The purpose of the current study is to assess
the possible implementation of the SPM AFM features and its
NanoScope software in general machining applications with special
attention to the tribological aspects of cutting tool. The surface
morphology of coated and uncoated as-received carbide inserts is
examined, analyzed, and characterized through the determination of
the appropriate scanning setting, the suitable data type imaging
techniques and the most representative data analysis parameters
using the MultiMode SPM AFM in contact mode. The NanoScope
operating software is used to capture realtime three data types
images: “Height", “Deflection" and “Friction". Three scan sizes are
independently performed: 2, 6, and 12 μm with a 2.5 μm vertical
range (Z). Offline mode analysis includes the determination of three
functional topographical parameters: surface “Roughness", power
spectral density “PSD" and “Section". The 12 μm scan size in
association with “Height" imaging is found efficient to capture every
tiny features and tribological aspects of the examined surface. Also,
“Friction" analysis is found to produce a comprehensive explanation
about the lateral characteristics of the scanned surface. Configuration
of many surface defects and drawbacks has been precisely detected
and analyzed.
Abstract: Fungal infections are becoming more common and the
range of susceptible individuals has expanded. While Candida
albicans remains the most common infective species, other Candida
spp. are becoming increasingly significant. In a range of large-scale
studies of candidaemia between 1999 and 2006, about 52% of 9717
cases involved C. albicans, about 30% involved either C. glabrata or
C. parapsilosis and less than 15% involved C. tropicalis, C. krusei or
C. guilliermondii. However, the probability of mortality within 30
days of infection with a particular species was at least 40% for C.
tropicalis, C. albicans, C. glabrata and C. krusei and only 22% for
C. parapsilopsis. Clinical isolates of Candida spp. grew at rates
ranging from 1.65 h-1 to 4.9 h-1. Three species (C. krusei, C. albicans
and C. glabrata) had relatively high growth rates (μm > 4 h-1), C.
tropicalis and C. dubliniensis grew moderately quickly (Ôëê 3 h-1) and
C. parapsilosis and C. guilliermondii grew slowly (< 2 h-1). Based
on these data, the log of the odds of mortality within 30 days of
diagnosis was linearly related to μm. From this the underlying
probability of mortality is 0.13 (95% CI: 0.10-0.17) and it increases
by about 0.09 ± 0.02 for each unit increase in μm. Given that the
overall crude mortality is about 0.36, the growth of Candida spp.
approximately doubles the rate, consistent with the results of larger
case-matched studies of candidaemia.
Abstract: Information sharing and exchange, rather than
information processing, is what characterizes information
technology in the 21st century. Ontologies, as shared common
understanding, gain increasing attention, as they appear as the
most promising solution to enable information sharing both at
a semantic level and in a machine-processable way. Domain
Ontology-based modeling has been exploited to provide
shareability and information exchange among diversified,
heterogeneous applications of enterprises.
Contextual ontologies are “an explicit specification of
contextual conceptualization". That is: ontology is
characterized by concepts that have multiple representations
and they may exist in several contexts. Hence, contextual
ontologies are a set of concepts and relationships, which are
seen from different perspectives. Contextualization is to allow
for ontologies to be partitioned according to their contexts.
The need for contextual ontologies in enterprise modeling
has become crucial due to the nature of today's competitive
market. Information resources in enterprise is distributed and
diversified and is in need to be shared and communicated
locally through the intranet and globally though the internet.
This paper discusses the roles that ontologies play in an
enterprise modeling, and how ontologies assist in building a
conceptual model in order to provide communicative and
interoperable information systems. The issue of enterprise
modeling based on contextual domain ontology is also
investigated, and a framework is proposed for an enterprise
model that consists of various applications.
Abstract: The effect of wheat flour extraction rates on flour
composition, farinographic characteristics and the quality of
sourdough naans was investigated. The results indicated that by
increasing the extraction rate, the amount of protein, fiber, fat and
ash increased, whereas moisture content decreased. Farinographic
characteristic like water absorption and dough development time
increased with an increase in flour extraction rate but the dough
stabilities and tolerance indices were reduced with an increase in
flour extraction rates. Titratable acidity for both sourdough and
sourdough naans also increased along with flour extraction rate. The
study showed that overall quality of sourdough naans were affected
by both flour extraction rate and starter culture used. Sensory
analysis of sourdough naans revealed that desirable extraction rate
for sourdough naan was 76%.
Abstract: There have been different approaches to compute the
analytic instantaneous frequency with a variety of background reasoning
and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based
instantaneous frequency computation approach. The adaptive Fourier
decomposition is a recently proposed new signal decomposition
approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy
convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of
the signal in most of the situation. A new instantaneous frequency
definition for a large class of so-called simple waves is also proposed
in this paper. Simple wave contains a wide range of signals for which
the concept instantaneous frequency has a perfect physical sense.
The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.
Abstract: Several studies have shown the association between
ambient particulate matter (PM) and adverse health effects and
climate change, thus highlighting the need to limit the anthropogenic
sources of PM. PM Exposure is commonly monitored as mass
concentration of PM10 (particle aerodynamic diameter < 10μm) or
PM2.5 (particle aerodynamic diameter < 2.5μm), although increasing
toxicity with decreasing aerodynamic diameter has been reported due
to increased surface area and enhanced chemical reactivity with other
species. Additionally, the light scattering properties of PM increases
with decreasing size. Hence, it is important to study the chemical
characterization of finer fraction of the particulate matter and to
identify their sources so that they can be controlled appropriately to a
large extent at the sources before reaching to the receptors.
Abstract: The group mutual exclusion (GME) problem is a
variant of the mutual exclusion problem. In the present paper a
token-based group mutual exclusion algorithm, capable of handling
transient faults, is proposed. The algorithm uses the concept of
dynamic request sets. A time out mechanism is used to detect the
token loss; also, a distributed scheme is used to regenerate the token.
The worst case message complexity of the algorithm is n+1. The
maximum concurrency and forum switch complexity of the
algorithm are n and min (n, m) respectively, where n is the number of
processes and m is the number of groups. The algorithm also satisfies
another desirable property called smooth admission. The scheme can
also be adapted to handle the extended group mutual exclusion
problem.