Abstract: The environment pollution with pesticides and heavy
metals is a recognized problem nowadays, with extension to the
global scale the tendency of amplification. Even with all the progress
in the environmental field, both in the emphasize of the effect of the
pollutants upon health, the linked studies environment-health are
insufficient, not only in Romania but all over the world also. We aim
to describe the particular situation in Romania regarding the
uncontrolled use of pesticides, to identify and evaluate the risk zones
for health and the environment in Romania, with the final goal of
designing adequate programs for reduction and control of the risk
sources. An exploratory study was conducted to determine the
magnitude of the pesticide use problem in a population living in
Saliste, a rural setting in Transylvania, Romania. The significant
stakeholders in Saliste region were interviewed and a sample from
the population living in Saliste area was selected to fill in a designed
questionnaire. All the selected participants declared that they used
pesticides in their activities for more than one purpose. They
declared they annually applied pesticides for a period of time
between 11 and 30 years, from 5 to 9 days per year on average,
mainly on crops situated at some distance from the houses but high
risk behavior was identified as the volunteers declared the use of
pesticides in the backyard gardens, near their homes, where children
were playing. The pesticide applicators did not have the necessary
knowledge about safety and exposure. The health data must be
correlated with exposure biomarkers in attempt to identify the
possible health effects of the pesticides exposure. Future plans
include educational campaigns to raise the awareness of the
population on the danger of uncontrolled use of pesticides.
Abstract: We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Abstract: As the Computed Tomography(CT) requires normally
hundreds of projections to reconstruct the image, patients are exposed
to more X-ray energy, which may cause side effects such as cancer.
Even when the variability of the particles in the object is very less,
Computed Tomography requires many projections for good quality
reconstruction. In this paper, less variability of the particles in an
object has been exploited to obtain good quality reconstruction.
Though the reconstructed image and the original image have same
projections, in general, they need not be the same. In addition
to projections, if a priori information about the image is known,
it is possible to obtain good quality reconstructed image. In this
paper, it has been shown by experimental results why conventional
algorithms fail to reconstruct from a few projections, and an efficient
polynomial time algorithm has been given to reconstruct a bi-level
image from its projections along row and column, and a known sub
image of unknown image with smoothness constraints by reducing the
reconstruction problem to integral max flow problem. This paper also
discusses the necessary and sufficient conditions for uniqueness and
extension of 2D-bi-level image reconstruction to 3D-bi-level image
reconstruction.
Abstract: Tomato powder has good potential as substitute of tomato paste and other tomato products. In order to protect physicochemical properties and nutritional quality of tomato during dehydration process, investigation was carried out using different drying methods and pretreatments. Solar drier and continuous conveyor (tunnel) drier were used for dehydration where as calcium chloride (CaCl2), potassium metabisulphite (KMS), calcium chloride and potassium metabisulphite (CaCl2 +KMS), and sodium chloride (NaCl) selected for treatment.. lycopene content, dehydration ratio, rehydration ratio and non-enzymatic browning in addition to moisture, sugar and titrable acidity were studied. Results show that pre-treatment with CaCl2 and NaCl increased water removal and moisture mobility in tomato slices during drying of tomatoes. Where CaCl2 used along with KMS the NEB was recorded the least compared to other treatments and the best results were obtained while using the two chemicals in combination form. Storage studies in LDPE polymeric and metalized polyesters films showed less changes in the products packed in metallized polyester pouches and even after 6 months lycopene content did not decrease more than 20% as compared to the control sample and provide extension of shelf life in acceptable condition for 6 months. In most of the quality characteristics tunnel drier samples presented better values in comparison to solar drier.
Abstract: A biophysically based multilayer continuum model of the facial soft tissue composite has been developed for simulating wrinkle formation. The deformed state of the soft tissue block was determined by solving large deformation mechanics equations using the Galerkin finite element method. The proposed soft tissue model is composed of four layers with distinct mechanical properties. These include stratum corneum, epidermal-dermal layer (living epidermis and dermis), subcutaneous tissue and the underlying muscle. All the layers were treated as non-linear, isotropic Mooney Rivlin materials. Contraction of muscle fibres was approximated using a steady-state relationship between the fibre extension ratio, intracellular calcium concentration and active stress in the fibre direction. Several variations of the model parameters (stiffness and thickness of epidermal-dermal layer, thickness of subcutaneous tissue layer) have been considered.
Abstract: A multi fingered dexterous anthropomorphic hand is
being developed by the authors. The focus of the hand is the
replacement of human operators in hazardous environments and also
in environments where zero tolerance is observed for the human
errors. The robotic hand will comprise of five fingers (four fingers
and one thumb) each having four degrees of freedom (DOF) which
can perform flexion, extension, abduction, adduction and also
circumduction. For the actuation purpose pneumatic muscles and
springs will be used. The paper exemplifies the mechanical design for
the robotic hand. It also describes different mechanical designs that
have been developed before date.
Abstract: Censored Production Rule is an extension of standard
production rule, which is concerned with problems of reasoning with
incomplete information, subject to resource constraints and problem
of reasoning efficiently with exceptions. A CPR has a form: IF A
(Condition) THEN B (Action) UNLESS C (Censor), Where C is the
exception condition. Fuzzy CPR are obtained by augmenting
ordinary fuzzy production rule “If X is A then Y is B with an
exception condition and are written in the form “If X is A then Y is B
Unless Z is C. Such rules are employed in situation in which the
fuzzy conditional statement “If X is A then Y is B" holds frequently
and the exception condition “Z is C" holds rarely. Thus “If X is A
then Y is B" part of the fuzzy CPR express important information
while the unless part acts only as a switch that changes the polarity of
“Y is B" to “Y is not B" when the assertion “Z is C" holds. The
proposed approach is an attempt to discover fuzzy censored
production rules from set of discovered fuzzy if then rules in the
form:
A(X)  B(Y) || C(Z).
Abstract: Technology of thin film deposition is of interest in
many engineering fields, from electronic manufacturing to corrosion
protective coating. A typical deposition process, like that developed
at the University of Eindhoven, considers the deposition of a thin,
amorphous film of C:H or of Si:H on the substrate, using the
Expanding Thermal arc Plasma technique. In this paper a computing
procedure is proposed to simulate the flow field in a deposition
chamber similar to that at the University of Eindhoven and a
sensitivity analysis is carried out in terms of: precursor mass flow
rate, electrical power, supplied to the torch and fluid-dynamic
characteristics of the plasma jet, using different nozzles. To this
purpose a deposition chamber similar in shape, dimensions and
operating parameters to the above mentioned chamber is considered.
Furthermore, a method is proposed for a very preliminary evaluation
of the film thickness distribution on the substrate. The computing
procedure relies on two codes working in tandem; the output from
the first code is the input to the second one. The first code simulates
the flow field in the torch, where Argon is ionized according to the
Saha-s equation, and in the nozzle. The second code simulates the
flow field in the chamber. Due to high rarefaction level, this is a
(commercial) Direct Simulation Monte Carlo code. Gas is a mixture
of 21 chemical species and 24 chemical reactions from Argon plasma
and Acetylene are implemented in both codes. The effects of the
above mentioned operating parameters are evaluated and discussed
by 2-D maps and profiles of some important thermo-fluid-dynamic
parameters, as per Mach number, velocity and temperature. Intensity,
position and extension of the shock wave are evaluated and the
influence of the above mentioned test conditions on the film
thickness and uniformity of distribution are also evaluated.
Abstract: Recently the usefulness of Concept Abduction, a novel non-monotonic inference service for Description Logics (DLs), has been argued in the context of ontology-based applications such as semantic matchmaking and resource retrieval. Based on tableau calculus, a method has been proposed to realize this reasoning task in ALN, a description logic that supports simple cardinality restrictions as well as other basic constructors. However, in many ontology-based systems, the representation of ontology would require expressive formalisms for capturing domain-specific constraints, this language is not sufficient. In order to increase the applicability of the abductive reasoning method in such contexts, we would like to present in the scope of this paper an extension of the tableaux-based algorithm for dealing with concepts represented inALCQ, the description logic that extends ALN with full concept negation and quantified number restrictions.
Abstract: Fuzzy logic can be used when knowledge is
incomplete or when ambiguity of data exists. The purpose of
this paper is to propose a proactive fuzzy set- based model for
reacting to the risk inherent in investment activities relative to
a complete view of portfolio management. Fuzzy rules are
given where, depending on the antecedents, the portfolio size
may be slightly or significantly decreased or increased. The
decision maker considers acceptable bounds on the proportion
of acceptable risk and return. The Fuzzy Controller model
allows learning to be achieved as 1) the firing strength of each
rule is measured, 2) fuzzy output allows rules to be updated,
and 3) new actions are recommended as the system continues
to loop. An extension is given to the fuzzy controller that
evaluates potential financial loss before adjusting the
portfolio. An application is presented that illustrates the
algorithm and extension developed in the paper.
Abstract: Many corporations are seriously concerned about
security of networks and therefore, their network supervisors are still
reluctant to install WLANs. In this regards, the IEEE802.11i standard
was developed to address the security problems, even though the
mistrust of the wireless LAN technology is still existing. The thought
was that the best security solutions could be found in open standards
based technologies that can be delivered by Virtual Private
Networking (VPN) being used for long time without addressing any
security holes for the past few years. This work, addresses this issue
and presents a simulated wireless LAN of IEEE802.11g protocol, and
analyzes impact of integrating Virtual Private Network technology to
secure the flow of traffic between the client and the server within the
LAN, using OPNET WLAN utility. Two Wireless LAN scenarios
have been introduced and simulated. These are based on normal
extension to a wired network and VPN over extension to a wired
network. The results of the two scenarios are compared and indicate
the impact of improving performance, measured by response time
and load, of Virtual Private Network over wireless LAN.
Abstract: We consider a Principal-Agent model with the
Principal being a seller who does not know perfectly how much the
buyer (the Agent) is willing to pay for the good. The buyer-s
preferences are hence his private information. The model corresponds
to the nonlinear pricing problem of Maskin and Riley. We assume
there are three types of Agents. The model is solved using
“informational rents" as variables. In the last section we present the
main characteristics of the optimal contracts in asymmetric
information and some possible extensions of the model.
Abstract: Texture classification is an important image processing
task with a broad application range. Many different techniques for
texture classification have been explored. Using sparse approximation
as a feature extraction method for texture classification is a relatively
new approach, and Skretting et al. recently presented the Frame
Texture Classification Method (FTCM), showing very good results on
classical texture images. As an extension of that work the FTCM is
here tested on a real world application as detection of abnormalities
in mammograms. Some extensions to the original FTCM that are
useful in some applications are implemented; two different smoothing
techniques and a vector augmentation technique. Both detection of
microcalcifications (as a primary detection technique and as a last
stage of a detection scheme), and soft tissue lesions in mammograms
are explored. All the results are interesting, and especially the results
using FTCM on regions of interest as the last stage in a detection
scheme for microcalcifications are promising.
Abstract: To support user mobility for a wireless network new mechanisms are needed and are fundamental, such as paging, location updating, routing, and handover. Also an important key feature is mobile QoS offered by the WATM. Several ATM network protocols should be updated to implement mobility management and to maintain the already ATM QoS over wireless ATM networks. A survey of the various schemes and types of handover is provided. Handover procedure allows guarantee the terminal connection reestablishment when it moves between areas covered by different base stations. It is useful to satisfy user radio link transfer without interrupting a connection. However, failure to offer efficient solutions will result in handover important packet loss, severe delays and degradation of QoS offered to the applications. This paper reviews the requirements, characteristics and open issues of wireless ATM, particularly with regard to handover. It introduces key aspects of WATM and mobility extensions, which are added in the fixed ATM network. We propose a flexible approach for handover management that will minimize the QoS deterioration. Functional entities of this flexible approach are discussed in order to achieve minimum impact on the connection quality when a MT crosses the BS.
Abstract: Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).
Abstract: Organizations face challenges supporting knowledge
workers due to their particular requirements for an environment
supportive of their self-guided learning activities which are important
to increase their productivity and to develop creative solutions to
non-routine problems. Face-to-face knowledge sharing remains
crucial in spite of a large number of knowledge management
instruments that aim at supporting a more impersonal transfer of
knowledge. This paper first describes the main criteria for a
conceptual and technical solution targeted at flexible management of
office space that aims at assigning those knowledge workers to the
same room that are most likely to thrive when being brought together
thus enhancing their knowledge work productivity. The paper
reflects on lessons learned from the implementation and operation of
such a solution in a project-focused organization and derives several
implications for future extensions that target to foster problem
solving, informal learning and personal development.
Abstract: The two-dimensional gel electrophoresis method
(2-DE) is widely used in Proteomics to separate thousands of proteins
in a sample. By comparing the protein expression levels of proteins in
a normal sample with those in a diseased one, it is possible to identify
a meaningful set of marker proteins for the targeted disease. The major
shortcomings of this approach involve inherent noises and irregular
geometric distortions of spots observed in 2-DE images. Various
experimental conditions can be the major causes of these problems. In
the protein analysis of samples, these problems eventually lead to
incorrect conclusions. In order to minimize the influence of these
problems, this paper proposes a partition based pair extension method
that performs spot-matching on a set of gel images multiple times and
segregates more reliable mapping results which can improve the
accuracy of gel image analysis. The improved accuracy of the
proposed method is analyzed through various experiments on real
2-DE images of human liver tissues.
Abstract: Power consumption of nodes in ad hoc networks is a
critical issue as they predominantly operate on batteries. In order to
improve the lifetime of an ad hoc network, all the nodes must be
utilized evenly and the power required for connections must be
minimized. In this project a link layer algorithm known as Power
Aware medium Access Control (PAMAC) protocol is proposed
which enables the network layer to select a route with minimum total
power requirement among the possible routes between a source and a
destination provided all nodes in the routes have battery capacity
above a threshold. When the battery capacity goes below a
predefined threshold, routes going through these nodes will be
avoided and these nodes will act only as source and destination.
Further, the first few nodes whose battery power drained to the set
threshold value are pushed to the exterior part of the network and the
nodes in the exterior are brought to the interior. Since less total
power is required to forward packets for each connection. The
network layer protocol AOMDV is basically an extension to the
AODV routing protocol. AOMDV is designed to form multiple
routes to the destination and it also avoid the loop formation so that it
reduces the unnecessary congestion to the channel. In this project, the
performance of AOMDV is evaluated using PAMAC as a MAC layer
protocol and the average power consumption, throughput and
average end to end delay of the network are calculated and the results
are compared with that of the other network layer protocol AODV.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.
Abstract: The reliable results of an insulated oval duct
considering heat radiation are obtained basing on accurate oval
perimeter obtained by integral method as well as one-dimensional
Plane Wedge Thermal Resistance (PWTR) model. This is an extension
study of former paper of insulated oval duct neglecting heat radiation.
It is found that in the practical situations with long-short-axes ratio a/b
4.5% while t/R2