Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: The investigating and assessing the effects of
relaxation training on the levels of state anxiety concerning first year
female nursing students at their initial experience in clinical setting.
This research is a quasi experimental study that was carried out in
nursing and midwifery faculty of Tehran university of medical
sciences .The sample of research consists 60 first term female
nursing students were selected through convenience and random
sampling. 30 of them were the experimental group and 30 of them
were in control group. The Instruments of data-collection has been a
questionnaire which consists of 3 parts. The first part includes 10
questions about demographic characteristics .the second part includes
20 question about anxiety (test 'Spielberg' ). The 3rd part includes
physiological indicators of anxiety (BP, P, R, body temperature). The
statistical tests included t-test and and fisher test, Data were
analyzed by SPSS software.
Abstract: Multimedia information availability has increased
dramatically with the advent of video broadcasting on handheld
devices. But with this availability comes problems of maintaining the
security of information that is displayed in public. ISMA Encryption
and Authentication (ISMACryp) is one of the chosen technologies for
service protection in DVB-H (Digital Video Broadcasting-
Handheld), the TV system for portable handheld devices. The
ISMACryp is encoded with H.264/AVC (advanced video coding),
while leaving all structural data as it is. Two modes of ISMACryp are
available; the CTR mode (Counter type) and CBC mode (Cipher
Block Chaining) mode. Both modes of ISMACryp are based on 128-
bit AES algorithm. AES algorithms are more complex and require
larger time for execution which is not suitable for real time
application like live TV. The proposed system aims to gain a deep
understanding of video data security on multimedia technologies and
to provide security for real time video applications using selective
encryption for H.264/AVC. Five level of security proposed in this
paper based on the content of NAL unit in Baseline Constrain profile
of H.264/AVC. The selective encryption in different levels provides
encryption of intra-prediction mode, residue data, inter-prediction
mode or motion vectors only. Experimental results shown in this
paper described that fifth level which is ISMACryp provide higher
level of security with more encryption time and the one level provide
lower level of security by encrypting only motion vectors with lower
execution time without compromise on compression and quality of
visual content. This encryption scheme with compression process
with low cost, and keeps the file format unchanged with some direct
operations supported. Simulation was being carried out in Matlab.
Abstract: The Wavelet-Galerkin finite element method for
solving the one-dimensional heat equation is presented in this work.
Two types of basis functions which are the Lagrange and multi-level
wavelet bases are employed to derive the full form of matrix system.
We consider both linear and quadratic bases in the Galerkin method.
Time derivative is approximated by polynomial time basis that
provides easily extend the order of approximation in time space. Our
numerical results show that the rate of convergences for the linear
Lagrange and the linear wavelet bases are the same and in order 2
while the rate of convergences for the quadratic Lagrange and the
quadratic wavelet bases are approximately in order 4. It also reveals
that the wavelet basis provides an easy treatment to improve
numerical resolutions that can be done by increasing just its desired
levels in the multilevel construction process.
Abstract: In this paper, the hardware implementation of the
RSA public-key cryptographic algorithm is presented. The RSA
cryptographic algorithm is depends on the computation of repeated
modular exponentials.
The Montgomery algorithm is used and modified to reduce
hardware resources and to achieve reasonable operating speed for
FPGA. An efficient architecture for modular multiplications based on
the array multiplier is proposed. We have implemented a RSA
cryptosystem based on Montgomery algorithm. As a result, it is
shown that proposed architecture contributes to small area and
reasonable speed.
Abstract: The paper deals with quality labels used in the food products market, especially with labels of quality, labels of origin, and labels of organic farming. The aim of the paper is to identify perception of these labels by consumers in the Czech Republic. The first part refers to the definition and specification of food quality labels that are relevant in the Czech Republic. The second part includes the discussion of marketing research results. Data were collected with personal questioning method. Empirical findings on 150 respondents are related to consumer awareness and perception of national and European food quality labels used in the Czech Republic, attitudes to purchases of labelled products, and interest in information regarding the labels. Statistical methods, in the concrete Pearson´s chi-square test of independence, coefficient of contingency, and coefficient of association are used to determinate if significant differences do exist among selected demographic categories of Czech consumers.
Abstract: Effective knowledge support relies on providing
operation-relevant knowledge to workers promptly and accurately. A
knowledge flow represents an individual-s or a group-s
knowledge-needs and referencing behavior of codified knowledge
during operation performance. The flow has been utilized to facilitate
organizational knowledge support by illustrating workers-
knowledge-needs systematically and precisely. However,
conventional knowledge-flow models cannot work well in cooperative
teams, which team members usually have diverse knowledge-needs in
terms of roles. The reason is that those models only provide one single
view to all participants and do not reflect individual knowledge-needs
in flows. Hence, we propose a role-based knowledge-flow view model
in this work. The model builds knowledge-flow views (or virtual
knowledge flows) by creating appropriate virtual knowledge nodes
and generalizing knowledge concepts to required concept levels. The
customized views could represent individual role-s knowledge-needs
in teamwork context. The novel model indicates knowledge-needs in
condensed representation from a roles perspective and enhances the
efficiency of cooperative knowledge support in organizations.
Abstract: A series of tests on cold-formed steel (CFS) wall plate system subjected to uplift force at the mid span of the wall plate is presented. The aim of the study was to study the behaviour and identify the modes of failure of CFS wall plate system. Two parameters were considered in these studies: 1) different dimension of U-bracket at the supports and 2) different sizes of lipped C-channel. The lipped C-channels used were C07508, C07512 and C10012. The dimensions of the leg of U-bracket were 50x35 mm and 50x60 mm respectively, where 25 mm clearance was provided to the connections for specimens with clearance. Results show that specimens with and without clearance experienced the same mode of failure. Failure began with the yielding of the connectors followed by distortional buckling of the wall plate. However, when C075 sections were used as wall plate, the system behaved differently. There was a large deformation in the wall plate and failure began in the distortional buckling of the wall plate followed by bearing of the connecting plates at the supports (U-bracket). The ultimate strength of the system also decreased dramatically when C075 sections were used.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: A numerical method is developed for simulating
the motion of particles with arbitrary shapes in an effectively
infinite or bounded viscous flow. The particle translational and
angular motions are numerically investigated using a fluid-structure
interaction (FSI) method based on the Arbitrary-Lagrangian-Eulerian
(ALE) approach and the dynamic mesh method (smoothing and
remeshing) in FLUENT ( ANSYS Inc., USA). Also, the effects of
arbitrary shapes on the dynamics are studied using the FSI method
which could be applied to the motions and deformations of a single
blood cell and multiple blood cells, and the primary thrombogenesis
caused by platelet aggregation. It is expected that, combined with a
sophisticated large-scale computational technique, the simulation
method will be useful for understanding the overall properties of blood
flow from blood cellular level (microscopic) to the resulting
rheological properties of blood as a mass (macroscopic).
Abstract: Several models of vulnerability assessment have been proposed. The selection of one of these models depends on the objectives of the study. The classical methodologies for seismic vulnerability analysis, as a part of seismic risk analysis, have been formulated with statistical criteria based on a rapid observation. The information relating to the buildings performance is statistically elaborated. In this paper, we use the European Macroseismic Scale EMS-98 to define the relationship between damage and macroseismic intensity to assess the seismic vulnerability. Applying to Algiers area, the first step is to identify building typologies and to assign vulnerability classes. In the second step, damages are investigated according to EMS-98.
Abstract: Modern managements of water distribution system
(WDS) need water quality models that are able to accurately predict
the dynamics of water quality variations within the distribution system
environment. Before water quality models can be applied to solve
system problems, they should be calibrated. Although former
researchers use GA solver to calibrate relative parameters, it is
difficult to apply on the large-scale or medium-scale real system for
long computational time. In this paper a new method is designed
which combines both macro and detailed model to optimize the water
quality parameters. This new combinational algorithm uses radial
basis function (RBF) metamodeling as a surrogate to be optimized for
the purpose of decreasing the times of time-consuming water quality
simulation and can realize rapidly the calibration of pipe wall reaction
coefficients of chlorine model of large-scaled WDS. After two cases
study this method is testified to be more efficient and promising, and
deserve to generalize in the future.
Abstract: This paper aims at identifying and analyzing the
knowledge transmission channels in textile and clothing clusters
located in Brazil and in Europe. Primary data was obtained through
interviews with key individuals. The collection of primary data was
carried out based on a questionnaire with ten categories of indicators
of knowledge transmission. Secondary data was also collected
through a literature review and through international organizations
sites. Similarities related to the use of the main transmission channels
of knowledge are observed in all cases. The main similarities are:
influence of suppliers of machinery, equipment and raw materials;
imitation of products and best practices; training promoted by
technical institutions and businesses; and cluster companies being
open to acquire new knowledge. The main differences lie in the
relationship between companies, where in Europe the intensity of this
relationship is bigger when compared to Brazil. The differences also
occur in importance and frequency of the relationship with the
government, with the cultural environment, and with the activities of
research and development. It is also found factors that reduce the
importance of geographical proximity in transmission of knowledge,
and in generating trust and the establishment of collaborative
behavior.
Abstract: This paper includes a review of three physics simulation packages that can be used to provide researchers with a virtual ground for modeling, implementing and simulating complex models, as well as testing their control methods with less cost and time of development. The inverted pendulum model was used as a test bed for comparing ODE, DANCE and Webots, while Linear State Feedback was used to control its behavior. The packages were compared with respect to model creation, solving systems of differential equation, data storage, setting system variables, control the experiment and ease of use. The purpose of this paper is to give an overview about our experience with these environments and to demonstrate some of the benefits and drawbacks involved in practice for each package.
Abstract: The ElectroEncephaloGram (EEG) is useful for
clinical diagnosis and biomedical research. EEG signals often
contain strong ElectroOculoGram (EOG) artifacts produced
by eye movements and eye blinks especially in EEG recorded
from frontal channels. These artifacts obscure the underlying
brain activity, making its visual or automated inspection
difficult. The goal of ocular artifact removal is to remove
ocular artifacts from the recorded EEG, leaving the underlying
background signals due to brain activity. In recent times,
Independent Component Analysis (ICA) algorithms have
demonstrated superior potential in obtaining the least
dependent source components. In this paper, the independent
components are obtained by using the JADE algorithm (best
separating algorithm) and are classified into either artifact
component or neural component. Neural Network is used for
the classification of the obtained independent components.
Neural Network requires input features that exactly represent
the true character of the input signals so that the neural
network could classify the signals based on those key
characters that differentiate between various signals. In this
work, Auto Regressive (AR) coefficients are used as the input
features for classification. Two neural network approaches
are used to learn classification rules from EEG data. First, a
Polynomial Neural Network (PNN) trained by GMDH (Group
Method of Data Handling) algorithm is used and secondly,
feed-forward neural network classifier trained by a standard
back-propagation algorithm is used for classification and the
results show that JADE-FNN performs better than JADEPNN.
Abstract: The modern telecommunication industry demands
higher capacity networks with high data rate. Orthogonal frequency
division multiplexing (OFDM) is a promising technique for high data
rate wireless communications at reasonable complexity in wireless
channels. OFDM has been adopted for many types of wireless
systems like wireless local area networks such as IEEE 802.11a, and
digital audio/video broadcasting (DAB/DVB). The proposed research
focuses on a concatenated coding scheme that improve the
performance of OFDM based wireless communications. It uses a
Redundant Residue Number System (RRNS) code as the outer code
and a convolutional code as the inner code. Here, a direct conversion
of analog signal to residue domain is done to reduce the conversion
complexity using sigma-delta based parallel analog-to-residue
converter. The bit error rate (BER) performances of the proposed
system under different channel conditions are investigated. These
include the effect of additive white Gaussian noise (AWGN),
multipath delay spread, peak power clipping and frame start
synchronization error. The simulation results show that the proposed
RRNS-Convolutional concatenated coding (RCCC) scheme provides
significant improvement in the system performance by exploiting the
inherent properties of RRNS.
Abstract: With the implied volatility as an important factor in
financial decision-making, in particular in option pricing valuation,
and also the given fact that the pricing biases of Leland option pricing
models and the implied volatility structure for the options are related,
this study considers examining the implied adjusted volatility smile
patterns and term structures in the S&P/ASX 200 index options using
the different Leland option pricing models. The examination of the
implied adjusted volatility smiles and term structures in the
Australian index options market covers the global financial crisis in
the mid-2007. The implied adjusted volatility was found to escalate
approximately triple the rate prior the crisis.
Abstract: Crucial information barely visible to the human eye is
often embedded in a series of low resolution images taken of the
same scene. Super resolution reconstruction is the process of
combining several low resolution images into a single higher
resolution image. The ideal algorithm should be fast, and should add
sharpness and details, both at edges and in regions without adding
artifacts. In this paper we propose a super resolution blind
reconstruction technique for linearly degraded images. In our
proposed technique the algorithm is divided into three parts an image
registration, wavelets based fusion and an image restoration. In this
paper three low resolution images are considered which may sub
pixels shifted, rotated, blurred or noisy, the sub pixel shifted images
are registered using affine transformation model; A wavelet based
fusion is performed and the noise is removed using soft thresolding.
Our proposed technique reduces blocking artifacts and also
smoothens the edges and it is also able to restore high frequency
details in an image. Our technique is efficient and computationally
fast having clear perspective of real time implementation.
Abstract: This paper presents an advance in monitoring and
process control of surface roughness in CNC machine for the turning
and milling processes. An integration of the in-process monitoring
and process control of the surface roughness is proposed and
developed during the machining process by using the cutting force
ratio. The previously developed surface roughness models for turning
and milling processes of the author are adopted to predict the inprocess
surface roughness, which consist of the cutting speed, the
feed rate, the tool nose radius, the depth of cut, the rake angle, and
the cutting force ratio. The cutting force ratios obtained from the
turning and the milling are utilized to estimate the in-process surface
roughness. The dynamometers are installed on the tool turret of CNC
turning machine and the table of 5-axis machining center to monitor
the cutting forces. The in-process control of the surface roughness
has been developed and proposed to control the predicted surface
roughness. It has been proved by the cutting tests that the proposed
integration system of the in-process monitoring and the process
control can be used to check the surface roughness during the cutting
by utilizing the cutting force ratio.
Abstract: 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGR) catalyzes the conversion of HMG-CoA to mevalonate using NADPH and the enzyme is involved in rate-controlling step of mevalonate. Inhibition of HMGR is considered as effective way to lower cholesterol levels so it is drug target to treat hypercholesterolemia, major risk factor of cardiovascular disease. To discover novel HMGR inhibitor, we performed structure-based pharmacophore modeling combined with molecular dynamics (MD) simulation. Four HMGR inhibitors were used for MD simulation and representative structure of each simulation were selected by clustering analysis. Four structure-based pharmacophore models were generated using the representative structure. The generated models were validated used in virtual screening to find novel scaffolds for inhibiting HMGR. The screened compounds were filtered by applying drug-like properties and used in molecular docking. Finally, four hit compounds were obtained and these complexes were refined using energy minimization. These compounds might be potential leads to design novel HMGR inhibitor.