Abstract: In this paper we present an efficient approach for the prediction of two sunspot-related time series, namely the Yearly Sunspot Number and the IR5 Index, that are commonly used for monitoring solar activity. The method is based on exploiting partially recurrent Elman networks and it can be divided into three main steps: the first one consists in a “de-rectification" of the time series under study in order to obtain a new time series whose appearance, similar to a sum of sinusoids, can be modelled by our neural networks much better than the original dataset. After that, we normalize the derectified data so that they have zero mean and unity standard deviation and, finally, train an Elman network with only one input, a recurrent hidden layer and one output using a back-propagation algorithm with variable learning rate and momentum. The achieved results have shown the efficiency of this approach that, although very simple, can perform better than most of the existing solar activity forecasting methods.
Abstract: To reveal the temperature field distribution of disc
brake in downward belt conveyor, mathematical models of heat
transfer for disc brake were established combined with heat transfer
theory. Then, the simulation process was stated in detail and the
temperature field of disc brake under conditions of dynamic speed and
dynamic braking torque was numerically simulated by using ANSYS
software. Finally the distribution and variation laws of temperature
field in the braking process were analyzed. Results indicate that the
maximum surface temperature occurs at a time before the brake end
and there exist large temperature gradients in both radial and axial
directions, while it is relatively small in the circumferential direction.
Abstract: Scale Invariant Feature Transform (SIFT) has been
widely applied, but extracting SIFT feature is complicated and
time-consuming. In this paper, to meet the demand of the real-time
applications, SIFT is parallelized and optimized on cluster system,
which is named pSIFT. Redundancy storage and communication are
used for boundary data to improve the performance, and before
representation of feature descriptor, data reallocation is adopted to
keep load balance in pSIFT. Experimental results show that pSIFT
achieves good speedup and scalability.
Abstract: This paper presents a novel method for inferring the
odor based on neural activities observed from rats- main olfactory
bulbs. Multi-channel extra-cellular single unit recordings were done
by micro-wire electrodes (tungsten, 50μm, 32 channels) implanted in
the mitral/tufted cell layers of the main olfactory bulb of anesthetized
rats to obtain neural responses to various odors. Neural response
as a key feature was measured by substraction of neural firing rate
before stimulus from after. For odor inference, we have developed a
decoding method based on the maximum likelihood (ML) estimation.
The results have shown that the average decoding accuracy is about
100.0%, 96.0%, 84.0%, and 100.0% with four rats, respectively. This
work has profound implications for a novel brain-machine interface
system for odor inference.
Abstract: The genus Fumaria L. (Papaveraceae) in Iran
comprises 8 species with a vast medicinal use in Asian folk
medicine. These herbs are considered to be useful in the
treatment of gastrointestinal disease and skin disorders.
Antioxidant activities of alkaloids and phenolic extracts of
these species had been studied previously. These species are:
F. officinalis, F. parviflora, F. asepala, F. densiflora, F.
schleicheri, F. vaillantii and F. indica. More than 50
populations of Fumaria species were sampled from nature. In
this study different fatty acids are extracted. Their picks were
recorded by GC technique. This species contain some kind of
fatty acids with antioxidant effects. A part of these lipids are
phospholipids. As these are unsaturated fatty acids they may
have industrial use as natural additive to cosmetics, dermal
and oral medicines. The presences of different materials are
discussed. Our studies for antioxidant effects of these
substances are continued.
Abstract: The aim of this study is to test the “work values"
inventory developed by Tevruz and Turgut and to utilize the concept
in a model, which aims to create a greater understanding of the work
experience. In the study multiple effects of work values, work-value
congruence and work centrality on organizational citizenship
behavior are examined. In this respect, it is hypothesized that work
values and work-value congruence predict organizational citizenship
behavior through work centrality. Work-goal congruence test, Tevruz
and Turgut-s work values inventory are administered along with
Kanungo-s work centrality and Podsakoff et al.-s [47] organizational
citizenship behavior test to employees working in Turkish SME-s.
The study validated that Tevruz and Turgut-s work values inventory
and the work-value congruence test were reliable and could be used
for future research. The study revealed the mediating role of work
centrality only for the relationship of work values and the
responsibility dimension of citizenship behavior. Most important, this
study brought in an important concept, work-value congruence,
which enables a better understanding of work values and their
relation to various attitudinal variables.
Abstract: The proliferation of web application and the pervasiveness of mobile technology make web-based attacks even more attractive and even easier to launch. Web Application Firewall (WAF) is an intermediate tool between web server and users that provides comprehensive protection for web application. WAF is a negative security model where the detection and prevention mechanisms are based on predefined or user-defined attack signatures and patterns. However, WAF alone is not adequate to offer best defensive system against web vulnerabilities that are increasing in number and complexity daily. This paper presents a methodology to automatically design a positive security based model which identifies and allows only legitimate web queries. The paper shows a true positive rate of more than 90% can be achieved.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: This research explores on the development of the structure of Carbon Credit Registry System those accords to the need of future events in Thailand. This research also explores the big picture of every connected system by referring to the design of each system, the Data Flow Diagram, and the design in term of the system-s data using DES standard. The purpose of this paper is to show how to design the model of each system. Furthermore, this paper can serve as guideline for designing an appropriate Carbon Credit Registry System.
Abstract: Most papers model Joint Replenishment Problem
(JRP) as a (kT,S) where kT is a multiple value for a common review
period T,and S is a predefined order up to level. In general the (T,S)
policy is characterized by a long out of control period which requires
a large amount of safety stock compared to the (R,Q) policy. In this
paper a probabilistic model is built where an item, call it item(i),
with the shortest order time between interval (T)is modeled under
(R,Q) policy and its inventory is continuously reviewed, while the
rest of items (j) are periodically reviewed at a definite time
corresponding to item
Abstract: The method of gait identification based on the nearest neighbor classification technique with motion similarity assessment by the dynamic time warping is proposed. The model based kinematic motion data, represented by the joints rotations coded by Euler angles and unit quaternions is used. The different pose distance functions in Euler angles and quaternion spaces are considered. To evaluate individual features of the subsequent joints movements during gait cycle, joint selection is carried out. To examine proposed approach database containing 353 gaits of 25 humans collected in motion capture laboratory is used. The obtained results are promising. The classifications, which takes into consideration all joints has accuracy over 91%. Only analysis of movements of hip joints allows to correctly identify gaits with almost 80% precision.
Abstract: Determining depth of anesthesia is a challenging problem
in the context of biomedical signal processing. Various methods
have been suggested to determine a quantitative index as depth of
anesthesia, but most of these methods suffer from high sensitivity
during the surgery. A novel method based on energy scattering of
samples in the wavelet domain is suggested to represent the basic
content of electroencephalogram (EEG) signal. In this method, first
EEG signal is decomposed into different sub-bands, then samples
are squared and energy of samples sequence is constructed through
each scale and time, which is normalized and finally entropy of the
resulted sequences is suggested as a reliable index. Empirical Results
showed that applying the proposed method to the EEG signals can
classify the awake, moderate and deep anesthesia states similar to
BIS.
Abstract: In high bitrate information hiding techniques, 1 bit is
embedded within each 4 x 4 Discrete Cosine Transform (DCT)
coefficient block by means of vector quantization, then the hidden bit
can be effectively extracted in terminal end. In this paper high bitrate
information hiding algorithms are summarized, and the scheme of
video in video is implemented. Experimental result shows that the host
video which is embedded numerous auxiliary information have little
visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host
video only degrades 0.22dB in average, while the hidden information
has a high percentage of survives and keeps a high robustness in
H.264/AVC compression, the average Bit Error Rate(BER) of hiding
information is 0.015%.
Abstract: Three novel and significant contributions are made in
this paper Firstly, non-recursive formulation of Haar connection
coefficients, pioneered by the present authors is presented, which
can be computed very efficiently and avoid stack and memory
overflows. Secondly, the generalized approach for state analysis of
singular bilinear time-invariant (TI) and time-varying (TV) systems
is presented; vis-˜a-vis diversified and complex works reported by
different authors. Thirdly, a generalized approach for parameter
estimation of bilinear TI and TV systems is also proposed. The unified
framework of the proposed method is very significant in that the
digital hardware once-designed can be used to perform the complex
tasks of state analysis and parameter estimation of different types
of bilinear systems single-handedly. The simplicity, effectiveness and
generalized nature of the proposed method is established by applying
it to different types of bilinear systems for the two tasks.
Abstract: This work is a proposed model of CMOS for which
the algorithm has been created and then the performance evaluation
of this proposition has been done. In this context, another commonly
used model called ZSTT (Zero Switching Time Transient) model is
chosen to compare all the vital features and the results for the
Proposed Equivalent CMOS are promising. In the end, the excerpts
of the created algorithm are also included
Abstract: Pattern matching is one of the fundamental applications in molecular biology. Searching DNA related data is a common activity for molecular biologists. In this paper we explore the applicability of a new pattern matching technique called Index based Forward Backward Multiple Pattern Matching algorithm(IFBMPM), for DNA Sequences. Our approach avoids unnecessary comparisons in the DNA Sequence due to this; the number of comparisons of the proposed algorithm is very less compared to other existing popular methods. The number of comparisons rapidly decreases and execution time decreases accordingly and shows better performance.
Abstract: Bloom filter is a probabilistic and memory efficient
data structure designed to answer rapidly whether an element is
present in a set. It tells that the element is definitely not in the set but
its presence is with certain probability. The trade-off to use Bloom
filter is a certain configurable risk of false positives. The odds of a
false positive can be made very low if the number of hash function is
sufficiently large. For spam detection, weight is attached to each set
of elements. The spam weight for a word is a measure used to rate the
e-mail. Each word is assigned to a Bloom filter based on its weight.
The proposed work introduces an enhanced concept in Bloom filter
called Bin Bloom Filter (BBF). The performance of BBF over
conventional Bloom filter is evaluated under various optimization
techniques. Real time data set and synthetic data sets are used for
experimental analysis and the results are demonstrated for bin sizes 4,
5, 6 and 7. Finally analyzing the results, it is found that the BBF
which uses heuristic techniques performs better than the traditional
Bloom filter in spam detection.
Abstract: This study was initiated with a three prong objective.
One, to identify the relationship between Technological
Competencies factors (Technical Capability, Firm Innovativeness
and E-Business Practices and professional service firms- business
performance. To investigate the predictors of professional service
firms business performance and finally to evaluate the predictors of
business performance according to the type of professional service
firms, a survey questionnaire was deployed to collect empirical data.
The questionnaire was distributed to the owners of the professional
small medium size enterprises services in the Accounting, Legal,
Engineering and Architecture sectors. Analysis showed that all three
Technology Competency factors have moderate effect on business
performance. In addition, the regression models indicate that
technical capability is the most highly influential that could
determine business performance, followed by e-business practices
and firm innovativeness. Subsequently, the main predictor of
business performance for all types of firms is Technical capability.
Abstract: In the paper we discuss the influence of the route
flexibility degree, the open rate of operations and the production type
coefficient on makespan. The flexible job-open shop scheduling
problem FJOSP (an extension of the classical job shop scheduling) is
analyzed. For the analysis of the production process we used a
hybrid heuristic of the GRASP (greedy randomized adaptive search
procedure) with simulated annealing algorithm. Experiments with
different levels of factors have been considered and compared. The
GRASP+SA algorithm has been tested and illustrated with results for
the serial route and the parallel one.
Abstract: Group contribution methods such as the UNIFAC are
very useful to researchers and engineers involved in synthesis,
feasibility studies, design and optimization of separation processes.
They can be applied successfully to predict phase equilibrium and
excess properties in the development of chemical and separation
processes. The main focus of this work was to investigate the
possibility of absorbing selected volatile organic compounds (VOCs)
into polydimethylsiloxane (PDMS) using three selected UNIFAC
group contribution methods. Absorption followed by subsequent
stripping is the predominant available abatement technology of
VOCs from flue gases prior to their release into the atmosphere. The
original, modified and effective UNIFAC models were used in this
work. The thirteen selected VOCs that have been considered in this
research are: pentane, hexane, heptanes, trimethylamine, toluene,
xylene, cyclohexane, butyl acetate, diethyl acetate, chloroform,
acetone, ethyl methyl ketone and isobutyl methyl ketone. The
computation was done for solute VOC concentration of 8.55x10-8
which is well in the infinite dilution region. The results obtained in
this study compare very well with those published in literature
obtained through both measurements and predictions. The phase
equilibrium obtained in this study show that PDMS is a good
absorbent for the removal of VOCs from contaminated air streams
through physical absorption.