Abstract: It is hard to percept the interaction process with machines when visual information is not available. In this paper, we have addressed this issue to provide interaction through visual techniques. Posture recognition is done for American Sign Language to recognize static alphabets and numbers. 3D information is exploited to obtain segmentation of hands and face using normal Gaussian distribution and depth information. Features for posture recognition are computed using statistical and geometrical properties which are translation, rotation and scale invariant. Hu-Moment as statistical features and; circularity and rectangularity as geometrical features are incorporated to build the feature vectors. These feature vectors are used to train SVM for classification that recognizes static alphabets and numbers. For the alphabets, curvature analysis is carried out to reduce the misclassifications. The experimental results show that proposed system recognizes posture symbols by achieving recognition rate of 98.65% and 98.6% for ASL alphabets and numbers respectively.
Abstract: The general global behavior of particle S a non-linear (Q - xy)2 potential cannot be revealed a Poincare surface of section method (PSS) because inost trajectories take practically infinitely long time to integrate numerically before they come back to the surface. In this study as an alternative to PSS, a multiple scale perturbation is applied to analyze global adiabatic, non-adiabatic and chaotic behavior of particles in this potential. It was found that the results can be summarized as a form of a Fermi-like map. Additionally, this method gives a variation of global stochasticity criteria with Q.
Abstract: Image watermarking has proven to be quite an
efficient tool for the purpose of copyright protection and
authentication over the last few years. In this paper, a novel image
watermarking technique in the wavelet domain is suggested and
tested. To achieve more security and robustness, the proposed
techniques relies on using two nested watermarks that are embedded
into the image to be watermarked. A primary watermark in form of a
PN sequence is first embedded into an image (the secondary
watermark) before being embedded into the host image. The
technique is implemented using Daubechies mother wavelets where
an arbitrary embedding factor α is introduced to improve the
invisibility and robustness. The proposed technique has been applied
on several gray scale images where a PSNR of about 60 dB was
achieved.
Abstract: Model Predictive Control (MPC) is an established control
technique in a wide range of process industries. The reason for
this success is its ability to handle multivariable systems and systems
having input, output or state constraints. Neverthless comparing to
PID controller, the implementation of the MPC in miniaturized
devices like Field Programmable Gate Arrays (FPGA) and microcontrollers
has historically been very small scale due to its complexity in
implementation and its computation time requirement. At the same
time, such embedded technologies have become an enabler for future
manufacturing enterprisers as well as a transformer of organizations
and markets. In this work, we take advantage of these recent advances
in this area in the deployment of one of the most studied and applied
control technique in the industrial engineering. In this paper, we
propose an efficient firmware for the implementation of constrained
MPC in the performed STM32 microcontroller using interior point
method. Indeed, performances study shows good execution speed
and low computational burden. These results encourage to develop
predictive control algorithms to be programmed in industrial standard
processes. The PID anti windup controller was also implemented in
the STM32 in order to make a performance comparison with the
MPC. The main features of the proposed constrained MPC framework
are illustrated through two examples.
Abstract: Falling has been one of the major concerns and threats
to the independence of the elderly in their daily lives. With the
worldwide significant growth of the aging population, it is essential
to have a promising solution of fall detection which is able to operate
at high accuracy in real-time and supports large scale implementation
using multiple cameras. Field Programmable Gate Array (FPGA) is a
highly promising tool to be used as a hardware accelerator in many
emerging embedded vision based system. Thus, it is the main
objective of this paper to present an FPGA-based solution of visual
based fall detection to meet stringent real-time requirements with
high accuracy. The hardware architecture of visual based fall
detection which utilizes the pixel locality to reduce memory accesses
is proposed. By exploiting the parallel and pipeline architecture of
FPGA, our hardware implementation of visual based fall detection
using FGPA is able to achieve a performance of 60fps for a series of
video analytical functions at VGA resolutions (640x480). The results
of this work show that FPGA has great potentials and impacts in
enabling large scale vision system in the future healthcare industry
due to its flexibility and scalability.
Abstract: A review of the literature found that Domestic
violence and child maltreatment co-occur in many families, the
purpose of this study attempts to emphasize the factors relating to
intra-family relationships (order point of view) on violence against
the children, For this purpose a survey technique on the sample size
amounted 200 students of governmental guidance schools of city of
Gilanegharb in country of Iran were considered. For measurement of
violence against the children (VAC) the CTS scaled has been used
.The results showed that children have experienced the violence more
than once during the last year. degree of order in family is high.
Explanation result indicated that the order variables in family
including collective thinking, empathy, communal co-circumstance
have significant effects on VAC.
Abstract: Although Face detection is not a recent activity in the
field of image processing, it is still an open area for research. The
greatest step in this field is the work reported by Viola and its recent
analogous is Huang et al. Both of them use similar features and also
similar training process. The former is just for detecting upright
faces, but the latter can detect multi-view faces in still grayscale
images using new features called 'sparse feature'. Finding these
features is very time consuming and inefficient by proposed methods.
Here, we propose a new approach for finding sparse features using a
genetic algorithm system. This method requires less computational
cost and gets more effective features in learning process for face
detection that causes more accuracy.
Abstract: One part of the total employee’s reward is apart from basic wages or salary, employee’s benefits and intangible remuneration also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, cap competency or skills of individual employees, and to team’s or company-wide performance or to combination of few of the mentioned possibilities. Sometimes among the contingent pay is also incorporated the remuneration based on length of employment, when the financial reward is not connected to performance or skills, but to length of continuous employment either on one working position or in one level of remuneration scale. Main aim of this article is to define, based on available information, contingent pay, describe individual forms, its advantages and disadvantages and possibilities to utilization in practice; but also bring information not only about its extent and level of utilization of contingent pay by companies in one of the Czech Republic’s regions, but also mention their practical experience with this type of remuneration.
Abstract: The optimization problem using time scales is studied.
Time scale is a model of time. The language of time scales seems to
be an ideal tool to unify the continuous-time and the discrete-time
theories. In this work we present necessary conditions for a solution
of an optimization problem on time scales. To obtain that result we
use properties and results of the partial diamond-alpha derivatives for
continuous-multivariable functions. These results are also presented
here.
Abstract: Arvia®, a spin-out company of University of Manchester, UK is commercialising a water treatment technology for the removal of low concentrations of organics from water. This technology is based on the adsorption of organics onto graphite based adsorbents coupled with their electrochemical regeneration in a simple electrochemical cell. In this paper, the potential of the process to adsorb microorganisms and electrochemically disinfect them present in water has been demonstrated. Bench scale experiments have indicated that the process of adsorption using graphite adsorbents with electrochemical regeneration can be used for water disinfection effectively. The most likely mechanisms of disinfection of water through this process include direct electrochemical oxidation and electrochemical chlorination.
Abstract: Nowadays, there is little information, concerning the
heat shield systems, and this information is not completely reliable to
use in so many cases. for example, the precise calculation cannot be
done for various materials. In addition, the real scale test has two
disadvantages: high cost and low flexibility, and for each case we
must perform a new test. Hence, using numerical modeling program
that calculates the surface recession rate and interior temperature
distribution is necessary. Also, numerical solution of governing
equation for non-charring material ablation is presented in order to
anticipate the recession rate and the heat response of non-charring
heat shields. the governing equation is nonlinear and the Newton-
Rafson method along with TDMA algorithm is used to solve this
nonlinear equation system. Using Newton- Rafson method for
solving the governing equation is one of the advantages of the
solving method because this method is simple and it can be easily
generalized to more difficult problems. The obtained results
compared with reliable sources in order to examine the accuracy of
compiling code.
Abstract: Employees commonly encounter unpredictable and
unavoidable work related stressors. Exposure to such stressors can
evoke negative appraisals and associated adverse mental, physical,
and behavioral responses. Because Acceptance and Commitment
Therapy (ACT) emphasizes acceptance of unavoidable stressors and
diffusion from negative appraisals, it may be particularly beneficial
for work stress. Forty-five workers were randomly assigned to an
ACT intervention for work stress (n = 21) or a waitlist control group
(n = 24). The intervention consisted of two 3-hour sessions spaced
one week apart. An examination of group process and outcomes was
conducted using the Revised Sessions Rating Scale. Results indicated
that the ACT participants reported that they perceived the
intervention to be supportive, task focused, and without adverse
therapist behaviors (e.g., feelings of being criticized or discounted).
Additionally, the second session (values clarification and
commitment to action) was perceived to be more supportive and task
focused than the first session (mindfulness, defusion). Process ratings
were correlated with outcomes. Results indicated that perceptions of
therapy supportiveness and task focus were associated with reduced
psychological distress and improved perceived physical health.
Abstract: The data is available in abundance in any business
organization. It includes the records for finance, maintenance,
inventory, progress reports etc. As the time progresses, the data keep
on accumulating and the challenge is to extract the information from
this data bank. Knowledge discovery from these large and complex
databases is the key problem of this era. Data mining and machine
learning techniques are needed which can scale to the size of the
problems and can be customized to the application of business. For
the development of accurate and required information for particular
problem, business analyst needs to develop multidimensional models
which give the reliable information so that they can take right
decision for particular problem. If the multidimensional model does
not possess the advance features, the accuracy cannot be expected.
The present work involves the development of a Multidimensional
data model incorporating advance features. The criterion of
computation is based on the data precision and to include slowly
change time dimension. The final results are displayed in graphical
form.
Abstract: Power consumption is rapidly increased in data centers
because the number of data center is increased and more the scale of
data center become larger. Therefore, it is one of key research items to
reduce power consumption in data center. The peak power of a typical
server is around 250 watts. When a server is idle, it continues to use
around 60% of the power consumed when in use, though vendors are
putting effort into reducing this “idle" power load. Servers tend to
work at only around a 5% to 20% utilization rate, partly because of
response time concerns. An average of 10% of servers in their data
centers was unused. In those reason, we propose dynamic power
management system to reduce power consumption in green data
center. Experiment result shows that about 55% power consumption is
reduced at idle time.
Abstract: Panoramic view generation has always offered
novel and distinct challenges in the field of image processing.
Panoramic view generation is nothing but construction of bigger
view mosaic image from set of partial images of the desired view.
The paper presents a solution to one of the problems of image
seascape formation where some of the partial images are color and
others are grayscale. The simplest solution could be to convert all
image parts into grayscale images and fusing them to get grayscale
image panorama. But in the multihued world, obtaining the colored
seascape will always be preferred. This could be achieved by picking
colors from the color parts and squirting them in grayscale parts of
the seascape. So firstly the grayscale image parts should be colored
with help of color image parts and then these parts should be fused to
construct the seascape image.
The problem of coloring grayscale images has no exact solution.
In the proposed technique of panoramic view generation, the job of
transferring color traits from reference color image to grayscale
image is done by palette based method. In this technique, the color
palette is prepared using pixel windows of some degrees taken from
color image parts. Then the grayscale image part is divided into pixel
windows with same degrees. For every window of grayscale image
part the palette is searched and equivalent color values are found,
which could be used to color grayscale window. For palette
preparation we have used RGB color space and Kekre-s LUV color
space. Kekre-s LUV color space gives better quality of coloring. The
searching time through color palette is improved over the exhaustive
search using Kekre-s fast search technique.
After coloring the grayscale image pieces the next job is fusion of
all these pieces to obtain panoramic view. For similarity estimation
between partial images correlation coefficient is used.
Abstract: The study was conducted to evaluate the quality
characteristics of cookies produced from sweet potato-fermented
soybean flour. Cookies were subjected to proximate and sensory
analysis to determine the acceptability of the product. Protein, fat and
ash increased as the proportion of soybean flour increased, ranging
from 13.8-21.7, 1.22-5.25 and 2.20-2.57 respectively. The crude fibre
content was within the range of 3.08-4.83%. The moisture content of
the cookies decreased with increase in soybean flour from 3.42-
2.13%. Cookies produced from whole sweet potato flour had the
highest moisture content of 3.42% while 30% substitution had the
lowest moisture content 2.13%. A nine point hedonic scale was used
to evaluate the organoleptic characteristics of the cookies. The
sensory analysis indicated that there was no significant difference
between the cookies produced even when compared to the control
100% sweet potato cookies. The overall acceptance of the cookies
was ranked to 20% soybean flour substitute.
Abstract: Mel Frequency Cepstral Coefficient (MFCC) features
are widely used as acoustic features for speech recognition as well
as speaker recognition. In MFCC feature representation, the Mel frequency
scale is used to get a high resolution in low frequency region,
and a low resolution in high frequency region. This kind of processing
is good for obtaining stable phonetic information, but not suitable
for speaker features that are located in high frequency regions. The
speaker individual information, which is non-uniformly distributed
in the high frequencies, is equally important for speaker recognition.
Based on this fact we proposed an admissible wavelet packet based
filter structure for speaker identification. Multiresolution capabilities
of wavelet packet transform are used to derive the new features.
The proposed scheme differs from previous wavelet based works,
mainly in designing the filter structure. Unlike others, the proposed
filter structure does not follow Mel scale. The closed-set speaker
identification experiments performed on the TIMIT database shows
improved identification performance compared to other commonly
used Mel scale based filter structures using wavelets.
Abstract: Intermittent aeration process can be easily applied on
the existing activated sludge system and is highly reliable against the loading changes. It can be operated in a relatively simple way as well.
Since the moving-bed biofilm reactor method processes pollutants by attaching and securing the microorganisms on the media, the process
efficiency can be higher compared to the suspended growth biological
treatment process, and can reduce the return of sludge. In this study,
the existing intermittent aeration process with alternating flow being
applied on the oxidation ditch is applied on the continuous flow stirred tank reactor with advantages from both processes, and we would like
to develop the process to significantly reduce the return of sludge in the clarifier and to secure the reliable quality of treated water by
adding the moving media. Corresponding process has the appropriate
form as an infrastructure based on u- environment in future u- City and
is expected to accelerate the implementation of u-Eco city in conjunction with city based services. The system being conducted in a
laboratory scale has been operated in HRT 8hours except for the final
clarifier and showed the removal efficiency of 97.7 %, 73.1 % and 9.4
% in organic matters, TN and TP, respectively with operating range of
4hour cycle on system SRT 10days. After adding the media, the removal efficiency of phosphorus showed a similar level compared to
that before the addition, but the removal efficiency of nitrogen was
improved by 7~10 %. In addition, the solids which were maintained in
MLSS 1200~1400 at 25 % of media packing were attached all onto the
media, which produced no sludge entering the clarifier. Therefore, the
return of sludge is not needed any longer.
Abstract: This paper reviews various approaches that have been
used for the modeling and simulation of large-scale engineering
systems and determines their appropriateness in the development of a
RICS modeling and simulation tool. Bond graphs, linear graphs,
block diagrams, differential and difference equations, modeling
languages, cellular automata and agents are reviewed. This tool
should be based on linear graph representation and supports symbolic
programming, functional programming, the development of noncausal
models and the incorporation of decentralized approaches.
Abstract: Years of extensive research in the field of speech
processing for compression and recognition in the last five decades,
resulted in a severe competition among the various methods and
paradigms introduced. In this paper we include the different representations
of speech in the time-frequency and time-scale domains
for the purpose of compression and recognition. The examination of
these representations in a variety of related work is accomplished.
In particular, we emphasize methods related to Fourier analysis
paradigms and wavelet based ones along with the advantages and
disadvantages of both approaches.