Abstract: A new approach for facial expressions recognition based on face context and adaptively weighted sub-pattern PCA (Aw-SpPCA) has been presented in this paper. The facial region and others part of the body have been segmented from the complex environment based on skin color model. An algorithm has been proposed to accurate detection of face region from the segmented image based on constant ratio of height and width of face (δ= 1.618). The paper also discusses on new concept to detect the eye and mouth position. The desired part of the face has been cropped to analysis the expression of a person. Unlike PCA based on a whole image pattern, Aw-SpPCA operates directly on its sub patterns partitioned from an original whole pattern and separately extracts features from them. Aw-SpPCA can adaptively compute the contributions of each part and a classification task in order to enhance the robustness to both expression and illumination variations. Experiments on single standard face with five types of facial expression database shows that the proposed method is competitive.
Abstract: Intensive changes of environment and strong market
competition have raised management of information and knowledge
to the strategic level of companies. In a knowledge based economy
only those organizations are capable of living which have up-to-date,
special knowledge and they are able to exploit and develop it.
Companies have to know what knowledge they have by taking a
survey of organizational knowledge and they have to fix actual and
additional knowledge in organizational memory. The question is how
to identify, acquire, fix and use knowledge effectively. The paper will
show that over and above the tools of information technology
supporting acquisition, storage and use of information and
organizational learning as well as knowledge coming into being as a
result of it, fixing and storage of knowledge in the memory of a
company play an important role in the intelligence of organizations
and competitiveness of a company.
Abstract: In a world of climate change and limited fossil fuel resources, renewable energy sources are playing an increasingly important role. Due to industrializations and population growth our economy and technologies today largely depend upon natural resources, which are not replaceable. Approximately 90% of our energy consumption comes from fossil fuels (viz. coal, oil and natural gas). The irony is that these resources are depleting. Also, the huge consumption of fossil fuels has caused visible damage to the environment in various forms viz. global warming, acid rains etc.
Abstract: Recently electric vehicles are becoming popular as an
alternative of conventional fossil fuel vehicles. Conventional Internal
Combustion Engine (ICE) vehicle uses fossil fuel which contributing
a major part of overall carbon emission in the environment. Carbon
and other green house gas emission are responsible for global
warming and resulting climate change. It becomes vital to evaluate
performance of vehicle based on emission. In this paper an effort has
been made to depict the picture of emission caused by vehicle and
scenario of Australia has taken into account. Effort has been made to
compare the fossil based vehicle with electric vehicle in phases. The
study also evaluates advancement in electric vehicle technology,
required infrastructure for sustainability and future scope of
developments. This paper also includes the evaluation of electric
vehicle concept for pollution control and sustainable transport
systems in future. This study can be a benchmark for development of
electric vehicle as low carbon emission alternative for the cities of
tomorrow.
Abstract: Fluids are used for heat transfer in many engineering
equipments. Water, ethylene glycol and propylene glycol are some
of the common heat transfer fluids. Over the years, in an attempt to
reduce the size of the equipment and/or efficiency of the process,
various techniques have been employed to improve the heat transfer
rate of these fluids. Surface modification, use of inserts and
increased fluid velocity are some examples of heat transfer
enhancement techniques. Addition of milli or micro sized particles
to the heat transfer fluid is another way of improving heat transfer
rate. Though this looks simple, this method has practical problems
such as high pressure loss, clogging and erosion of the material of
construction. These problems can be overcome by using nanofluids,
which is a dispersion of nanosized particles in a base fluid.
Nanoparticles increase the thermal conductivity of the base fluid
manifold which in turn increases the heat transfer rate. In this work,
the heat transfer enhancement using aluminium oxide nanofluid has
been studied by computational fluid dynamic modeling of the
nanofluid flow adopting the single phase approach.
Abstract: Tea is consumed by a big part of the world-s
population. It has an enormous importance for the Turkish culture.
Nearly it is brewed every morning and evening at the all houses. Also it is consumed with lemon wedge. Habitual drinking of tea
infusions may significantly contribute to daily dietary requirements of elements.
Different instrumental techniques are used for determination of
these elements. But atomic and mass spectroscopic methods are preferred most. In these study chromium, iron and selenium contents
after the hot water brewing of black and green tea were determined
by Optical Emission Spectroscopy (ICP-OES). Furthermore, effect
of lemon addition on chromium, iron and selenium concentration tea
infusions is investigated.
Results of the investigation showed that concentration of
chromium, iron and selenium increased in black tea with lemon addition. On the other hand only selenium is increased with lemon
addition in green tea. And iron concentration is not detected in green
tea but its concentration is determined as 1.420 ppm after lemon addition.
Abstract: Methods to detect and localize time singularities of polynomial and quasi-polynomial ordinary differential equations are systematically presented and developed. They are applied to examples taken form different fields of applications and they are also compared to better known methods such as those based on the existence of linear first integrals or Lyapunov functions.
Abstract: Human activity is a major concern in a wide variety of
applications, such as video surveillance, human computer interface
and face image database management. Detecting and recognizing
faces is a crucial step in these applications. Furthermore, major
advancements and initiatives in security applications in the past years
have propelled face recognition technology into the spotlight. The
performance of existing face recognition systems declines significantly
if the resolution of the face image falls below a certain level.
This is especially critical in surveillance imagery where often, due to
many reasons, only low-resolution video of faces is available. If these
low-resolution images are passed to a face recognition system, the
performance is usually unacceptable. Hence, resolution plays a key
role in face recognition systems. In this paper we introduce a new
low resolution face recognition system based on mixture of expert
neural networks. In order to produce the low resolution input images
we down-sampled the 48 × 48 ORL images to 12 × 12 ones using
the nearest neighbor interpolation method and after that applying
the bicubic interpolation method yields enhanced images which is
given to the Principal Component Analysis feature extractor system.
Comparison with some of the most related methods indicates that
the proposed novel model yields excellent recognition rate in low
resolution face recognition that is the recognition rate of 100% for
the training set and 96.5% for the test set.
Abstract: The development of Internet technology in recent years has led to a more active role of users in creating Web content. This has significant effects both on individual learning and collaborative knowledge building. This paper will present an integrative framework model to describe and explain learning and knowledge building with shared digital artifacts on the basis of Luhmann-s systems theory and Piaget-s model of equilibration. In this model, knowledge progress is based on cognitive conflicts resulting from incongruities between an individual-s prior knowledge and the information which is contained in a digital artifact. Empirical support for the model will be provided by 1) applying it descriptively to texts from Wikipedia, 2) examining knowledge-building processes using a social network analysis, and 3) presenting a survey of a series of experimental laboratory studies.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.
Abstract: Long Term Evolution (LTE) is a 4G wireless
broadband technology developed by the Third Generation
Partnership Project (3GPP) release 8, and it's represent the
competitiveness of Universal Mobile Telecommunications System
(UMTS) for the next 10 years and beyond. The concepts for LTE
systems have been introduced in 3GPP release 8, with objective of
high-data-rate, low-latency and packet-optimized radio access
technology. In this paper, performance of different TCP variants
during LTE network investigated. The performance of TCP over
LTE is affected mostly by the links of the wired network and total
bandwidth available at the serving base station. This paper describes
an NS-2 based simulation analysis of TCP-Vegas, TCP-Tahoe, TCPReno,
TCP-Newreno, TCP-SACK, and TCP-FACK, with full
modeling of all traffics of LTE system. The Evaluation of the
network performance with all TCP variants is mainly based on
throughput, average delay and lost packet. The analysis of TCP
performance over LTE ensures that all TCP's have a similar
throughput and the best performance return to TCP-Vegas than other
variants.
Abstract: Since the 1980s, banks and financial service institutions have been running in an endless race of innovation to cope with the advancing technology, the fierce competition, and the more sophisticated and demanding customers. In order to guide their innovation efforts, several researches were conducted to identify the success and failure factors of new financial services. These mainly included organizational factors, marketplace factors and new service development process factors. They almost all emphasized the importance of customer and market orientation as a response to the highly perceptual and intangible characteristics of financial services. However, they deemphasized the critical characteristics of high involvement of risk and close correlation with the economic conditions, a factor that heavily contributed to the Global financial Crisis of 2008. This paper reviews the success and failure factors of new financial services. It then adds new perspectives emerging from the analysis of the role of innovation in the global financial crisis.
Abstract: Higher-order Statistics (HOS), also known as
cumulants, cross moments and their frequency domain counterparts,
known as poly spectra have emerged as a powerful signal processing
tool for the synthesis and analysis of signals and systems. Algorithms
used for the computation of cross moments are computationally
intensive and require high computational speed for real-time
applications. For efficiency and high speed, it is often advantageous
to realize computation intensive algorithms in hardware. A promising
solution that combines high flexibility together with the speed of a
traditional hardware is Field Programmable Gate Array (FPGA). In
this paper, we present FPGA-based parallel architecture for the
computation of third-order cross moments. The proposed design is
coded in Very High Speed Integrated Circuit (VHSIC) Hardware
Description Language (VHDL) and functionally verified by
implementing it on Xilinx Spartan-3 XC3S2000FG900-4 FPGA.
Implementation results are presented and it shows that the proposed
design can operate at a maximum frequency of 86.618 MHz.
Abstract: Gene expression profiling is rapidly evolving into a
powerful technique for investigating tumor malignancies. The
researchers are overwhelmed with the microarray-based platforms
and methods that confer them the freedom to conduct large-scale
gene expression profiling measurements. Simultaneously,
investigations into cross-platform integration methods have started
gaining momentum due to their underlying potential to help
comprehend a myriad of broad biological issues in tumor diagnosis,
prognosis, and therapy. However, comparing results from different
platforms remains to be a challenging task as various inherent
technical differences exist between the microarray platforms. In this
paper, we explain a simple ratio-transformation method, which can
provide some common ground for cDNA and Affymetrix platform
towards cross-platform integration. The method is based on the
characteristic data attributes of Affymetrix- and cDNA- platform. In
the work, we considered seven childhood leukemia patients and their
gene expression levels in either platform. With a dataset of 822
differentially expressed genes from both these platforms, we carried
out a specific ratio-treatment to Affymetrix data, which subsequently
showed an improvement in the relationship with the cDNA data.
Abstract: Herein, we report the different types of surface morphology due to the interaction between the pure protein Insulin (INS) and catanionic surfactant mixture of Sodium Dodecyl Sulfate (SDS) and Cetyl Trimethyl Ammonium Bromide (CTAB) at air/water interface obtained by the Langmuir-Blodgett (LB) technique. We characterized the aggregations by Scanning Electron Microscopy (SEM), Atomic Force Microscopy (AFM) and Fourier transform infrared spectroscopy (FTIR) in LB films. We found that the INS adsorption increased in presence of catanionic surfactant at air/water interface. The presence of small amount of surfactant induces two-stage growth kinetics due to the pure protein absorption and protein-catanionic surface micelle interaction. The protein remains in native state in presence of small amount of surfactant mixture. Smaller amount of surfactant mixture with INS is producing surface micelle type structure. This may be considered for drug delivery system. On the other hand, INS becomes unfolded and fibrillated in presence of higher amount of surfactant mixture. In both the cases, the protein was successfully immobilized on a glass substrate by the LB technique. These results may find applications in the fundamental science of the physical chemistry of surfactant systems, as well as in the preparation of drug-delivery system.
Abstract: In this article a modification of the algorithm of the fuzzy ART network, aiming at returning it supervised is carried out. It consists of the search for the comparison, training and vigilance parameters giving the minimum quadratic distances between the output of the training base and those obtained by the network. The same process is applied for the determination of the parameters of the fuzzy ARTMAP giving the most powerful network. The modification consist in making learn the fuzzy ARTMAP a base of examples not only once as it is of use, but as many time as its architecture is in evolution or than the objective error is not reached . In this way, we don-t worry about the values to impose on the eight (08) parameters of the network. To evaluate each one of these three networks modified, a comparison of their performances is carried out. As application we carried out a classification of the image of Algiers-s bay taken by SPOT XS. We use as criterion of evaluation the training duration, the mean square error (MSE) in step control and the rate of good classification per class. The results of this study presented as curves, tables and images show that modified fuzzy ARTMAP presents the best compromise quality/computing time.
Abstract: Efficient preprocessing is very essential for automatic
recognition of handwritten documents. In this paper, techniques on
segmenting words in handwritten Arabic text are presented. Firstly,
connected components (ccs) are extracted, and distances among
different components are analyzed. The statistical distribution of this
distance is then obtained to determine an optimal threshold for words
segmentation. Meanwhile, an improved projection based method is
also employed for baseline detection. The proposed method has been
successfully tested on IFN/ENIT database consisting of 26459
Arabic words handwritten by 411 different writers, and the results
were promising and very encouraging in more accurate detection of
the baseline and segmentation of words for further recognition.
Abstract: In recent years, IT convergence technology has been developed to get creative solution by combining robotics or sports science technology. Object detection and recognition have mainly applied to sports science field that has processed by recognizing face and by tracking human body. But object detection and recognition using vision sensor is challenge task in real world because of illumination. In this paper, object detection and recognition using vision sensor applied to sports simulator has been introduced. Face recognition has been processed to identify user and to update automatically a person athletic recording. Human body has tracked to offer a most accurate way of riding horse simulator. Combined image processing has been processed to reduce illumination adverse affect because illumination has caused low performance in detection and recognition in real world application filed. Face has recognized using standard face graph and human body has tracked using pose model, which has composed of feature nodes generated diverse face and pose images. Face recognition using Gabor wavelet and pose recognition using pose graph is robust to real application. We have simulated using ETRI database, which has constructed on horse riding simulator.
Abstract: Perth will run out of available sustainable natural
water resources by 2015 if nothing is done to slow usage rates,
according to a Western Australian study [1]. Alternative water
technology options need to be considered for the long-term
guaranteed supply of water for agricultural, commercial, domestic
and industrial purposes. Seawater is an alternative source of water for
human consumption, because seawater can be desalinated and
supplied in large quantities to a very high quality.
While seawater desalination is a promising option, the technology
requires a large amount of energy which is typically generated from
fossil fuels. The combustion of fossil fuels emits greenhouse gases
(GHG) and, is implicated in climate change. In addition to
environmental emissions from electricity generation for desalination,
greenhouse gases are emitted in the production of chemicals and
membranes for water treatment. Since Australia is a signatory to the
Kyoto Protocol, it is important to quantify greenhouse gas emissions
from desalinated water production.
A life cycle assessment (LCA) has been carried out to determine
the greenhouse gas emissions from the production of 1 gigalitre (GL)
of water from the new plant. In this LCA analysis, a new desalination
plant that will be installed in Bunbury, Western Australia, and known
as Southern Seawater Desalinization Plant (SSDP), was taken as a
case study. The system boundary of the LCA mainly consists of three
stages: seawater extraction, treatment and delivery. The analysis
found that the equivalent of 3,890 tonnes of CO2 could be emitted
from the production of 1 GL of desalinated water. This LCA analysis
has also identified that the reverse osmosis process would cause the
most significant greenhouse emissions as a result of the electricity
used if this is generated from fossil fuels
Abstract: This paper describes a method of modeling to model
shadow play puppet using sophisticated computer graphics techniques
available in OpenGL in order to allow interactive play in real-time
environment as well as producing realistic animation. This paper
proposes a novel real-time method is proposed for modeling of puppet
and its shadow image that allows interactive play of virtual shadow
play using texture mapping and blending techniques. Special effects
such as lighting and blurring effects for virtual shadow play
environment are also developed. Moreover, the use of geometric
transformations and hierarchical modeling facilitates interaction
among the different parts of the puppet during animation. Based on the
experiments and the survey that were carried out, the respondents
involved are very satisfied with the outcomes of these techniques.