Abstract: Studies in neuroscience suggest that both global and
local feature information are crucial for perception and recognition of
faces. It is widely believed that local feature is less sensitive to
variations caused by illumination, expression and illumination. In
this paper, we target at designing and learning local features for face
recognition. We designed three types of local features. They are
semi-global feature, local patch feature and tangent shape feature.
The designing of semi-global feature aims at taking advantage of
global-like feature and meanwhile avoiding suppressing AdaBoost
algorithm in boosting weak classifies established from small local
patches. The designing of local patch feature targets at automatically
selecting discriminative features, and is thus different with traditional
ways, in which local patches are usually selected manually to cover
the salient facial components. Also, shape feature is considered in
this paper for frontal view face recognition. These features are
selected and combined under the framework of boosting algorithm
and cascade structure. The experimental results demonstrate that the
proposed approach outperforms the standard eigenface method and
Bayesian method. Moreover, the selected local features and
observations in the experiments are enlightening to researches in
local feature design in face recognition.
Abstract: All practical real-time scheduling algorithms in multiprocessor systems present a trade-off between their computational complexity and performance. In real-time systems, tasks have to be performed correctly and timely. Finding minimal schedule in multiprocessor systems with real-time constraints is shown to be NP-hard. Although some optimal algorithms have been employed in uni-processor systems, they fail when they are applied in multiprocessor systems. The practical scheduling algorithms in real-time systems have not deterministic response time. Deterministic timing behavior is an important parameter for system robustness analysis. The intrinsic uncertainty in dynamic real-time systems increases the difficulties of scheduling problem. To alleviate these difficulties, we have proposed a fuzzy scheduling approach to arrange real-time periodic and non-periodic tasks in multiprocessor systems. Static and dynamic optimal scheduling algorithms fail with non-critical overload. In contrast, our approach balances task loads of the processors successfully while consider starvation prevention and fairness which cause higher priority tasks have higher running probability. A simulation is conducted to evaluate the performance of the proposed approach. Experimental results have shown that the proposed fuzzy scheduler creates feasible schedules for homogeneous and heterogeneous tasks. It also and considers tasks priorities which cause higher system utilization and lowers deadline miss time. According to the results, it performs very close to optimal schedule of uni-processor systems.
Abstract: Recently, the advanced technologies that offer high
precision product, relative easy, economical process and also rapid
production are needed to realize the high demand of ultra precision
micro part. In our research, micromanufacturing based on soft
lithography and nanopowder injection molding was investigated. The
silicone metal pattern with ultra thick and high aspect ratio succeeds to
fabricate Polydimethylsiloxane (PDMS) micro mold. The process
followed by nanopowder injection molding (PIM) by a simple vacuum
hot press. The 17-4ph nanopowder with diameter of 100 nm, succeed
to be injected and it forms green sample microbearing with thickness,
microchannel and aspect ratio is 700μm, 60μm and 12, respectively.
Sintering process was done in 1200 C for 2 hours and heating rate
0.83oC/min. Since low powder load (45% PL) was applied to achieve
green sample fabrication, ~15% shrinkage happen in the 86% relative
density. Several improvements should be done to produce high
accuracy and full density sintered part.
Abstract: The most common result of analysis of highthroughput
data in molecular biology represents a global list of
genes, ranked accordingly to a certain score. The score can be a
measure of differential expression. Recent work proposed a new
method for selecting a number of genes in a ranked gene list from
microarray gene expression data such that this set forms the
Optimally Functionally Enriched Network (OFTEN), formed by
known physical interactions between genes or their products. Here
we present calculation results of relative connectivity of genes from
META-OFTEN network and tentative biological interpretation of the
most reproducible signal. The relative connectivity and
inbetweenness values of genes from META-OFTEN network were
estimated.
Abstract: Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.
Abstract: Due to the non-linear characteristics of photovoltaic
(PV) array, PV systems typically are equipped with the capability of
maximum power point tracking (MPPT) feature. Moreover, in the
case of PV array under partially shaded conditions, hotspot problem
will occur which could damage the PV cells. Partial shading causes
multiple peaks in the P-V characteristic curves. This paper presents a
hybrid algorithm of Particle Swarm Optimization (PSO) and
Artificial Neural Network (ANN) MPPT algorithm for the detection
of global peak among the multiple peaks in order to extract the true
maximum energy from PV panel. The PV system consists of PV
array, dc-dc boost converter controlled by the proposed MPPT
algorithm and a resistive load. The system was simulated using
MATLAB/Simulink package. The simulation results show that the
proposed algorithm performs well to detect the true global peak
power. The results of the simulations are analyzed and discussed.
Abstract: This paper presents a fast and efficient on-line technique for estimating impedance of unbalanced loads in power systems. The proposed technique is an application of a discrete timedynamic filter based on stochastic estimation theory which is suitable for estimating parameters in noisy environment. The algorithm uses sets of digital samples of the distorted voltage and current waveforms of the non-linear load to estimate the harmonic contents of these two signal. The non-linear load impedance is then calculated from these contents. The method is tested using practical data. Results are reported and compared with those obtained using the conventional least error squares technique. In addition to the very accurate results obtained, the method can detect and reject bad measurements. This can be considered as a very important advantage over the conventional static estimation methods such as the least error square method.
Abstract: The assessment of surface waters in Enugu metropolis
for fecal coliform bacteria was undertaken. Enugu urban was divided
into three areas (A1, A2 and A3), and fecal coliform bacteria
analysed in the surface waters found in these areas for four years
(2005-2008). The plate count method was used for the analyses. Data
generated were subjected to statistical tests involving; Normality test,
Homogeneity of variance test, correlation test, and tolerance limit
test. The influence of seasonality and pollution trends were
investigated using time series plots. Results from the tolerance limit
test at 95% coverage with 95% confidence, and with respect to EU
maximum permissible concentration show that the three areas suffer
from fecal coliform pollution. To this end, remediation procedure
involving the use of saw-dust extracts from three woods namely;
Chlorophora-Excelsa (C-Excelsa),Khayan-Senegalensis,(CSenegalensis)
and Erythrophylum-Ivorensis (E-Ivorensis) in
controlling the coliforms was studied. Results show that mixture of
the acetone extracts of the woods show the most effective
antibacterial inhibitory activities (26.00mm zone of inhibition)
against E-coli. Methanol extract mixture of the three woods gave best
inhibitory activity (26.00mm zone of inhibition) against S-areus, and
25.00mm zones of inhibition against E-Aerogenes. The aqueous
extracts mixture gave acceptable zones of inhibitions against the
three bacteria organisms.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: The recent development of humanoid robots has led robot designers to imagine a great variety of anthropomorphic forms for human-like machine. Which form is the best ? We try to answer this question from a double meaning of the anthropomorphism : a positive anthropomorphism corresponing to the realization of an effective anthropomorphic form object and a negative one corresponding to our natural tendency in certain circumstances to give human attributes to non-human beings. We postulate that any humanoid robot is concerned by both these two anthropomorphism kinds. We propose to use gestalt theory and Heider-s balance theory in order to analyze how negative anthropomorphism can influence our perception of human-like robots. From our theoretical approach we conclude that an “even shape" as defined by gestalt theory is not a sufficient condition for a good integration of future humanoid robots into a human community. Aesthetic perception of the robot cannot be splitted from a social perception : a humanoid robot, any how the efforts made for improving its appearance, could be rejected if it is devoted to a task with too high affective implications.
Abstract: Repetitive systems stand for a kind of systems that
perform a simple task on a fixed pattern repetitively, which are
widely spread in industrial fields. Hence, many researchers have been
interested in those systems, especially in the field of iterative learning
control (ILC). In this paper, we propose a finite-horizon tracking
control scheme for linear time-varying repetitive systems with uncertain
initial conditions. The scheme is derived both analytically
and numerically for state-feedback systems and only numerically for
output-feedback systems. Then, it is extended to stable systems with
input constraints. All numerical schemes are developed in the forms
of linear matrix inequalities (LMIs). A distinguished feature of the
proposed scheme from the existing iterative learning control is that
the scheme guarantees the tracking performance exactly even under
uncertain initial conditions. The simulation results demonstrate the
good performance of the proposed scheme.
Abstract: In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.
Abstract: It is well known that during the developments in the
economic sector and through the financial crises occur everywhere in
the whole world, volatility measurement is the most important
concept in financial time series. Therefore in this paper we discuss
the volatility for Amman stocks market (Jordan) for certain period of
time. Since wavelet transform is one of the most famous filtering
methods and grows up very quickly in the last decade, we compare
this method with the traditional technique, Fast Fourier transform to
decide the best method for analyzing the volatility. The comparison
will be done on some of the statistical properties by using Matlab
program.
Abstract: By employing BS (Base Station) cooperation we can
increase substantially the spectral efficiency and capacity of cellular
systems. The signals received at each BS are sent to a central unit that
performs the separation of the different MT (Mobile Terminal) using
the same physical channel. However, we need accurate sampling and
quantization of those signals so as to reduce the backhaul
communication requirements.
In this paper we consider the optimization of the quantizers for BS
cooperation systems. Four different quantizer types are analyzed and
optimized to allow better SQNR (Signal-to-Quantization Noise
Ratio) and BER (Bit Error Rate) performance.
Abstract: This paper presents a new method of analog fault diagnosis based on back-propagation neural networks (BPNNs) using wavelet decomposition and fractal dimension as preprocessors. The proposed method has the capability to detect and identify faulty components in an analog electronic circuit with tolerance by analyzing its impulse response. Using wavelet decomposition to preprocess the impulse response drastically de-noises the inputs to the neural network. The second preprocessing by fractal dimension can extract unique features, which are the fed to a neural network as inputs for further classification. A comparison of our work with [1] and [6], which also employs back-propagation (BP) neural networks, reveals that our system requires a much smaller network and performs significantly better in fault diagnosis of analog circuits due to our proposed preprocessing techniques.
Abstract: The Shanghai Cooperation Organization is one of the successful outcomes of China's foreign policy since the end of the Cold war. The expansion of multilateral ties all over the world by dint of pursuing institutional strategies as SCO, identify China as a more constructive power. SCO became a new model of cooperation that was formed on remains of collapsed Soviet system, and predetermined China's geopolitical role in the region. As the fast developing effective regional mechanism, SCO today has more of external impact on the international system and forms a new type of interaction for promoting China's grand strategy of 'peaceful rise'.
Abstract: Deformable active contours are widely used in
computer vision and image processing applications for image
segmentation, especially in biomedical image analysis. The active
contour or “snake" deforms towards a target object by controlling the
internal, image and constraint forces. However, if the contour
initialized with a lesser number of control points, there is a high
probability of surpassing the sharp corners of the object during
deformation of the contour. In this paper, a new technique is
proposed to construct the initial contour by incorporating prior
knowledge of significant corners of the object detected using the
Harris operator. This new reconstructed contour begins to deform, by
attracting the snake towards the targeted object, without missing the
corners. Experimental results with several synthetic images show the
ability of the new technique to deal with sharp corners with a high
accuracy than traditional methods.
Abstract: The oil and gas industry has moved towards Load and
Resistance Factor Design through API RP2A - LRFD and the
recently published international standard, ISO-19902, for design of
fixed steel offshore structures. The ISO 19902 is intended to provide
a harmonized design practice that offers a balanced structural fitness
for the purpose, economy and safety. As part of an ongoing work, the
reliability analysis of tubular joints of the jacket structure has been
carried out to calibrate the load and resistance factors for the design
of offshore platforms in Malaysia, as proposed in the ISO.
Probabilistic models have been established for the load effects (wave,
wind and current) and the tubular joints strengths. In this study the
First Order Reliability Method (FORM), coded in MATLAB
Software has been employed to evaluate the reliability index of the
typical joints, designed using API RP2A - WSD and ISO 19902.
Abstract: Modern organizations operate under the pressure of
dynamic and often unpredictable changes, both in external and
internal environment. Market success, in this context, requires a
particular competence in the form of flexibility, interpreted here both
on the level of individuals and on the level of organization. This
paper addresses the changes taking place in the sphere of
employment, as observed in economic entities operating on Polish
market. Based on own empirical studies, the authors focus on the
progressing trend of ‘flexibilization’ of employment, particularly in
the context of transformations in organizational structure, designed to
facilitate the transition into management by projects and
differentiation of labor forms.
Abstract: In this paper we propose a novel approach for
searching eCommerce products using a mobile phone, illustrated by a
prototype eCoMobile. This approach aims to globalize the mobile
search by integrating the concept of user multilinguism into it. To
show that, we particularly deal with English and Arabic languages.
Indeed the mobile user can formulate his query on a commercial
product in either language (English/Arabic). The description of his
information need on commercial products relies on the ontology that
represents the conceptualization of the product catalogue knowledge
domain defined in both English and Arabic languages. A query
expressed on a mobile device client defines the concept that
corresponds to the name of the product followed by a set of pairs
(property, value) specifying the characteristics of the product. Once a
query is submitted it is then communicated to the server side which
analyses it and in its turn performs an http request to an eCommerce
application server (like Amazon). This latter responds by returning
an XML file representing a set of elements where each element
defines an item of the searched product with its specific
characteristics. The XML file is analyzed on the server side and then
items are displayed on the mobile device client along with its
relevant characteristics in the chosen language.