Abstract: AAM has been successfully applied to face alignment,
but its performance is very sensitive to initial values. In case the initial
values are a little far distant from the global optimum values, there
exists a pretty good possibility that AAM-based face alignment may
converge to a local minimum. In this paper, we propose a progressive
AAM-based face alignment algorithm which first finds the feature
parameter vector fitting the inner facial feature points of the face and
later localize the feature points of the whole face using the first
information. The proposed progressive AAM-based face alignment
algorithm utilizes the fact that the feature points of the inner part of the
face are less variant and less affected by the background surrounding
the face than those of the outer part (like the chin contour). The
proposed algorithm consists of two stages: modeling and relation
derivation stage and fitting stage. Modeling and relation derivation
stage first needs to construct two AAM models: the inner face AAM
model and the whole face AAM model and then derive relation matrix
between the inner face AAM parameter vector and the whole face
AAM model parameter vector. In the fitting stage, the proposed
algorithm aligns face progressively through two phases. In the first
phase, the proposed algorithm will find the feature parameter vector
fitting the inner facial AAM model into a new input face image, and
then in the second phase it localizes the whole facial feature points of
the new input face image based on the whole face AAM model using
the initial parameter vector estimated from using the inner feature
parameter vector obtained in the first phase and the relation matrix
obtained in the first stage. Through experiments, it is verified that the
proposed progressive AAM-based face alignment algorithm is more
robust with respect to pose, illumination, and face background than the
conventional basic AAM-based face alignment algorithm.
Abstract: In this study, the theoretical relationship between pressure and density was investigated on cylindrical hollow fuel briquettes produced of a mixture of fibrous biomass material using a screw press without any chemical binder. The fuel briquettes were made of biomass and other waste material such as spent coffee beans, mielie husks, saw dust and coal fines under pressures of 0.878-2.2 Mega Pascals (MPa). The material was densified into briquettes of outer diameter of 100mm, inner diameter of 35mm and 50mm long. It was observed that manual screw compression action produces briquettes of relatively low density as compared to the ones made using hydraulic compression action. The pressure and density relationship was obtained in the form of power law and compare well with other cylindrical solid briquettes made using hydraulic compression action. The produced briquettes have a dry density of 989 kg/m3 and contain 26.30% fixed carbon, 39.34% volatile matter, 10.9% moisture and 10.46% ash as per dry proximate analysis. The bomb calorimeter tests have shown the briquettes yielding a gross calorific value of 18.9MJ/kg.
Abstract: In this paper we present an efficient approach for the prediction of two sunspot-related time series, namely the Yearly Sunspot Number and the IR5 Index, that are commonly used for monitoring solar activity. The method is based on exploiting partially recurrent Elman networks and it can be divided into three main steps: the first one consists in a “de-rectification" of the time series under study in order to obtain a new time series whose appearance, similar to a sum of sinusoids, can be modelled by our neural networks much better than the original dataset. After that, we normalize the derectified data so that they have zero mean and unity standard deviation and, finally, train an Elman network with only one input, a recurrent hidden layer and one output using a back-propagation algorithm with variable learning rate and momentum. The achieved results have shown the efficiency of this approach that, although very simple, can perform better than most of the existing solar activity forecasting methods.
Abstract: The objective of this research was to study the
influence of marketing mix on customers purchasing behavior. A
total of 397 respondents were collected from customers who were the
patronages of the Chatuchak Plaza market. A questionnaire was
utilized as a tool to collect data. Statistics utilized in this research
included frequency, percentage, mean, standard deviation, and
multiple regression analysis. Data were analyzed by using Statistical
Package for the Social Sciences. The findings revealed that the
majority of respondents were male with the age between 25-34 years
old, hold undergraduate degree, married and stay together. The
average income of respondents was between 10,001-20,000 baht. In
terms of occupation, the majority worked for private companies. The
research analysis disclosed that there were three variables of
marketing mix which included price (X2), place (X3), and product
(X1) which had an influence on the frequency of customer
purchasing. These three variables can predict a purchase about 30
percent of the time by using the equation; Y1 = 6.851 + .921(X2) +
.949(X3) + .591(X1). It also found that in terms of marketing mixed,
there were two variables had an influence on the amount of customer
purchasing which were physical characteristic (X6), and the process
(X7). These two variables are 17 percent predictive of a purchasing
by using the equation: Y2 = 2276.88 + 2980.97(X6) + 2188.09(X7).
Abstract: The aim of this study is to test the “work values"
inventory developed by Tevruz and Turgut and to utilize the concept
in a model, which aims to create a greater understanding of the work
experience. In the study multiple effects of work values, work-value
congruence and work centrality on organizational citizenship
behavior are examined. In this respect, it is hypothesized that work
values and work-value congruence predict organizational citizenship
behavior through work centrality. Work-goal congruence test, Tevruz
and Turgut-s work values inventory are administered along with
Kanungo-s work centrality and Podsakoff et al.-s [47] organizational
citizenship behavior test to employees working in Turkish SME-s.
The study validated that Tevruz and Turgut-s work values inventory
and the work-value congruence test were reliable and could be used
for future research. The study revealed the mediating role of work
centrality only for the relationship of work values and the
responsibility dimension of citizenship behavior. Most important, this
study brought in an important concept, work-value congruence,
which enables a better understanding of work values and their
relation to various attitudinal variables.
Abstract: The proliferation of web application and the pervasiveness of mobile technology make web-based attacks even more attractive and even easier to launch. Web Application Firewall (WAF) is an intermediate tool between web server and users that provides comprehensive protection for web application. WAF is a negative security model where the detection and prevention mechanisms are based on predefined or user-defined attack signatures and patterns. However, WAF alone is not adequate to offer best defensive system against web vulnerabilities that are increasing in number and complexity daily. This paper presents a methodology to automatically design a positive security based model which identifies and allows only legitimate web queries. The paper shows a true positive rate of more than 90% can be achieved.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: Image compression plays a vital role in today-s
communication. The limitation in allocated bandwidth leads to
slower communication. To exchange the rate of transmission in the
limited bandwidth the Image data must be compressed before
transmission. Basically there are two types of compressions, 1)
LOSSY compression and 2) LOSSLESS compression. Lossy
compression though gives more compression compared to lossless
compression; the accuracy in retrievation is less in case of lossy
compression as compared to lossless compression. JPEG, JPEG2000
image compression system follows huffman coding for image
compression. JPEG 2000 coding system use wavelet transform,
which decompose the image into different levels, where the
coefficient in each sub band are uncorrelated from coefficient of
other sub bands. Embedded Zero tree wavelet (EZW) coding exploits
the multi-resolution properties of the wavelet transform to give a
computationally simple algorithm with better performance compared
to existing wavelet transforms. For further improvement of
compression applications other coding methods were recently been
suggested. An ANN base approach is one such method. Artificial
Neural Network has been applied to many problems in image
processing and has demonstrated their superiority over classical
methods when dealing with noisy or incomplete data for image
compression applications. The performance analysis of different
images is proposed with an analysis of EZW coding system with
Error Backpropagation algorithm. The implementation and analysis
shows approximately 30% more accuracy in retrieved image
compare to the existing EZW coding system.
Abstract: Most papers model Joint Replenishment Problem
(JRP) as a (kT,S) where kT is a multiple value for a common review
period T,and S is a predefined order up to level. In general the (T,S)
policy is characterized by a long out of control period which requires
a large amount of safety stock compared to the (R,Q) policy. In this
paper a probabilistic model is built where an item, call it item(i),
with the shortest order time between interval (T)is modeled under
(R,Q) policy and its inventory is continuously reviewed, while the
rest of items (j) are periodically reviewed at a definite time
corresponding to item
Abstract: This paper presents an adaptive motion estimator
that can be dynamically reconfigured by the best algorithm
depending on the variation of the video nature during the lifetime
of an application under running. The 4 Step Search (4SS) and the
Gradient Search (GS) algorithms are integrated in the estimator in
order to be used in the case of rapid and slow video sequences
respectively. The Full Search Block Matching (FSBM) algorithm
has been also integrated in order to be used in the case of the
video sequences which are not real time oriented.
In order to efficiently reduce the computational cost while
achieving better visual quality with low cost power, the proposed
motion estimator is based on a Variable Block Size (VBS) scheme
that uses only the 16x16, 16x8, 8x16 and 8x8 modes.
Experimental results show that the adaptive motion estimator
allows better results in term of Peak Signal to Noise Ratio
(PSNR), computational cost, FPGA occupied area, and dissipated
power relatively to the most popular variable block size schemes
presented in the literature.
Abstract: Software development has experienced remarkable progress in the past decade. However, due to the rising complexity and magnitude of the project the development productivity has not been consistently improved. By analyzing the latest ISBSG data repository with 4106 projects, we discovered that software development productivity has actually undergone irregular variations between the years 1995 and 2005. Considering the factors significant to the productivity, we found its variations are primarily caused by the variations of average team size and the unbalanced uses of the less productive language 3GL.
Abstract: Pattern matching is one of the fundamental applications in molecular biology. Searching DNA related data is a common activity for molecular biologists. In this paper we explore the applicability of a new pattern matching technique called Index based Forward Backward Multiple Pattern Matching algorithm(IFBMPM), for DNA Sequences. Our approach avoids unnecessary comparisons in the DNA Sequence due to this; the number of comparisons of the proposed algorithm is very less compared to other existing popular methods. The number of comparisons rapidly decreases and execution time decreases accordingly and shows better performance.
Abstract: In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.
Abstract: People detection from images has a variety of applications such as video surveillance and driver assistance system, but is still a challenging task and more difficult in crowded environments such as shopping malls in which occlusion of lower parts of human body often occurs. Lack of the full-body information requires more effective features than common features such as HOG. In this paper, new features are introduced that exploits global self-symmetry (GSS) characteristic in head-shoulder patterns. The features encode the similarity or difference of color histograms and oriented gradient histograms between two vertically symmetric blocks. The domain-specific features are rapid to compute from the integral images in Viola-Jones cascade-of-rejecters framework. The proposed features are evaluated with our own head-shoulder dataset that, in part, consists of a well-known INRIA pedestrian dataset. Experimental results show that the GSS features are effective in reduction of false alarmsmarginally and the gradient GSS features are preferred more often than the color GSS ones in the feature selection.
Abstract: Bloom filter is a probabilistic and memory efficient
data structure designed to answer rapidly whether an element is
present in a set. It tells that the element is definitely not in the set but
its presence is with certain probability. The trade-off to use Bloom
filter is a certain configurable risk of false positives. The odds of a
false positive can be made very low if the number of hash function is
sufficiently large. For spam detection, weight is attached to each set
of elements. The spam weight for a word is a measure used to rate the
e-mail. Each word is assigned to a Bloom filter based on its weight.
The proposed work introduces an enhanced concept in Bloom filter
called Bin Bloom Filter (BBF). The performance of BBF over
conventional Bloom filter is evaluated under various optimization
techniques. Real time data set and synthetic data sets are used for
experimental analysis and the results are demonstrated for bin sizes 4,
5, 6 and 7. Finally analyzing the results, it is found that the BBF
which uses heuristic techniques performs better than the traditional
Bloom filter in spam detection.
Abstract: This study was initiated with a three prong objective.
One, to identify the relationship between Technological
Competencies factors (Technical Capability, Firm Innovativeness
and E-Business Practices and professional service firms- business
performance. To investigate the predictors of professional service
firms business performance and finally to evaluate the predictors of
business performance according to the type of professional service
firms, a survey questionnaire was deployed to collect empirical data.
The questionnaire was distributed to the owners of the professional
small medium size enterprises services in the Accounting, Legal,
Engineering and Architecture sectors. Analysis showed that all three
Technology Competency factors have moderate effect on business
performance. In addition, the regression models indicate that
technical capability is the most highly influential that could
determine business performance, followed by e-business practices
and firm innovativeness. Subsequently, the main predictor of
business performance for all types of firms is Technical capability.
Abstract: Human heart valves diseased by congenital heart
defects, rheumatic fever, bacterial infection, cancer may cause stenosis
or insufficiency in the valves. Treatment may be with medication but
often involves valve repair or replacement (insertion of an artificial
heart valve). Bileaflet mechanical heart valves (BMHVs) are widely
implanted to replace the diseased heart valves, but still suffer from
complications such as hemolysis, platelet activation, tissue
overgrowth and device failure. These complications are closely related
to both flow characteristics through the valves and leaflet dynamics. In
this study, the physiological flow interacting with the moving leaflets
in a bileaflet mechanical heart valve (BMHV) is simulated with a
strongly coupled implicit fluid-structure interaction (FSI) method
which is newly organized based on the Arbitrary-Lagrangian-Eulerian
(ALE) approach and the dynamic mesh method (remeshing) of
FLUENT. The simulated results are in good agreement with previous
experimental studies. This study shows the applicability of the present
FSI model to the complicated physics interacting between fluid flow
and moving boundary.
Abstract: It is well known that the channel capacity of Multiple-
Input-Multiple-Output (MIMO) system increases as the number of
antenna pairs between transmitter and receiver increases but it suffers
from multiple expensive RF chains. To reduce the cost of RF chains,
Antenna Selection (AS) method can offer a good tradeoff between
expense and performance. In a transmitting AS system, Channel
State Information (CSI) feedback is necessarily required to choose
the best subset of antennas in which the effects of delays and errors
occurred in feedback channels are the most dominant factors
degrading the performance of the AS method. This paper presents the
concept of AS method using CSI from channel reciprocity instead of
feedback method. Reciprocity technique can easily archive CSI by
utilizing a reverse channel where the forward and reverse channels
are symmetrically considered in time, frequency and location. In this
work, the capacity performance of MIMO system when using AS
method at transmitter with reciprocity channels is investigated by
own developing Testbed. The obtained results show that reciprocity
technique offers capacity close to a system with a perfect CSI and
gains a higher capacity than a system without AS method from 0.9 to
2.2 bps/Hz at SNR 10 dB.
Abstract: The purpose of this paper is to assess the value of neural networks for classification of cancer and noncancer prostate cells. Gauss Markov Random Fields, Fourier entropy and wavelet average deviation features are calculated from 80 noncancer and 80 cancer prostate cell nuclei. For classification, artificial neural network techniques which are multilayer perceptron, radial basis function and learning vector quantization are used. Two methods are utilized for multilayer perceptron. First method has single hidden layer and between 3-15 nodes, second method has two hidden layer and each layer has between 3-15 nodes. Overall classification rate of 86.88% is achieved.
Abstract: Entrepreneurs are important for national labour markets and economies in that they contribute significantly to economic growth as well as provide the majority of jobs and create new ones. According to the Global Entrepreneurship Monitor’s “Report on Women and Entrepreneurship”, investment in women’s entrepreneurship is an important way to exponentially increase the impact of new venture creation finding ways to empower women’s participation and success in entrepreneurship are critical for more sustainable and successful economic development. Our results confirm that they are still differences between men and women entrepreneurs The reasons seems to be the lack of specific business skills, the less extensive social network, and the lack of identification patterns among women. Those differences can be explained by the fact that women still have fewer opportunities to make a career. If this is correct, we can predict an increasing proportion of women among entrepreneurs in the next years. Concerning the development of a favorable environment for developing and enhancing women entrepreneurship activities, our results show the insertion in a network and the role of a model doubtless represent elements determining in the choice to launch an entrepreneurship activity, as well as a precious resource for the success of her company.