Abstract: Segmentation techniques based on Active Contour
Models have been strongly benefited from the use of prior information
during their evolution. Shape prior information is captured from
a training set and is introduced in the optimization procedure to
restrict the evolution into allowable shapes. In this way, the evolution
converges onto regions even with weak boundaries. Although
significant effort has been devoted on different ways of capturing
and analyzing prior information, very little thought has been devoted
on the way of combining image information with prior information.
This paper focuses on a more natural way of incorporating the
prior information in the level set framework. For proof of concept
the method is applied on hippocampus segmentation in T1-MR
images. Hippocampus segmentation is a very challenging task, due
to the multivariate surrounding region and the missing boundary
with the neighboring amygdala, whose intensities are identical. The
proposed method, mimics the human segmentation way and thus
shows enhancements in the segmentation accuracy.
Abstract: In this paper is investigated a possible
optimization of some linear algebra problems which can be
solved by parallel processing using the special arrays called
systolic arrays. In this paper are used some special types of
transformations for the designing of these arrays. We show
the characteristics of these arrays. The main focus is on
discussing the advantages of these arrays in parallel
computation of matrix product, with special approach to the
designing of systolic array for matrix multiplication.
Multiplication of large matrices requires a lot of
computational time and its complexity is O(n3 ). There are
developed many algorithms (both sequential and parallel) with
the purpose of minimizing the time of calculations. Systolic
arrays are good suited for this purpose. In this paper we show
that using an appropriate transformation implicates in finding
more optimal arrays for doing the calculations of this type.
Abstract: Although services play a crucial role in economy,
service did not gain as much importance as productivity management
in manufacturing. This paper presents key findings from literature
and practice. Based on an initial definition of complex services, seven
productivity concepts are briefly presented and assessed by relevant,
complex service specific criteria. Following the findings a complex
service productivity model is proposed. The novel model comprises
of all specific dimensions of service provision from both, the
provider-s as well as costumer-s perspective. A clear assignment of
identified value drivers and relationships between them is presented.
In order to verify the conceptual service productivity model a case
study from a project engineering department of a chemical plant
development and construction company is presented.
Abstract: In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction.
A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.
Abstract: The purposes of this research were 1) to survey the
number of drugstores that unlawful dispense of asthma prescription
drugs, in form of drug combinations in the Phaya Thai district of
Bangkok, 2) to find the steroids contained in that drug combinations,
3) to find a means for informing general public about the dangers of
drugs and for a campaign to stop dispensing them.
Researcher collected drug combinations from 69 drugstores in
Phaya Thai district from Feb 15, 2012 to Mar 15, 2012. The survey
found 30.43%, 21, drug stores, sold asthma drug combinations to
customers without a prescription. These collected samples were
tested for steroid contamination by using Immunochromatography
kits. Eleven samples, 52.38%, were found contaminated with
steroids. In short, there should be control and inspection of
drugstores in the distribution of steroid medications. To improve the
knowledge of self health maintenance and drug usage among public,
Thai Government and Department of Public Health should educate
people about the side effects of using drug combinations and steroids.
Abstract: Adhesively bonded joints are preferred over the
conventional methods of joining such as riveting, welding, bolting
and soldering. Some of the main advantages of adhesive joints
compared to conventional joints are the ability to join dissimilar
materials and damage-sensitive materials, better stress distribution,
weight reduction, fabrication of complicated shapes, excellent
thermal and insulation properties, vibration response and enhanced
damping control, smoother aerodynamic surfaces and an
improvement in corrosion and fatigue resistance. This paper presents
the behavior of adhesively bonded joints subjected to combined
thermal loadings, using the numerical methods. The joint
configuration considers aluminum as central adherend with six
different outer adherends including aluminum, steel, titanium, boronepoxy,
unidirectional graphite-epoxy and cross-ply graphite-epoxy
and epoxy-based adhesives. Free expansion of the joint in x
direction was permitted and stresses in adhesive layer and interfaces
calculated for different adherends.
Abstract: Phishing, or stealing of sensitive information on the
web, has dealt a major blow to Internet Security in recent times. Most
of the existing anti-phishing solutions fail to handle the fuzziness
involved in phish detection, thus leading to a large number of false
positives. This fuzziness is attributed to the use of highly flexible and
at the same time, highly ambiguous HTML language. We introduce a
new perspective against phishing, that tries to systematically prove,
whether a given page is phished or not, using the corresponding
original page as the basis of the comparison. It analyzes the layout of
the pages under consideration to determine the percentage distortion
between them, indicative of any form of malicious alteration. The
system design represents an intelligent system, employing dynamic
assessment which accurately identifies brand new phishing attacks
and will prove effective in reducing the number of false positives.
This framework could potentially be used as a knowledge base, in
educating the internet users against phishing.
Abstract: Zero inflated strict arcsine model is a newly developed
model which is found to be appropriate in modeling overdispersed
count data. In this study, we extend zero inflated strict arcsine model
to zero inflated strict arcsine regression model by taking into
consideration the extra variability caused by extra zeros and
covariates in count data. Maximum likelihood estimation method is
used in estimating the parameters for this zero inflated strict arcsine
regression model.
Abstract: The term private equity usually refers to any type of
equity investment in an asset in which the equity is not freely
tradable on a public stock market. Some researchers believe that
private equity contributed to the extent of the crisis and increased
the pace of its spread over the world. We do not agree with this.
On the other hand, we argue that during the economic recession
private equity might become an important source of funds for firms
with special needs (e.g. for firms seeking buyout financing, venture
capital, expansion capital or distress debt financing). However,
over-regulation of private equity in both the European Union and
the US can slow down this specific funding channel to the
economy and deepen credit crunch during global crises.
Abstract: Biofuels, like biobutanol, have been recognized for
being renewable and sustainable fuels which can be produced from
lignocellulosic biomass. To convert lignocellulosic biomass to
biofuel, pretreatment process is an important step to remove
hemicelluloses and lignin to improve enzymatic hydrolysis. Dilute
acid pretreatment has been successful developed for pretreatment of
corncobs and the optimum conditions of dilute sulfuric and
phosphoric acid pretreatment were obtained at 120 °C for 5 min with
15:1 liquid to solid ratio and 140 °C for 10 min with 10:1 liquid to
solid ratio, respectively. The result shows that both of acid
pretreatments gave the content of total sugar approximately 34–35
g/l. In case of inhibitor content (furfural), phosphoric acid
pretreatment gives higher than sulfuric acid pretreatment.
Characterizations of corncobs after pretreatment indicate that both of
acid pretreatments can improve enzymatic accessibility and the better
results present in corncobs pretreated with sulfuric acid in term of
surface area, crystallinity, and composition analysis.
Abstract: The so-called all-pass filter circuits are commonly
used in the field of signal processing, control and measurement.
Being connected to capacitive loads, these circuits tend to loose their
stability; therefore the elaborate analysis of their dynamic behavior is
necessary. The compensation methods intending to increase the
stability of such circuits are discussed in this paper, including the socalled
lead-lag compensation technique being treated in detail. For
the dynamic modeling, a two-port network model of the all-pass filter
is being derived. The results of the model analysis show, that
effective lead-lag compensation can be achieved, alone by the
optimization of the circuit parameters; therefore the application of
additional electric components are not needed to fulfill the stability
requirement.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: A state of the art Speaker Identification (SI) system requires a robust feature extraction unit followed by a speaker modeling scheme for generalized representation of these features. Over the years, Mel-Frequency Cepstral Coefficients (MFCC) modeled on the human auditory system has been used as a standard acoustic feature set for SI applications. However, due to the structure of its filter bank, it captures vocal tract characteristics more effectively in the lower frequency regions. This paper proposes a new set of features using a complementary filter bank structure which improves distinguishability of speaker specific cues present in the higher frequency zone. Unlike high level features that are difficult to extract, the proposed feature set involves little computational burden during the extraction process. When combined with MFCC via a parallel implementation of speaker models, the proposed feature set outperforms baseline MFCC significantly. This proposition is validated by experiments conducted on two different kinds of public databases namely YOHO (microphone speech) and POLYCOST (telephone speech) with Gaussian Mixture Models (GMM) as a Classifier for various model orders.
Abstract: The purpose of this study is to present a non invasive
method for the marginal adaptation evaluation in class V composite
restorations. Standardized class V cavities, prepared in human
extracted teeth, were filled with Premise (Kerr) composite. The
specimens were thermo cycled. The interfaces were examined by
Optical Coherence Tomography method (OCT) combined with the
confocal microscopy and fluorescence. The optical configuration
uses two single mode directional couplers with a superluminiscent
diode as the source at 1300 nm. The scanning procedure is similar to
that used in any confocal microscope, where the fast scanning is enface
(line rate) and the depth scanning is much slower (at the frame
rate). Gaps at the interfaces as well as inside the composite resin
materials were identified. OCT has numerous advantages which
justify its use in vivo as well as in vitro in comparison with
conventional techniques.
Abstract: In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.
Abstract: In this paper, a new approach based on the extent of
friendship between the nodes is proposed which makes the nodes to
co-operate in an ad hoc environment. The extended DSR protocol is
tested under different scenarios by varying the number of malicious
nodes and node moving speed. It is also tested varying the number of
nodes in simulation used. The result indicates the achieved
throughput by extended DSR is greater than the standard DSR and
indicates the percentage of malicious drops over total drops are less
in the case of extended DSR than the standard DSR.
Abstract: The response surface methodology (RSM) is a
collection of mathematical and statistical techniques useful in the
modeling and analysis of problems in which the dependent variable
receives the influence of several independent variables, in order to
determine which are the conditions under which should operate these
variables to optimize a production process. The RSM estimated a
regression model of first order, and sets the search direction using the
method of maximum / minimum slope up / down MMS U/D.
However, this method selects the step size intuitively, which can
affect the efficiency of the RSM. This paper assesses how the step
size affects the efficiency of this methodology. The numerical
examples are carried out through Monte Carlo experiments,
evaluating three response variables: efficiency gain function, the
optimum distance and the number of iterations. The results in the
simulation experiments showed that in response variables efficiency
and gain function at the optimum distance were not affected by the
step size, while the number of iterations is found that the efficiency if
it is affected by the size of the step and function type of test used.
Abstract: The aim of this paper to characterize a larger set of
wavelet functions for implementation in a still image compression
system using SPIHT algorithm. This paper discusses important
features of wavelet functions and filters used in sub band coding to
convert image into wavelet coefficients in MATLAB. Image quality
is measured objectively using peak signal to noise ratio (PSNR) and
its variation with bit rate (bpp). The effect of different parameters is
studied on different wavelet functions. Our results provide a good
reference for application designers of wavelet based coder.
Abstract: Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.
Abstract: The Block Sorting problem is to sort a given
permutation moving blocks. A block is defined as a substring
of the given permutation, which is also a substring of the
identity permutation. Block Sorting has been proved to be
NP-Hard. Until now two different 2-Approximation algorithms
have been presented for block sorting. These are the best known
algorithms for Block Sorting till date. In this work we present
a different characterization of Block Sorting in terms of a
transposition cycle graph. Then we suggest a heuristic,
which we show to exhibit a 2-approximation performance
guarantee for most permutations.