Abstract: We propose an enhanced collaborative filtering
method using Hofstede-s cultural dimensions, calculated for 111
countries. We employ 4 of these dimensions, which are correlated to
the costumers- buying behavior, in order to detect users- preferences
for items. In addition, several advantages of this method
demonstrated for data sparseness and cold-start users, which are
important challenges in collaborative filtering. We present
experiments using a real dataset, Book Crossing Dataset.
Experimental results shows that the proposed algorithm provide
significant advantages in terms of improving recommendation
quality.
Abstract: The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Abstract: In this paper, a fuzzy algorithm and a fuzzy multicriteria
decision framework are developed and used for a practical
question of optimizing biofuels policy making. The methodological
framework shows how to incorporate fuzzy set theory in a decision
process of finding a sustainable biofuels policy among several policy
options. Fuzzy set theory is used here as a tool to deal with
uncertainties of decision environment, vagueness and ambiguities of
policy objectives, subjectivities of human assessments and imprecise
and incomplete information about the evaluated policy instruments.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: ZnO nanocrystals with mean diameter size 14 nm
have been prepared by precipitation method, and examined as
photocatalyst for the UV-induced degradation of insecticide diazinon
as deputy of organic pollutant in aqueous solution. The effects of
various parameters, such as illumination time, the amount of
photocatalyst, initial pH values and initial concentration of
insecticide on the photocatalytic degradation diazinon were
investigated to find desired conditions. In this case, the desired
parameters were also tested for the treatment of real water containing
the insecticide. Photodegradation efficiency of diazinon was
compared between commercial and prepared ZnO nanocrystals. The
results indicated that UV/ZnO process applying prepared
nanocrystalline ZnO offered electrical energy efficiency and
quantum yield better than commercial ZnO. The present study, on the
base of Langmuir-Hinshelwood mechanism, illustrated a pseudo
first-order kinetic model with rate constant of surface reaction equal
to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l
mg-1.
Abstract: This paper proposes new enhancement models to the
methods of nonlinear anisotropic diffusion to greatly reduce speckle
and preserve image features in medical ultrasound images. By
incorporating local physical characteristics of the image, in this case
scatterer density, in addition to the gradient, into existing tensorbased
image diffusion methods, we were able to greatly improve the
performance of the existing filtering methods, namely edge
enhancing (EE) and coherence enhancing (CE) diffusion. The new
enhancement methods were tested using various ultrasound images,
including phantom and some clinical images, to determine the
amount of speckle reduction, edge, and coherence enhancements.
Scatterer density weighted nonlinear anisotropic diffusion
(SDWNAD) for ultrasound images consistently outperformed its
traditional tensor-based counterparts that use gradient only to weight
the diffusivity function. SDWNAD is shown to greatly reduce
speckle noise while preserving image features as edges, orientation
coherence, and scatterer density. SDWNAD superior performances
over nonlinear coherent diffusion (NCD), speckle reducing
anisotropic diffusion (SRAD), adaptive weighted median filter
(AWMF), wavelet shrinkage (WS), and wavelet shrinkage with
contrast enhancement (WSCE), make these methods ideal
preprocessing steps for automatic segmentation in ultrasound
imaging.
Abstract: A new approach for protection of power transformer is
presented using a time-frequency transform known as Wavelet transform.
Different operating conditions such as inrush, Normal, load,
External fault and internal fault current are sampled and processed
to obtain wavelet coefficients. Different Operating conditions provide
variation in wavelet coefficients. Features like energy and Standard
deviation are calculated using Parsevals theorem. These features
are used as inputs to PNN (Probabilistic neural network) for fault
classification. The proposed algorithm provides more accurate results
even in the presence of noise inputs and accurately identifies inrush
and fault currents. Overall classification accuracy of the proposed
method is found to be 96.45%. Simulation of the fault (with and
without noise) was done using MATLAB AND SIMULINK software
taking 2 cycles of data window (40 m sec) containing 800 samples.
The algorithm was evaluated by using 10 % Gaussian white noise.
Abstract: Automatic segmentation of skin lesions is the first step
towards the automated analysis of malignant melanoma. Although
numerous segmentation methods have been developed, few studies
have focused on determining the most effective color space for
melanoma application. This paper proposes an automatic segmentation
algorithm based on color space analysis and clustering-based histogram
thresholding, a process which is able to determine the optimal
color channel for detecting the borders in dermoscopy images. The
algorithm is tested on a set of 30 high resolution dermoscopy images.
A comprehensive evaluation of the results is provided, where borders
manually drawn by four dermatologists, are compared to automated
borders detected by the proposed algorithm, applying three previously
used metrics of accuracy, sensitivity, and specificity and a new metric
of similarity. By performing ROC analysis and ranking the metrics,
it is demonstrated that the best results are obtained with the X and
XoYoR color channels, resulting in an accuracy of approximately
97%. The proposed method is also compared with two state-of-theart
skin lesion segmentation methods.
Abstract: In this paper a novel approach for generalized image
retrieval based on semantic contents is presented. A combination of
three feature extraction methods namely color, texture, and edge
histogram descriptor. There is a provision to add new features in
future for better retrieval efficiency. Any combination of these
methods, which is more appropriate for the application, can be used
for retrieval. This is provided through User Interface (UI) in the
form of relevance feedback. The image properties analyzed in this
work are by using computer vision and image processing algorithms.
For color the histogram of images are computed, for texture cooccurrence
matrix based entropy, energy, etc, are calculated and for
edge density it is Edge Histogram Descriptor (EHD) that is found.
For retrieval of images, a novel idea is developed based on greedy
strategy to reduce the computational complexity. The entire system
was developed using AForge.Imaging (an open source product),
MATLAB .NET Builder, C#, and Oracle 10g. The system was tested
with Coral Image database containing 1000 natural images and
achieved better results.
Abstract: It was determined that woody biomass and livestock excreta can be utilized as hydrogen resources and hydrogen produced from such sources can be used to fill fuel cell vehicles (FCVs) at hydrogen stations. It was shown that the biomass transport costs for hydrogen production may be reduced the costs for co-generation. In the Tokyo Metropolitan Area, there are only a few sites capable of producing hydrogen from woody biomass in amounts greater than 200 m3/h-the scale required for a hydrogen station to be operationally practical. However, in the case of livestock excreta, it was shown that 15% of the municipalities in this area are capable of securing sufficient biomass to be operationally practical for hydrogen production. The differences in feasibility of practical operation depend on the type of biomass.
Abstract: Under the limitation of investment budget, a utility
company is required to maximize the utilization of their existing
assets during their life cycle satisfying both engineering and financial
requirements. However, utility does not have knowledge about the
status of each asset in the portfolio neither in terms of technical nor
financial values. This paper presents a knowledge based model for
the utility companies in order to make an optimal decision on power
transformer with their utilization. CommonKADS methodology, a
structured development for knowledge and expertise representation,
is utilized for designing and developing knowledge based model. A
case study of One MVA power transformer of Nepal Electricity
Authority is presented. The results show that the reusable knowledge
can be categorized, modeled and utilized within the utility company
using the proposed methodologies. Moreover, the results depict that
utility company can achieve both engineering and financial benefits
from its utilization.
Abstract: The objective of the paper is to develop the forecast
model for the HW flows. The methodology of the research included
6 modules: historical data, assumptions, choose of indicators, data
processing, and data analysis with STATGRAPHICS, and forecast
models. The proposed methodology was validated for the case study
for Latvia. Hypothesis on the changes in HW for time period of
2010-2020 have been developed and mathematically described with
confidence level of 95.0% and 50.0%. Sensitivity analysis for the
analyzed scenarios was done. The results show that the growth of
GDP affects the total amount of HW in the country. The total amount
of the HW is projected to be within the corridor of – 27.7% in the
optimistic scenario up to +87.8% in the pessimistic scenario with
confidence level of 50.0% for period of 2010-2020. The optimistic
scenario has shown to be the least flexible to the changes in the GDP
growth.
Abstract: Among the numerous economic evaluation techniques currently available, Multi-criteria Spatial Analysis lends itself to solving localization problems of property complexes and, in particular, production plants. The methodology involves the use of Geographical Information Systems (GIS) and the mapping overlay technique, which overlaps the different information layers of a territory in order to obtain an overview of the parameters that characterize it. This first phase is used to detect possible settlement surfaces of a new agglomeration, subsequently selected through Analytic Hierarchy Process (AHP), so as to choose the best alternative. The result ensures the synthesis of a multidimensional profile that expresses both the quantitative and qualitative effects. Each criterion can be given a different weight.
Abstract: In this paper two mathematical models for definition of gas accidental escape localization in the gas pipelines are suggested. The first model was created for leak localization in the horizontal branched pipeline and second one for leak detection in inclined section of the main gas pipeline. The algorithm of leak localization in the branched pipeline did not demand on knowledge of corresponding initial hydraulic parameters at entrance and ending points of each sections of pipeline. For detection of the damaged section and then leak localization in this section special functions and equations have been constructed. Some results of calculations for compound pipelines having two, four and five sections are presented. Also a method and formula for the leak localization in the simple inclined section of the main gas pipeline are suggested. Some results of numerical calculations defining localization of gas escape for the inclined pipeline are presented.
Abstract: Motion capturing technology has been used for quite a
while and several research has been done within this area. Nevertheless,
we discovered open issues within current motion capturing
environments. In this paper we provide a state-of-the-art overview of
the addressed research areas and show issues with current motion
capturing environments. Observations, interviews and questionnaires
have been used to reveal the challenges actors are currently facing in
a motion capturing environment. Furthermore, the idea to create a
more immersive motion capturing environment to improve the acting
performances and motion capturing outcomes as a potential solution
is introduced. It is hereby the goal to explain the found open issues
and the developed ideas which shall serve for further research as a
basis. Moreover, a methodology to address the interaction and
systems design issues is proposed. A future outcome could be that
motion capture actors are able to perform more naturally, especially
if using a non-body-worn solution.
Abstract: A low bit rate still image compression scheme by
compressing the indices of Vector Quantization (VQ) and generating
residual codebook is proposed. The indices of VQ are compressed by
exploiting correlation among image blocks, which reduces the bit per
index. A residual codebook similar to VQ codebook is generated that
represents the distortion produced in VQ. Using this residual
codebook the distortion in the reconstructed image is removed,
thereby increasing the image quality. Our scheme combines these two
methods. Experimental results on standard image Lena show that our
scheme can give a reconstructed image with a PSNR value of 31.6 db
at 0.396 bits per pixel. Our scheme is also faster than the existing VQ
variants.
Abstract: When the results of the total element concentrations using USEPA method 3051A are compared to the sequential extraction analyses (i.e. the sum of fractions BCR1, BCR2 and BRC3), it can be calculated that the recovery values of elements varied between 56.8-% and 69.4-% in the bottom ash, and between 11.3-% and 70.9-% in the fly ash. This indicates that most of the elements in the ashes do not occur as readily soluble forms.
Abstract: In this paper the problem of face recognition under variable illumination conditions is considered. Most of the works in the literature exhibit good performance under strictly controlled acquisition conditions, but the performance drastically drop when changes in pose and illumination occur, so that recently number of approaches have been proposed to deal with such variability. The aim of this work is to introduce an efficient local appearance feature extraction method based steerable pyramid (SP) for face recognition. Local information is extracted from SP sub-bands using LBP(Local binary Pattern). The underlying statistics allow us to reduce the required amount of data to be stored. The experiments carried out on different face databases confirm the effectiveness of the proposed approach.
Abstract: Implicit equations play a crucial role in Engineering.
Based on this importance, several techniques have been applied to
solve this particular class of equations. When it comes to practical
applications, in general, iterative procedures are taken into account.
On the other hand, with the improvement of computers, other
numerical methods have been developed to provide a more
straightforward methodology of solution. Analytical exact approaches
seem to have been continuously neglected due to the difficulty
inherent in their application; notwithstanding, they are indispensable
to validate numerical routines. Lagrange-s Inversion Theorem is a
simple mathematical tool which has proved to be widely applicable to
engineering problems. In short, it provides the solution to implicit
equations by means of an infinite series. To show the validity of this
method, the tree-parameter infiltration equation is, for the first time,
analytically and exactly solved. After manipulating these series,
closed-form solutions are presented as H-functions.