Abstract: The one-class support vector machine “support vector
data description” (SVDD) is an ideal approach for anomaly or outlier
detection. However, for the applicability of SVDD in real-world
applications, the ease of use is crucial. The results of SVDD are
massively determined by the choice of the regularisation parameter C
and the kernel parameter of the widely used RBF kernel. While for
two-class SVMs the parameters can be tuned using cross-validation
based on the confusion matrix, for a one-class SVM this is not
possible, because only true positives and false negatives can occur
during training. This paper proposes an approach to find the optimal
set of parameters for SVDD solely based on a training set from
one class and without any user parameterisation. Results on artificial
and real data sets are presented, underpinning the usefulness of the
approach.
Abstract: Small cracks or chips of a product appear very
frequently in the course of continuous production of an automatic
press process system. These phenomena become the cause of not only
defective product but also damage of a press mold. In order to solve
this problem AE system was introduced. AE system was expected to
be very effective to real time detection of the defective product and to
prevention of the damage of the press molds.
In this study, for pick and analysis of AE signals generated from the
press process, AE sensors/pre-amplifier/analysis and processing board
were used as frequently found in the other similar cases. For analysis
and processing the AE signals picked in real time from the good or bad
products, specialized software called cdm8 was used. As a result of
this work it was conformed that intensity and shape of the various AE
signals differ depending on the weight and thickness of metal sheet
and process type.
Abstract: The authors report a case of swine urolithiasis caused
by improper administration of sulfamonomethoxine and which was
diagnosed by examination of urinary sediments and analyzing the
composition of the uroliths. The chemical composition of urinary
calculi obtained from affected pigs with urolithiasis was further
confimed as sulfamonomethoxine by fourier transform infrared
(FTIR). It is suggested that appearance of typical fanlike or wheat
bunchy crystals in urinary sediments under observation of lightmicroscope
and determination by FTIR for the crystals are helpful in
diagnosing sulfa calculi causced swine urolithiasis.
Abstract: In networks, mainly small and medium-sized businesses benefit from the knowledge, experiences and solutions offered by experts from industry and science or from the exchange with practitioners. Associations which focus, among other things, on networking, information and knowledge transfer and which are interested in supporting such cooperations are especially well suited to provide such networks and the appropriate web platforms. Using METORA as an example – a project developed and run by the Federal Association for Information Economy, Telecommunications and New Media e.V. (BITKOM) for the Federal Ministry of Economics and Technology (BMWi) – This paper will discuss how associations and other network organizations can achieve this task and what conditions they have to consider.
Abstract: Bovine viral diarrhea virus (BVDV) can cause lifelong
persistent infection. One reason for the phenomena is attributed to
BVDV infection to placenta tissue. However the mechanisms that
BVDV invades into placenta tissue remain unclear. To clarify the
molecular mechanisms, we investigated the possible means that
BVDV entered into bovine trophoblast cells (TPC). Yeast two-hybrid
system was used to identify proteins extracted from TPC, which
interact with BVDV envelope glycoprotein E2. A PGbkt7-E2 yeast
expression vector and TPC cDNA library were constructed. Through
two rounds of screening, three positive clones were identified.
Sequencing analysis indicated that all the three positive clones
encoded the same protein clathrin. Physical interaction between
clathrin and BVDV E2 protein was further confirmed by
coimmunoprecipitation experiments. This result suggested that the
clathrin might play a critical role in the process of BVDV entry into
placenta tissue and might be a novel antiviral target for preventing
BVDV infection.
Abstract: Long term rainfall analysis and prediction is a
challenging task especially in the modern world where the impact of
global warming is creating complications in environmental issues.
These factors which are data intensive require high performance
computational modeling for accurate prediction. This research paper
describes a prototype which is designed and developed on grid
environment using a number of coupled software infrastructural
building blocks. This grid enabled system provides the demanding
computational power, efficiency, resources, user-friendly interface,
secured job submission and high throughput. The results obtained
using sequential execution and grid enabled execution shows that
computational performance has enhanced among 36% to 75%, for
decade of climate parameters. Large variation in performance can be
attributed to varying degree of computational resources available for
job execution.
Grid Computing enables the dynamic runtime selection, sharing
and aggregation of distributed and autonomous resources which plays
an important role not only in business, but also in scientific
implications and social surroundings. This research paper attempts to
explore the grid enabled computing capabilities on weather indices
from HOAPS data for climate impact modeling and change
detection.
Abstract: In the automotive industry test drives are being conducted
during the development of new vehicle models or as a part of
quality assurance of series-production vehicles. The communication
on the in-vehicle network, data from external sensors, or internal
data from the electronic control units is recorded by automotive
data loggers during the test drives. The recordings are used for fault
analysis. Since the resulting data volume is tremendous, manually
analysing each recording in great detail is not feasible.
This paper proposes to use machine learning to support domainexperts
by preventing them from contemplating irrelevant data and
rather pointing them to the relevant parts in the recordings. The
underlying idea is to learn the normal behaviour from available
recordings, i.e. a training set, and then to autonomously detect
unexpected deviations and report them as anomalies.
The one-class support vector machine “support vector data description”
is utilised to calculate distances of feature vectors. SVDDSUBSEQ
is proposed as a novel approach, allowing to classify subsequences
in multivariate time series data. The approach allows to
detect unexpected faults without modelling effort as is shown with
experimental results on recordings from test drives.
Abstract: The purpose of this article is to analyze economic and
political tendencies of development of integration processes with
different developing level and speed on the Eurasian space, by considering two organizations at the region – Eurasian Economic
Community and Shanghai Cooperation Organization, by considering the interests of participations in organizations of Russia and China as
a global powers and Kazakhstan as a leader among the Central Asian
countries. This article investigates what certain goals Eurasian
countries (especially Russia, Kazakhstan and China) are waiting from integration within the SCO and the EurAsEC, linking the process
with the theories of regional integration. After European debt crisis it is more topically to research the integration within the specific
region's conditions.
Abstract: This article provides empirical evidence on the effect
of domestic and international factors on the U.S. current account
deficit. Linear dynamic regression and vector autoregression models
are employed to estimate the relationships during the period from 1986
to 2011. The findings of this study suggest that the current and lagged
private saving rate and foreign current account for East Asian
economies have played a vital role in affecting the U.S. current
account. Additionally, using Granger causality tests and variance
decompositions, the change of the productivity growth and foreign
domestic demand are determined to influence significantly the change
of the U.S. current account. To summarize, the empirical relationship
between the U.S. current account deficit and its determinants is
sensitive to alternative regression models and specifications.
Abstract: Researchers have been applying tional intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI
methods with respect to each game application. In th
our experimental result on the comparison of three evolutionary algorithms – evolution strategy, genetic algorithm, and their hybrid
applied to evolving controller agents for the CIG 2007 Simulated Car Racing competition. Our experimental result shows that, premature
convergence of solutions was observed in the case of ES, and GA outperformed ES in the last half of generations. Besides, a hybrid
which uses GA first and ES next evolved the best solution among the whole solutions being generated. This result shows the ability of GA in
globally searching promising areas in the early stage and the ability of ES in locally searching the focused area (fine-tuning solutions).
Abstract: A ten-year grazing study was conducted at the
Agriculture and Agri-Food Canada Brandon Research Centre in
Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P,
K, and S) addition on economics and efficiency of non-renewable
energy use in meadow brome grass-based pasture systems for beef
production. Fertilizing grass-only or alfalfa-grass pastures to full soil
test recommendations improved pasture productivity, but did not
improve profitability compared to unfertilized pastures. Fertilizing
grass-only pastures resulted in the highest net loss of any pasture
management strategy in this study. Adding alfalfa at the time of
seeding, with no added fertilizer, was economically the best pasture
improvement strategy in this study. Because of moisture limitations,
adding commercial fertilizer to full soil test recommendations is
probably not economically justifiable in most years, especially with
the rising cost of fertilizer. Improving grass-only pastures by adding
fertilizer and/or alfalfa required additional non-renewable energy
inputs; however, the additional energy required for unfertilized
alfalfa-grass pastures was minimal compared to the fertilized
pastures. Of the four pasture management strategies, adding alfalfa
to grass pastures without adding fertilizer had the highest efficiency
of energy use. Based on energy use and economic performance, the
unfertilized alfalfa-grass pasture was the most efficient and
sustainable pasture system.
Abstract: Nevertheless the widespread application of finite
mixture models in segmentation, finite mixture model selection is
still an important issue. In fact, the selection of an adequate number
of segments is a key issue in deriving latent segments structures and
it is desirable that the selection criteria used for this end are effective.
In order to select among several information criteria, which may
support the selection of the correct number of segments we conduct a
simulation study. In particular, this study is intended to determine
which information criteria are more appropriate for mixture model
selection when considering data sets with only categorical
segmentation base variables. The generation of mixtures of
multinomial data supports the proposed analysis. As a result, we
establish a relationship between the level of measurement of
segmentation variables and some (eleven) information criteria-s
performance. The criterion AIC3 shows better performance (it
indicates the correct number of the simulated segments- structure
more often) when referring to mixtures of multinomial segmentation
base variables.
Abstract: Image Compression using Artificial Neural Networks
is a topic where research is being carried out in various directions
towards achieving a generalized and economical network.
Feedforward Networks using Back propagation Algorithm adopting
the method of steepest descent for error minimization is popular and
widely adopted and is directly applied to image compression.
Various research works are directed towards achieving quick
convergence of the network without loss of quality of the restored
image. In general the images used for compression are of different
types like dark image, high intensity image etc. When these images
are compressed using Back-propagation Network, it takes longer
time to converge. The reason for this is, the given image may
contain a number of distinct gray levels with narrow difference with
their neighborhood pixels. If the gray levels of the pixels in an image
and their neighbors are mapped in such a way that the difference in
the gray levels of the neighbors with the pixel is minimum, then
compression ratio as well as the convergence of the network can be
improved. To achieve this, a Cumulative distribution function is
estimated for the image and it is used to map the image pixels. When
the mapped image pixels are used, the Back-propagation Neural
Network yields high compression ratio as well as it converges
quickly.
Abstract: Knowledge development in companies relies on
knowledge-intensive business processes, which are characterized by
a high complexity in their execution, weak structuring,
communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of
knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is
modeled with the help of general knowledge conversions between
knowledge assets. Here knowledge dynamics is understood to cover
all of acquisition, conversion, transfer, development and usage of
knowledge. Through this conception we gain a sound basis for
knowledge management and development in an enterprise. Especially
the type dimension of knowledge, which categorizes it according to
its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development,
because knowledge should be made available by converting it to
more external types.
Built on this conception, a modeling approach for knowledgeintensive
business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of
a product is given.
Abstract: Wood pyrolysis for Casuarina glauca, Casuarina cunninghamiana, Eucalyptus camaldulensis, Eucalyptus microtheca was made at 450°C with 2.5°C/min. in a flowing N2-atmosphere. The Eucalyptus genus wood gave higher values of specific gravity, ash , total extractives, lignin, N2-liquid trap distillate (NLTD) and water trap distillate (WSP) than those for Casuarina genus. The GHC of NLTD was higher for Casuarina genus than that for Eucalyptus genus with the highest value for Casuarina cunninghamiana. Guiacol, 4-ethyl-2-methoxyphenol and syringol were observed in the NLTD of all the four wood species reflecting their parent hardwood lignin origin. Eucalyptus camaldulensis wood had the highest lignin content (28.89%) and was pyrolyzed to the highest values of phenolics (73.01%), guaiacol (11.2%) and syringol (32.28%) contents in methylene chloride fraction (MCF) of NLTD. Accordingly, recoveries of syringol and guaiacol may become economically attractive from Eucalyptus camaldulensis.
Abstract: Sub-prime mortgage crisis which began in the US is
regarded as the most economic crisis since the Great Depression in the
early 20th century. Especially, hidden problems on efficient operation
of a business were disclosed at a time and many financial institutions
went bankrupt and filed for court receivership. The collapses of
physical market lead to bankruptcy of manufacturing and construction
businesses. This study is to analyze dynamic efficiency of construction
businesses during the five years at the turn of the global financial
crisis. By discovering the trend and stability of efficiency of a
construction business, this study-s objective is to improve
management efficiency of a construction business in the
ever-changing construction market. Variables were selected by
analyzing corporate information on top 20 construction businesses in
Korea and analyzed for static efficiency in 2008 and dynamic
efficiency between 2006 and 2010. Unlike other studies, this study
succeeded in deducing efficiency trend and stability of a construction
business for five years by using the DEA/Window model. Using the
analysis result, efficient and inefficient companies could be figured
out. In addition, relative efficiency among DMU was measured by
comparing the relationship between input and output variables of
construction businesses. This study can be used as a literature to
improve management efficiency for companies with low efficiency
based on efficiency analysis of construction businesses.
Abstract: Nanophotocatalysts such as titanium (TiO2), zinc (ZnO), and iron (Fe2O3) oxides can be used in organic pollutants oxidation, and in many other applications. But among the challenges for technological application (scale-up) of the nanotechnology scientific developments two aspects are still little explored: research on environmental risk of the nanomaterials preparation methods, and the study of nanomaterials properties and/or performance variability. The environmental analysis was performed for six different methods of ZnO nanoparticles synthesis, and showed that it is possible to identify the more environmentally compatible process even at laboratory scale research. The obtained ZnO nanoparticles were tested as photocatalysts, and increased the degradation rate of the Rhodamine B dye up to 30 times.
Abstract: Traffic Engineering (TE) is the process of controlling
how traffic flows through a network in order to facilitate efficient and
reliable network operations while simultaneously optimizing network
resource utilization and traffic performance. TE improves the
management of data traffic within a network and provides the better
utilization of network resources. Many research works considers intra
and inter Traffic Engineering separately. But in reality one influences
the other. Hence the effective network performances of both inter and
intra Autonomous Systems (AS) are not optimized properly. To
achieve a better Joint Optimization of both Intra and Inter AS TE, we
propose a joint Optimization technique by considering intra-AS
features during inter – AS TE and vice versa. This work considers the
important criterion say latency within an AS and between ASes. and
proposes a Bi-Criteria Latency optimization model. Hence an overall
network performance can be improved by considering this jointoptimization
technique in terms of Latency.
Abstract: This paper presents an intrusion detection system of hybrid neural network model based on RBF and Elman. It is used for anomaly detection and misuse detection. This model has the memory function .It can detect discrete and related aggressive behavior effectively. RBF network is a real-time pattern classifier, and Elman network achieves the memory ability for former event. Based on the hybrid model intrusion detection system uses DARPA data set to do test evaluation. It uses ROC curve to display the test result intuitively. After the experiment it proves this hybrid model intrusion detection system can effectively improve the detection rate, and reduce the rate of false alarm and fail.
Abstract: “Garbage enzyme", a fermentation product of kitchen waste, water and brown sugar, is claimed in the media as a multipurpose solution for household and agricultural uses. This study assesses the effects of dilutions (5% to 75%) of garbage enzyme in reducing pollutants in domestic wastewater. The pH of the garbage enzyme was found to be 3.5, BOD concentration about 150 mg/L. Test results showed that the garbage enzyme raised the wastewater-s BOD in proportion to its dilution due to its high organic content. For mixtures with more than 10% garbage enzyme, its pH remained acidic after the 5-day digestion period. However, it seems that ammonia nitrogen and phosphorus could be removed by the addition of the garbage enzyme. The most economic solution for removal of ammonia nitrogen and phosphorus was found to be 9%. Further tests are required to understand the removal mechanisms of the ammonia nitrogen and phosphorus.