Abstract: In comparison to the original SVM, which involves a
quadratic programming task; LS–SVM simplifies the required
computation, but unfortunately the sparseness of standard SVM is
lost. Another problem is that LS-SVM is only optimal if the training
samples are corrupted by Gaussian noise. In Least Squares SVM
(LS–SVM), the nonlinear solution is obtained, by first mapping the
input vector to a high dimensional kernel space in a nonlinear
fashion, where the solution is calculated from a linear equation set. In
this paper a geometric view of the kernel space is introduced, which
enables us to develop a new formulation to achieve a sparse and
robust estimate.
Abstract: This paper considers the problem of finding low cost
chip set for a minimum cost partitioning of a large logic circuits. Chip
sets are selected from a given library. Each chip in the library has a
different price, area, and I/O pin. We propose a low cost chip set
selection algorithm. Inputs to the algorithm are a netlist and a chip
information in the library. Output is a list of chip sets satisfied with
area and maximum partitioning number and it is sorted by cost. The
algorithm finds the sorted list of chip sets from minimum cost to
maximum cost. We used MCNC benchmark circuits for experiments.
The experimental results show that all of chip sets found satisfy the
multiple partitioning constraints.
Abstract: The utilize of renewable energy sources becomes
more crucial and fascinatingly, wider application of renewable
energy devices at domestic, commercial and industrial levels is not
only affect to stronger awareness but also significantly installed
capacities. Moreover, biomass principally is in form of woods and
converts to be energy for using by humans for a long time.
Gasification is a process of conversion of solid carbonaceous fuel
into combustible gas by partial combustion. Many gasified models
have various operating conditions because the parameters kept in
each model are differentiated. This study applied the experimental
data including three inputs variables including biomass consumption;
temperature at combustion zone and ash discharge rate and gas flow
rate as only one output variable. In this paper, response surface
methods were applied for identification of the gasified system
equation suitable for experimental data. The result showed that linear
model gave superlative results.
Abstract: Prickly pear juice has received renewed attention with regard to the effects of processing and preservation on its sensory qualities (colour, taste, flavour, aroma, astringency, visual browning and overall acceptability). Juice was prepared by homogenizing fruit and treating the pulp with pectinase (Aspergillus niger). Juice treatments applied were sugar addition, acidification, heat-treatment, refrigeration, and freezing and thawing. Prickly pear pulp and juice had unique properties (low pH 3.88, soluble solids 3.68 oBrix and high titratable acidity 0.47). Sensory profiling and descriptive analyses revealed that non-treated juice had a bitter taste with high astringency whereas treated prickly pear was significantly sweeter. All treated juices had a good sensory acceptance with values approximating or exceeding 7. Regression analysis of the consumer sensory attributes for non-treated prickly pear juice indicated an overwhelming rejection, while treated prickly pear juice received overall acceptability. Thus, educed favourable sensory responses and may have positive implications for consumer acceptability.
Abstract: CScheme, a concurrent programming paradigm based
on scheme concept enables concurrency schemes to be constructed
from smaller synchronization units through a GUI based composer
and latter be reused on other concurrency problems of a similar
nature. This paradigm is particularly important in the multi-core
environment prevalent nowadays. In this paper, we demonstrate
techniques to separate concurrency from functional code using the
CScheme paradigm. Then we illustrate how the CScheme
methodology can be used to solve some of the traditional
concurrency problems – critical section problem, and readers-writers
problem - using synchronization schemes such as Single Threaded
Execution Scheme, and Readers Writers Scheme.
Abstract: Fischer-Tropsch synthesis is one of the most
important catalytic reactions that convert the synthetic gas to light
and heavy hydrocarbons. One of the main issues is selecting the type
of reactor. The slurry bubble reactor is suitable choice for Fischer-
Tropsch synthesis because of its good qualification to transfer heat
and mass, high durability of catalyst, low cost maintenance and
repair. The more common catalysts for Fischer-Tropsch synthesis are
Iron-based and Cobalt-based catalysts, the advantage of these
catalysts on each other depends on which type of hydrocarbons we
desire to produce. In this study, Fischer-Tropsch synthesis is modeled
with Iron and Cobalt catalysts in a slurry bubble reactor considering
mass and momentum balance and the hydrodynamic relations effect
on the reactor behavior. Profiles of reactant conversion and reactant
concentration in gas and liquid phases were determined as the
functions of residence time in the reactor. The effects of temperature,
pressure, liquid velocity, reactor diameter, catalyst diameter, gasliquid
and liquid-solid mass transfer coefficients and kinetic
coefficients on the reactant conversion have been studied. With 5%
increase of liquid velocity (with Iron catalyst), H2 conversions
increase about 6% and CO conversion increase about 4%, With 8%
increase of liquid velocity (with Cobalt catalyst), H2 conversions
increase about 26% and CO conversion increase about 4%. With
20% increase of gas-liquid mass transfer coefficient (with Iron
catalyst), H2 conversions increase about 12% and CO conversion
increase about 10% and with Cobalt catalyst H2 conversions increase
about 10% and CO conversion increase about 6%. Results show that
the process is sensitive to gas-liquid mass transfer coefficient and
optimum condition operation occurs in maximum possible liquid
velocity. This velocity must be more than minimum fluidization
velocity and less than terminal velocity in such a way that avoid
catalysts particles from leaving the fluidized bed.
Abstract: Sparse representation which can represent high dimensional
data effectively has been successfully used in computer vision
and pattern recognition problems. However, it doesn-t consider the
label information of data samples. To overcome this limitation,
we develop a novel dimensionality reduction algorithm namely
dscriminatively regularized sparse subspace learning(DR-SSL) in this
paper. The proposed DR-SSL algorithm can not only make use of
the sparse representation to model the data, but also can effective
employ the label information to guide the procedure of dimensionality
reduction. In addition,the presented algorithm can effectively deal
with the out-of-sample problem.The experiments on gene-expression
data sets show that the proposed algorithm is an effective tool for
dimensionality reduction and gene-expression data classification.
Abstract: Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.
Abstract: Nowadays there are several grid connected converter
in the grid system. These grid connected converters are generally the
converters of renewable energy sources, industrial four quadrant
drives and other converters with DC link. These converters are
connected to the grid through a three phase bridge. The standards
prescribe the maximal harmonic emission which could be easily
limited with high switching frequency. The increased switching
losses can be reduced to the half with the utilization of the wellknown
Flat-top modulation. The suggested control method is the
expansion of the Flat-top modulation with which the losses could be
also reduced to the half compared to the Flat-top modulation.
Comparing to traditional control these requirements can be
simultaneously satisfied much better with the DLF (DC Link
Floating) method.
Abstract: This paper is concerned with a nonautonomous three species food chain model with Crowley–Martin type functional response and time delay. Using the Mawhin-s continuation theorem in theory of degree, sufficient conditions for existence of periodic solutions are obtained.
Abstract: A key requirement for e-learning materials is
reusability and interoperability, that is the possibility to use at least
part of the contents in different courses, and to deliver them trough
different platforms. These features make possible to limit the cost of
new packages, but require the development of material according to
proper specifications. SCORM (Sharable Content Object Reference
Model) is a set of guidelines suitable for this purpose. A specific
adaptation project has been started to make possible to reuse existing
materials. The paper describes the main characteristics of SCORM
specification, and the procedure used to modify the existing material.
Abstract: This paper presents a hand vein authentication system
using fast spatial correlation of hand vein patterns. In order to
evaluate the system performance, a prototype was designed and a
dataset of 50 persons of different ages above 16 and of different
gender, each has 10 images per person was acquired at different
intervals, 5 images for left hand and 5 images for right hand. In
verification testing analysis, we used 3 images to represent the
templates and 2 images for testing. Each of the 2 images is matched
with the existing 3 templates. FAR of 0.02% and FRR of 3.00 %
were reported at threshold 80. The system efficiency at this threshold
was found to be 99.95%. The system can operate at a 97% genuine
acceptance rate and 99.98 % genuine reject rate, at corresponding
threshold of 80. The EER was reported as 0.25 % at threshold 77. We
verified that no similarity exists between right and left hand vein
patterns for the same person over the acquired dataset sample.
Finally, this distinct 100 hand vein patterns dataset sample can be
accessed by researchers and students upon request for testing other
methods of hand veins matching.
Abstract: The object of this paper is to design and analyze a
proportional – integral (PI) control for positive output elementary
super lift Luo converter (POESLLC), which is the start-of-the-art
DC-DC converter. The positive output elementary super lift Luo
converter performs the voltage conversion from positive source
voltage to positive load voltage. This paper proposes a
development of PI control capable of providing the good static and
dynamic performance compared to proportional – integralderivative
(PID) controller. Using state space average method
derives the dynamic equations describing the positive output
elementary super lift luo converter and PI control is designed. The
simulation model of the positive output elementary super lift Luo
converter with its control circuit is implemented in
Matlab/Simulink. The PI control for positive output elementary
super lift Luo converter is tested for transient region, line changes,
load changes, steady state region and also for components
variations.
Abstract: This paper presents a real-time defect detection
algorithm for high-speed steel bar in coil. Because the target speed is
very high, proposed algorithm should process quickly the large
volumes of image for real-time processing. Therefore, defect detection
algorithm should satisfy two conflicting requirements of reducing the
processing time and improving the efficiency of defect detection. To
enhance performance of detection, edge preserving method is
suggested for noise reduction of target image. Finally, experiment
results show that the proposed algorithm guarantees the condition of
the real-time processing and accuracy of detection.
Abstract: The vast rural landscape in the southern United States
is conspicuously characterized by the hedgerow trees or groves. The
patchwork landscape of fields surrounded by high hedgerows is a
traditional and familiar feature of the American countryside.
Hedgerows are in effect linear strips of trees, groves, or woodlands,
which are often critical habitats for wildlife and important for the
visual quality of the landscape. As landscape interfaces, hedgerows
define the spaces in the landscape, give the landscape life and
meaning, and enrich ecologies and cultural heritages of the American
countryside. Although hedgerows were originally intended as fences
and to mark property and townland boundaries, they are not merely
the natural or man-made additions to the landscape--they have
gradually become “naturalized" into the landscape, deeply rooted in
the rural culture, and now formed an important component of the
southern American rural environment. However, due to the ever
expanding real estate industry and high demand for new residential
development, substantial areas of authentic hedgerow landscape in
the southern United States are being urbanized. Using Hudson Farm
as an example, this study illustrated guidelines of how hedgerows can
be integrated into town planning as green infrastructure and
landscape interface to innovate and direct sustainable land use, and
suggest ways in which such vernacular landscapes can be preserved
and integrated into new development without losing their contextual
inspiration.
Abstract: Measurement of the COD of a spent caustic solution involves firstly digestion of a test sample with dichromate solution and secondly measurement of dichromate remained by titration by ferrous ammonium sulfate [FAS] to an end point. In this paper we study by a potentiometric end point with Ag/AgCl reference electrode and gold rode electrode. The potentiometric end point is sharp and easily identified especially for the samples with high turbidity and color that other methods such as colorimetric in this type of sample do not result in high precision. Because interim of titration responds quickly to potential changes within the [Cr+6/Cr+3& Fe+2/Fe+3] solution producing stable readings that is lead to accurate COD measurement. Finally results are compared with data determined using colorimetric method for standard samples. It is shown that the potentiometric end point titration with gold rode electrode can be used with equal or better facility
Abstract: Cenozoic basalts found in Jiangsu province of eastern
China include tholeiites and alkali basalts. The present paper analyzed
the major, trace elements, rare earth elements of these Cenozoic
basalts and combined with Sr-Nd isotopic compositions proposed by
Chen et al. (1990)[1] in the literatures to discuss the petrogenesis of
these basalts and the geochemical characteristics of the source mantle.
Based on major, trace elements and fractional crystallization model
established by Brooks and Nielsen (1982)[2] we suggest that the
basaltic magma has experienced olivine + clinopyroxene fractionation
during its evolution. The chemical compositions of basaltic rocks from
Jiangsu province indicate that these basalts may belong to the same
magmatic system. Spidergrams reveal that Cenozoic basalts from
Jiangsu province have geochemical characteristics similar to those of
ocean island basalts(OIB). The slight positive Nb and Ti anomalies
found in basaltic rocks of this study suggest the presence of Ti-bearing
minerals in the mantle source and these Ti-bearing minerals had
contributed to basaltic magma during partial melting, indicating a
metasomatic event might have occurred before the partial melting.
Based on the Sr vs. Nd isotopic ratio plots, we suggest that Jiangsu
basalts may be derived from partial melting of mantle source which
may represent two-end members mixing of DMM and EM-I. Some
Jiangsu basaltic magma may be derived from partial melting of EM-I
heated by the upwelling asthenospheric mantle or asthenospheric
diapirism.
Abstract: This paper introduces a hand gesture recognition system to recognize real time gesture in unstrained environments. Efforts should be made to adapt computers to our natural means of communication: Speech and body language. A simple and fast algorithm using orientation histograms will be developed. It will recognize a subset of MAL static hand gestures. A pattern recognition system will be using a transforrn that converts an image into a feature vector, which will be compared with the feature vectors of a training set of gestures. The final system will be Perceptron implementation in MATLAB. This paper includes experiments of 33 hand postures and discusses the results. Experiments shows that the system can achieve a 90% recognition average rate and is suitable for real time applications.
Abstract: Quality Function Deployment (QFD) is an expounded, multi-step planning method for delivering commodity, services, and processes to customers, both external and internal to an organization. It is a way to convert between the diverse customer languages expressing demands (Voice of the Customer), and the organization-s languages expressing results that sate those demands. The policy is to establish one or more matrices that inter-relate producer and consumer reciprocal expectations. Due to its visual presence is called the “House of Quality" (HOQ). In this paper, we assumed HOQ in multi attribute decision making (MADM) pattern and through a proposed MADM method, rank technical specifications. Thereafter compute satisfaction degree of customer requirements and for it, we apply vagueness and uncertainty conditions in decision making by fuzzy set theory. This approach would propound supervised neural network (perceptron) for MADM problem solving.
Abstract: The purpose of this paper is to present two different
approaches of financial distress pre-warning models appropriate for
risk supervisors, investors and policy makers. We examine a sample
of the financial institutions and electronic companies of Taiwan
Security Exchange (TSE) market from 2002 through 2008. We
present a binary logistic regression with paned data analysis. With
the pooled binary logistic regression we build a model including
more variables in the regression than with random effects, while the
in-sample and out-sample forecasting performance is higher in
random effects estimation than in pooled regression. On the other
hand we estimate an Adaptive Neuro-Fuzzy Inference System
(ANFIS) with Gaussian and Generalized Bell (Gbell) functions and
we find that ANFIS outperforms significant Logit regressions in both
in-sample and out-of-sample periods, indicating that ANFIS is a
more appropriate tool for financial risk managers and for the
economic policy makers in central banks and national statistical
services.