Abstract: This paper describes the shape optimization of impeller
blades for a anti-heeling bidirectional axial flow pump used in ships.
In general, a bidirectional axial pump has an efficiency much lower
than the classical unidirectional pump because of the symmetry of the
blade type. In this paper, by focusing on a pump impeller, the shape of
blades is redesigned to reach a higher efficiency in a bidirectional axial
pump. The commercial code employed in this simulation is CFX v.13.
CFD result of pump torque, head, and hydraulic efficiency was
compared. The orthogonal array (OA) and analysis of variance
(ANOVA) techniques and surrogate model based optimization using
orthogonal polynomial, are employed to determine the main effects
and their optimal design variables. According to the optimal design,
we confirm an effective design variable in impeller blades and explain
the optimal solution, the usefulness for satisfying the constraints of
pump torque and head.
Abstract: The authors present a mixed method for reducing the order of the large-scale dynamic systems. In this method, the denominator polynomial of the reduced order model is obtained by using the modified pole clustering technique while the coefficients of the numerator are obtained by Pade approximations. This method is conceptually simple and always generates stable reduced models if the original high-order system is stable. The proposed method is illustrated with the help of the numerical examples taken from the literature.
Abstract: Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.
Abstract: Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.
Abstract: Enzymatic hydrolysis of starch from natural sources
finds potential application in commercial production of alcoholic
beverage and bioethanol. In this study the effect of starch
concentration, temperature, time and enzyme concentration were
studied and optimized for hydrolysis of cassava (Manihot esculenta)
starch powder (of mesh 80/120) into glucose syrup by immobilized
(using Polyacrylamide gel) a-amylase using central composite
design. The experimental result on enzymatic hydrolysis of cassava
starch was subjected to multiple linear regression analysis using
MINITAB 14 software. Positive linear effect of starch concentration,
enzyme concentration and time was observed on hydrolysis of
cassava starch by a-amylase. The statistical significance of the model
was validated by F-test for analysis of variance (p < 0.01). The
optimum value of starch concentration temperature, time and enzyme
concentration were found to be 4.5% (w/v), 45oC, 150 min, and 1%
(w/v) enzyme. The maximum glucose yield at optimum condition
was 5.17 mg/mL.
Abstract: In this paper the Laplace Decomposition method is developed to solve linear and nonlinear fractional integro- differential equations of Volterra type.The fractional derivative is described in the Caputo sense.The Laplace decomposition method is found to be fast and accurate.Illustrative examples are included to demonstrate the validity and applicability of presented technique and comparasion is made with exacting results.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.
Abstract: Recently, a lot of attention has been devoted to
advanced techniques of system modeling. PNN(polynomial neural
network) is a GMDH-type algorithm (Group Method of Data
Handling) which is one of the useful method for modeling nonlinear
systems but PNN performance depends strongly on the number of
input variables and the order of polynomial which are determined by
trial and error. In this paper, we introduce GPNN (genetic
polynomial neural network) to improve the performance of PNN.
GPNN determines the number of input variables and the order of all
neurons with GA (genetic algorithm). We use GA to search between
all possible values for the number of input variables and the order of
polynomial. GPNN performance is obtained by two nonlinear
systems. the quadratic equation and the time series Dow Jones stock
index are two case studies for obtaining the GPNN performance.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: The paper shows that in the analysis of a queuing system with fixed-size batch arrivals, there emerges a set of polynomials which are a generalization of Chebyshev polynomials of the second kind. The paper uses these polynomials in assessing the transient behaviour of the overflow (equivalently call blocking) probability in the system. A key figure to note is the proportion of the overflow (or blocking) probability resident in the transient component, which is shown in the results to be more significant at the beginning of the transient and naturally decays to zero in the limit of large t. The results also show that the significance of transients is more pronounced in cases of lighter loads, but lasts longer for heavier loads.
Abstract: Polynomial bases and normal bases are both used for
elliptic curve cryptosystems, but field arithmetic operations such as
multiplication, inversion and doubling for each basis are implemented
by different methods. In general, it is said that normal bases, especially
optimal normal bases (ONB) which are special cases on normal bases,
are efficient for the implementation in hardware in comparison with
polynomial bases. However there seems to be more examined by
implementing and analyzing these systems under similar condition. In
this paper, we designed field arithmetic operators for each basis over
GF(2233), which field has a polynomial basis recommended by SEC2
and a type-II ONB both, and analyzed these implementation results.
And, in addition, we predicted the efficiency of two elliptic curve
cryptosystems using these field arithmetic operators.
Abstract: 2D/3D registration is a special case of medical image
registration which is of particular interest to surgeons. Applications
of 2D/3D registration are [1] radiotherapy planning and treatment
verification, spinal surgery, hip replacement, neurointerventions and
aortic stenting. The purpose of this paper is to provide a literature
review of the main methods for image registration for the 2D/3D
case. At the end of the paper an algorithm is proposed for 2D/3D
registration based on the Chebyssev polynomials iteration loop.
Abstract: Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of features selection methods to reduce the dimensionality of the document-representation vector. Four feature selection methods are evaluated: Random Selection, Information Gain (IG), Support Vector Machine (called SVM_FS) and Genetic Algorithm with SVM (GA_FS). We showed that the best results were obtained with SVM_FS and GA_FS methods for a relatively small dimension of the features vector comparative with the IG method that involves longer vectors, for quite similar classification accuracies. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).
Abstract: This paper present a new way to find the aerodynamic
characteristic equation of missile for the numerical trajectories
prediction more accurate. The goal is to obtain the polynomial
equation based on two missile characteristic parameters, angle of
attack (α ) and flight speed (ν ). First, the understudied missile is
modeled and used for flow computational model to compute
aerodynamic force and moment. Assume that performance range of
understudied missile where range -10< α
Abstract: A novel sponge submerged membrane bioreactor
(SSMBR) was developed to effectively remove organics and
nutrients from wastewater. Sponge is introduced within the SSMBR
as a medium for the attached growth of biomass. This paper evaluates
the effects of new and acclimatized sponges for dissolved organic
carbon (DOC) removal from wastewater at different mixed liquor
suspended solids- (MLSS) concentration of the sludge. It was
observed in a series of experimental studies that the acclimatized
sponge performed better than the new sponge whilst the optimum
DOC removal could be achieved at 10g/L of MLSS with the
acclimatized sponge. Moreover, the paper analyses the relationships
between the MLSSsponge/MLSSsludge and the DOC removal efficiency
of SSMBR. The results showed a non-linear relationship between the
biomass parameters of the sponge and the sludge, and the DOC
removal efficiency of SSMBR. A second-order polynomial function
could reasonably represent these relationships.
Abstract: In this paper, novel techniques in increasing the accuracy
and speed of convergence of a Feed forward Back propagation
Artificial Neural Network (FFBPNN) with polynomial activation
function reported in literature is presented. These technique was
subsequently used to determine the coefficients of Autoregressive
Moving Average (ARMA) and Autoregressive (AR) system. The
results obtained by introducing sequential and batch method of weight
initialization, batch method of weight and coefficient update, adaptive
momentum and learning rate technique gives more accurate result
and significant reduction in convergence time when compared t the
traditional method of back propagation algorithm, thereby making
FFBPNN an appropriate technique for online ARMA coefficient
determination.
Abstract: This paper examines many mathematical methods for
molding the hourly price forward curve (HPFC); the model will be
constructed by numerous regression methods, like polynomial
regression, radial basic function neural networks & a furrier series.
Examination the models goodness of fit will be done by means of
statistical & graphical tools. The criteria for choosing the model will
depend on minimize the Root Mean Squared Error (RMSE), using the
correlation analysis approach for the regression analysis the optimal
model will be distinct, which are robust against model
misspecification. Learning & supervision technique employed to
determine the form of the optimal parameters corresponding to each
measure of overall loss. By using all the numerical methods that
mentioned previously; the explicit expressions for the optimal model
derived and the optimal designs will be implemented.
Abstract: The adsorption of simulated aqueous solution containing textile remazol reactive dye, namely Red 3BS by palm shell activated carbon (PSAC) as adsorbent was carried out using Response Surface Methodology (RSM). A Box-Behnken design in three most important operating variables; initial dye concentration, dosage of adsorbent and speed of impeller was employed for experimental design and optimization of results. The significance of independent variables and their interactions were tested by means of the analysis of variance (ANOVA) with 95% confidence limits. Model indicated that with the increasing of dosage and speed give the result of removal up to 90% with the capacity uptake more than 7 mg/g. High regression coefficient between the variables and the response (R-Sq = 93.9%) showed of good evaluation of experimental data by polynomial regression model.