Abstract: Response surface methodology (RSM) is a very
efficient tool to provide a good practical insight into developing new
process and optimizing them. This methodology could help
engineers to raise a mathematical model to represent the behavior of
system as a convincing function of process parameters.
Through this paper the sequential nature of the RSM surveyed for process
engineers and its relationship to design of experiments (DOE), regression
analysis and robust design reviewed. The proposed four-step procedure in
two different phases could help system analyst to resolve the parameter
design problem involving responses. In order to check accuracy of the
designed model, residual analysis and prediction error sum of squares
(PRESS) described.
It is believed that the proposed procedure in this study can resolve a
complex parameter design problem with one or more responses. It can be
applied to those areas where there are large data sets and a number of
responses are to be optimized simultaneously. In addition, the proposed
procedure is relatively simple and can be implemented easily by using
ready-made standard statistical packages.
Abstract: This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.
Abstract: In this paper, the concepts of dichotomous logistic
regression (DLR) with leave-one-out (L-O-O) were discussed. To
illustrate this, the L-O-O was run to determine the importance of the
simulation conditions for robust test of spread procedures with good
Type I error rates. The resultant model was then evaluated. The
discussions included 1) assessment of the accuracy of the model, and
2) parameter estimates. These were presented and illustrated by
modeling the relationship between the dichotomous dependent
variable (Type I error rates) with a set of independent variables (the
simulation conditions). The base SAS software containing PROC
LOGISTIC and DATA step functions can be making used to do the
DLR analysis.
Abstract: Selective harmonic elimination-pulse width modulation techniques offer a tight control of the harmonic spectrum of a given voltage waveform generated by a power electronic converter along with a low number of switching transitions. Traditional optimization methods suffer from various drawbacks, such as prolonged and tedious computational steps and convergence to local optima; thus, the more the number of harmonics to be eliminated, the larger the computational complexity and time. This paper presents a novel method for output voltage harmonic elimination and voltage control of PWM AC/AC voltage converters using the principle of hybrid Real-Coded Genetic Algorithm-Pattern Search (RGA-PS) method. RGA is the primary optimizer exploiting its global search capabilities, PS is then employed to fine tune the best solution provided by RGA in each evolution. The proposed method enables linear control of the fundamental component of the output voltage and complete elimination of its harmonic contents up to a specified order. Theoretical studies have been carried out to show the effectiveness and robustness of the proposed method of selective harmonic elimination. Theoretical results are validated through simulation studies using PSIM software package.
Abstract: Performance control law is studied for an
interconnected fractional nonlinear system. Applying a backstepping
algorithm, a backstepping sliding mode controller (BSMC) is
developed for fractional nonlinear system. To improve control law
performance, BSMC is coupled to an adaptive sliding mode observer
have a filtered error as a sliding surface. The both architecture
performance is studied throughout the inverted pendulum mounted on
a cart. Simulation result show that the BSMC coupled to an adaptive
sliding mode observer have stable control law and eligible control
amplitude than the BSMC.
Abstract: In this study, the ability of Aspergillus niger and
Penicillium simplicissimum to extract heavy metals from a spent
refinery catalyst was investigated. For the first step, a spent
processing catalyst from one of the oil refineries in Iran was
physically and chemically characterized. Aspergillus niger and
Penicillium simplicissimum were used to mobilize Al/Co/Mo/Ni from
hazardous spent catalysts. The fungi were adapted to the mixture of
metals at 100-800 mg L-1 with increments in concentration of 100 mg
L-1. Bioleaching experiments were carried out in batch cultures. To
investigate the production of organic acids in sucrose medium,
analyses of the culture medium by HPLC were performed at specific
time intervals after inoculation. The results obtained from Inductive
coupled plasma-optical emission spectrometry (ICP-OES) showed
that after the one-step bioleaching process using Aspergillus niger,
maximum removal efficiencies of 27%, 66%, 62% and 38% were
achieved for Al, Co, Mo and Ni, respectively. However, the highest
removal efficiencies using Penicillium simplicissimum were of 32%,
67%, 65% and 38% for Al, Co, Mo and Ni, respectively
Abstract: Data Structures and Algorithms is a module in most
Computer Science or Information Technology curricula. It is one of
the modules most students identify as being difficult. This paper
demonstrates how programming a solution for Sudoku can make
abstract concepts more concrete. The paper relates concepts of a
typical Data Structures and Algorithms module to a step by step
solution for Sudoku in a human type as opposed to a computer
oriented solution.
Abstract: The Programmable Logic Controller (PLC) plays a
vital role in automation and process control. Grafcet is used for
representing the control logic, and traditional programming
languages are used for describing the pure algorithms. Grafcet is used
for dividing the process to be automated in elementary sequences that
can be easily implemented. Each sequence represent a step that has
associated actions programmed using textual or graphical languages
after case. The programming task is simplified by using a set of
subroutines that are used in several steps. The paper presents an
example of implementation for a punching machine for sheets and
plates. The use the graphical languages the programming of a
complex sequential process is a necessary solution. The state of
Grafcet can be used for debugging and malfunction determination.
The use of the method combined with a set of knowledge acquisition
for process application reduces the downtime of the machine and
improve the productivity.
Abstract: Empirical insights into the implementation of logistics competencies at the top management level are scarce. This paper addresses this issue with an explorative approach which is based on a dataset of 872 observations in the years 2000, 2004 and 2008 using quantitative content analysis from annual reports of the 500 publicly listed firms with the highest global research and development expenditures according to the British Department for Business Innovation and Skills. We find that logistics competencies are more pronounced in Asian companies than in their European or American counterparts. On an industrial level the results are quite mixed. Using partial point-biserial correlations we show that logistics competencies are positively related to financial performance.
Abstract: This paper presents an application of particle swarm
optimization (PSO) to the grounding grid planning which compares to
the application of genetic algorithm (GA). Firstly, based on IEEE
Std.80, the cost function of the grounding grid and the constraints of
ground potential rise, step voltage and touch voltage are constructed
for formulating the optimization problem of grounding grid planning.
Secondly, GA and PSO algorithms for obtaining optimal solution of
grounding grid are developed. Finally, a case of grounding grid
planning is shown the superiority and availability of the PSO
algorithm and proposal planning results of grounding grid in cost and
computational time.
Abstract: The objective of this work is to present a expertise on
flooding hazard analysis and how to reduce the risk. The analysis
concerns the disaster induced by the flood on November 10/11, 2001
in the Bab El Oued district of the city of Algiers.The study begins by
an expertise of damages in related with the urban environment and
the history of the urban growth of the site. After this phase, the work
is focalized on the identification of the existing correlations between
the development of the town and its vulnerability. The final step
consists to elaborate the interpretations on the interactions between
the urban growth, the sewerage network and the vulnerability of the
urban system.In conclusion, several recommendations are formulated
permitting the mitigation of the risk in the future. The principal
recommendations concern the new urban operations and the existing
urbanized sites.
Abstract: Soil organic carbon (SOC) plays a key role in soil
fertility, hydrology, contaminants control and acts as a sink or source
of terrestrial carbon content that can affect the concentration of
atmospheric CO2. SOC supports the sustainability and quality of
ecosystems, especially in semi-arid region. This study was
conducted to determine relative importance of 13 different
exploratory climatic, soil and geometric factors on the SOC contents
in one of the semiarid watershed zones in Iran. Two methods
canonical discriminate analysis (CDA) and feed-forward back
propagation neural networks were used to predict SOC. Stepwise
regression and sensitivity analysis were performed to identify
relative importance of exploratory variables. Results from sensitivity
analysis showed that 7-2-1 neural networks and 5 inputs in CDA
models output have highest predictive ability that explains %70 and
%65 of SOC variability. Since neural network models outperformed
CDA model, it should be preferred for estimating SOC.
Abstract: Hexapod Machine Tool (HMT) is a parallel robot
mostly based on Stewart platform. Identification of kinematic
parameters of HMT is an important step of calibration procedure. In
this paper an algorithm is presented for identifying the kinematic
parameters of HMT using inverse kinematics error model. Based on
this algorithm, the calibration procedure is simulated. Measurement
configurations with maximum observability are decided as the first
step of this algorithm for a robust calibration. The errors occurring in
various configurations are illustrated graphically. It has been shown
that the boundaries of the workspace should be searched for the
maximum observability of errors. The importance of using
configurations with sufficient observability in calibrating hexapod
machine tools is verified by trial calibration with two different
groups of randomly selected configurations. One group is selected to
have sufficient observability and the other is in disregard of the
observability criterion. Simulation results confirm the validity of the
proposed identification algorithm.
Abstract: In this paper, we evaluate the performance of some wavelet based coding algorithms such as 3D QT-L, 3D SPIHT and JPEG2K. In the first step we achieve an objective comparison between three coders, namely 3D SPIHT, 3D QT-L and JPEG2K. For this purpose, eight MRI head scan test sets of 256 x 256x124 voxels have been used. Results show superior performance of 3D SPIHT algorithm, whereas 3D QT-L outperforms JPEG2K. The second step consists of evaluating the robustness of 3D SPIHT and JPEG2K coding algorithm over wireless transmission. Compressed dataset images are then transmitted over AWGN wireless channel or over Rayleigh wireless channel. Results show the superiority of JPEG2K over these two models. In fact, it has been deduced that JPEG2K is more robust regarding coding errors. Thus we may conclude the necessity of using corrector codes in order to protect the transmitted medical information.
Abstract: The application of Neural Network for disease
diagnosis has made great progress and is widely used by physicians.
An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which
was the great motivation towards our study. In our work, tachycardia
features obtained are used for the training and testing of a Neural
Network. In this study we are using Fuzzy Probabilistic Neural
Networks as an automatic technique for ECG signal analysis. As
every real signal recorded by the equipment can have different
artifacts, we needed to do some preprocessing steps before feeding it
to our system. Wavelet transform is used for extracting the
morphological parameters of the ECG signal. The outcome of the
approach for the variety of arrhythmias shows the represented
approach is superior than prior presented algorithms with an average
accuracy of about %95 for more than 7 tachy arrhythmias.
Abstract: As is known, one of the priority directions of research
works of natural sciences is introduction of applied section of
contemporary mathematics as approximate and numerical methods to
solving integral equation into practice. We fare with the solving of
integral equation while studying many phenomena of nature to whose
numerically solving by the methods of quadrature are mainly applied.
Taking into account some deficiency of methods of quadrature for
finding the solution of integral equation some sciences suggested of
the multistep methods with constant coefficients. Unlike these papers,
here we consider application of hybrid methods to the numerical
solution of Volterra integral equation. The efficiency of the suggested
method is proved and a concrete method with accuracy order p = 4
is constructed. This method in more precise than the corresponding
known methods.
Abstract: In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Abstract: A theory for optimal filtering of infinite sets of random
signals is presented. There are several new distinctive features of the
proposed approach. First, a single optimal filter for processing any
signal from a given infinite signal set is provided. Second, the filter is
presented in the special form of a sum with p terms where each term
is represented as a combination of three operations. Each operation
is a special stage of the filtering aimed at facilitating the associated
numerical work. Third, an iterative scheme is implemented into the
filter structure to provide an improvement in the filter performance at
each step of the scheme. The final step of the scheme concerns signal
compression and decompression. This step is based on the solution of
a new rank-constrained matrix approximation problem. The solution
to the matrix problem is described in this paper. A rigorous error
analysis is given for the new filter.
Abstract: Bus networks design is an important problem in
public transportation. The main step to this design, is determining the
number of required terminals and their locations. This is an especial
type of facility location problem, a large scale combinatorial
optimization problem that requires a long time to be solved.
The genetic algorithm (GA) is a search and optimization technique
which works based on evolutionary principle of natural
chromosomes. Specifically, the evolution of chromosomes due to the
action of crossover, mutation and natural selection of chromosomes
based on Darwin's survival-of-the-fittest principle, are all artificially
simulated to constitute a robust search and optimization procedure.
In this paper, we first state the problem as a mixed integer
programming (MIP) problem. Then we design a new crossover and
mutation for bus terminal location problem (BTLP). We tested the
different parameters of genetic algorithm (for a sample problem) and
obtained the optimal parameters for solving BTLP with numerical try
and error.
Abstract: Decision making preferences to certain criteria
usually focus on positive degrees without considering the negative
degrees. However, in real life situation, evaluation becomes more
comprehensive if negative degrees are considered concurrently.
Preference is expected to be more effective when considering both
positive and negative degrees of preference to evaluate the best
selection. Therefore, the aim of this paper is to propose the
conflicting bifuzzy preference relations in group decision making by
utilization of a novel score function. The conflicting bifuzzy
preference relation is obtained by introducing some modifications on
intuitionistic fuzzy preference relations. Releasing the intuitionistic
condition by taking into account positive and negative degrees
simultaneously and utilizing the novel score function are the main
modifications to establish the proposed preference model. The
proposed model is tested with a numerical example and proved to be
simple and practical. The four-step decision model shows the
efficiency of obtaining preference in group decision making.