Abstract: Ant colony optimization (ACO) and its variants are
applied extensively to resolve various continuous optimization
problems. As per the various diversification and intensification
schemes of ACO for continuous function optimization, researchers
generally consider components of multidimensional state space to
generate the new search point(s). However, diversifying to a new
search space by updating only components of the multidimensional
vector may not ensure that the new point is at a significant distance
from the current solution. If a minimum distance is not ensured
during diversification, then there is always a possibility that the
search will end up with reaching only local optimum. Therefore, to
overcome such situations, a Mahalanobis distance-based
diversification with Nelder-Mead simplex-based search scheme for
each ant is proposed for the ACO strategy. A comparative
computational run results, based on nine nonlinear standard test
problems, confirms that the performance of ACO is improved
significantly with the integration of the proposed schemes in the
ACO.
Abstract: In literature, there are metrics for identifying the
quality of reusable components but the framework that makes use of
these metrics to precisely predict reusability of software components
is still need to be worked out. These reusability metrics if identified
in the design phase or even in the coding phase can help us to reduce
the rework by improving quality of reuse of the software component
and hence improve the productivity due to probabilistic increase in
the reuse level. As CK metric suit is most widely used metrics for
extraction of structural features of an object oriented (OO) software;
So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO
and LCOM, is used to obtain the structural analysis of OO-based
software components. An algorithm has been proposed in which the
inputs can be given to K-Means Clustering system in form of
tuned values of the OO software component and decision tree is
formed for the 10-fold cross validation of data to evaluate the in
terms of linguistic reusability value of the component. The developed
reusability model has produced high precision results as desired.
Abstract: This paper presents a procedure of forming the
mathematical model of radial electric power systems for simulation
of both transient and steady-state conditions. The research idea has
been based on nodal voltages technique and on differentiation of
Kirchhoff's current law (KCL) applied to each non-reference node of
the radial system, the result of which the nodal voltages has been
calculated by solving a system of algebraic equations. Currents of the
electric power system components have been determined by solving
their respective differential equations. Transforming the three-phase
coordinate system into Cartesian coordinate system in the model
decreased the overall number of equations by one third. The use of
Cartesian coordinate system does not ignore the DC component
during transient conditions, but restricts the model's implementation
for symmetrical modes of operation only. An example of the input
data for a four-bus radial electric power system has been calculated.
Abstract: Texture information plays increasingly an important
role in remotely sensed imagery classification and many pattern
recognition applications. However, the selection of relevant textural
features to improve this classification accuracy is not a straightforward
task. This work investigates the effectiveness of two Mutual
Information Feature Selector (MIFS) algorithms to select salient
textural features that contain highly discriminatory information for
multispectral imagery classification. The input candidate features are
extracted from a SPOT High Resolution Visible(HRV) image using
Wavelet Transform (WT) at levels (l = 1,2).
The experimental results show that the selected textural features
according to MIFS algorithms make the largest contribution to
improve the classification accuracy than classical approaches such
as Principal Components Analysis (PCA) and Linear Discriminant
Analysis (LDA).
Abstract: WOLED is widely used as lighting for high efficacy and little power consumption. In this research, power factor testing between WOLED and fluorescent lamp to see which one is more efficient in consuming energy. Since both lamps use semiconductor components, so calculation of the power factor need to consider the effects of harmonics. Harmonic make bigger losses. The study is conducted by comparing the value of the power factor regardless of harmonics (DPF) and also by included the harmonics (TPF). The average value of DPF of fluorescent is 0.953 while WOLED is 0.972. The average value of TPF of fluorescent is 0.717 whereas WOLED is 0.933. So from the review of power factor WOLED is more energy efficient than fluorescent lamp.
Abstract: A typical definition of the Computer Aided Diagnosis
(CAD), found in literature, can be: A diagnosis made by a radiologist
using the output of a computerized scheme for automated image
analysis as a diagnostic aid. Often it is possible to find the expression
Computer Aided Detection (CAD or CADe): this definition
emphasizes the intent of CAD to support rather than substitute the
human observer in the analysis of radiographic images. In this article
we will illustrate the application of CAD systems and the aim of
these definitions.
Commercially available CAD systems use computerized
algorithms for identifying suspicious regions of interest. In this paper
are described the general CAD systems as an expert system
constituted of the following components: segmentation / detection,
feature extraction, and classification / decision making.
As example, in this work is shown the realization of a Computer-
Aided Detection system that is able to assist the radiologist in
identifying types of mammary tumor lesions. Furthermore this
prototype of station uses a GRID configuration to work on a large
distributed database of digitized mammographic images.
Abstract: The paper presents a method for multivariate time
series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series
space. The forecasting can be done separately and with a different
method for each component, depending on its time structure. The
paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series
with five components, generated from three sources and a mixing matrix, randomly generated.
Abstract: This paper describes a proposed support system which
enables applications designers to effectively create VR applications
using multiple haptic APIs. When the VR designers create
applications, it is often difficult to handle and understand many
parameters and functions that have to be set in the application program
using documentation manuals only. This complication may disrupt
creative imagination and result in inefficient coding. So, we proposed
the support application which improved the efficiency of VR
applications development and provided the interactive components of
confirmation of operations with haptic sense previously.
In this paper, we describe improvements of our former proposed
support application, which was applicable to multiple APIs and haptic
devices, and evaluate the new application by having participants
complete VR program. Results from a preliminary experiment suggest
that our application facilitates creation of VR applications.
Abstract: In this work, the primary compressive strength
components of human femur trabecular bone are qualitatively
assessed using image processing and wavelet analysis. The Primary
Compressive (PC) component in planar radiographic femur trabecular
images (N=50) is delineated by semi-automatic image processing
procedure. Auto threshold binarization algorithm is employed to
recognize the presence of mineralization in the digitized images. The
qualitative parameters such as apparent mineralization and total area
associated with the PC region are derived for normal and abnormal
images.The two-dimensional discrete wavelet transforms are utilized
to obtain appropriate features that quantify texture changes in medical
images .The normal and abnormal samples of the human femur are
comprehensively analyzed using Harr wavelet.The six statistical
parameters such as mean, median, mode, standard deviation, mean
absolute deviation and median absolute deviation are derived at level
4 decomposition for both approximation and horizontal wavelet
coefficients. The correlation coefficient of various wavelet derived
parameters with normal and abnormal for both approximated and
horizontal coefficients are estimated. It is seen that in almost all cases
the abnormal show higher degree of correlation than normals. Further
the parameters derived from approximation coefficient show more
correlation than those derived from the horizontal coefficients. The
parameters mean and median computed at the output of level 4 Harr
wavelet channel was found to be a useful predictor to delineate the
normal and the abnormal groups.
Abstract: Yeast cells live in a constantly changing environment that requires the continuous adaptation of their genomic program in order to sustain their homeostasis, survive and proliferate. Due to the advancement of high throughput technologies, there is currently a large amount of data such as gene expression, gene deletion and protein-protein interactions for S. Cerevisiae under various environmental conditions. Mining these datasets requires efficient computational methods capable of integrating different types of data, identifying inter-relations between different components and inferring functional groups or 'modules' that shape intracellular processes. This study uses computational methods to delineate some of the mechanisms used by yeast cells to respond to environmental changes. The GRAM algorithm is first used to integrate gene expression data and ChIP-chip data in order to find modules of coexpressed and co-regulated genes as well as the transcription factors (TFs) that regulate these modules. Since transcription factors are themselves transcriptionally regulated, a three-layer regulatory cascade consisting of the TF-regulators, the TFs and the regulated modules is subsequently considered. This three-layer cascade is then modeled quantitatively using artificial neural networks (ANNs) where the input layer corresponds to the expression of the up-stream transcription factors (TF-regulators) and the output layer corresponds to the expression of genes within each module. This work shows that (a) the expression of at least 33 genes over time and for different stress conditions is well predicted by the expression of the top layer transcription factors, including cases in which the effect of up-stream regulators is shifted in time and (b) identifies at least 6 novel regulatory interactions that were not previously associated with stress-induced changes in gene expression. These findings suggest that the combination of gene expression and protein-DNA interaction data with artificial neural networks can successfully model biological pathways and capture quantitative dependencies between distant regulators and downstream genes.
Abstract: Knowledge bases are basic components of expert
systems or intelligent computational programs. Knowledge bases
provide knowledge, events that serve deduction activity,
computation and control. Therefore, researching and developing of
models for knowledge representation play an important role in
computer science, especially in Artificial Intelligence Science and
intelligent educational software. In this paper, the extensive
deduction computational model is proposed to design knowledge
bases whose attributes are able to be real values or functional values.
The system can also solve problems based on knowledge bases.
Moreover, the models and algorithms are applied to produce the
educational software for solving alternating current problems or
solving set of equations automatically.
Abstract: The overall objective of this research is a strain
improvement technology for efficient pectinase production. A novel
cells cultivation technology by immobilization of fungal cells has
been studied in long time continuous fermentations. Immobilization
was achieved by using of new material for absorption of stores of
immobilized cultures which was for the first time used for
immobilization of microorganisms. Effects of various conditions of
nitrogen and carbon nutrition on the biosynthesis of pectolytic
enzymes in Aspergillus awamori 1-8 strain were studied. Proposed
cultivation technology along with optimization of media components
for pectinase overproduction led to increased pectinase productivity
in Aspergillus awamori 1-8 from 7 to 8 times. Proposed technology
can be applied successfully for production of major industrial
enzymes such as α-amylase, protease, collagenase etc.
Abstract: The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Abstract: This paper gives an overview of the mapping
mechanism of SEAM-a methodology for the automatic generation of
knowledge models and its mapping onto Java codes. It discusses the
rules that will be used to map the different components in the
knowledge model automatically onto Java classes, properties and
methods. The aim of developing this mechanism is to help in the
creation of a prototype which will be used to validate the knowledge
model which has been generated automatically. It will also help to
link the modeling phase with the implementation phase as existing
knowledge engineering methodologies do not provide for proper
guidelines for the transition from the knowledge modeling phase to
development phase. This will decrease the development overheads
associated to the development of Knowledge Based Systems.
Abstract: In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Abstract: A minimal complexity version of component mode
synthesis is presented that requires simplified computer
programming, but still provides adequate accuracy for modeling
lower eigenproperties of large structures and their transient
responses. The novelty is that a structural separation into components
is done along a plane/surface that exhibits rigid-like behavior, thus
only normal modes of each component is sufficient to use, without
computing any constraint, attachment, or residual-attachment modes.
The approach requires only such input information as a few (lower)
natural frequencies and corresponding undamped normal modes of
each component. A novel technique is shown for formulation of
equations of motion, where a double transformation to generalized
coordinates is employed and formulation of nonproportional damping
matrix in generalized coordinates is shown.
Abstract: In modern agriculture, polymeric hydrogels are
known as a component able to hold an amount of water due to their
3-dimensional network structure and their tendency to absorb water
in humid environments. In addition, these hydrogels are able to
controllably release the fertilisers and pesticides loaded in them.
Therefore, they deliver these materials to the plants' roots and help
them with growing. These hydrogels also reduce the pollution of
underground water sources by preventing the active components
from leaching. In this study, sIPN acrylamide based hydrogels are
synthesised by using acrylamide free radical, potassium acrylate, and
linear polyvinyl alcohol. Ammonium nitrate is loaded in the hydrogel
as the fertiliser. The effect of various amounts of monomers and
linear polymer, measured in molar ratio, on the swelling rate,
equilibrium swelling, and release of ammonium nitrate is studied.
Abstract: The aim of this paper is to study in depth some
methodological aspects of social interventation, focusing on desirable
passage from social maternage method to peer advocacy method. For
this purpose, we intend analyze social and organizative components,
that affect operator-s professional action and that are part of his
psychological environment, besides the physical and social one. In
fact, operator-s interventation should not be limited to a pure supply
of techniques, nor to take shape as improvised action, but “full of
good purposes".
Abstract: The preparation of good-quality Environmental Impact Assessment (EIA) reports contribute to enhancing overall effectiveness of EIA. This component of the EIA process becomes more important in situation where public participation is weak and there is lack of expertise on the part of the competent authority. In Pakistan, EIA became mandatory for every project likely to cause adverse environmental impacts from July 1994. The competent authority also formulated guidelines for preparation and review of EIA reports in 1997. However, EIA is yet to prove as a successful decision support tool to help in environmental protection. One of the several reasons of this ineffectiveness is the generally poor quality of EIA reports. This paper critically reviews EIA reports of some randomly selected projects. Interviews of EIA consultants, project proponents and concerned government officials have also been conducted to underpin the root causes of poor quality of EIA reports. The analysis reveals several inadequacies particularly in areas relating to identification, evaluation and mitigation of key impacts and consideration of alternatives. The paper identifies some opportunities and suggests measures for improving the quality of EIA reports and hence making EIA an effective tool to help in environmental protection.
Abstract: Reliability assessment and risk analysis of rotating
machine rotors in various overload and malfunction situations
present challenge to engineers and operators. In this paper a new
analytical method for evaluation of rotor under large deformation is
addressed. Model is presented in general form to include also
composite rotors. Presented simulation procedure is based on
variational work method and has capability to account for geometric
nonlinearity, large displacement, nonlinear support effect and rotor
contacting other machine components. New shape functions are
presented which capable to predict accurate nonlinear profile of
rotor. The closed form solutions for various operating and
malfunction situations are expressed. Analytical simulation results
are discussed