Abstract: Vertex configuration for a vertex in an orthogonal
pseudo-polyhedron is an identity of a vertex that is determined by the
number of edges, dihedral angles, and non-manifold properties
meeting at the vertex. There are up to sixteen vertex configurations
for any orthogonal pseudo-polyhedron (OPP). Understanding the
relationship between these vertex configurations will give us insight
into the structure of an OPP and help us design better algorithms for
many 3-dimensional geometric problems. In this paper, 16 vertex
configurations for OPP are described first. This is followed by a
number of formulas giving insight into the relationship between
different vertex configurations in an OPP. These formulas
will be useful as an extension of orthogonal polyhedra usefulness on
pattern analysis in 3D-digital images.
Abstract: In this paper, we describe a rule-based message passing method to support developing collaborative applications, in which multiple users share resources in distributed environments. Message communications of applications in collaborative environments tend to be very complex because of the necessity to manage context situations such as sharing events, access controlling of users, and network places. In this paper, we propose a message communications method based on unification of artificial intelligence and logic programming for defining rules of such context information in a procedural object-oriented programming language. We also present an implementation of the method as java classes.
Abstract: Using strength Pulse Electrical Field (PEF) in food
industries is a non-thermal process that can deactivate
microorganisms and increase penetration in plant and animals tissues
without serious impact on food taste and quality. In this paper designing and fabricating of a PEF generator has been presented. Pulse generation methods have been surveyed and the best of them
selected. The equipment by controller set can generate square pulse with adjustable parameters such as amplitude 1-5kV, frequency 0.1-10Hz, pulse width 10-100s, and duty cycle 0-100%. Setting the number of pulses, and presenting the output voltage and current
waveforms on the oscilloscope screen are another advantages of this
equipment. Finally, some food samples were tested that yielded the satisfactory results. PEF applying had considerable effects on potato, banana and purple cabbage. It caused increase Brix factor from 0.05
to 0.15 in potato solution. It is also so effective in extraction color material from purple cabbage. In the last experiment effects of PEF
voltages on color extraction of saffron scum were surveyed (about 6% increasing yield).
Abstract: The construction of original functional sample of the portable device for fast analysis of energetic materials has been described in the paper. The portable device consisting of two parts – an original miniaturized microcolumn liquid chromatograph and a unique chemiluminescence detector – has been proposed and realized. In a very short time, this portable device is capable of identifying selectively most of military nitramine- and nitroesterbased explosives as well as inorganic nitrates occurring in trace concentrations in water or in soil. The total time required for the identification of extracts is shorter than 8 minutes.
Abstract: The study investigated the educational implications
that can be derived from the work of a variety of celebrated figures
such as Piaget, Vygotsky, and Bruner that will be helpful in the field
of language learning. However, the writer believed these views were
previously expressed not full–fledged by Comenius who has been
described by Howatt (1984) as a genius–the one that the history of
language teaching can claim. And we owe to him more than anyone.
Abstract: Efficient storage, transmission and use of video information are key requirements in many multimedia applications currently being addressed by MPEG-4. To fulfill these requirements, a new approach for representing video information which relies on an object-based representation, has been adopted. Therefore, objectbased watermarking schemes are needed for copyright protection. This paper proposes a novel blind object watermarking scheme for images and video using the in place lifting shape adaptive-discrete wavelet transform (SA-DWT). In order to make the watermark robust and transparent, the watermark is embedded in the average of wavelet blocks using the visual model based on the human visual system. Wavelet coefficients n least significant bits (LSBs) are adjusted in concert with the average. Simulation results shows that the proposed watermarking scheme is perceptually invisible and robust against many attacks such as lossy image/video compression (e.g. JPEG, JPEG2000 and MPEG-4), scaling, adding noise, filtering, etc.
Abstract: In this paper we present the PC cluster built at R.V.
College of Engineering (with great help from the Department of
Computer Science and Electrical Engineering). The structure of the
cluster is described and the performance is evaluated by rendering of
complex 3D Persistence of Vision (POV) images by the Ray-Tracing
algorithm. Here, we propose an unexampled method to render such
images, distributedly on a low cost scalable.
Abstract: In this paper, Wavelet based ANFIS for finding inter
turn fault of generator is proposed. The detector uniquely responds to
the winding inter turn fault with remarkably high sensitivity.
Discrimination of different percentage of winding affected by inter
turn fault is provided via ANFIS having an Eight dimensional input
vector. This input vector is obtained from features extracted from
DWT of inter turn faulty current leaving the generator phase
winding. Training data for ANFIS are generated via a simulation of
generator with inter turn fault using MATLAB. The proposed
algorithm using ANFIS is giving satisfied performance than ANN
with selected statistical data of decomposed levels of faulty current.
Abstract: Power system stabilizers (PSS) must be capable of providing appropriate stabilization signals over a broad range of
operating conditions and disturbance. Traditional PSS rely on robust
linear design method in an attempt to cover a wider range of operating
condition. Expert or rule-based controllers have also been proposed.
Recently fuzzy logic (FL) as a novel robust control
design method has shown promising results. The emphasis in fuzzy
control design center is around uncertainties in the system parameters
& operating conditions. In this paper a novel Robust Fuzzy Logic Power
System Stabilizer (RFLPSS) design is proposed The RFLPSS
basically utilizes only one measurable Δω signal as input
(generator shaft speed).
The speed signal is discretized resulting in three inputs to the
RFLPSS. There are six rules for the fuzzification and two rules for
defuzzification. To provide robustness, additional signal namely,
speed are used as inputs to RFLPSS enabling appropriate gain
adjustments for the three RFLPSS inputs. Simulation studies
show the superior performance of the RFLPSS compared
with an optimally designed conventional PSS and discrete mode FLPSS.
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.
Abstract: In today-s economy plant engineering faces many
challenges. For instance the intensifying competition in this business
is leading to cost competition and needs for a shorter time-to-market.
To remain competitive companies need to make their businesses
more profitable by implementing improvement programs such as
standardization projects. But they have difficulties to tap their full
economic potential for various reasons. One of them is non-holistic
planning and implementation of standardization projects. This paper
describes a new conceptual framework - the layer-model. The model
combines and expands existing proven approaches in order to
improve design, implementation and management of standardization
projects. Based on a holistic approach it helps to systematically
analyze the effects of standardization projects on different business
layers and enables companies to better seize the opportunities offered
by standardization.
Abstract: Discrete Cosine Transform (DCT) based transform coding is very popular in image, video and speech compression due to its good energy compaction and decorrelating properties. However, at low bit rates, the reconstructed images generally suffer from visually annoying blocking artifacts as a result of coarse quantization. Lapped transform was proposed as an alternative to the DCT with reduced blocking artifacts and increased coding gain. Lapped transforms are popular for their good performance, robustness against oversmoothing and availability of fast implementation algorithms. However, there is no proper study reported in the literature regarding the statistical distributions of block Lapped Orthogonal Transform (LOT) and Lapped Biorthogonal Transform (LBT) coefficients. This study performs two goodness-of-fit tests, the Kolmogorov-Smirnov (KS) test and the 2- test, to determine the distribution that best fits the LOT and LBT coefficients. The experimental results show that the distribution of a majority of the significant AC coefficients can be modeled by the Generalized Gaussian distribution. The knowledge of the statistical distribution of transform coefficients greatly helps in the design of optimal quantizers that may lead to minimum distortion and hence achieve optimal coding efficiency.
Abstract: TTV is an unenveloped circular single-stranded DNA
virus with a diameter of 30-32 nm that first was described in 1997 in
Japan. TTV was detected in various populations without proven
pathology, including blood donors and in patients with chronic HBV
and HCV hepatitis. The aim of this study was to determine the
prevalence of TTV DNA in Iranian patients with chronic hepatitis B
and C. Viral TTV-DNA was studied in 442 samples (202 with HBV,
138 with HCV and 102 controls) collected from west south of Iran.
All extracted serum DNA was amplified by TTV ORF1 gene specific
primers using the semi nested PCR technique. TTV DNA was
detected in the serum of 8.9% and 10.8% patients with chronic
hepatitis B and C, respectively. Prevalence of TTV-DNA in the serum
of 102 controls was 2.9%. Results showed significant relation of TTV
with HBV and HCV in patients by using T test examination (P
Abstract: In this paper, a novel multipurpose audio watermarking
algorithm is proposed based on Vector Quantization (VQ) in Discrete
Cosine Transform (DCT) domain using the codeword labeling and
index-bit constrained method. By using this algorithm, it can fulfill the
requirements of both the copyright protection and content integrity
authentication at the same time for the multimedia artworks. The
robust watermark is embedded in the middle frequency coefficients of
the DCT transform during the labeled codeword vector quantization
procedure. The fragile watermark is embedded into the indices of the
high frequency coefficients of the DCT transform by using the
constrained index vector quantization method for the purpose of
integrity authentication of the original audio signals. Both the robust
and the fragile watermarks can be extracted without the original audio
signals, and the simulation results show that our algorithm is effective
with regard to the transparency, robustness and the authentication
requirements
Abstract: Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Abstract: In our modern society electricity is vital to our health,
safety, comfort and well-being. While our daily use of electricity is
often taken for granted, public concern has arisen about potential
adverse health effects from electric and magnetic – electromagnetic –
fields (EMFs) produced by our use of electricity.
This paper aims to compare between the measured magnetic field
values and the simulated models for the indoor medium to low
voltage (MV/LV) distribution substations.
To calculate the magnetic flux density in the substations,
interactive software SUBCALC is used which is based on closed
form solution of the Biot-Savart law with 3D conductor model.
The comparison between the measured values and the simulated
models was acceptable. However there were some discrepancies, as
expected, may be due to the current variation during measurements.
Abstract: The use of synthetic retardants in polymeric insulated
cables is not uncommon in the high voltage engineering to study
electrical treeing phenomenon. However few studies on organic
materials for the same investigation have been carried. .This paper
describes the study on the effects of Oil Palm Empty Fruit Bunch
(OPEFB) microfiller on the tree initiation and propagation in silicone
rubber with different weight percentages (wt %) of filler to insulation
bulk material. The weight percentages used were 0 wt % and 1 wt %
respectively. It was found that the OPEFB retards the propagation of
the electrical treeing development. For tree inception study, the
addition of 1(wt %) OPEFB has increase the tree inception voltage of
silicone rubber. So, OPEFB is a potential retardant to the initiation
and growth of electrical treeing occurring in polymeric materials for
high voltage application. However more studies on the effects of
physical and electrical properties of OPEFB as a tree retardant
material are required.
Abstract: Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.
Abstract: The ability of pomelo peel, a natural biosorbent, to remove Cd(II) ions from aqueous solution by biosorption was investigated. The experiments were carried out by batch method at 25 °C. The influence of solution pH, initial cadmium ion concentrations and contact times were evaluated. Cadmium ion removal increased significantly as the pH of the solution increased from pH 1 to pH 5. At pH 5, the cadmium ion removal reached a maximum value. The equilibrium process was described well by the Langmuir isotherm model, with a maximum biosorption capacity of 21.83 mg/g. The biosorption was relatively quick, (approx. 20 min). Biosorption kinetics followed a pseudo-second-order model. The result showed that pomelo peel was effective as a biosorbent for removing cadmium ions from aqueous solution. It is a low cost material that shows potential to be applied in wastewater technology for remediation of heavy metal contamination.
Abstract: The issue of classifying objects into one of predefined
groups when the measured variables are mixed with different types
of variables has been part of interest among statisticians in many
years. Some methods for dealing with such situation have been
introduced that include parametric, semi-parametric and nonparametric
approaches. This paper attempts to discuss on a problem
in classifying a data when the number of measured mixed variables is
larger than the size of the sample. A propose idea that integrates a
dimensionality reduction technique via principal component analysis
and a discriminant function based on the location model is discussed.
The study aims in offering practitioners another potential tool in a
classification problem that is possible to be considered when the
observed variables are mixed and too large.