Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: The RR interval series is non-stationary and unevenly
spaced in time. For estimating its power spectral density (PSD) using
traditional techniques like FFT, require resampling at uniform
intervals. The researchers have used different interpolation
techniques as resampling methods. All these resampling methods
introduce the low pass filtering effect in the power spectrum. The
lomb transform is a means of obtaining PSD estimates directly from
irregularly sampled RR interval series, thus avoiding resampling. In
this work, the superiority of Lomb transform method has been
established over FFT based approach, after applying linear and
cubicspline interpolation as resampling methods, in terms of
reproduction of exact frequency locations as well as the relative
magnitudes of each spectral component.
Abstract: Electrical Discharge Machine (EDM) is especially
used for the manufacturing of 3-D complex geometry and hard
material parts that are extremely difficult-to-machine by conventional
machining processes. In this paper authors review the research work
carried out in the development of die-sinking EDM within the past
decades for the improvement of machining characteristics such as
Material Removal Rate, Surface Roughness and Tool Wear Ratio. In
this review various techniques reported by EDM researchers for
improving the machining characteristics have been categorized as
process parameters optimization, multi spark technique, powder
mixed EDM, servo control system and pulse discriminating. At the
end, flexible machine controller is suggested for Die Sinking EDM to
enhance the machining characteristics and to achieve high-level
automation. Thus, die sinking EDM can be integrated with Computer
Integrated Manufacturing environment as a need of agile
manufacturing systems.
Abstract: This paper presents reliability evaluation techniques
which are applied in distribution system planning studies and
operation. Reliability of distribution systems is an important issue in
power engineering for both utilities and customers. Reliability is a
key issue in the design and operation of electric power distribution
systems and load. Reliability evaluation of distribution systems has
been the subject of many recent papers and the modeling and
evaluation techniques have improved considerably.
Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.
Abstract: One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.
Abstract: In the power quality analysis non-stationary nature
of voltage distortions require some precise and powerful analytical
techniques. The time-frequency representation (TFR) provides a
powerful method for identification of the non-stationary of the
signals. This paper investigates a comparative study on two
techniques for analysis and visualization of voltage distortions with
time-varying amplitudes. The techniques include the Discrete
Wavelet Transform (DWT), and the S-Transform. Several power
quality problems are analyzed using both the discrete wavelet
transform and S–transform, showing clearly the advantage of the S–
transform in detecting, localizing, and classifying the power quality
problems.
Abstract: One of the main research directions in CAD/CAM
machining area is the reducing of machining time.
The feedrate scheduling is one of the advanced techniques that
allows keeping constant the uncut chip area and as sequel to keep
constant the main cutting force. They are two main ways for feedrate
optimization. The first consists in the cutting force monitoring, which
presumes to use complex equipment for the force measurement and
after this, to set the feedrate regarding the cutting force variation. The
second way is to optimize the feedrate by keeping constant the
material removal rate regarding the cutting conditions.
In this paper there is proposed a new approach using an extended
database that replaces the system model.
The feedrate scheduling is determined based on the identification
of the reconfigurable machine tool, and the feed value determination
regarding the uncut chip section area, the contact length between tool
and blank and also regarding the geometrical roughness.
The first stage consists in the blank and tool monitoring for the
determination of actual profiles. The next stage is the determination
of programmed tool path that allows obtaining the piece target
profile.
The graphic representation environment models the tool and blank
regions and, after this, the tool model is positioned regarding the
blank model according to the programmed tool path. For each of
these positions the geometrical roughness value, the uncut chip area
and the contact length between tool and blank are calculated. Each of
these parameters are compared with the admissible values and
according to the result the feed value is established.
We can consider that this approach has the following advantages:
in case of complex cutting processes the prediction of cutting force is
possible; there is considered the real cutting profile which has
deviations from the theoretical profile; the blank-tool contact length
limitation is possible; it is possible to correct the programmed tool
path so that the target profile can be obtained.
Applying this method, there are obtained data sets which allow the
feedrate scheduling so that the uncut chip area is constant and, as a
result, the cutting force is constant, which allows to use more
efficiently the machine tool and to obtain the reduction of machining
time.
Abstract: This paper proposes a new technique for improving
the efficiency of software testing, which is based on a conventional
attempt to reduce test cases that have to be tested for any given
software. The approach utilizes the advantage of Regression Testing
where fewer test cases would lessen time consumption of the testing
as a whole. The technique also offers a means to perform test case
generation automatically. Compared to one of the techniques in the
literature where the tester has no option but to perform the test case
generation manually, the proposed technique provides a better
option. As for the test cases reduction, the technique uses simple
algebraic conditions to assign fixed values to variables (Maximum,
minimum and constant variables). By doing this, the variables values
would be limited within a definite range, resulting in fewer numbers
of possible test cases to process. The technique can also be used in
program loops and arrays.
Abstract: Skin color is an important visual cue for computer
vision systems involving human users. In this paper we combine skin
color and optical flow for detection and tracking of skin regions. We
apply these techniques to gesture recognition with encouraging
results. We propose a novel skin similarity measure. For grouping
detected skin regions we propose a novel skin region grouping
mechanism. The proposed techniques work with any number of skin
regions making them suitable for a multiuser scenario.
Abstract: The aim of this research is to evaluate surface
roughness and develop a multiple regression model for surface roughness as a function of cutting parameters during the turning of
flame hardened medium carbon steel with TiN-Al2O3-TiCN coated inserts. An experimental plan of work and signal-to-noise ratio (S/N)
were used to relate the influence of turning parameters to the
workpiece surface finish utilizing Taguchi methodology. The effects
of turning parameters were studied by using the analysis of variance (ANOVA) method. Evaluated parameters were feed, cutting speed,
and depth of cut. It was found that the most significant interaction among the considered turning parameters was between depth of cut and feed. The average surface roughness (Ra) resulted by TiN-Al2O3-
TiCN coated inserts was about 2.44 μm and minimum value was 0.74 μm. In addition, the regression model was able to predict values for surface roughness in comparison with experimental values within
reasonable limit.
Abstract: In this paper, RSA encryption algorithm and its hardware
implementation in Xilinx-s Virtex Field Programmable Gate
Arrays (FPGA) is analyzed. The issues of scalability, flexible performance,
and silicon efficiency for the hardware acceleration of
public key crypto systems are being explored in the present work.
Using techniques based on the interleaved math for exponentiation,
the proposed RSA calculation architecture is compared to existing
FPGA-based solutions for speed, FPGA utilization, and scalability.
The paper covers the RSA encryption algorithm, interleaved multiplication,
Miller Rabin algorithm for primality test, extended Euclidean
math, basic FPGA technology, and the implementation details of
the proposed RSA calculation architecture. Performance of several
alternative hardware architectures is discussed and compared. Finally,
conclusion is drawn, highlighting the advantages of a fully flexible
& parameterized design.
Abstract: In this paper a hybrid technique of Genetic Algorithm
and Simulated Annealing (HGASA) is applied for Fractal Image
Compression (FIC). With the help of this hybrid evolutionary
algorithm effort is made to reduce the search complexity of matching
between range block and domain block. The concept of Simulated
Annealing (SA) is incorporated into Genetic Algorithm (GA) in order
to avoid pre-mature convergence of the strings. One of the image
compression techniques in the spatial domain is Fractal Image
Compression but the main drawback of FIC is that it involves more
computational time due to global search. In order to improve the
computational time along with acceptable quality of the decoded
image, HGASA technique has been proposed. Experimental results
show that the proposed HGASA is a better method than GA in terms
of PSNR for Fractal image Compression.
Abstract: This study compares family communication patterns in association with family socio-cultural status, especially marriage and family pattern, and couples- socio-economic status between Muslim and Santal communities in rural Bangladesh. A total of 288 couples, 145 couples from the Muslim and 143 couples from the Santal were randomly selected through cluster sampling procedure from Kalna village situated in Tanore Upazila of Rajshahi district of Bangladesh, where both the communities dwell as neighbors. In order to collect data from the selected samples, interview method with semistructural questionnaire schedule was applied. The responses given by the respondents were analyzed by Pearson-s chi-squire test and bivariate correlation techniques. The results of Pearson-s chi-squire test revealed that family communication patterns (X2= 25. 90, df= 2, p0.05) were significantly different between the Muslim and Santal communities. In addition, Spearman-s bivariate correlation coefficients suggested that among the exogenous factors, family type (rs=.135, p
Abstract: This paper present some preliminary work on the
preparation and physicochemical caracterization of nanocomposite
MFI-alumina structures based on alumina hollow fibres. The fibers
are manufactured by a wet spinning process. α-alumina particles were
dispersed in a solution of polysulfone in NMP. The resulting slurry is
pressed through the annular gap of a spinneret into a precipitation
bath. The resulting green fibres are sintered. The mechanical strength
of the alumina hollow fibres is determined by a three-point-bending
test while the pore size is characterized by bubble-point testing. The
bending strength is in the range of 110 MPa while the average pore
size is 450 nm for an internal diameter of 1 mm and external diameter
of 1.7 mm. To characterize the MFI membranes various techniques
were used for physicochemical characterization of MFI–ceramic
hollow fibres membranes: The nitrogen adsorption, X-ray
diffractometry, scanning electron microscopy combined with X
emission microanalysis. Scanning Electron Microscopy (SEM) and
Energy Dispersive Microanalysis by the X-ray were used to observe
the morphology of the hollow fibre membranes (thickness,
infiltration into the carrier, defects, homogeneity). No surface film,
has been obtained, as observed by SEM and EDX analysis and
confirmed by high temperature variation of N2 and CO2 gas
permeances before cation exchange. Local analysis and characterise
(SEM and EDX) and overall (by ICP elemental analysis) were
conducted on two samples exchanged to determine the quantity and
distribution of the cation of cesium on the cross section fibre of the
zeolite between the cavities.
Abstract: Repeated additions of the unfertilized bacteria led to
increase the activity of Nitrogen-fixing bacteria in the root zone with
drip irrigation system compared to traditional manual vaccination to
increase the proportion of Nitrogen from 29% to 64%, and the
efficiency of adding Nitrogen fertilizer did not exceed 9.5% while
dropped to 4%, due to the amount of fertilizer added was not exceed
20kg N/h, and the second was the existence of a large amount of
available Nitrogen in the soil by fixation, while the efficiency of
irrigation system between 2.08 to 2.26 kg/m3.
Abstract: In this paper, many techniques for blind identification of moving average (MA) process are presented. These methods utilize third- and fourth-order cumulants of the noisy observations of the system output. The system is driven by an independent and identically distributed (i.i.d) non-Gaussian sequence that is not observed. Two nonlinear optimization algorithms, namely the Gradient Descent and the Gauss-Newton algorithms are exposed. An algorithm based on the joint-diagonalization of the fourth-order cumulant matrices (FOSI) is also considered, as well as an improved version of the classical C(q, 0, k) algorithm based on the choice of the Best 1-D Slice of fourth-order cumulants. To illustrate the effectiveness of our methods, various simulation examples are presented.
Abstract: The running logs of a process hold valuable
information about its executed activity behavior and generated activity
logic structure. Theses informative logs can be extracted, analyzed and
utilized to improve the efficiencies of the process's execution and
conduction. One of the techniques used to accomplish the process
improvement is called as process mining. To mine similar processes is
such an improvement mission in process mining. Rather than directly
mining similar processes using a single comparing coefficient or a
complicate fitness function, this paper presents a simplified heuristic
process mining algorithm with two similarity comparisons that are
able to relatively conform the activity logic sequences (traces) of
mining processes with those of a normalized (regularized) one. The
relative process conformance is to find which of the mining processes
match the required activity sequences and relationships, further for
necessary and sufficient applications of the mined processes to process
improvements. One similarity presented is defined by the relationships
in terms of the number of similar activity sequences existing in
different processes; another similarity expresses the degree of the
similar (identical) activity sequences among the conforming processes.
Since these two similarities are with respect to certain typical behavior
(activity sequences) occurred in an entire process, the common
problems, such as the inappropriateness of an absolute comparison and
the incapability of an intrinsic information elicitation, which are often
appeared in other process conforming techniques, can be solved by the
relative process comparison presented in this paper. To demonstrate
the potentiality of the proposed algorithm, a numerical example is
illustrated.
Abstract: Natural Language Understanding Systems (NLU) will not be widely deployed unless they are technically mature and cost effective to develop. Cost effective development hinges on the availability of tools and techniques enabling the rapid production of NLU applications through minimal human resources. Further, these tools and techniques should allow quick development of applications in a user friendly way and should be easy to upgrade in order to continuously follow the evolving technologies and standards. This paper presents a visual tool for the structuring and editing of dialog forms, the key element of driving conversation in NLU applications based on IBM technology. The main focus is given on the basic component used to describe Human – Machine interactions of that kind, the Dialogue Manager. In essence, the description of a tool that enables the visual representation of the Dialogue Manager mainly during the implementation phase is illustrated.
Abstract: This study describes the preparation of a novel proton
conducting membranes based on bacterial cellulose (BC) modified by
grafting of 2-acrylamido-2-methyl-1 -propanesulfonic acid (AMPS)
through UV-induced graft polymerization. These AMPS-g-BC
membranes have been characterized by various techniques including
FTIR, SEM and TGA, to find their successful grafting of AMPS on
BC, surface morphology and thermal stability, respectively. Physical
properties of AMPS-g-BC membranes have been assessed in terms of
Lamda value( λ ), ion exchange capacity(IEC) and proton
conductivity. The relationship between degree of grafting and AMPS
concentration used for grafting has been determined by weight gain
method. An optimum proton conductivity equal to 2.89x10-2 S cm-1
and IEC value equal to 1.79 mmol g-1 have been obtained when 20
wt% AMPS concentration is used for grafting (i.e. the corresponding
membrane is notated as AMPS20-g-BC).