Abstract: Clustering unstructured text documents is an
important issue in data mining community and has a number of
applications such as document archive filtering, document
organization and topic detection and subject tracing. In the real
world, some of the already clustered documents may not be of
importance while new documents of more significance may evolve.
Most of the work done so far in clustering unstructured text
documents overlooks this aspect of clustering. This paper, addresses
this issue by using the Fading Function. The unstructured text
documents are clustered. And for each cluster a statistics structure
called Cluster Profile (CP) is implemented. The cluster profile
incorporates the Fading Function. This Fading Function keeps an
account of the time-dependent importance of the cluster. The work
proposes a novel algorithm Clustering n-ary Merge Algorithm
(CnMA) for unstructured text documents, that uses Cluster Profile
and Fading Function. Experimental results illustrating the
effectiveness of the proposed technique are also included.
Abstract: Horizontal wells are proven to be better producers
because they can be extended for a long distance in the pay zone.
Engineers have the technical means to forecast the well productivity
for a given horizontal length. However, experiences have shown that
the actual production rate is often significantly less than that of
forecasted. It is a difficult task, if not impossible to identify the real
reason why a horizontal well is not producing what was forecasted.
Often the source of problem lies in the drilling of horizontal section
such as permeability reduction in the pay zone due to mud invasion
or snaky well patterns created during drilling. Although drillers aim
to drill a constant inclination hole in the pay zone, the more frequent
outcome is a sinusoidal wellbore trajectory. The two factors, which
play an important role in wellbore tortuosity, are the inclination and
side force at bit. A constant inclination horizontal well can only be
drilled if the bit face is maintained perpendicular to longitudinal axis
of bottom hole assembly (BHA) while keeping the side force nil at
the bit. This approach assumes that there exists no formation force at
bit. Hence, an appropriate BHA can be designed if bit side force and
bit tilt are determined accurately. The Artificial Neural Network
(ANN) is superior to existing analytical techniques. In this study, the
neural networks have been employed as a general approximation tool
for estimation of the bit side forces. A number of samples are
analyzed with ANN for parameters of bit side force and the results
are compared with exact analysis. Back Propagation Neural network
(BPN) is used to approximation of bit side forces. Resultant low
relative error value of the test indicates the usability of the BPN in
this area.
Abstract: In this paper, an approach to reduce the computation steps required by fast neural networksfor the searching process is presented. The principle ofdivide and conquer strategy is applied through imagedecomposition. Each image is divided into small in sizesub-images and then each one is tested separately usinga fast neural network. The operation of fast neuralnetworks based on applying cross correlation in thefrequency domain between the input image and theweights of the hidden neurons. Compared toconventional and fast neural networks, experimentalresults show that a speed up ratio is achieved whenapplying this technique to locate human facesautomatically in cluttered scenes. Furthermore, fasterface detection is obtained by using parallel processingtechniques to test the resulting sub-images at the sametime using the same number of fast neural networks. Incontrast to using only fast neural networks, the speed upratio is increased with the size of the input image whenusing fast neural networks and image decomposition.
Abstract: This paper proposes a dual tree complex wavelet transform (DT-CWT) based directional interpolation scheme for noisy images. The problems of denoising and interpolation are modelled as to estimate the noiseless and missing samples under the same framework of optimal estimation. Initially, DT-CWT is used to decompose an input low-resolution noisy image into low and high frequency subbands. The high-frequency subband images are interpolated by linear minimum mean square estimation (LMMSE) based interpolation, which preserves the edges of the interpolated images. For each noisy LR image sample, we compute multiple estimates of it along different directions and then fuse those directional estimates for a more accurate denoised LR image. The estimation parameters calculated in the denoising processing can be readily used to interpolate the missing samples. The inverse DT-CWT is applied on the denoised input and interpolated high frequency subband images to obtain the high resolution image. Compared with the conventional schemes that perform denoising and interpolation in tandem, the proposed DT-CWT based noisy image interpolation method can reduce many noise-caused interpolation artifacts and preserve well the image edge structures. The visual and quantitative results show that the proposed technique outperforms many of the existing denoising and interpolation methods.
Abstract: Feature-based registration is an effective technique for clinical use, because it can greatly reduce computational costs. However, this technique, which estimates the transformation by using feature points extracted from two images, may cause misalignments. To handle with this limitation, we propose to extract the salient edges and extracted control points (CP) of medical images by using efficiency of multiresolution representation of data nonsubsampled contourlet transform (NSCT) that finds the best feature points. The MR images were first decomposed using the NSCT, and then Edge and CP were extracted from bandpass directional subband of NSCT coefficients and some proposed rules. After edge and CP extraction, mutual information was adopted for the registration of feature points and translation parameters are calculated by using particle swarm optimization (PSO). The experimental results showed that the proposed method produces totally accurate performance for registration medical CT-MR images.
Abstract: Image registration plays an important role in the
diagnosis of dental pathologies such as dental caries, alveolar bone
loss and periapical lesions etc. This paper presents a new wavelet
based algorithm for registering noisy and poor contrast dental x-rays.
Proposed algorithm has two stages. First stage is a preprocessing
stage, removes the noise from the x-ray images. Gaussian filter has
been used. Second stage is a geometric transformation stage.
Proposed work uses two levels of affine transformation. Wavelet
coefficients are correlated instead of gray values. Algorithm has been
applied on number of pre and post RCT (Root canal treatment)
periapical radiographs. Root Mean Square Error (RMSE) and
Correlation coefficients (CC) are used for quantitative evaluation.
Proposed technique outperforms conventional Multiresolution
strategy based image registration technique and manual registration
technique.
Abstract: In this paper, an automatic control system design
based on Integral Squared Error (ISE) parameter optimization
technique has been implemented on longitudinal flight dynamics of
an UAV. It has been aimed to minimize the error function between
the reference signal and the output of the plant. In the following
parts, objective function has been defined with respect to error
dynamics. An unconstrained optimization problem has been solved
analytically by using necessary and sufficient conditions of
optimality, optimum PID parameters have been obtained and
implemented in control system dynamics.
Abstract: This paper evaluates the performance of a novel
algorithm for tracking of a mobile node, interms of execution time
and root mean square error (RMSE). Particle Filter algorithm is used
to track the mobile node, however a new technique in particle filter
algorithm is also proposed to reduce the execution time. The
stationary points were calculated through trilateration and finally by
averaging the number of points collected for a specific time, whereas
tracking is done through trilateration as well as particle filter
algorithm. Wi-Fi signal is used to get initial guess of the position of
mobile node in x-y coordinates system. Commercially available
software “Wireless Mon" was used to read the WiFi signal strength
from the WiFi card. Visual Cµ version 6 was used to interact with
this software to read only the required data from the log-file
generated by “Wireless Mon" software. Results are evaluated through
mathematical modeling and MATLAB simulation.
Abstract: Although, it is a long time that human know about
the importance of environment in life, but at the last decade of 20
century, the space that was full of hot scientific, collegial and
political were made in environmental challenge, So much that, this
problem not only disarrange the peace and security of life, but also it
has threatened human existence. One of the problems in last years
that are significant for authorities is unsatisfactory achieved results
against of using huge cost for magnificent environmental projects.
This subject leads thinker to this thought that for solving the
environmental problems it is needed new methods include of
sociology, ethics and philosophic, etc. methods apart of technical
affairs. Environment ethics is a new branch of philosophic ethics
discussion that discusses about the ethics relationship between
humans and universe that is around them. By notifying to the above
considered affairs, in today world, necessity of environmental ethics
for environment management is reduplicated. In the following the
article has been focused on environmental ethics role and
environmental management methods and techniques for developing
it.
Abstract: The quality improvements of the environmental
elements could increase the recreational opportunities in a certain
area (destination). The technique of the need for recreation focuses
on choosing certain destinations for recreational purposes. The basic
exchange taken into consideration is the one between the satisfaction
gained after staying in that area and the value expressed in money
and time allocated. The number of tourists in the respective area, the
duration of staying and the money spent including transportation
provide information on how individuals rank the place or certain
aspects of the area (such as the quality of the environmental
elements).
For the statistical analysis of the environmental benefits offered by
an area through the need of recreation technique, the following stages
are suggested:
- characterization of the reference area based on the
statistical variables considered;
- estimation of the environmental benefit through
comparing the reference area with other similar areas
(having the same environmental characteristics), from
the perspective of the statistical variables considered.
The model compared in recreation technique faced with a series of
difficulties which refers to the reference area and correct
transformation of time in money.
Abstract: Locality Sensitive Hashing (LSH) is one of the most
promising techniques for solving nearest neighbour search problem in
high dimensional space. Euclidean LSH is the most popular variation
of LSH that has been successfully applied in many multimedia
applications. However, the Euclidean LSH presents limitations that
affect structure and query performances. The main limitation of the
Euclidean LSH is the large memory consumption. In order to achieve
a good accuracy, a large number of hash tables is required. In this
paper, we propose a new hashing algorithm to overcome the storage
space problem and improve query time, while keeping a good
accuracy as similar to that achieved by the original Euclidean LSH.
The Experimental results on a real large-scale dataset show that the
proposed approach achieves good performances and consumes less
memory than the Euclidean LSH.
Abstract: This paper proposes an efficient finite precision block floating point (BFP) treatment to the fixed coefficient finite impulse response (FIR) digital filter. The treatment includes effective implementation of all the three forms of the conventional FIR filters, namely, direct form, cascaded and par- allel, and a roundoff error analysis of them in the BFP format. An effective block formatting algorithm together with an adaptive scaling factor is pro- posed to make the realizations more simple from hardware view point. To this end, a generic relation between the tap weight vector length and the input block length is deduced. The implementation scheme also emphasises on a simple block exponent update technique to prevent overflow even during the block to block transition phase. The roundoff noise is also investigated along the analogous lines, taking into consideration these implementational issues. The simulation results show that the BFP roundoff errors depend on the sig- nal level almost in the same way as floating point roundoff noise, resulting in approximately constant signal to noise ratio over a relatively large dynamic range.
Abstract: Home is important for Chinese people. Because the
information regarding the house attributes and surrounding
environments is incomplete in most real estate agency, most house
buyers are difficult to consider the overall factors effectively and only
can search candidates by sorting-based approach. This study aims to
develop a decision support system for housing purchasing, in which
surrounding facilities of each house are quantified. Then, all
considered house factors and customer preferences are incorporated
into Simple Multi-Attribute Ranking Technique (SMART) to support
the housing evaluation. To evaluate the validity of proposed approach,
an empirical study was conducted from a real estate agency. Based on
the customer requirement and preferences, the proposed approach can
identify better candidate house with consider the overall house
attributes and surrounding facilities.
Abstract: Robotic system is an important area in artificial intelligence that aims at developing the performance techniques of the robot and making it more efficient and more effective in choosing its correct behavior. In this paper the distributed learning classifier system is used for designing a simulated control system for robot to perform complex behaviors. A set of enhanced approaches that support default hierarchies formation is suggested and compared with each other in order to make the simulated robot more effective in mapping the input to the correct output behavior.
Abstract: XML is becoming a de facto standard for online data exchange. Existing XML filtering techniques based on a publish/subscribe model are focused on the highly structured data marked up with XML tags. These techniques are efficient in filtering the documents of data-centric XML but are not effective in filtering the element contents of the document-centric XML. In this paper, we propose an extended XPath specification which includes a special matching character '%' used in the LIKE operation of SQL in order to solve the difficulty of writing some queries to adequately filter element contents using the previous XPath specification. We also present a novel technique for filtering a collection of document-centric XMLs, called Pfilter, which is able to exploit the extended XPath specification. We show several performance studies, efficiency and scalability using the multi-query processing time (MQPT).
Abstract: Human pose estimation can be executed using Active Shape Models. The existing techniques for applying to human-body research using Active Shape Models, such as human detection, primarily take the form of silhouette of human body. This technique is not able to estimate accurately for human pose to concern two arms and legs, as the silhouette of human body represents the shape as out of round. To solve this problem, we applied the human body model as stick-figure, “skeleton". The skeleton model of human body can give consideration to various shapes of human pose. To obtain effective estimation result, we applied background subtraction and deformed matching algorithm of primary Active Shape Models in the fitting process. The images which were used to make the model were 600 human bodies, and the model has 17 landmark points which indicate body junction and key features of human pose. The maximum iteration for the fitting process was 30 times and the execution time was less than .03 sec.
Abstract: Emphasis on the advancement of new materials and technology has been there for the past few decades. The global development towards using cheap and durable materials from renewable resources contributes to sustainable development. An experimental investigation of mechanical behaviour of sisal fibre-reinforced concrete is reported for making a suitable building material in terms of reinforcement. Fibre reinforced Composite is one such material, which has reformed the concept of high strength. Sisal fibres are abundantly available in the hot areas. Sisal fibre has emerged as a reinforcing material for concretes, used in civil structures. In this work, properties such as hardness and tensile strength of sisal fibre reinforced cement composites with 6, 12, 18 and 24% by weight of sisal fibres were assessed. Sisal fibre reinforced cement composite slabs with long sisal fibres were manufactured using a cast hand lay up technique. Mechanical response was measured under tension. The high energy absorption capacity of the developed composite system was reflected in high toughness values under tension respectively.
Abstract: This paper presents the robust stability criteria for uncertain genetic regulatory networks with time-varying delays. One key point of the criterion is that the decomposition of the matrix ˜D into ˜D = ˜D1 + ˜D2. This decomposition corresponds to a decomposition of the delayed terms into two groups: the stabilizing ones and the destabilizing ones. This technique enables one to take the stabilizing effect of part of the delayed terms into account. Meanwhile, by choosing an appropriate new Lyapunov functional, a new delay-dependent stability criteria is obtained and formulated in terms of linear matrix inequalities (LMIs). Finally, numerical examples are presented to illustrate the effectiveness of the theoretical results.
Abstract: Advertising today has already become an integral part
of human life as a building block of the consumer community. A
component of the value chain of the media, advertising sector is
struggling increasingly harder to find new methods to reach
consumers. The tendency towards experimental marketing practices
is increasing day by day, especially to divert consumers from the idea
“They are selling something to me.” It is therefore considered a good
idea to investigate the trust in ad media of consumers, who are today
exposed to a great bulk of information from advertising sector.
In this study, the current value of ad media for the young
consumer will be investigated. Data on various ad media reliability
will be comparatively analyzed and young consumers will be traced
by including university students in the study. In this research, which
will be performed on students studying at the Selçuk University
(Turkey) by random sampling method, data will be obtained by
survey technique and evaluated by a statistical analysis.
Abstract: Word sense disambiguation is one of the most important open problems in natural language processing applications such as information retrieval and machine translation. Many approach strategies can be employed to resolve word ambiguity with a reasonable degree of accuracy. These strategies are: knowledgebased, corpus-based, and hybrid-based. This paper pays attention to the corpus-based strategy that employs an unsupervised learning method for disambiguation. We report our investigation of Latent Semantic Indexing (LSI), an information retrieval technique and unsupervised learning, to the task of Thai noun and verbal word sense disambiguation. The Latent Semantic Indexing has been shown to be efficient and effective for Information Retrieval. For the purposes of this research, we report experiments on two Thai polysemous words, namely /hua4/ and /kep1/ that are used as a representative of Thai nouns and verbs respectively. The results of these experiments demonstrate the effectiveness and indicate the potential of applying vector-based distributional information measures to semantic disambiguation.