Abstract: One of the difficulties of the vibration-based damage identification methods is the nonuniqueness of the results of damage identification. The different damage locations and severity may cause the identical response signal, which is even more severe for detection of the multiple damage. This paper proposes a new strategy for damage detection to avoid this nonuniqueness. This strategy firstly determines the approximates damage area based on the statistical pattern recognition method using the dynamic strain signal measured by the distributed fiber Bragg grating, and then accurately evaluates the damage information based on the Bayesian model updating method using the experimental modal data. The stochastic simulation method is then used to compute the high-dimensional integral in the Bayesian problem. Finally, an experiment of the plate structure, simulating one part of mechanical structure, is used to verify the effectiveness of this approach.
Abstract: The paper deals with the analysis of triggering
conditions and evolution processes of piping phenomena, in relation
to both mechanical and hydraulic aspects. In particular, the aim of
the study is to predict slope instabilities triggered by piping,
analysing the conditions necessary for a flow failure to occur. Really,
the mechanical effect involved in the loads redistribution around the
pipe is coupled to the drainage process arising from higher
permeability of the pipe. If after the pipe formation, the drainage
goes prevented for pipe clogging, the porewater pressure increase can
lead to the failure or even the liquefaction, with a subsequent flow
slide. To simulate the piping evolution and to verify relevant stability
conditions, a iterative coupled modelling approach has been pointed
out. As example, the proposed tool has been applied to the Stava
Valley disaster (July, 1985), demonstrating that piping might be one
of triggering phenomena of the tailings dams collapse.
Abstract: Although Face detection is not a recent activity in the
field of image processing, it is still an open area for research. The
greatest step in this field is the work reported by Viola and its recent
analogous is Huang et al. Both of them use similar features and also
similar training process. The former is just for detecting upright
faces, but the latter can detect multi-view faces in still grayscale
images using new features called 'sparse feature'. Finding these
features is very time consuming and inefficient by proposed methods.
Here, we propose a new approach for finding sparse features using a
genetic algorithm system. This method requires less computational
cost and gets more effective features in learning process for face
detection that causes more accuracy.
Abstract: Visualizing sound and noise often help us to determine
an appropriate control over the source localization. Near-field acoustic
holography (NAH) is a powerful tool for the ill-posed problem.
However, in practice, due to the small finite aperture size, the discrete
Fourier transform, FFT based NAH couldn-t predict the activeregion-
of-interest (AROI) over the edges of the plane. Theoretically
few approaches were proposed for solving finite aperture problem.
However most of these methods are not quite compatible for the
practical implementation, especially near the edge of the source. In
this paper, a zip-stuffing extrapolation approach has suggested with
2D Kaiser window. It is operated on wavenumber complex space
to localize the predicted sources. We numerically form a practice
environment with touch impact databases to test the localization of
sound source. It is observed that zip-stuffing aperture extrapolation
and 2D window with evanescent components provide more accuracy
especially in the small aperture and its derivatives.
Abstract: This paper presents an approach for an unequal error
protection of facial features of personal ID images coding. We
consider unequal error protection (UEP) strategies for the efficient
progressive transmission of embedded image codes over noisy
channels. This new method is based on the progressive image
compression embedded zerotree wavelet (EZW) algorithm and UEP
technique with defined region of interest (ROI). In this case is ROI
equal facial features within personal ID image. ROI technique is
important in applications with different parts of importance. In ROI
coding, a chosen ROI is encoded with higher quality than the
background (BG). Unequal error protection of image is provided by
different coding techniques and encoding LL band separately. In our
proposed method, image is divided into two parts (ROI, BG) that
consist of more important bytes (MIB) and less important bytes
(LIB). The proposed unequal error protection of image transmission
has shown to be more appropriate to low bit rate applications,
producing better quality output for ROI of the compresses image.
The experimental results verify effectiveness of the design. The
results of our method demonstrate the comparison of the UEP of
image transmission with defined ROI with facial features and the
equal error protection (EEP) over additive white gaussian noise
(AWGN) channel.
Abstract: The paper provides a discussion of the most relevant
aspects of yield curve modeling. Two classes of models are
considered: stochastic and parsimonious function based, through the
approaches developed by Vasicek (1977) and Nelson and Siegel
(1987). Yield curve estimates for Croatia are presented and their
dynamics analyzed and finally, a comparative analysis of models is
conducted.
Abstract: Water is the key of national development. Wherever a spring has been dried out or a river has changed its course, the area-s people have migrated and have been scattered and the area-s civilization has lost its brilliance. Today, air pollution, global warming and ozone layer damage are as the problems of countries, but certainly in the next decade the shortage and pollution of waters will be important issues of the world. The polluted waters are more dangerous in when they are used in agriculture. Because they infect plants and these plants are used in human and livestock consumption in food chain. With the increasing population growth and after that, the increase need to facilities and raw materials, human beings has started to do haste actions and wanted or unwanted destroyed his life basin. They try to overuse and capture his environment extremely, instead of having futurism approach in sustainable use of nature. This process includes Zayanderood recession, and caused its pollution after the transition from industrial and urban areas. Zayandehrood River in Isfahan is a vital artery of a living ecosystem. Now is the location of disposal waste water of many cities, villages and existing industries. The central area of the province is an important industrial place, and its environmental situation has reached a critical stage. Not only a large number of pollution-generating industries are active in the city limits, but outside of the city and adjacent districts Zayandehrood River, heavy industries like steel, Mobarakeh Steel and other tens great units pollute wild life. This article tries to study contaminant sources of Zayanderood and their severity, and determine and discuss the share of each of these resources by major industrial centers located in areas. At the end, we represent suitable strategy.
Abstract: Semantic Web Technologies enable machines to
interpret data published in a machine-interpretable form on the web.
At the present time, only human beings are able to understand the
product information published online. The emerging semantic Web
technologies have the potential to deeply influence the further
development of the Internet Economy. In this paper we propose a
scenario based research approach to predict the effects of these new
technologies on electronic markets and business models of traders
and intermediaries and customers. Over 300 million searches are
conducted everyday on the Internet by people trying to find what
they need. A majority of these searches are in the domain of
consumer ecommerce, where a web user is looking for something to
buy. This represents a huge cost in terms of people hours and an
enormous drain of resources. Agent enabled semantic search will
have a dramatic impact on the precision of these searches. It will
reduce and possibly eliminate information asymmetry where a better
informed buyer gets the best value. By impacting this key
determinant of market prices semantic web will foster the evolution
of different business and economic models. We submit that there is a
need for developing these futuristic models based on our current
understanding of e-commerce models and nascent semantic web
technologies. We believe these business models will encourage
mainstream web developers and businesses to join the “semantic web
revolution."
Abstract: In this paper, we explore the applicability of the Sinc-
Collocation method to a three-dimensional (3D) oceanography model.
The model describes a wind-driven current with depth-dependent
eddy viscosity in the complex-velocity system. In general, the
Sinc-based methods excel over other traditional numerical methods
due to their exponentially decaying errors, rapid convergence and
handling problems in the presence of singularities in end-points.
Together with these advantages, the Sinc-Collocation approach that
we utilize exploits first derivative interpolation, whose integration
is much less sensitive to numerical errors. We bring up several
model problems to prove the accuracy, stability, and computational
efficiency of the method. The approximate solutions determined by
the Sinc-Collocation technique are compared to exact solutions and
those obtained by the Sinc-Galerkin approach in earlier studies. Our
findings indicate that the Sinc-Collocation method outperforms other
Sinc-based methods in past studies.
Abstract: In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Abstract: In this paper a new approach is proposed for the
adaptation of the simulated annealing search in the field of the
Multi-Objective Optimization (MOO). This new approach is called
Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It
uses some basics of a well-known recent Multi-Objective Simulated
Annealing proposed by Ulungu et al., which is referred in the
literature as U-MOSA. However, some drawbacks of this algorithm
have been found, and are substituted by other ones, especially in
the acceptance decision criterion. The MC-MOSA has shown better
performance than the U-MOSA in the numerical experiments. This
performance is further improved by some other subvariants of the
MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA
and the Two-Stage annealing MC-MOSA.
Abstract: Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.
Abstract: In this research the separation efficiency of deoiling hydrocyclone is evaluated using three-dimensional simulation of multiphase flow based on Eulerian-Eulerian finite volume method. The mixture approach of Reynolds Stress Model is also employed to capture the features of turbulent multiphase swirling flow. The obtained separation efficiency of Colman's design is compared with available experimental data and showed that the separation curve of deoiling hydrocyclones can be predicted using numerical simulation.
Abstract: The network traffic data provided for the design of
intrusion detection always are large with ineffective information and
enclose limited and ambiguous information about users- activities.
We study the problems and propose a two phases approach in our
intrusion detection design. In the first phase, we develop a
correlation-based feature selection algorithm to remove the worthless
information from the original high dimensional database. Next, we
design an intrusion detection method to solve the problems of
uncertainty caused by limited and ambiguous information. In the
experiments, we choose six UCI databases and DARPA KDD99
intrusion detection data set as our evaluation tools. Empirical studies
indicate that our feature selection algorithm is capable of reducing the
size of data set. Our intrusion detection method achieves a better
performance than those of participating intrusion detectors.
Abstract: Recently, a lot of attention has been devoted to
advanced techniques of system modeling. PNN(polynomial neural
network) is a GMDH-type algorithm (Group Method of Data
Handling) which is one of the useful method for modeling nonlinear
systems but PNN performance depends strongly on the number of
input variables and the order of polynomial which are determined by
trial and error. In this paper, we introduce GPNN (genetic
polynomial neural network) to improve the performance of PNN.
GPNN determines the number of input variables and the order of all
neurons with GA (genetic algorithm). We use GA to search between
all possible values for the number of input variables and the order of
polynomial. GPNN performance is obtained by two nonlinear
systems. the quadratic equation and the time series Dow Jones stock
index are two case studies for obtaining the GPNN performance.
Abstract: Medical image data hiding has strict constrains such
as high imperceptibility, high capacity and high robustness.
Achieving these three requirements simultaneously is highly
cumbersome. Some works have been reported in the literature on
data hiding, watermarking and stegnography which are suitable for
telemedicine applications. None is reliable in all aspects. Electronic
Patient Report (EPR) data hiding for telemedicine demand it blind
and reversible. This paper proposes a novel approach to blind
reversible data hiding based on integer wavelet transform.
Experimental results shows that this scheme outperforms the prior
arts in terms of zero BER (Bit Error Rate), higher PSNR (Peak Signal
to Noise Ratio), and large EPR data embedding capacity with
WPSNR (Weighted Peak Signal to Noise Ratio) around 53 dB,
compared with the existing reversible data hiding schemes.
Abstract: Deaminated lesions were produced via nitrosative oxidation of natural nucleobases; uracul (Ura, U) from cytosine (Cyt, C), hypoxanthine (Hyp, H) from adenine (Ade, A), and xanthine (Xan, X) and oxanine (Oxa, O) from guanine (Gua, G). Such damaged nucleobases may induce mutagenic problems, so that much attentions and efforts have been poured on the revealing of their mechanisms in vivo or in vitro. In this study, we employed these deaminated lesions as useful probes for analysis of DNA-binding/recognizing proteins or enzymes. Since the pyrimidine lesions such as Hyp, Oxa and Xan are employed as analogues of guanine, their comparative uses are informative for analyzing the role of Gua in DNA sequence in DNA-protein interaction. Several DNA oligomers containing such Hyp, Oxa or Xan substituted for Gua were designed to reveal the molecular interaction between DNA and protein. From this approach, we have got useful information to understand the molecular mechanisms of the DNA-recognizing enzymes, which have not ever been observed using conventional DNA oligomer composed of just natural nucleobases.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.
Abstract: We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Abstract: This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.