Abstract: The classic problem of recovering arbitrary values of
a band-limited signal from its samples has an added complication
in software radio applications; namely, the resampling calculations
inevitably fold aliases of the analog signal back into the original
bandwidth. The phenomenon is quantified by the spur-free dynamic
range. We demonstrate how a novel application of the Remez (Parks-
McClellan) algorithm permits optimal signal recovery and SFDR, far
surpassing state-of-the-art resamplers.
Abstract: When programming in languages such as C, Java, etc.,
it is difficult to reconstruct the programmer's ideas only from the
program code. This occurs mainly because, much of the programmer's
ideas behind the implementation are not recorded in the code during
implementation. For example, physical aspects of computation such as
spatial structures, activities, and meaning of variables are not required
as instructions to the computer and are often excluded. This makes the
future reconstruction of the original ideas difficult. AIDA, which is a
multimedia programming language based on the cyberFilm model, can
solve these problems allowing to describe ideas behind programs
using advanced annotation methods as a natural extension to
programming. In this paper, a development environment that
implements the AIDA language is presented with a focus on the
annotation methods. In particular, an actual scientific numerical
computation code is created and the effects of the annotation methods
are analyzed.
Abstract: Fatigue tests of specimen-s with numerous holes are
presented. The tests were made up till fatigue cracks have been
created on both sides of the hole. Their extension was stopping with
pressed plastic deformation at the mouth of the detected crack. It is
shown that the moments of occurrence of cracks on holes are
stochastically dependent. This dependence has positive and negative
correlation relations. Shown that the positive correlation is formed
across of the applied force, while negative one – along it. The
negative relationship extends over a greater distance. The
mathematical model of dependence area formation is represented as
well as the estimating of model parameters. The positive correlation
of fatigue cracks origination can be considered as an extension of one
main crack. With negative correlation the first crack locates the place
of its origin, leading to the appearance of multiple cracks; do not
merge with each other.
Abstract: In order to evaluate the relationship between the sulphur (S), glucose (G), nitrogen (N) and plant residues (st), sulphur immobilization and microbial transformation were monitored in five soil samples from 0-30 cm of Bastam farmers fields of Shahrood area following 11 treatments with different levels of Sulphur (S), glucose (G), N and plant residues (wheat straw) in a randomized block design with three replications and incubated over 20, 45 and 60 days, the immobilization of SO4 -2-S presented as a percentage of that added, was inversely related to its addition rate. Additions of glucose and plant residues increased with the C-to-S ratio of the added amendments, irrespective of their origins (glucose and plant residues). In the presence of C sources (glucose or plant residues). N significantly increased the immobilization of SO4 -2-S, whilst the effect of N was insignificant in the absence of a C amendment. In first few days the amounts of added SO4 -2-S immobilized were linearly correlated with the amounts of added S recovered in the soil microbial biomass. With further incubation the proportions of immobilized SO4 -2-S remaining as biomass-S decreased. Decrease in biomass-S was thought to be due to the conversion of biomass-S into soil organic-S. Glucose addition increased the immobilization (microbial utilization and incorporation into the soil organic matter) of native soil SO4 -2-S. However, N addition enhance the mineralization of soil organic-S, increasing the concentration of SO4 - 2-S in soil.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: We are concerned with a class of quadratic matrix
equations arising from the overdamped mass-spring system. By
exploring the structure of coefficient matrices, we propose a fast
cyclic reduction algorithm to calculate the extreme solutions of the
equation. Numerical experiments show that the proposed algorithm
outperforms the original cyclic reduction and the structure-preserving
doubling algorithm.
Abstract: This paper presents an application of the improved
QFD method for determining the specifications of kitchen utensils
rack. By using the improved method, the subjective nature in original
QFD was reduced; particularly in defining the relationship between
customer requirement and engineering characteristics. The regression
analysis that was used for obtaining the relationship functions
between customer requirement and engineering characteristics also
accommodated the inaccurateness of the competitive assessment
results. The improved method which is represented in the form of a
mathematical model had become a formal guidance to allocate the
resource for improving the specifications of kitchen utensils rack.
The specifications obtained had led to the achievement of the highest
feasible customer satisfaction.
Abstract: The purpose of this research is to compare the original
intra-oral digital dental radiograph images with images that are
enhanced using a combination of image processing algorithms. Intraoral
digital dental radiograph images are often noisy, blur edges and
low in contrast. A combination of sharpening and enhancement
method are used to overcome these problems. Three types of
proposed compound algorithms used are Sharp Adaptive Histogram
Equalization (SAHE), Sharp Median Adaptive Histogram
Equalization (SMAHE) and Sharp Contrast adaptive histogram
equalization (SCLAHE). This paper presents an initial study of the
perception of six dentists on the details of abnormal pathologies and
improvement of image quality in ten intra-oral radiographs. The
research focus on the detection of only three types of pathology
which is periapical radiolucency, widen periodontal ligament space
and loss of lamina dura. The overall result shows that SCLAHE-s
slightly improve the appearance of dental abnormalities- over the
original image and also outperform the other two proposed
compound algorithms.
Abstract: Zeolite A and MCM-41 have extensive applications in basic science, petrochemical science, energy conservation/storage, medicine, chemical sensor, air purification, environmentally benign composite structure and waste remediation. However, the use of zeolite A and MCM-41 in these areas, especially environmental remediation, are restricted due to prohibitive production cost. Efficient recycling of and resource recovery from coal fly ash has been a major topic of current international research interest, aimed at achieving sustainable development of human society from the viewpoints of energy, economy, and environmental strategy. This project reported an original, novel, green and fast methods to produce nano-porous zeolite A and MCM-41 materials from coal fly ash. For zeolite A, this novel production method allows a reduction by half of the total production time while maintaining a high degree of crystallinity of zeolite A which exists in a narrower particle size distribution. For MCM-41, this remarkably green approach, being an environmentally friendly process and reducing generation of toxic waste, can produce pure and long-range ordered MCM-41 materials from coal fly ash. This approach took 24 h at 25 oC to produce 9 g of MCM-41 materials from 30 g of the coal fly ash, which is the shortest time and lowest reaction temperature required to produce pure and ordered MCM-41 materials (having the largest internal surface area) compared to the values reported in the literature. Performance evaluation of the produced zeolite A and MCM-41 materials in wastewater treatment and air pollution control were reported. The residual fly ash was also converted to zeolite Na-P1 which showed good performance in removal of multi-metal ions in wastewater. In wastewater treatment, compared to commercial-grade zeolite A, adsorbents produced from coal fly ash were effective in removing multi heavy metal ions in water and could be an alternative material for treatment of wastewater. In methane emission abatement, the zeolite A (produced from coal fly ash) achieved similar methane removal efficiency compared to the zeolite A prepared from pure chemicals. This report provides the guidance for production of zeolite A and MCM-41 from coal fly ash by a cost-effective approach which opens potential applications of these materials in environmental industry. Finally, environmental and economic aspects of production of zeolite A and MCM-41 from coal fly ash were discussed.
Abstract: Tacit knowledge has been one of the most discussed
and contradictory concepts in the field of knowledge management
since the mid 1990s. The concept is used relatively vaguely to refer
to any type of information that is difficult to articulate, which has led
to discussions about the original meaning of the concept (adopted
from Polanyi-s philosophy) and the nature of tacit knowing. It is
proposed that the subject should be approached from the perspective
of cognitive science in order to connect tacit knowledge to
empirically studied cognitive phenomena. Some of the most
important examples of tacit knowing presented by Polanyi are
analyzed in order to trace the cognitive mechanisms of tacit knowing
and to promote better understanding of the nature of tacit knowledge.
The cognitive approach to Polanyi-s theory reveals that the
tacit/explicit typology of knowledge often presented in the
knowledge management literature is not only artificial but totally
opposite approach compared to Polanyi-s thinking.
Abstract: Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.
Abstract: Wireless Sensor Networks (WSNs) are used to monitor/observe vast inaccessible regions through deployment of large number of sensor nodes in the sensing area. For majority of WSN applications, the collected data needs to be combined with geographic information of its origin to make it useful for the user; information received from remote Sensor Nodes (SNs) that are several hops away from base station/sink is meaningless without knowledge of its source. In addition to this, location information of SNs can also be used to propose/develop new network protocols for WSNs to improve their energy efficiency and lifetime. In this paper, range free localization protocols for WSNs have been proposed. The proposed protocols are based on weighted centroid localization technique, where the edge weights of SNs are decided by utilizing fuzzy logic inference for received signal strength and link quality between the nodes. The fuzzification is carried out using (i) Mamdani, (ii) Sugeno, and (iii) Combined Mamdani Sugeno fuzzy logic inference. Simulation results demonstrate that proposed protocols provide better accuracy in node localization compared to conventional centroid based localization protocols despite presence of unintentional radio frequency interference from radio frequency (RF) sources operating in same frequency band.
Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of Pulping of Sugar Maple problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified problem where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: Robust face recognition under various illumination
environments is very difficult and needs to be accomplished for
successful commercialization. In this paper, we propose an improved
illumination normalization method for face recognition. Illumination
normalization algorithm based on anisotropic smoothing is well known
to be effective among illumination normalization methods but
deteriorates the intensity contrast of the original image, and incurs less
sharp edges. The proposed method in this paper improves the previous
anisotropic smoothing-based illumination normalization method so
that it increases the intensity contrast and enhances the edges while
diminishing the effect of illumination variations. Due to the result of
these improvements, face images preprocessed by the proposed
illumination normalization method becomes to have more distinctive
feature vectors (Gabor feature vectors) for face recognition. Through
experiments of face recognition based on Gabor feature vector
similarity, the effectiveness of the proposed illumination
normalization method is verified.
Abstract: With the advent of digital cinema and digital
broadcasting, copyright protection of video data has been one of the
most important issues.
We present a novel method of watermarking for video image data
based on the hardware and digital wavelet transform techniques and
name it as “traceable watermarking" because the watermarked data is
constructed before the transmission process and traced after it has been
received by an authorized user.
In our method, we embed the watermark to the lowest part of each
image frame in decoded video by using a hardware LSI.
Digital Cinema is an important application for traceable
watermarking since digital cinema system makes use of watermarking
technology during content encoding, encryption, transmission,
decoding and all the intermediate process to be done in digital cinema
systems. The watermark is embedded into the randomly selected
movie frames using hash functions.
Embedded watermark information can be extracted from the
decoded video data. For that, there is no need to access original movie
data. Our experimental results show that proposed traceable
watermarking method for digital cinema system is much better than the
convenient watermarking techniques in terms of robustness, image
quality, speed, simplicity and robust structure.
Abstract: The network traffic data provided for the design of
intrusion detection always are large with ineffective information and
enclose limited and ambiguous information about users- activities.
We study the problems and propose a two phases approach in our
intrusion detection design. In the first phase, we develop a
correlation-based feature selection algorithm to remove the worthless
information from the original high dimensional database. Next, we
design an intrusion detection method to solve the problems of
uncertainty caused by limited and ambiguous information. In the
experiments, we choose six UCI databases and DARPA KDD99
intrusion detection data set as our evaluation tools. Empirical studies
indicate that our feature selection algorithm is capable of reducing the
size of data set. Our intrusion detection method achieves a better
performance than those of participating intrusion detectors.
Abstract: Given that entrepreneurship is a very significant factor of regional development, it is necessary to approach systematically the development with measures of regional politics. According to international classification The Nomenclature of Territorial Units for Statistics (NUTS II), there are three regions in Croatia. The indicators of entrepreneurial activities on the national level of Croatia are analyzed in the paper, taking into consideration the results of referent research. The level of regional development is shown based on the analysis of entrepreneurs- operations. The results of the analysis show a very unfavorable situation in entrepreneurial activities on the national level of Croatia. The origin of this situation is to be found in the surroundings with an expressed inequality of regional development, which is caused by the non-existence of a strategically directed regional policy. In this paper recommendations which could contribute to the reduction of regional inequality in Croatia, have been made.
Abstract: Proxy signature helps the proxy signer to sign
messages on behalf of the original signer. It is very useful when
the original signer (e.g. the president of a company) is not
available to sign a specific document. If the original signer can
not forge valid proxy signatures through impersonating the proxy
signer, it will be robust in a virtual environment; thus the original
signer can not shift any illegal action initiated by herself to the
proxy signer. In this paper, we propose a new proxy signature
scheme. The new scheme can prevent the original signer from
impersonating the proxy signer to sign messages. The proposed
scheme is based on the regular ElGamal signature. In addition,
the fair privacy of the proxy signer is maintained. That means,
the privacy of the proxy signer is preserved; and the privacy can
be revealed when it is necessary.
Abstract: Leo Breimans Random Forests (RF) is a recent
development in tree based classifiers and quickly proven to be one of
the most important algorithms in the machine learning literature. It
has shown robust and improved results of classifications on standard
data sets. Ensemble learning algorithms such as AdaBoost and
Bagging have been in active research and shown improvements in
classification results for several benchmarking data sets with mainly
decision trees as their base classifiers. In this paper we experiment to
apply these Meta learning techniques to the random forests. We
experiment the working of the ensembles of random forests on the
standard data sets available in UCI data sets. We compare the
original random forest algorithm with their ensemble counterparts
and discuss the results.
Abstract: The original idea for a feature film may come from a
writer, director or a producer. Director is the person responsible for
the creative aspects, both interpretive and technical, of a motion
picture production in a film. Director may be shot discussing his
project with his or her cowriters, members of production staff, and
producer, and director may be shown selecting locales or
constructing sets. All these activities provide, of course, ways of
externalizing director-s ideas about the film. A director sometimes
pushes both the film image and techniques of narration to new artistic
limits, but main responsibility of director is take the spectator to an
original opinion in his philosophical approach. Director tries to find
an artistic angle in every scene and change screenplay into an
effective story and sets his film on a spiritual and philosophical base.