Abstract: Limited infrastructure development on peats and
organic soils is a serious geotechnical issues common to many
countries of the world especially Malaysia which distributed 1.5 mill
ha of those problematic soil. These soils have high water content and
organic content which exhibit different mechanical properties and
may also change chemically and biologically with time. Constructing
structures on peaty ground involves the risk of ground failure and
extreme settlement. Nowdays, much efforts need to be done in
making peatlands usable for construction due to increased landuse.
Deep mixing method employing cement as binders, is generally used
as measure again peaty/ organic ground failure problem. Where the
technique is widely adopted because it can improved ground
considerably in a short period of time. An understanding of
geotechnical properties as shear strength, stiffness and compressibility
behavior of these soils was requires before continues construction on
it. Therefore, 1- 1.5 meter peat soil sample from states of Johor and
an organic soil from Melaka, Malaysia were investigated. Cement
were added to the soil in the pre-mixing stage with water cement ratio
at range 3.5,7,14,140 for peats and 5,10,30 for organic soils,
essentially to modify the original soil textures and properties. The
mixtures which in slurry form will pour to polyvinyl chloride (pvc)
tube and cured at room temperature 250C for 7,14 and 28 days.
Laboratory experiments were conducted including unconfined
compressive strength and bender element , to monitor the improved
strength and stiffness of the 'stabilised mixed soils'. In between,
scanning electron miscroscopic (SEM) were observations to
investigate changes in microstructures of stabilised soils and to
evaluated hardening effect of a peat and organic soils stabilised
cement. This preliminary effort indicated that pre-mixing peat and
organic soils contributes in gaining soil strength while help the
engineers to establish a new method for those problematic ground
improvement in further practical and long term applications.
Abstract: A new analysis of perceptual speech enhancement is
presented. It focuses on the fact that if only noise above the masking
threshold is filtered, then noise below the masking threshold, but
above the absolute threshold of hearing, can become audible after the
masker filtering. This particular drawback of some perceptual filters,
hereafter called the maskee-to-audible-noise (MAN) phenomenon,
favours the emergence of isolated tonals that increase musical noise.
Two filtering techniques that avoid or correct the MAN phenomenon
are proposed to effectively suppress background noise without introducing
much distortion. Experimental results, including objective
and subjective measurements, show that these techniques improve
the enhanced speech quality and the gain they bring emphasizes the
importance of the MAN phenomenon.
Abstract: This paper proposes a zero-voltage transition (ZVT) PWM synchronous buck converter, which is designed to operate at low output voltage and high efficiency typically required for portable systems. To make the DC-DC converter efficient at lower voltage, synchronous converter is an obvious choice because of lower conduction loss in the diode. The high-side MOSFET is dominated by the switching losses and it is eliminated by the soft switching technique. Additionally, the resonant auxiliary circuit designed is also devoid of the switching losses. The suggested procedure ensures an efficient converter. Theoretical analysis, computer simulation, and experimental results are presented to explain the proposed schemes.
Abstract: Fault-proneness of a software module is the
probability that the module contains faults. To predict faultproneness
of modules different techniques have been proposed which
includes statistical methods, machine learning techniques, neural
network techniques and clustering techniques. The aim of proposed
study is to explore whether metrics available in the early lifecycle
(i.e. requirement metrics), metrics available in the late lifecycle (i.e.
code metrics) and metrics available in the early lifecycle (i.e.
requirement metrics) combined with metrics available in the late
lifecycle (i.e. code metrics) can be used to identify fault prone
modules using Genetic Algorithm technique. This approach has been
tested with real time defect C Programming language datasets of
NASA software projects. The results show that the fusion of
requirement and code metric is the best prediction model for
detecting the faults as compared with commonly used code based
model.
Abstract: Data Warehousing tools have become very popular and currently many of them have moved to Web-based user interfaces to make it easier to access and use the tools. The next step is to enable these tools to be used within a portal framework. The portal framework consists of pages having several small windows that contain individual data warehouse query results. There are several issues that need to be considered when designing the architecture for a portal enabled data warehouse query tool. Some issues need special techniques that can overcome the limitations that are imposed by the nature of data warehouse queries. Issues such as single sign-on, query result caching and sharing, customization, scheduling and authorization need to be considered. This paper discusses such issues and suggests an architecture to support data warehouse queries within Web portal frameworks.
Abstract: Disposal of health-care waste (HCW) is considered as
an important environmental problem especially in large cities.
Multiple criteria decision making (MCDM) techniques are apt to deal
with quantitative and qualitative considerations of the health-care
waste management (HCWM) problems. This research proposes a
fuzzy multi-criteria group decision making approach with a multilevel
hierarchical structure including qualitative as well as
quantitative performance attributes for evaluating HCW disposal
alternatives for Istanbul. Using the entropy weighting method,
objective weights as well as subjective weights are taken into account
to determine the importance weighting of quantitative performance
attributes. The results obtained using the proposed methodology are
thoroughly analyzed.
Abstract: This paper addresses an efficient technique to embed and detect digital fingerprint code. Orthogonal modulation method is a straightforward and widely used approach for digital fingerprinting but shows several limitations in computational cost and signal efficiency. Coded modulation method can solve these limitations in theory. However it is difficult to perform well in practice if host signals are not available during tracing colluders, other kinds of attacks are applied, and the size of fingerprint code becomes large. In this paper, we propose a hybrid modulation method, in which the merits of or-thogonal modulation and coded modulation method are combined so that we can achieve low computational cost and high signal efficiency. To analyze the performance, we design a new fingerprint code based on GD-PBIBD theory and modulate this code into images by our method using spread-spectrum watermarking on frequency domain. The results show that the proposed method can efficiently handle large fingerprint code and trace colluders against averaging attacks.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques for extracting
phrases and stemming words. Then an ontology based conceptual
method will be used to annotate documents and expand the query.
To expand a query the spread activation algorithm is improved so
that the expansion can be done in various aspects. The annotated
documents and the expanded query will be processed to compute
the relevance degree exploiting statistical methods. The outstanding
features of our approach are (1) combining conceptual, statistical
and linguistic features of documents, (2) expanding the query with
its related concepts before comparing to documents, (3) extracting
and using both words and phrases to compute relevance degree, (4)
improving the spread activation algorithm to do the expansion based
on weighted combination of different conceptual relationships and
(5) allowing variable document vector dimensions. A ranking
system called ORank is developed to implement and test the
proposed model. The test results will be included at the end of the
paper.
Abstract: In this work, we successfully extended one-dimensional differential transform method (DTM), by presenting and proving some theorems, to solving nonlinear high-order multi-pantograph equations. This technique provides a sequence of functions which converges to the exact solution of the problem. Some examples are given to demonstrate the validity and applicability of the present method and a comparison is made with existing results.
Abstract: Repeated observation of a given area over time yields
potential for many forms of change detection analysis. These
repeated observations are confounded in terms of radiometric
consistency due to changes in sensor calibration over time,
differences in illumination, observation angles and variation in
atmospheric effects.
This paper demonstrates applicability of an empirical relative
radiometric normalization method to a set of multitemporal cloudy
images acquired by Resourcesat1 LISS III sensor. Objective of this
study is to detect and remove cloud cover and normalize an image
radiometrically. Cloud detection is achieved by using Average
Brightness Threshold (ABT) algorithm. The detected cloud is
removed and replaced with data from another images of the same
area. After cloud removal, the proposed normalization method is
applied to reduce the radiometric influence caused by non surface
factors. This process identifies landscape elements whose reflectance
values are nearly constant over time, i.e. the subset of non-changing
pixels are identified using frequency based correlation technique. The
quality of radiometric normalization is statistically assessed by R2
value and mean square error (MSE) between each pair of analogous
band.
Abstract: There are reports of gas and oil wells fire due to different accidents. Many different methods are used for fire fighting in gas and oil industry. Traditional fire extinguishing techniques are mostly faced with many problems and are usually time consuming and needs lots of equipments. Besides, they cause damages to facilities, and create health and environmental problems. This article proposes innovative approach in fire extinguishing techniques in oil and gas industry, especially applicable for burning oil wells located offshore. Fire extinguishment employing a turbojet is a novel approach which can help to extinguishment the fire in short period of time. Divergent and convergent turbojets modeled in laboratory scale along with a high pressure flame were used. Different experiments were conducted to determine the relationship between output discharges of trumpet and oil wells. The results were corrected and the relationship between dimensionless parameters of flame and fire extinguishment distances and also the output discharge of turbojet and oil wells in specified distances are demonstrated by specific curves.
Abstract: Software and applications are subjected to serious and damaging security threats, these threats are increasing as a result of increased number of potential vulnerabilities. Security testing is an indispensable process to validate software security requirements and to identify security related vulnerabilities. In this paper we analyze and compare different available vulnerabilities testing techniques based on a pre defined criteria using analytical hierarchy process (AHP). We have selected five testing techniques which includes Source code analysis, Fault code injection, Robustness, Stress and Penetration testing techniques. These testing techniques have been evaluated against five criteria which include cost, thoroughness, Ease of use, effectiveness and efficiency. The outcome of the study is helpful for researchers, testers and developers to understand effectiveness of each technique in its respective domain. Also the study helps to compare the inner working of testing techniques against a selected criterion to achieve optimum testing results.
Abstract: The modeling of water transfer in the unsaturated zone
uses techniques and methods of the soil physics to solve the
Richards-s equation. However, there is a disaccord between the size
of the measurements provided by the soil physics and the size of the
fields of hydrological modeling problem, to which is added the
strong spatial variability of soil hydraulic properties. The objective of
this work was to develop a methodology to estimate the
hydrodynamic parameters for modeling water transfers at different
hydrological scales in the soil-plant atmosphere systems.
Abstract: The inherent flexibilities of XML in both structure
and semantics makes mining from XML data a complex task with
more challenges compared to traditional association rule mining in
relational databases. In this paper, we propose a new model for the
effective extraction of generalized association rules form a XML
document collection. We directly use frequent subtree mining
techniques in the discovery process and do not ignore the tree
structure of data in the final rules. The frequent subtrees based on the
user provided support are split to complement subtrees to form the
rules. We explain our model within multi-steps from data preparation
to rule generation.
Abstract: Supplier selection, in real situation, is affected by
several qualitative and quantitative factors and is one of the most
important activities of purchasing department. Since at the time of
evaluating suppliers against the criteria or factors, decision makers
(DMS) do not have precise, exact and complete information, supplier
selection becomes more difficult. In this case, Grey theory helps us
to deal with this problem of uncertainty. Here, we apply Technique
for Order Preference by Similarity to Ideal Solution (TOPSIS)
method to evaluate and select the best supplier by using interval
fuzzy numbers. Through this article, we compare TOPSIS with some
other approaches and afterward demonstrate that the concept of
TOPSIS is very important for ranking and selecting right supplier.
Abstract: We address the problem of joint beamforming and multipath channel parameters estimation in Wideband Code Division Multiple Access (WCDMA) communication systems that employ Multiple-Access Interference (MAI) suppression techniques in the uplink (from mobile to base station). Most of the existing schemes rely on time multiplex a training sequence with the user data. In WCDMA, the channel parameters can also be estimated from a code multiplexed common pilot channel (CPICH) that could be corrupted by strong interference resulting in a bad estimate. In this paper, we present new methods to combine interference suppression together with channel estimation when using multiple receiving antennas by using adaptive signal processing techniques. Computer simulation is used to compare between the proposed methods and the existing conventional estimation techniques.
Abstract: Serial Analysis of Gene Expression is a powerful
quantification technique for generating cell or tissue gene expression
data. The profile of the gene expression of cell or tissue in several
different states is difficult for biologists to analyze because of the large
number of genes typically involved. However, feature selection in
machine learning can successfully reduce this problem. The method
allows reducing the features (genes) in specific SAGE data, and
determines only relevant genes. In this study, we used a genetic
algorithm to implement feature selection, and evaluate the
classification accuracy of the selected features with the K-nearest
neighbor method. In order to validate the proposed method, we used
two SAGE data sets for testing. The results of this study conclusively
prove that the number of features of the original SAGE data set can be
significantly reduced and higher classification accuracy can be
achieved.
Abstract: This paper deals with dynamic load balancing using PVM. In distributed environment Load Balancing and Heterogeneity are very critical issues and needed to drill down in order to achieve the optimal results and efficiency. Various techniques are being used in order to distribute the load dynamically among different nodes and to deal with heterogeneity. These techniques are using different approaches where Process Migration is basic concept with different optimal flavors. But Process Migration is not an easy job, it impose lot of burden and processing effort in order to track each process in nodes. We will propose a dynamic load balancing technique in which application will intelligently balance the load among different nodes, resulting in efficient use of system and have no overheads of process migration. It would also provide a simple solution to problem of load balancing in heterogeneous environment.
Abstract: Among all mechanical joining processes, welding has
been employed for its advantage in design flexibility, cost saving,
reduced overall weight and enhanced structural performance.
However, for structures made of relatively thin components, welding
can introduce significant buckling distortion which causes loss of
dimensional control, structural integrity and increased fabrication
costs. Different parameters can affect buckling behavior of welded
thin structures such as, heat input, welding sequence, dimension of
structure. In this work, a 3-D thermo elastic-viscoplastic finite
element analysis technique is applied to evaluate the effect of shell
dimensions on buckling behavior and entropy generation of welded
thin shells. Also, in the present work, the approximated longitudinal
transient stresses which produced in each time step, is applied to the
3D-eigenvalue analysis to ratify predicted buckling time and
corresponding eigenmode. Besides, the possibility of buckling
prediction by entropy generation at each time is investigated and it is
found that one can predict time of buckling with drawing entropy
generation versus out of plane deformation. The results of finite
element analysis show that the length, span and thickness of welded
thin shells affect the number of local buckling, mode shape of global
buckling and post-buckling behavior of welded thin shells.
Abstract: This research work proposes a model of network security systems aiming to prevent production system in a data center from being attacked by intrusions. Conceptually, we introduce a decoy system as a part of the security system for luring intrusions, and apply network intrusion detection (NIDS), coupled with the decoy system to perform intrusion prevention. When NIDS detects an activity of intrusions, it will signal a redirection module to redirect all malicious traffics to attack the decoy system instead, and hence the production system is protected and safe. However, in a normal situation, traffic will be simply forwarded to the production system as usual. Furthermore, we assess the performance of the model with various bandwidths, packet sizes and inter-attack intervals (attacking frequencies).