Abstract: In this paper, an extended study is performed on the
effect of different factors on the quality of vector data based on a
previous study. In the noise factor, one kind of noise that appears in
document images namely Gaussian noise is studied while the previous
study involved only salt-and-pepper noise. High and low levels of
noise are studied. For the noise cleaning methods, algorithms that were
not covered in the previous study are used namely Median filters and
its variants. For the vectorization factor, one of the best available
commercial raster to vector software namely VPstudio is used to
convert raster images into vector format. The performance of line
detection will be judged based on objective performance evaluation
method. The output of the performance evaluation is then analyzed
statistically to highlight the factors that affect vector quality.
Abstract: Computational study of two dimensional supersonic reacting hydrogen-air flows is performed to investigate the nitrogen effects on ignition delay time for premixed and diffusion flames. Chemical reaction is treated using detail kinetics and the advection upstream splitting method is used to calculate the numerical inviscid fluxes. The results show that just in stoichiometric condition for both premixed and diffusion flames, there is monotone dependency of the ignition delay time to the nitrogen addition. In other situations, the optimal condition from ignition viewpoint should be found using numerical investigations.
Abstract: Microtomographic images and thin section (TS)
images were analyzed and compared against some parameters of
geological interest such as porosity and its distribution along the
samples. The results show that microtomography (CT) analysis,
although limited by its resolution, have some interesting information
about the distribution of porosity (homogeneous or not) and can also
quantify the connected and non-connected pores, i.e., total porosity.
TS have no limitations concerning resolution, but are limited by the
experimental data available in regards to a few glass sheets for
analysis and also can give only information about the connected
pores, i.e., effective porosity. Those two methods have their own
virtues and flaws but when paired together they are able to
complement one another, making for a more reliable and complete
analysis.
Abstract: Thirty three re-wetting tests were conducted at
different combinations of temperatures (5.7- 46.30C) and relative
humidites (48.2-88.6%) with barley. Two most commonly used thinlayer
drying and rewetting models i.e. Page and Diffusion were
compared for their ability to the fit the experimental re-wetting data
based on the standard error of estimate (SEE) of the measured and
simulated moisture contents. The comparison shows both the Page
and Diffusion models fit the re-wetting experimental data of barley
well. The average SEE values for the Page and Diffusion models
were 0.176 % d.b. and 0.199 % d.b., respectively. The Page and
Diffusion models were found to be most suitable equations, to
describe the thin-layer re-wetting characteristics of barley over a
typically five day re-wetting. These two models can be used for the
simulation of deep-bed re-wetting of barley occurring during
ventilated storage and deep bed drying.
Abstract: Climate change leading to global warming affects the
earth through many different ways such as weather (temperature, precipitation, humidity and the other parameters of weather), snow coverage and ice melting, sea level rise, hydrological cycles, quality of water, agriculture, forests, ecosystems and health. One of the most
affected areas by climate change is hydrology and water resources.
Regions where majority of runoff consists of snow melt are more
sensitive to climate change. The first step of climate change studies
is to establish trends of significant climate variables including precipitation,
temperature and flow data to detect any potential climate
change impacts already happened. Two popular non-parametric trend
analysis methods, Mann-Kendal and Spearman-s Rho were applied
to Upper Euphrates Basin (Turkey) to detect trends of precipitation,
temperatures (maximum, minimum and average) and streamflow.
Abstract: Cloud computing is the innovative and leading
information technology model for enabling convenient, on-demand
network access to a shared pool of configurable computing resources
that can be rapidly provisioned and released with minimal
management effort. This paper presents our development on enabling
an individual user's desktop in a virtualized environment, which is
stored on a remote virtual machine rather than locally. We present the
initial work on the integration of virtual desktop and application
sharing with virtualization technology. Given the development of
remote desktop virtualization, this proposed effort has the potential to
positively provide an efficient, resilience and elastic environment for
online cloud service. Users no longer need to burden the cost of
software licenses and platform maintenances. Moreover, this
development also helps boost user productivity by promoting a
flexible model that lets users access their desktop environments from
virtually anywhere.
Abstract: Recently, Genetic Algorithms (GA) and Differential
Evolution (DE) algorithm technique have attracted considerable
attention among various modern heuristic optimization techniques.
Since the two approaches are supposed to find a solution to a given
objective function but employ different strategies and computational
effort, it is appropriate to compare their performance. This paper
presents the application and performance comparison of DE and GA
optimization techniques, for flexible ac transmission system
(FACTS)-based controller design. The design objective is to enhance
the power system stability. The design problem of the FACTS-based
controller is formulated as an optimization problem and both the PSO
and GA optimization techniques are employed to search for optimal
controller parameters. The performance of both optimization
techniques has been compared. Further, the optimized controllers are
tested on a weekly connected power system subjected to different
disturbances, and their performance is compared with the
conventional power system stabilizer (CPSS). The eigenvalue
analysis and non-linear simulation results are presented and
compared to show the effectiveness of both the techniques in
designing a FACTS-based controller, to enhance power system
stability.
Abstract: As the development of digital technology is increasing,
Digital cinema is getting more spread.
However, content copy and attack against the digital cinema becomes
a serious problem. To solve the above security problem, we propose
“Additional Watermarking" for digital cinema delivery system. With
this proposed “Additional watermarking" method, we protect content
copyrights at encoder and user side information at decoder. It realizes
the traceability of the watermark embedded at encoder.
The watermark is embedded into the random-selected frames using
Hash function. Using it, the embedding position is distributed by Hash
Function so that third parties do not break off the watermarking
algorithm.
Finally, our experimental results show that proposed method is much
better than the convenient watermarking techniques in terms of
robustness, image quality and its simple but unbreakable algorithm.
Abstract: Removal of PCP by a system combining
biodegradation by biofilm and adsorption was investigated here.
Three studies were conducted employing batch tests, sequencing
batch reactor (SBR) and continuous biofilm activated carbon
column reactor (BACCOR). The combination of biofilm-GAC
batch process removed about 30% more PCP than GAC adsorption
alone. For the SBR processes, both the suspended and attached
biomass could remove more than 90% of the PCP after
acclimatisation. BACCOR was able to remove more than 98% of
PCP-Na at concentrations ranging from 10 to 100 mg/L, at empty
bed contact time (EBCT) ranging from 0.75 to 4 hours. Pure and
mixed cultures from BACCOR were tested for use of PCP as sole
carbon and energy source under aerobic conditions. The isolates
were able to degrade up to 42% of PCP under aerobic conditions in
pure cultures. However, mixed cultures were found able to degrade
more than 99% PCP indicating interdependence of species.
Abstract: This paper outlines the research conducted to propose na framework of 'Knowledge Society' (KS) in the Malaysian context.
It is important to highlight that the emergence of KS is a result of the rapid growth in knowledge and information. However, the discussion
of KS should not only be limited to the importance of knowledge, but a holistic KS is also determined by other imperative dimensions. This
article discusses the results of a study conducted previously in Malaysia in order to identify the essential dimensions of KS, and
consequently propose a KS framework in the Malaysian context.
Two methods were employed, namely the Delphi technique and semi-structured interviews. The modified Delphi involved five
rounds with ten experts, while the interviews were conducted with two prominent figures in Malaysia. The results support the proposed
framework which contains seven major dimensions in order for Malaysia to become a KS in the future. The dimensions which are
crucial for a holistic Malaysian KS are human capital, spirituality, economy, social, institutional, sustainability, and driven by the ICT.
Abstract: This study reports the preparation of soft magnetic ribbons of Fe-based amorphous alloys using the single-roller melt-spinning technique. Ribbon width varied from 142 mm to 213 mm and, with a thickness of approximately 22 μm 2 μm. The microstructure and magnetic properties of the ribbons were characterized by differential scanning calorimeter (DSC), X-ray diffraction (XRD), vibrating sample magnetometer (VSM), and electrical resistivity measurements (ERM). The amorphous material properties dependence of the cooling rate and nozzle pressure have uneven surface in ribbon thicknesses are investigated. Magnetic measurement results indicate that some region of the ribbon exhibits good magnetic properties, higher saturation induction and lower coercivity. However, due to the uneven surface of 213 mm wide ribbon, the magnetic responses are not uniformly distributed. To understand the transformer magnetic performances, this study analyzes the measurements of a three-phase 2 MVA amorphous-cored transformer. Experimental results confirm that the transformer with a ribbon width of 142 mm has better magnetic properties in terms of lower core loss, exciting power, and audible noise.
Abstract: METIS is the Multi Element Telescope for Imaging
and Spectroscopy, a Coronagraph aboard the European Space
Agency-s Solar Orbiter Mission aimed at the observation of the solar
corona via both VIS and UV/EUV narrow-band imaging and spectroscopy. METIS, with its multi-wavelength capabilities, will
study in detail the physical processes responsible for the corona heating and the origin and properties of the slow and fast solar wind.
METIS electronics will collect and process scientific data thanks to its detectors proximity electronics, the digital front-end subsystem
electronics and the MPPU, the Main Power and Processing Unit,
hosting a space-qualified processor, memories and some rad-hard
FPGAs acting as digital controllers.This paper reports on the overall
METIS electronics architecture and data processing capabilities
conceived to address all the scientific issues as a trade-off solution between requirements and allocated resources, just before the
Preliminary Design Review as an ESA milestone in April 2012.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT
generally first segment the area of interest (lung) and then analyze
the separately obtained area for nodule detection in order to
diagnosis the disease. For normal lung, segmentation can be
performed by making use of excellent contrast between air and
surrounding tissues. However this approach fails when lung is
affected by high density pathology. Dense pathologies are present in
approximately a fifth of clinical scans, and for computer analysis
such as detection and quantification of abnormal areas it is vital that
the entire and perfectly lung part of the image is provided and no
part, as present in the original image be eradicated. In this paper we
have proposed a lung segmentation technique which accurately
segment the lung parenchyma from lung CT Scan images. The
algorithm was tested against the 25 datasets of different patients
received from Ackron Univeristy, USA and AGA Khan Medical
University, Karachi, Pakistan.
Abstract: Local Linear Neuro-Fuzzy Models (LLNFM) like other neuro- fuzzy systems are adaptive networks and provide robust learning capabilities and are widely utilized in various applications such as pattern recognition, system identification, image processing and prediction. Local linear model tree (LOLIMOT) is a type of Takagi-Sugeno-Kang neuro fuzzy algorithm which has proven its efficiency compared with other neuro fuzzy networks in learning the nonlinear systems and pattern recognition. In this paper, a dedicated reconfigurable and parallel processing hardware for LOLIMOT algorithm and its applications are presented. This hardware realizes on-chip learning which gives it the capability to work as a standalone device in a system. The synthesis results on FPGA platforms show its potential to improve the speed at least 250 of times faster than software implemented algorithms.
Abstract: The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.
Abstract: This is an applied research to propose the method for
price quotation for a contract electronics manufacturer. It has had a
precise price quoting method but such method could not quickly
provide a result as the customer required. This reduces the ability of
company to compete in this kind of business. In this case, the cause
of long time quotation process was analyzed. A lot of product
features have been demanded by customer. By checking routine
processes, it was found that high fraction of quoting time was used
for production time estimating which has effected to the
manufacturing or production cost. Then the historical data of
products including types, number of components, assembling
method, and their assembling time were used to analyze the key
components affecting to production time. The price quoting model
then was proposed. The implementation of proposed model was able
to remarkably reduce quoting time with an acceptable required
precision.
Abstract: A self-evolution algorithm for optimizing neural networks using a combination of PSO and JPSO is proposed. The algorithm optimizes both the network topology and parameters simultaneously with the aim of achieving desired accuracy with less complicated networks. The performance of the proposed approach is compared with conventional back-propagation networks using several synthetic functions, with better results in the case of the former. The proposed algorithm is also implemented on slope stability problem to estimate the critical factor of safety. Based on the results obtained, the proposed self evolving network produced a better estimate of critical safety factor in comparison to conventional BPN network.
Abstract: The Boundary Representation of a 3D manifold contains
FACES (connected subsets of a parametric surface S : R2 -!
R3). In many science and engineering applications it is cumbersome
and algebraically difficult to deal with the polynomial set and
constraints (LOOPs) representing the FACE. Because of this reason, a
Piecewise Linear (PL) approximation of the FACE is needed, which is
usually represented in terms of triangles (i.e. 2-simplices). Solving the
problem of FACE triangulation requires producing quality triangles
which are: (i) independent of the arguments of S, (ii) sensitive to the
local curvatures, and (iii) compliant with the boundaries of the FACE
and (iv) topologically compatible with the triangles of the neighboring
FACEs. In the existing literature there are no guarantees for the point
(iii). This article contributes to the topic of triangulations conforming
to the boundaries of the FACE by applying the concept of parameterindependent
Gabriel complex, which improves the correctness of the
triangulation regarding aspects (iii) and (iv). In addition, the article
applies the geometric concept of tangent ball to a surface at a point to
address points (i) and (ii). Additional research is needed in algorithms
that (i) take advantage of the concepts presented in the heuristic
algorithm proposed and (ii) can be proved correct.
Abstract: In this paper, a two factor scheme is proposed to
generate cryptographic keys directly from biometric data, which
unlike passwords, are strongly bound to the user. Hash value of the
reference iris code is used as a cryptographic key and its length
depends only on the hash function, being independent of any other
parameter. The entropy of such keys is 94 bits, which is much higher
than any other comparable system. The most important and distinct
feature of this scheme is that it regenerates the reference iris code by
providing a genuine iris sample and the correct user password. Since
iris codes obtained from two images of the same eye are not exactly
the same, error correcting codes (Hadamard code and Reed-Solomon
code) are used to deal with the variability. The scheme proposed here
can be used to provide keys for a cryptographic system and/or for
user authentication. The performance of this system is evaluated on
two publicly available databases for iris biometrics namely CBS and
ICE databases. The operating point of the system (values of False
Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set
by properly selecting the error correction capacity (ts) of the Reed-
Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096%
and FRR is 0.76%.
Abstract: Video watermarking is usually considered as watermarking of a set of still images. In frame-by-frame watermarking approach, each video frame is seen as a single watermarked image, so collusion attack is more critical in video watermarking. If the same or redundant watermark is used for embedding in every frame of video, the watermark can be estimated and then removed by watermark estimate remodolulation (WER) attack. Also if uncorrelated watermarks are used for every frame, these watermarks can be washed out with frame temporal filtering (FTF). Switching watermark system or so-called SS-N system has better performance against WER and FTF attacks. In this system, for each frame, the watermark is randomly picked up from a finite pool of watermark patterns. At first SS-N system will be surveyed and then a new collusion attack for SS-N system will be proposed using a new algorithm for separating video frame based on watermark pattern. So N sets will be built in which every set contains frames carrying the same watermark. After that, using WER attack in every set, N different watermark patterns will be estimated and removed later.