Abstract: This paper presents a NDT by infrared thermography with excitation CO2 Laser, wavelength of 10.6 μm. This excitation is the controllable heating beam, confirmed by a preliminary test on a wooden plate 1.2 m x 0.9 m x 1 cm. As the first practice, this method is applied to detecting the defect in CFRP heated by the Laser 300 W during 40 s. Two samples 40 cm x 40 cm x 4.5 cm are prepared, one with defect, another one without defect. The laser beam passes through the lens of a deviation device, and heats the samples placed at a determinate position and area. As a result, the absence of adhesive can be detected. This method displays prominently its application as NDT with the composite materials. This work gives a good perspective to characterize the laser beam, which is very useful for the next detection campaigns.
Abstract: The use of radar in Quantitative Precipitation Estimation (QPE) for radar-rainfall measurement is significantly beneficial. Radar has advantages in terms of high spatial and temporal condition in rainfall measurement and also forecasting. In Malaysia, radar application in QPE is still new and needs to be explored. This paper focuses on the Z/R derivation works of radarrainfall estimation based on rainfall classification. The works developed new Z/R relationships for Klang River Basin in Selangor area for three different general classes of rain events, namely low (10mm/hr, 30mm/hr) and also on more specific rain types during monsoon seasons. Looking at the high potential of Doppler radar in QPE, the newly formulated Z/R equations will be useful in improving the measurement of rainfall for any hydrological application, especially for flood forecasting.
Abstract: Although the World Wide Web is considered the
largest source of information there exists nowadays, due to its
inherent dynamic characteristics, the task of finding useful and
qualified information can become a very frustrating experience. This
study presents a research on the information mining systems in the
Web; and proposes an implementation of these systems by means of
components that can be built using the technology of Web services.
This implies that they can encompass features offered by a services
oriented architecture (SOA) and specific components may be used by
other tools, independent of platforms or programming languages.
Hence, the main objective of this work is to provide an architecture
to Web mining systems, divided into stages, where each step is a
component that will incorporate the characteristics of SOA. The
separation of these steps was designed based upon the existing
literature. Interesting results were obtained and are shown here.
Abstract: The energy consumption and delay in read/write
operation of conventional SRAM is investigated analytically as well
as by simulation. Explicit analytical expressions for the energy
consumption and delay in read and write operation as a function of
device parameters and supply voltage are derived. The expressions are
useful in predicting the effect of parameter changes on the energy
consumption and speed as well as in optimizing the design of
conventional SRAM. HSPICE simulation in standard 0.25μm CMOS
technology confirms precision of analytical expressions derived from
this paper.
Abstract: Long number multiplications (n ≥ 128-bit) are a
primitive in most cryptosystems. They can be performed better by
using Karatsuba-Ofman technique. This algorithm is easy to
parallelize on workstation network and on distributed memory, and
it-s known as the practical method of choice. Multiplying long
numbers using Karatsuba-Ofman algorithm is fast but is highly
recursive. In this paper, we propose different designs of
implementing Karatsuba-Ofman multiplier. A mixture of sequential
and combinational system design techniques involving pipelining is
applied to our proposed designs. Multiplying large numbers can be
adapted flexibly to time, area and power criteria. Computationally
and occupation constrained in embedded systems such as: smart
cards, mobile phones..., multiplication of finite field elements can be
achieved more efficiently. The proposed designs are compared to
other existing techniques. Mathematical models (Area (n), Delay (n))
of our proposed designs are also elaborated and evaluated on
different FPGAs devices.
Abstract: A special case of floating point data representation is block
floating point format where a block of operands are forced to have a joint
exponent term. This paper deals with the finite wordlength properties of
this data format. The theoretical errors associated with the error model for
block floating point quantization process is investigated with the help of error
distribution functions. A fast and easy approximation formula for calculating
signal-to-noise ratio in quantization to block floating point format is derived.
This representation is found to be a useful compromise between fixed point
and floating point format due to its acceptable numerical error properties over
a wide dynamic range.
Abstract: This research simulates one of the natural phenomena,
the ocean wave. Our goal is to be able to simulate the ocean wave at
real-time rate with the water surface interacting with objects. The
wave in this research is calm and smooth caused by the force of the
wind above the ocean surface. In order to make the simulation of the
wave real-time, the implementation of the GPU and the
multithreading techniques are used here. Based on the fact that the
new generation CPUs, for personal computers, have multi cores, they
are useful for the multithread. This technique utilizes more than one
core at a time. This simulation is programmed by C language with
OpenGL. To make the simulation of the wave look more realistic, we
applied an OpenGL technique called cube mapping (environmental
mapping) to make water surface reflective and more realistic.
Abstract: In this paper usefulness of quasi-Newton iteration
procedure in parameters estimation of the conditional variance
equation within BHHH algorithm is presented. Analytical solution of
maximization of the likelihood function using first and second
derivatives is too complex when the variance is time-varying. The
advantage of BHHH algorithm in comparison to the other
optimization algorithms is that requires no third derivatives with
assured convergence. To simplify optimization procedure BHHH
algorithm uses the approximation of the matrix of second derivatives
according to information identity. However, parameters estimation in
a/symmetric GARCH(1,1) model assuming normal distribution of
returns is not that simple, i.e. it is difficult to solve it analytically.
Maximum of the likelihood function can be founded by iteration
procedure until no further increase can be found. Because the
solutions of the numerical optimization are very sensitive to the
initial values, GARCH(1,1) model starting parameters are defined.
The number of iterations can be reduced using starting values close
to the global maximum. Optimization procedure will be illustrated in
framework of modeling volatility on daily basis of the most liquid
stocks on Croatian capital market: Podravka stocks (food industry),
Petrokemija stocks (fertilizer industry) and Ericsson Nikola Tesla
stocks (information-s-communications industry).
Abstract: Network warfare is an emerging concept that focuses on the network and computer based forms through which information is attacked and defended. Various computer and network security concepts thus play a role in network warfare. Due the intricacy of the various interacting components, a model to better understand the complexity in a network warfare environment would be beneficial. Non-quantitative modeling is a useful method to better characterize the field due to the rich ideas that can be generated based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define the influential conditions in a network warfare environment.
Abstract: In a none-super-competitive environment the concepts
of closed system, management control remains to be the dominant
guiding concept to management. The merits of closed loop have been
the sources of most of the management literature and culture for
many decades. It is a useful exercise to investigate and poke into the
dynamics of the control loop phenomenon and draws some lessons to
use for refining the practice of management. This paper examines the
multitude of lessons abstracted from the behavior of the Input /output
/feedback control loop model, which is the core of control theory.
There are numerous lessons that can be learned from the insights this
model would provide and how it parallels the management dynamics
of the organization. It is assumed that an organization is basically a
living system that interacts with the internal and external variables. A
viable control loop is the one that reacts to the variation in the
environment and provide or exert a corrective action. In managing
organizations this is reflected in organizational structure and
management control practices. This paper will report findings that
were a result of examining several abstract scenarios that are
exhibited in the design, operation, and dynamics of the control loop
and how they are projected on the functioning of the organization.
Valuable lessons are drawn in trying to find parallels and new
paradigms, and how the control theory science is reflected in the
design of the organizational structure and management practices. The
paper is structured in a logical and perceptive format. Further
research is needed to extend these findings.
Abstract: On the basis of Bayesian inference using the
maximizer of the posterior marginal estimate, we carry out phase
unwrapping using multiple interferograms via generalized mean-field
theory. Numerical calculations for a typical wave-front in remote
sensing using the synthetic aperture radar interferometry, phase
diagram in hyper-parameter space clarifies that the present method
succeeds in phase unwrapping perfectly under the constraint of
surface- consistency condition, if the interferograms are not corrupted
by any noises. Also, we find that prior is useful for extending a phase
in which phase unwrapping under the constraint of the
surface-consistency condition. These results are quantitatively
confirmed by the Monte Carlo simulation.
Abstract: Gamboge disorder (GD) or fruit damage by the yellow sap is a major problem in mangosteen. Mangosteen plants varied in the level of GD, from very low or non GD to low, moderate and high GD. However it was difficult to differentiate between GD and non GD plants because evaluation of the disorder is strongly influenced by environment. In this study we investigated the usefulness of primer designed from bioinformatics related to cell wall strength, termed as MCWS, to predict GD. Plant materials used were 28 mangosteen plants selected based on percentage of GD categorized as high, moderate, low and very low or non GD. The result showed that the specific DNA fragments were absent in the high GD accessions. The MCWS marker suggests as a novel polymorphic marker for GD in mangosteen as well as a marker for detect variability in mangosteen as apomictic plant.
Abstract: In competitive electricity markets all over the world, an adoption of suitable transmission pricing model is a problem as transmission segment still operates as a monopoly. Transmission pricing is an important tool to promote investment for various transmission services in order to provide economic, secure and reliable electricity to bulk and retail customers. The nodal pricing based on SRMC (Short Run Marginal Cost) is found extremely useful by researchers for sending correct economic signals. The marginal prices must be determined as a part of solution to optimization problem i.e. to maximize the social welfare. The need to maximize the social welfare subject to number of system operational constraints is a major challenge from computation and societal point of views. The purpose of this paper is to present a nodal transmission pricing model based on SRMC by developing new mathematical expressions of real and reactive power marginal prices using GA-Fuzzy based optimal power flow framework. The impacts of selecting different social welfare functions on power marginal prices are analyzed and verified with results reported in literature. Network revenues for two different power systems are determined using expressions derived for real and reactive power marginal prices in this paper.
Abstract: Solar sunspot rotation, latitudinal bands are studied based on intelligent computation methods. A combination of image fusion method with together tree decomposition is used to obtain quantitative values about the latitudes of trajectories on sun surface that sunspots rotate around them. Daily solar images taken with SOlar and Heliospheric (SOHO) satellite are fused for each month separately .The result of fused image is decomposed with Quad Tree decomposition method in order to achieve the precise information about latitudes of sunspot trajectories. Such analysis is useful for gathering information about the regions on sun surface and coordinates in space that is more expose to solar geomagnetic storms, tremendous flares and hot plasma gases permeate interplanetary space and help human to serve their technical systems. Here sunspot images in September, November and October in 2001 are used for studying the magnetic behavior of sun.
Abstract: The survey and classification of the different security
attacks in structured peer-to-peer (P2P) overlay networks can be
useful to computer system designers, programmers, administrators,
and users. In this paper, we attempt to provide a taxonomy of
structured P2P overlay networks security attacks. We have specially
focused on the way these attacks can arise at each level of the
network. Moreover, we observed that most of the existing systems
such as Content Addressable Network (CAN), Chord, Pastry,
Tapestry, Kademlia, and Viceroy suffer from threats and vulnerability
which lead to disrupt and corrupt their functioning. We hope that our
survey constitutes a good help for who-s working on this area of
research.
Abstract: This paper presents three new methodologies for the
basic operations, which aim at finding new ways of computing union
(maximum) and intersection (minimum) membership values by
taking into effect the entire membership values in a fuzzy set. The
new methodologies are conceptually simple and easy from the
application point of view and are illustrated with a variety of
problems such as Cartesian product of two fuzzy sets, max –min
composition of two fuzzy sets in different product spaces and an
application of an inverted pendulum to determine the impact of the
new methodologies. The results clearly indicate a difference based on
the nature of the fuzzy sets under consideration and hence will be
highly useful in quite a few applications where different values have
significant impact on the behavior of the system.
Abstract: The area of Project Risk Management (PRM) has
been extensively researched, and the utilization of various tools and
techniques for managing risk in several industries has been
sufficiently reported. Formal and systematic PRM practices have
been made available for the construction industry. Based on such
body of knowledge, this paper tries to find out the global picture of
PRM practices and approaches with the help of a survey to look into
the usage of PRM techniques and diffusion of software tools, their
level of maturity, and their usefulness in the construction sector.
Results show that, despite existing techniques and tools, their usage is
limited: software tools are used only by a minority of respondents
and their cost is one of the largest hurdles in adoption. Finally, the
paper provides some important guidelines for future research
regarding quantitative risk analysis techniques and suggestions for
PRM software tools development and improvement.
Abstract: Multimedia security is an incredibly significant area
of concern. A number of papers on robust digital watermarking have
been presented, but there are no standards that have been defined so
far. Thus multimedia security is still a posing problem. The aim of
this paper is to design a robust image-watermarking scheme, which
can withstand a different set of attacks. The proposed scheme
provides a robust solution integrating image moment normalization,
content dependent watermark and discrete wavelet transformation.
Moment normalization is useful to recover the watermark even in
case of geometrical attacks. Content dependent watermarks are a
powerful means of authentication as the data is watermarked with its
own features. Discrete wavelet transforms have been used as they
describe image features in a better manner. The proposed scheme
finds its place in validating identification cards and financial
instruments.
Abstract: Indices summarizing community structure are used to
evaluate fundamental community ecology, species interaction,
biogeographical factors, and environmental stress. Some of these
indices are insensitive to gross community changes induced by
contaminants of pollution. Diversity indices and similarity indices are
reviewed considering their ecological application, both theoretical
and practical. For some useful indices, empirical equations are given
to calculate the expected maximum value of the indices to which the
observed values can be related at any combination of sample sizes at
the experimental sites. This paper examines the effects of sample size
and diversity on the expected values of diversity indices and
similarity indices, using various formulae. It has been shown that all
indices are strongly affected by sample size and diversity. In some
indices, this influence is greater than the others and an attempt has
been made to deal with these influences.
Abstract: In line with changes of consumers modern lifestyle has call for the advertising strategy to change. This research is to find out how game with telepresence and product experience embedded in the computer game to affect users- intention to purchase. Game content developers are urging to consider of placing product message as part of game design strategy that can influence the game player-s intention to purchase. Experiment was carried out on two hundred and fifty undergraduate students who volunteered to participate in the Internet game playing activities. A factor analysis and correlation analysis was performed on items designed to measure telepresence, attitudes toward telepresence, and game player intention to purchase the product advertise in the game that respondents experienced. The results indicated that telepresence consist of interactive experience and product experience. The study also found that product experience is positively related to the game players- intention to purchase. The significance of product experience implies the usefulness of an interactive advertising in the game playing to attract players- intention to purchase the advertised product placed in the creative game design.