Abstract: A great deal of research works in the field information
systems security has been based on a positivist paradigm. Applying
the reductionism concept of the positivist paradigm for information
security means missing the bigger picture and thus, the lack of holism
which could be one of the reasons why security is still overlooked,
comes as an afterthought or perceived from a purely technical
dimension. We need to reshape our thinking and attitudes towards
security especially in a complex and dynamic environment such as e-
Business to develop a holistic understanding of e-Business security in
relation to its context as well as considering all the stakeholders in
the problem area. In this paper we argue the suitability and need for
more inductive interpretive approach and qualitative research method
to investigate e-Business security. Our discussion is based on a
holistic framework of enquiry, nature of the research problem, the
underling theoretical lens and the complexity of e-Business
environment. At the end we present a research strategy for
developing a holistic framework for understanding of e-Business
security problems in the context of developing countries based on an
interdisciplinary inquiry which considers their needs and
requirements.
Abstract: Conventional materials like glass, wood or metals
replacement with polymer materials is still continuing. More simple
thus cheaper production is the main reason. However due to high
energy and petrochemical prices are polymer prices increasing too.
That´s why various kinds of fillers are used to make polymers
cheaper. Of course target is to maintain or improve properties of
these compounds. In this paper are solved rheology issues of
polymers compounded with vegetal origin fibers.
Abstract: Wavelets have provided the researchers with
significant positive results, by entering the texture defect detection domain. The weak point of wavelets is that they are one-dimensional
by nature so they are not efficient enough to describe and analyze two-dimensional functions. In this paper we present a new method to
detect the defect of texture images by using curvelet transform.
Simulation results of the proposed method on a set of standard
texture images confirm its correctness. Comparing the obtained results indicates the ability of curvelet transform in describing
discontinuity in two-dimensional functions compared to wavelet
transform
Abstract: Research and development R&D work involves
enormous amount of work that has to do with data measurement and
collection. This process evolves as new information is fed, new
technologies are utilized, and eventually new knowledge is created
by the stakeholders i.e., researchers, clients, and end-users. When
new knowledge is created, procedures of R&D work should evolve
and produce better results within improved research skills and
improved methods of data measurements and collection. This
measurement improvement should then be benchmarked against a
metric that should be developed at the organization. In this paper, we
are suggesting a conceptual metric for R&D work performance
improvement (PI) at the Kuwait Institute for Scientific Research
(KISR). This PI is to be measured against a set of variables in the
suggested metric, which are more closely correlated to organizational
output, as opposed to organizational norms. The paper also mentions
and discusses knowledge creation and management as an addedvalue
to R&D work and measurement improvement. The research
methodology followed in this work is qualitative in nature, based on
a survey that was distributed to researchers and interviews held with
senior researchers at KISR. Research and analyses in this paper also
include looking at and analyzing KISR-s literature.
Abstract: In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.
Abstract: Deciding the numerous parameters involved in
designing a competent artificial neural network is a complicated task.
The existence of several options for selecting an appropriate
architecture for neural network adds to this complexity, especially
when different applications of heterogeneous natures are concerned.
Two completely different applications in engineering and medical
science were selected in the present study including prediction of
workpiece's surface roughness in ultrasonic-vibration assisted turning
and papilloma viruses oncogenicity. Several neural network
architectures with different parameters were developed for each
application and the results were compared. It was illustrated in this
paper that some applications such as the first one mentioned above
are apt to be modeled by a single network with sufficient accuracy,
whereas others such as the second application can be best modeled
by different expert networks for different ranges of output.
Development of knowledge about the essentials of neural networks
for different applications is regarded as the cornerstone of
multidisciplinary network design programs to be developed as a
means of reducing inconsistencies and the burden of the user
intervention.
Abstract: A code has been developed in Mathematica using
Direct Simulation Monte Carlo (DSMC) technique. The code was
tested for 2-D air flow around a circular cylinder. Same geometry
and flow properties were used in FLUENT 6.2 for comparison. The
results obtained from Mathematica simulation indicated significant
agreement with FLUENT calculations, hence providing insight into
particle nature of fluid flows.
Abstract: One of the common problems encountered in software
engineering is addressing and responding to the changing nature of
requirements. While several approaches have been devised to address
this issue, ranging from instilling resistance to changing requirements
in order to mitigate impact to project schedules, to developing an
agile mindset towards requirements, the approach discussed in this
paper is one of conceptualizing the delta in requirement and
modeling it, in order to plan a response to it. To provide some
context here, change is first formally identified and categorized as
either formal change or informal change. While agile methodology
facilitates informal change, the approach discussed in this paper
seeks to develop the idea of facilitating formal change. To collect,
document meta-requirements that represent the phenomena of change
would be a pro-active measure towards building a realistic cognition
of the requirements entity that can further be harnessed in the
software engineering process.
Abstract: Land surface temperature (LST) is an important
parameter to study in urban climate. The understanding of the
influence of biophysical factors could improve the establishment of
modeling urban thermal landscape. It is well established that climate
hold a great influence on the urban landscape. However, it has been
recognize that climate has a low priority in urban planning process,
due to the complex nature of its influence. This study will focus on
the relatively cloud free Landsat Thematic Mapper image of the study
area, acquired on the 2nd March 2006. Correlation analyses were
conducted to identify the relationship of LST to the biophysical
factors; vegetation indices, impervious surface, and albedo to
investigate the variation of LST. We suggest that the results can be
considered by the stackholders during decision-making process to
create a cooler and comfortable environment in the urban landscape
for city dwellers.
Abstract: In this paper we propose segmentation approach based
on Vector Quantization technique. Here we have used Kekre-s fast
codebook generation algorithm for segmenting low-altitude aerial
image. This is used as a preprocessing step to form segmented
homogeneous regions. Further to merge adjacent regions color
similarity and volume difference criteria is used. Experiments
performed with real aerial images of varied nature demonstrate that
this approach does not result in over segmentation or under
segmentation. The vector quantization seems to give far better results
as compared to conventional on-the-fly watershed algorithm.
Abstract: Fractional Fourier Transform is a powerful tool,
which is a generalization of the classical Fourier Transform. This
paper provides a mathematical relation relating the span in Fractional
Fourier domain with the amplitude and phase functions of the signal,
which is further used to study the variation of quality factor with
different values of the transform order. It is seen that with the
increase in the number of transients in the signal, the deviation of
average Fractional Fourier span from the frequency bandwidth
increases. Also, with the increase in the transient nature of the signal,
the optimum value of transform order can be estimated based on the
quality factor variation, and this value is found to be very close to
that for which one can obtain the most compact representation. With
the entire mathematical analysis and experimentation, we consolidate
the fact that Fractional Fourier Transform gives more optimal
representations for a number of transform orders than Fourier
transform.
Abstract: In this paper, we propose a dual version of the first
threshold ring signature scheme based on error-correcting code proposed
by Aguilar et. al in [1]. Our scheme uses an improvement of
Véron zero-knowledge identification scheme, which provide smaller
public and private key sizes and better computation complexity than
the Stern one. This scheme is secure in the random oracle model.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: This study explores perceptions of English as a Foreign
Language (EFL) learners on using computer mediated communication
technology in their learner of English. The data consists of
observations of both synchronous and asynchronous communication
participants engaged in for over a period of 4 months, which included
online, and offline communication protocols, open-ended interviews
and reflection papers composed by participants.
Content analysis of interview data and the written documents listed
above, as well as, member check and triangulation techniques are the
major data analysis strategies. The findings suggest that participants
generally do not benefit from computer-mediated communication in
terms of its effect in learning a foreign language. Participants regarded
the nature of CMC as artificial, or pseudo communication that did not
aid their authentic communicational skills in English. The results of
this study sheds lights on insufficient and inconclusive findings, which
most quantitative CMC studies previously generated.
Abstract: This paper presents a generalized form of the
mechanistic deconvolution technique (GMD) to modeling image sensors applicable in various pan–tilt planes of view. The mechanistic deconvolution technique (UMD) is modified with the
given angles of a pan–tilt plane of view to formulate constraint parameters and characterize distortion effects, and thereby, determine
the corrected image data. This, as a result, does not require experimental setup or calibration. Due to the mechanistic nature of
the sensor model, the necessity for the sensor image plane to be
orthogonal to its z-axis is eliminated, and it reduces the dependency on image data. An experiment was constructed to evaluate the
accuracy of a model created by GMD and its insensitivity to changes in sensor properties and in pan and tilt angles. This was compared
with a pre-calibrated model and a model created by UMD using two sensors with different specifications. It achieved similar accuracy
with one-seventh the number of iterations and attained lower mean error by a factor of 2.4 when compared to the pre-calibrated and
UMD model respectively. The model has also shown itself to be robust and, in comparison to pre-calibrated and UMD model, improved the accuracy significantly.
Abstract: Corrosion of metallic water pipelines buried below
ground surface is a function of the nature of the surrounding soil and
groundwater. This gives the importance of knowing the physical and
chemical characteristics of the pipe-s surrounding environment. The
corrosion of externally – unprotected metallic water pipelines,
specially ductile iron pipes, in localities with aggressive soil
conditions is becoming a significant problem. Anticorrosive
protection for metallic water pipelines, their fittings and accessories
is very important, because they may be attached by corrosion with
time. The tendency of a metallic substrate to corrode is a function of
the surface characteristics of the metal and of the metal/protective
film interface, the physical, electrical and electrochemical properties
of the film, and the nature of the environment in which the pipelines
system is placed. In this work the authors have looked at corrosion
problems of water pipelines and their control. The corrosive
properties of groundwater and soil environments are reviewed, and
parameters affecting corrosion are discussed. The purpose of this
work is to provide guidelines for materials selection in water and soil
environments, and how the water pipelines can be protected against
metallic corrosion.
Abstract: Frequent pattern discovery over data stream is a hard
problem because a continuously generated nature of stream does not
allow a revisit on each data element. Furthermore, pattern discovery
process must be fast to produce timely results. Based on these
requirements, we propose an approximate approach to tackle the
problem of discovering frequent patterns over continuous stream.
Our approximation algorithm is intended to be applied to process a
stream prior to the pattern discovery process. The results of
approximate frequent pattern discovery have been reported in the
paper.
Abstract: Electronic Systems are the core of everyday lives.
They form an integral part in financial networks, mass transit,
telephone systems, power plants and personal computers. Electronic
systems are increasingly based on complex VLSI (Very Large Scale
Integration) integrated circuits. Initial electronic design automation is
concerned with the design and production of VLSI systems. The next
important step in creating a VLSI circuit is Physical Design. The
input to the physical design is a logical representation of the system
under design. The output of this step is the layout of a physical
package that optimally or near optimally realizes the logical
representation. Physical design problems are combinatorial in nature
and of large problem sizes. Darwin observed that, as variations are
introduced into a population with each new generation, the less-fit
individuals tend to extinct in the competition of basic necessities.
This survival of fittest principle leads to evolution in species. The
objective of the Genetic Algorithms (GA) is to find an optimal
solution to a problem .Since GA-s are heuristic procedures that can
function as optimizers, they are not guaranteed to find the optimum,
but are able to find acceptable solutions for a wide range of
problems. This survey paper aims at a study on Efficient Algorithms
for VLSI Physical design and observes the common traits of the
superior contributions.
Abstract: The purpose of this study is to identify and evaluate
the scale of implementation of Just-In-Time (JIT) in the different industrial sectors in the Middle East. This study analyzes the empirical data collected by a questionnaire survey distributed to
companies in three main industrial sectors in the Middle East, which
are: food, chemicals and fabrics. The following main hypotheses is formulated and tested: (The requirements of JIT application differ
according to the type of industrial sector).Descriptive statistics and Box plot analysis were used to examine the hypotheses. This study indicates a reasonable evidence for accepting the main hypotheses. It
reveals that there is no standard way to adopt JIT as a production system. But each industrial sector should concentrate in the
investment on critical requirements that differ according to the nature
and strategy of production followed in that sector.
Abstract: For the electrical metrics that describe photovoltaic
cell performance are inherently multivariate in nature, use of a
univariate, or one variable, statistical process control chart can have
important limitations. Development of a comprehensive process
control strategy is known to be significantly beneficial to reducing
process variability that ultimately drives up the manufacturing cost
photovoltaic cells. The multivariate moving average or MMA chart,
is applied to the electrical metrics of photovoltaic cells to illustrate
the improved sensitivity on process variability this method of control
charting offers. The result show the ability of the MMA chart to
expand to as any variables as needed, suggests an application
with multiple photovoltaic electrical metrics being used in
concert to determine the processes state of control.