Abstract: the research was conducted using the self report of
shoplifters who apprehended in the supermarket while stealing. 943
shoplifters in three years were interviewed right after the stealing act
and before calling the police. The aim of the study is to know the
shoplifting characteristics in Saudi Arabia, including the trait of
shoplifters and the situation of the supermarkets where the stealing
takes place. The analysis based on the written information about each
thief as the documentary research method. Descriptive statistics as
well as some inferential statistics were employed. The result shows
that there are differences between genders, age groups, occupations,
time of the day, days of the week, months, way of stealing, individual
or group of thieves and other supermarket situations in the type of
items stolen, total price and the count of items. The result and the
recommendation will serve as a guide for retailers where, when and
who to look at to prevent shoplifting.
Abstract: Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: A large number of chemical, bio-chemical and pollution-control processes use heterogeneous fixed-bed reactors. The use of finite hollow cylindrical catalyst pellets can enhance conversion levels in such reactors. The absence of the pellet core can significantly lower the diffusional resistance associated with the solid phase. This leads to a better utilization of the catalytic material, which is reflected in the higher values for the effectiveness factor, leading ultimately to an enhanced conversion level in the reactor. It is however important to develop a rigorous heterogeneous model for the reactor incorporating the two-dimensional feature of the solid phase owing to the presence of the finite hollow cylindrical catalyst pellet. Presently, heterogeneous models reported in the literature invariably employ one-dimension solid phase models meant for spherical catalyst pellets. The objective of the paper is to present a rigorous model of the fixed-bed reactors containing finite hollow cylindrical catalyst pellets. The reaction kinetics considered here is the widely used Michaelis–Menten kinetics for the liquid-phase bio-chemical reactions. The reaction parameters used here are for the enzymatic degradation of urea. Results indicate that increasing the height to diameter ratio helps to improve the conversion level. On the other hand, decreasing the thickness is apparently not as effective. This could however be explained in terms of the higher void fraction of the bed that causes a smaller amount of the solid phase to be packed in the fixed-bed bio-chemical reactor.
Abstract: Nowadays it is a trend for electronic circuit designers to
integrate all system components on a single-chip. This paper proposed
the design of a single-chip proportional to absolute temperature
(PTAT) sensor including a voltage reference circuit using CEDEC
0.18m CMOS Technology. It is a challenge to design asingle-chip
wide range linear response temperature sensor for many applications.
The channel widths between the compensation transistor and the
reference transistor are critical to design the PTAT temperature sensor
circuit. The designed temperature sensor shows excellent linearity
between -100°C to 200° and the sensitivity is about 0.05mV/°C.
The chip is designed to operate with a single voltage source of 1.6V.
Abstract: Vehicle suspension design must fulfill
some conflicting criteria. Among those is ride comfort
which is attained by minimizing the acceleration
transmitted to the sprung mass, via suspension spring
and damper. Also good handling of a vehicle is a
desirable property which requires stiff suspension and
therefore is in contrast with a vehicle with good ride.
Among the other desirable features of a suspension is
the minimization of the maximum travel of suspension.
This travel which is called suspension working space in
vehicle dynamics literature is also a design constraint
and it favors good ride. In this research a full car 8
degrees of freedom model has been developed and the
three above mentioned criteria, namely: ride, handling
and working space has been adopted as objective
functions. The Multi Objective Programming (MOP)
discipline has been used to find the Pareto Front and
some reasoning used to chose a design point between
these non dominated points of Pareto Front.
Abstract: The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.
Abstract: Resins are used in nuclear power plants for water
ultrapurification. Two approaches are considered in this work:
column experiments and simulations. A software called OPTIPUR
was developed, tested and used. The approach simulates the onedimensional
reactive transport in porous medium with convectivedispersive
transport between particles and diffusive transport within
the boundary layer around the particles. The transfer limitation in the
boundary layer is characterized by the mass transfer coefficient
(MTC). The influences on MTC were measured experimentally. The
variation of the inlet concentration does not influence the MTC; on
the contrary of the Darcy velocity which influences. This is consistent
with results obtained using the correlation of Dwivedi&Upadhyay.
With the MTC, knowing the number of exchange site and the relative
affinity, OPTIPUR can simulate the column outlet concentration
versus time. Then, the duration of use of resins can be predicted in
conditions of a binary exchange.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: Existing literature ondesign reasoning seems to give
either one sided accounts on expert design behaviour based on
internal processing. In the same way ecological theoriesseem to
focus one sidedly on external elementsthat result in a lack of unifying
design cognition theory. Although current extended design cognition
studies acknowledge the intellectual interaction between internal and
external resources, there still seems to be insufficient understanding
of the complexities involved in such interactive processes. As
such,this paper proposes a novelmulti-directional model for design
researchers tomap the complex and dynamic conduct controlling
behaviour in which both the computational and ecological
perspectives are integrated in a vertical manner. A clear distinction
between identified intentional and emerging physical drivers, and
relationships between them during the early phases of experts- design
process, is demonstrated by presenting a case study in which the
model was employed.
Abstract: This paper looks into detailed investigation of
thermal-hydraulic characteristics of the flow field in a fuel rod
model, especially near the spacer. The area investigate represents a
source of information on the velocity flow field, vortex, and on the
amount of heat transfer into the coolant all of which are critical for
the design and improvement of the fuel rod in nuclear power plants.
The flow field investigation uses three-dimensional Computational
Fluid Dynamics (CFD) with the Reynolds stresses turbulence model
(RSM). The fuel rod model incorporates a vertical annular channel
where three different shapes of spacers are used; each spacer shape is
addressed individually. These spacers are mutually compared in
consideration of heat transfer capabilities between the coolant and
the fuel rod model. The results are complemented with the calculated
heat transfer coefficient in the location of the spacer and along the
stainless-steel pipe.
Abstract: The recycling of concrete, bricks and masonry rubble
as concrete aggregates is an important way to contribute to a
sustainable material flow. However, there are still various
uncertainties limiting the widespread use of Recycled Concrete
Aggregates (RCA). The fluctuations in the composition of grade
recycled aggregates and their influence on the properties of fresh and
hardened concrete are of particular concern regarding the use of
RCA. Most of problems occurring while using recycled concrete
aggregates as aggregates are due to higher porosity and hence higher
water absorption, lower mechanical strengths, residual impurities on
the surface of the RCA forming weaker bond between cement paste
and aggregate. So, the reuse of RCA is still limited. Efficient
polymer based treatment is proposed in order to reuse RCA easier.
The silicon-based polymer treatments of RCA were carried out and
were compared. This kind of treatment can improve the properties of
RCA such as the rate of water absorption on treated RCA is
significantly reduced.
Abstract: The major problem that wireless communication
systems undergo is multipath fading caused by scattering of the
transmitted signal. However, we can treat multipath propagation as
multiple channels between the transmitter and receiver to improve
the signal-to-scattering-noise ratio. While using Single Input
Multiple Output (SIMO) systems, the diversity receivers extract
multiple signal branches or copies of the same signal received from
different channels and apply gain combining schemes such as Root
Mean Square Gain Combining (RMSGC). RMSGC asymptotically
yields an identical performance to that of the theoretically optimal
Maximum Ratio Combining (MRC) for values of mean Signal-to-
Noise-Ratio (SNR) above a certain threshold value without the need
for SNR estimation. This paper introduces an improvement of
RMSGC using two different issues. We found that post-detection and
de-noising the received signals improve the performance of RMSGC
and lower the threshold SNR.
Abstract: To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.
Abstract: This research aims to examine the key success factors
for the diffusion of mobile entertainment services in Malaysia. The
drivers and barriers observed in this research include perceived
benefit; concerns pertaining to pricing, product and technological
standardization, privacy and security; as well as influences from
peers and community. An analysis of a Malaysian survey of 384
respondents between 18 to 25 years shows that subscribers placed
greater importance on perceived benefit of mobile entertainment
services compared to other factors. Results of the survey also show
that there are strong positive correlations between all the factors,
with pricing issue–perceived benefit showing the strongest
relationship. This paper aims to provide an extensive study on the
drivers and barriers that could be used to derive architecture for
entertainment service provision to serve as a guide for telcos to
outline suitable approaches in order to encourage mass market
adoption of mobile entertainment services in Malaysia.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: In this paper, a Biochemical Methane Potential (BMP)
test provides a measure of the energy production potential from codigestion
between the frozen seafood wastewater and the decanter
cake. The experiments were conducted in laboratory-scale. The
suitable ratio of the frozen seafood wastewater and the decanter cake
was observed in the BMP test. The ratio of the co-digestion between
the frozen seafood wastewater and the decanter cake has impacts on
the biogas production and energy production potential. The best
performance for energy production potential using BMP test
observed from the 180 ml of the frozen seafood wastewater and 10 g
of the decanter cake ratio. This ratio provided the maximum methane
production at 0.351 l CH4/g TCODremoval. The removal efficiencies
are 76.18%, 83.55%, 43.16% and 56.76% at TCOD, SCOD, TS and
VS, respectively. The result can be concluded that the decanter cake
can improve the energy production potential of the frozen seafood
wastewater. The energy provides from co-digestion between frozen
seafood wastewater and decanter cake approximately 19x109
MJ/year in Thailand.
Abstract: One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.