Abstract: This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.
Abstract: Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Abstract: A motion protection system is designed for a parallel
motion platform with subsided cabin. Due to its complex structure,
parallel mechanism is easy to encounter interference problems
including link length limits, joints limits and self-collision. Thus a
virtual spring algorithm in operational space is developed for the
motion protection system to avoid potential damages caused by
interference. Simulation results show that the proposed motion
protection system can effectively eliminate interference problems and
ensure safety of the whole motion platform.
Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: Color Image quantization (CQ) is an important
problem in computer graphics, image and processing. The aim of
quantization is to reduce colors in an image with minimum distortion.
Clustering is a widely used technique for color quantization; all
colors in an image are grouped to small clusters. In this paper, we
proposed a new hybrid approach for color quantization using firefly
algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased
algorithm that can be used for solving optimization problems.
The proposed method can overcome the drawbacks of both
algorithms such as the local optima converge problem in K-means
and the early converge of firefly algorithm. Experiments on three
commonly used images and the comparison results shows that the
proposed algorithm surpasses both the base-line technique k-means
clustering and original firefly algorithm.
Abstract: In this article the accumulated results out of the effects
and length of the manufacture and production projects in the
university and research standard have been settled with the usefulness
definition of the process of project management for the accessibility
to the proportional pattern in the “time and action" stages. Studies
show that many problems confronted by the researchers in these
projects are connected to the non-profiting of: 1) autonomous timing
for gathering the educational theme, 2) autonomous timing for
planning and pattern, presenting before the construction, and 3)
autonomous timing for manufacture and sample presentation from the
output. The result of this study indicates the division of every
manufacture and production projects into three smaller autonomous
projects from its kind, budget and autonomous expenditure, shape
and order of the stages for the management of these kinds of projects.
In this case study real result are compared with theoretical results.
Abstract: Empty Fruit Bunches (EFB) and Palm Oil Mill
Effluent (POME) are two main wastes from oil palm industries which
contain rich lignocellulose. Degradation of EFB and POME by
microorganisms will produce hydrolytic enzyme which will degrade
cellulose and hemicellulose during composting process. However,
normal composting takes about four to six months to reach maturity.
Hence, application of fungi into compost can shorten the period of
composting. This study identifies the effect of xylanase and cellulase
produced by Aspergillus niger and Trichoderma virens on
composting process using EFB and POME. The degradation of EFB
and POME indicates the lignocellulolytic capacity of Aspergillus
niger and Trichoderma virens with more than 7% decrease in
hemicellulose and more than 25% decrease in cellulose for both
inoculated compost. Inoculation of Aspergillus niger and
Trichoderma virens also increased the enzyme activities during the
composting period compared to the control compost by 21% for both
xylanase and cellulase. Rapid rise in the activities of cellulase and
xylanase was observed by Aspergillus niger with the highest
activities of 14.41 FPU/mg and 3.89 IU/mg, respectively. Increased
activities of cellulase and xylanase also occurred in inoculation of
Trichoderma virens with the highest activities obtained at 13.21
FPU/mg and 4.43 IU/mg, respectively. Therefore, it is evident that
the inoculation of fungi can increase the enzyme activities hence
effectively degrading the EFB and POME.
Abstract: Pretreatment of lignocellulosic biomass materials from
poplar, acacia, oak, and fir with different ionic liquids (ILs)
containing 1-alkyl-3-methyl-imidazolium cations and various anions
has been carried out. The dissolved cellulose from biomass was
precipitated by adding anti-solvents into the solution and vigorous
stirring. Commercial cellulases Celluclast 1.5L and Accelerase 1000
have been used for hydrolysis of untreated and pretreated
lignocellulosic biomass. Among the tested ILs, [Emim]COOCH3
showed the best efficiency, resulting in highest amount of liberated
reducing sugars. Pretreatment of lignocellulosic biomass using
glycerol-ionic liquids combined pretreatment and dilute acid-ionic
liquids combined pretreatment were evaluated and compared with
glycerol pretreatment, ionic liquids pretreatment and dilute acid
pretreatment.
Abstract: A Wireless sensor network (WSN) consists of a set of battery-powered nodes, which collaborate to perform sensing tasks in a given environment. Each node in WSN should be capable to act for long periods of time with scrimpy or no external management. One requirement for this independent is: in the presence of adverse positions, the sensor nodes must be capable to configure themselves. Hence, the nodes for determine the existence of unusual events in their surroundings should make use of position awareness mechanisms. This work approaches the problem by considering the possible unusual events as diseases, thus making it possible to diagnose them through their symptoms, namely, their side effects. Considering these awareness mechanisms as a foundation for highlevel monitoring services, this paper also shows how these mechanisms are included in the primal plan of an intrusion detection system.
Abstract: The trial in the city, located 170 kilometers from the
Iranian city of Ahvaz was Omidiyeh. The main factor in this project
includes 4 levels in control (without hormones), use of hormones in
the seed, vegetative and flowering stage respectively. And sub-plots
included 3 varieties of vetch in three levels, with local names, was the
jewel in the study of light and Auxin in the vegetative and
reproductive different times in different varieties of vetch was
investigated. This test has been taken in the plots in a randomized
complete block with four replications. In order to study the effects of
the hormone Auxin in the growth stages (seed, vegetative and
flowering) to control (no hormone Auxin) on three local varieties of
vetch, the essence of light and plant height, number of pods per plant,
seed number The pods, seeds per plant, grain weight, grain yield,
plant dry weight and protein content were measured. Among the
vetch varieties for plant height, number of pods per plant, a seed per
plant, grain weight, grain yield, and plant dry weight and protein
levels of 1 percent of plant and seed number per pod per plant at 5%
level of There was no significant difference. Interactions for grain
yield per plant, grain yield and protein levels of 1 percent and the
number of seeds per pod and seed weight are significant differences
in levels 5 and plant height and plant dry weight of the interaction
were INFLUENCE There was no significant difference in them.
Abstract: IVE toolkit has been created for facilitating research,education and development in the field of virtual storytelling and computer games. Primarily, the toolkit is intended for modelling action selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploring joint behaviour and role-passing technique (Sec. V). Additionally, the toolkit can be used as an AI middleware without any changes. The main facility of IVE is that it serves for prototyping both the AI and virtual worlds themselves. The purpose of this paper is to describe IVE's features in general and to present our current work - including an educational game - on this platform.
Abstract: Vehicle suspension design must fulfill
some conflicting criteria. Among those is ride comfort
which is attained by minimizing the acceleration
transmitted to the sprung mass, via suspension spring
and damper. Also good handling of a vehicle is a
desirable property which requires stiff suspension and
therefore is in contrast with a vehicle with good ride.
Among the other desirable features of a suspension is
the minimization of the maximum travel of suspension.
This travel which is called suspension working space in
vehicle dynamics literature is also a design constraint
and it favors good ride. In this research a full car 8
degrees of freedom model has been developed and the
three above mentioned criteria, namely: ride, handling
and working space has been adopted as objective
functions. The Multi Objective Programming (MOP)
discipline has been used to find the Pareto Front and
some reasoning used to chose a design point between
these non dominated points of Pareto Front.
Abstract: This study presents a systematic analysis of the
dynamic behaviors of a gear-bearing system with porous squeeze film
damper (PSFD) under nonlinear suspension, nonlinear oil-film force
and nonlinear gear meshing force effect. It can be found that the
system exhibits very rich forms of sub-harmonic and even the chaotic
vibrations. The bifurcation diagrams also reveal that greater values of
permeability may not only improve non-periodic motions effectively,
but also suppress dynamic amplitudes of the system. Therefore, porous
effect plays an important role to improve dynamic stability of
gear-bearing systems or other mechanical systems. The results
presented in this study provide some useful insights into the design
and development of a gear-bearing system for rotating machinery that
operates in highly rotational speed and highly nonlinear regimes.
Abstract: The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.
Abstract: Historical monuments as architectural heritage are,
economically and culturally, considered one of the key aspects for
modern communities. Cultural heritage represents a country-s
national identity and pride and maintains and enriches that country-s
culture. Therefore, conservation of the monuments remained from
our ancestors requires everybody-s serious and unremitting effort.
Conservation, renewal, restoration, and technical study of cultural
and historical matters are issues which have a special status among
various forms of art and science in the present century and this is due
to two reasons: firstly, progress of humankind in this century has
created a factor called environmental pollution which not only has
caused new destructive processes of cultural/historical monuments
but also has accelerated the previous destructive processes by several
times, and secondly, the rapid advance of various sciences, especially
chemistry, has lead to the contribution of new methods and materials
to this significant issue.
Abstract: Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: Existing literature ondesign reasoning seems to give
either one sided accounts on expert design behaviour based on
internal processing. In the same way ecological theoriesseem to
focus one sidedly on external elementsthat result in a lack of unifying
design cognition theory. Although current extended design cognition
studies acknowledge the intellectual interaction between internal and
external resources, there still seems to be insufficient understanding
of the complexities involved in such interactive processes. As
such,this paper proposes a novelmulti-directional model for design
researchers tomap the complex and dynamic conduct controlling
behaviour in which both the computational and ecological
perspectives are integrated in a vertical manner. A clear distinction
between identified intentional and emerging physical drivers, and
relationships between them during the early phases of experts- design
process, is demonstrated by presenting a case study in which the
model was employed.