Abstract: One of the purposes of the robust method of
estimation is to reduce the influence of outliers in the data, on the
estimates. The outliers arise from gross errors or contamination from
distributions with long tails. The trimmed mean is a robust estimate.
This means that it is not sensitive to violation of distributional
assumptions of the data. It is called an adaptive estimate when the
trimming proportion is determined from the data rather than being
fixed a “priori-.
The main objective of this study is to find out the robustness
properties of the adaptive trimmed means in terms of efficiency, high
breakdown point and influence function. Specifically, it seeks to find
out the magnitude of the trimming proportion of the adaptive
trimmed mean which will yield efficient and robust estimates of the
parameter for data which follow a modified Weibull distribution with
parameter λ = 1/2 , where the trimming proportion is determined by a
ratio of two trimmed means defined as the tail length. Secondly, the
asymptotic properties of the tail length and the trimmed means are
also investigated. Finally, a comparison is made on the efficiency of
the adaptive trimmed means in terms of the standard deviation for the
trimming proportions and when these were fixed a “priori".
The asymptotic tail lengths defined as the ratio of two trimmed
means and the asymptotic variances were computed by using the
formulas derived. While the values of the standard deviations for the
derived tail lengths for data of size 40 simulated from a Weibull
distribution were computed for 100 iterations using a computer
program written in Pascal language.
The findings of the study revealed that the tail lengths of the
Weibull distribution increase in magnitudes as the trimming
proportions increase, the measure of the tail length and the adaptive
trimmed mean are asymptotically independent as the number of
observations n becomes very large or approaching infinity, the tail
length is asymptotically distributed as the ratio of two independent
normal random variables, and the asymptotic variances decrease as
the trimming proportions increase. The simulation study revealed
empirically that the standard error of the adaptive trimmed mean
using the ratio of tail lengths is relatively smaller for different values
of trimming proportions than its counterpart when the trimming
proportions were fixed a 'priori'.
Abstract: Negation is useful in the majority of the real world applications. However, its introduction leads to semantic and canonical problems. We propose in this paper an approach based on stratification to deal with negation problems. This approach is based on an extension of predicates nets. It is characterized with two main contributions. The first concerns the management of the whole class of stratified programs. The second contribution is related to usual operations optimizations on stratified programs (maximal stratification, incremental updates ...).
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Abstract: Avalanche release of snow has been modeled in the present studies. Snow is assumed to be represented by semi-solid and the governing equations have been studied from the concept of continuum approach. The dynamical equations have been solved for two different zones [starting zone and track zone] by using appropriate initial and boundary conditions. Effect of density (ρ), Eddy viscosity (η), Slope angle (θ), Slab depth (R) on the flow parameters have been observed in the present studies. Numerical methods have been employed for computing the non linear differential equations. One of the most interesting and fundamental innovation in the present studies is getting initial condition for the computation of velocity by numerical approach. This information of the velocity has obtained through the concept of fracture mechanics applicable to snow. The results on the flow parameters have found to be in qualitative agreement with the published results.
Abstract: Task of object localization is one of the major
challenges in creating intelligent transportation. Unfortunately, in
densely built-up urban areas, localization based on GPS only
produces a large error, or simply becomes impossible. New
opportunities arise for the localization due to the rapidly emerging
concept of a wireless ad-hoc network. Such network, allows
estimating potential distance between these objects measuring
received signal level and construct a graph of distances in which
nodes are the localization objects, and edges - estimates of the
distances between pairs of nodes. Due to the known coordinates of
individual nodes (anchors), it is possible to determine the location of
all (or part) of the remaining nodes of the graph. Moreover, road
map, available in digital format can provide localization routines
with valuable additional information to narrow node location search.
However, despite abundance of well-known algorithms for solving
the problem of localization and significant research efforts, there are
still many issues that currently are addressed only partially. In this
paper, we propose localization approach based on the graph mapped
distances on the digital road map data basis. In fact, problem is
reduced to distance graph embedding into the graph representing area
geo location data. It makes possible to localize objects, in some cases
even if only one reference point is available. We propose simple
embedding algorithm and sample implementation as spatial queries
over sensor network data stored in spatial database, allowing
employing effectively spatial indexing, optimized spatial search
routines and geometry functions.
Abstract: This paper may be considered as combination of both pervasive computing and Differential GPS (global positioning satellite) which relates to control automatic traffic signals in such a
way as to pre-empt normal signal operation and permit lifesaving vehicles. Before knowing the arrival of the lifesaving vehicles from
the signal there is a chance of clearing the traffic. Traffic signal
preemption system includes a vehicle equipped with onboard computer system capable of capturing diagnostic information and
estimated location of the lifesaving vehicle using the information provided by GPS receiver connected to the onboard computer system
and transmitting the information-s using a wireless transmitter via a
wireless network. The fleet management system connected to a
wireless receiver is capable of receiving the information transmitted
by the lifesaving vehicle .A computer is also located at the
intersection uses corrected vehicle position, speed & direction
measurements, in conjunction with previously recorded data defining
approach routes to the intersection, to determine the optimum time to
switch a traffic light controller to preemption mode so that lifesaving
vehicles can pass safely. In case when the ambulance need to take a
“U" turn in a heavy traffic area we suggest a solution. Now we are
going to make use of computerized median which uses LINKED
BLOCKS (removable) to solve the above problem.
Abstract: The Sphere Method is a flexible interior point algorithm for linear programming problems. This was developed mainly by Professor Katta G. Murty. It consists of two steps, the centering step and the descent step. The centering step is the most expensive part of the algorithm. In this centering step we proposed some improvements such as introducing two or more initial feasible solutions as we solve for the more favorable new solution by objective value while working with the rigorous updates of the feasible region along with some ideas integrated in the descent step. An illustration is given confirming the advantage of using the proposed procedure.
Abstract: Despite the internet, which is one of the mass media
that has become quite common in recent years, the relationship of
Advertisement with Television and Cinema, which have always
drawn attention of researchers as basic media and where visual use is
in the foreground, have also become the subject of various studies.
Based on the assumption that the known fundamental effects of
advertisements on consumers are closely related to the creative
process of advertisements as well as the nature and characteristics of
the medium where they are used, these basic mass media (Television
and Cinema) and the consumer motivations of the advertisements
they broadcast have become a focus of study.
Given that the viewers of the mass media in question have shifted
from a passive position to a more active one especially in recent years
and approach contents of advertisements, as they do all contents, in a
more critical and “pitiless" manner, it is possible to say that
individuals make more use of advertisements than in the past and
combine their individual goals with the goals of the advertisements.
This study, which aims at finding out what the goals of these new
individual advertisement use are, how they are shaped by the distinct
characteristics of Television and Cinema, where visuality takes
precedence as basic mass media, and what kind of places they occupy
in the minds of consumers, has determined consumers- motivations
as: “Entertainment", “Escapism", “Play", “Monitoring/Discovery",
“Opposite Sex" and “Aspirations and Role Models".
This study intends to reveal the differences or similarities among
the needs and hence the gratifications of viewers who consume
advertisements on Television or at the Cinema, which are two basic
media where visuality is prioritized.
Abstract: The present work is motivated by the idea that the
layer deformation in anisotropic elasticity can be estimated from the
theory of interfacial dislocations. In effect, this work which is an
extension of a previous approach given by one of the authors
determines the anisotropic displacement fields and the critical
thickness due to a complex biperiodic network of MDs lying just
below the free surface in view of the arrangement of dislocations.
The elastic fields of such arrangements observed along interfaces
play a crucial part in the improvement of the physical properties of
epitaxial systems. New results are proposed in anisotropic elasticity
for hexagonal networks of MDs which contain intrinsic and extrinsic
stacking faults. We developed, using a previous approach based on
the relative interfacial displacement and a Fourier series formulation
of the displacement fields, the expressions of elastic fields when
there is a possible dissociation of MDs. The numerical investigations
in the case of the observed system Si/(111)Si with low twist angles
show clearly the effect of the anisotropy and thickness when the
misfit networks are dissociated.
Abstract: In this paper, a novel approach is presented
for designing multiplier-free state-space digital filters. The
multiplier-free design is obtained by finding power-of-2 coefficients
and also quantizing the state variables to power-of-2
numbers. Expressions for the noise variance are derived for the
quantized state vector and the output of the filter. A “structuretransformation
matrix" is incorporated in these expressions. It
is shown that quantization effects can be minimized by properly
designing the structure-transformation matrix. Simulation
results are very promising and illustrate the design algorithm.
Abstract: Automatic detection of syllable repetition is one of the
important parameter in assessing the stuttered speech objectively.
The existing method which uses artificial neural network (ANN)
requires high levels of agreement as prerequisite before attempting to
train and test ANNs to separate fluent and nonfluent. We propose
automatic detection method for syllable repetition in read speech for
objective assessment of stuttered disfluencies which uses a novel
approach and has four stages comprising of segmentation, feature
extraction, score matching and decision logic. Feature extraction is
implemented using well know Mel frequency Cepstra coefficient
(MFCC). Score matching is done using Dynamic Time Warping
(DTW) between the syllables. The Decision logic is implemented by
Perceptron based on the score given by score matching. Although
many methods are available for segmentation, in this paper it is done
manually. Here the assessment by human judges on the read speech
of 10 adults who stutter are described using corresponding method
and the result was 83%.
Abstract: We report the electronic structure and optical
properties of NdF3 compound. Our calculations are based on density
functional theory (DFT) using the full potential linearized augmented
plane wave (FPLAPW) method with the inclusion of spin orbit
coupling. We employed the local spin density approximation (LSDA)
and Coulomb-corrected local spin density approximation, known for
treating the highly correlated 4f electrons properly, is able to
reproduce the correct insulating ground state. We find that the
standard LSDA approach is incapable of correctly describing the
electronic properties of such materials since it positions the f-bands
incorrectly resulting in an incorrect metallic ground state. On the
other hand, LSDA + U approximation, known for treating the highly
correlated 4f electrons properly, is able to reproduce the correct
insulating ground state. Interestingly, however, we do not find any
significant differences in the optical properties calculated using
LSDA, and LSDA + U suggesting that the 4f electrons do not play a
decisive role in the optical properties of these compounds. The
reflectivity for NdF3 compound stays low till 7 eV which is
consistent with their large energy gaps. The calculated energy gaps
are in good agreement with experiments. Our calculated reflectivity
compares well with the experimental data and the results are analyzed
in the light of band to band transitions.
Abstract: This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.
Abstract: For controlling urban transportations, traffic lights
show similar behavior for different kinds of vehicles at intersections.
Emergency vehicles need special behavior at intersections, so traffic
lights should behave in different manner when emergency vehicles
approach them. At the present time, intelligent traffic lights control
urban transportations intelligently. In this paper the ethical aspect of
this topic is considered. A model is proposed for adding special
component to emergency vehicles and traffic lights for controlling
traffic in ethical manner. The proposed model is simulated by JADE.
Abstract: Some of the polycyclic aromatic hydrocarbons (PAHs)
are the strongest known carcinogens compounds; the majority of
them are mostly produced by the incomplete combustion of fossil
fuels; Motor vehicles are a significant source of polycyclic aromatic
hydrocarbon (PAH) where diesel emission is one of the main sources
of such compounds available in the ambient air. There is a big
concern about the increasing concentration of PAHs in the
environment. Researchers are trying to explore optimal methods to
reduce those pollutants and improve the quality of air. Water blended
fuel is one of the possible approaches to reduce emission of PAHs
from the combustion of diesel in urban and domestic vehicles. In this
work a modeling study was conducted using CHEMKIN-PRO
software to simulate spray combustion at similar diesel engine
conditions. Surrogate fuel of (80 % n-heptane and 20 % toluene) was
used due to detailed kinetic and thermodynamic data needed for
modeling is available for this kind of fuel but not available for diesel.
An emulsified fuel with 3, 5, 8, 10 and 20 % water by volume is used
as an engine feed for this study. The modeling results show that water
has a significant effect on reducing engine soot and PAHs precursors
formation up to certain extent.
Abstract: In the time of globalisation, growing uncertainty, ambiguity and change, traditional way of doing business are no longer sufficient and it is important to consider non-conventional methods and approaches to release creativity and facilitate innovation and growth. Thus, creative industries, as a natural source of creativity and innovation, draw particular attention. This paper explores feasibility of building creative partnerships between creative industries and business and brings attention to mutual benefits derived from such partnerships. Design/approach - This paper is a theoretical exploration of projects, practices and research findings addressing collaboration between creative industries and business. Thus, it concerns creative industries, arts, business and its representatives in order to define requirements for creative partnerships to work and succeed. Findings – Current practices in engaging into arts-business partnerships are still very few, although most of creative partnerships proved to be highly valuable and mutually beneficial. Certain conditions shall be provided in order to benefit from arts-business creative synergy. Originality/value- By integrating different sources of literature, this article provides a base for conducting empirical research in several dimensions within arts-business partnerships.
Abstract: One of the most important applications of
wireless sensor networks is data collection. This paper
proposes as efficient approach for data collection in wireless
sensor networks by introducing Member Forward List. This list
includes the nodes with highest priority for forwarding the data.
When a node fails or dies, this list is used to select the next node
with higher priority. The benefit of this node is that it prevents
the algorithm from repeating when a node fails or dies. The
results show that Member Forward List decreases power
consumption and latency in wireless sensor networks.
Abstract: Sequential pattern mining is a challenging task in data mining area with large applications. One among those applications is mining patterns from weblog. Recent times, weblog is highly dynamic and some of them may become absolute over time. In addition, users may frequently change the threshold value during the data mining process until acquiring required output or mining interesting rules. Some of the recently proposed algorithms for mining weblog, build the tree with two scans and always consume large time and space. In this paper, we build Revised PLWAP with Non-frequent Items (RePLNI-tree) with single scan for all items. While mining sequential patterns, the links related to the nonfrequent items are not considered. Hence, it is not required to delete or maintain the information of nodes while revising the tree for mining updated transactions. The algorithm supports both incremental and interactive mining. It is not required to re-compute the patterns each time, while weblog is updated or minimum support changed. The performance of the proposed tree is better, even the size of incremental database is more than 50% of existing one. For evaluation purpose, we have used the benchmark weblog dataset and found that the performance of proposed tree is encouraging compared to some of the recently proposed approaches.
Abstract: In this paper we present, propose and examine
additional membership functions for the Smoothing Transition
Autoregressive (STAR) models. More specifically, we present the
tangent hyperbolic, Gaussian and Generalized bell functions.
Because Smoothing Transition Autoregressive (STAR) models
follow fuzzy logic approach, more fuzzy membership functions
should be tested. Furthermore, fuzzy rules can be incorporated or
other training or computational methods can be applied as the error
backpropagation or genetic algorithm instead to nonlinear squares.
We examine two macroeconomic variables of US economy, the
inflation rate and the 6-monthly treasury bills interest rates.
Abstract: The anti-lock braking systems installed on vehicles
for safe and effective braking, are high-order nonlinear and timevariant.
Using fuzzy logic controllers increase efficiency of such
systems, but impose a high computational complexity as well. The
main concept introduced by this paper is reducing computational
complexity of fuzzy controllers by deploying problem-solution data
structure. Unlike conventional methods that are based on
calculations, this approach is based on data oriented modeling.