Abstract: This paper reports the feasibility of the ARMA model
to describe a bursty video source transmitting over a AAL5 ATM link
(VBR traffic). The traffic represents the activity of the action movie
"Lethal Weapon 3" transmitted over the ATM network using the Fore
System AVA-200 ATM video codec with a peak rate of 100 Mbps
and a frame rate of 25. The model parameters were estimated for a
single video source and independently multiplexed video sources. It
was found that the model ARMA (2, 4) is well-suited for the real data
in terms of average rate traffic profile, probability density function,
autocorrelation function, burstiness measure, and the pole-zero
distribution of the filter model.
Abstract: New graph similarity methods have been proposed in this work with the aim to refining the chemical information extracted from molecules matching. For this purpose, data fusion of the isomorphic and nonisomorphic subgraphs into a new similarity measure, the Approximate Similarity, was carried out by several approaches. The application of the proposed method to the development of quantitative structure-activity relationships (QSAR) has provided reliable tools for predicting several pharmacological parameters: binding of steroids to the globulin-corticosteroid receptor, the activity of benzodiazepine receptor compounds, and the blood brain barrier permeability. Acceptable results were obtained for the models presented here.
Abstract: A novel robust audio watermarking scheme is
proposed in this paper. In the proposed scheme, the host audio signals
are segmented into frames. Two consecutive frames are assessed if
they are suitable to represent a watermark bit. If so, frequency
transform is performed on these two frames. The compressionexpansion
technique is adopted to generate distortion over the two
frames. The distortion is used to represent one watermark bit.
Psychoacoustic model is applied to calculate local auditory mask to
ensure that the distortion is not audible. The watermarking schemes
using mono and stereo audio signals are designed differently. The
correlation-based detection method is used to detect the distortion
and extract embedded watermark bits. The experimental results show
that the quality degradation caused by the embedded watermarks is
perceptually transparent and the proposed schemes are very robust
against different types of attacks.
Abstract: In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.
Abstract: The main thrust of this paper is to assess the level of disclosure in the annual reports of non-financial Greek firms and to empirically investigate the hypothesized impact of several firm characteristics on the extent of mandatory disclosure. A disclosure checklist consisting of 100 mandatory items was developed to assess the level of disclosure in the 2009 annual reports of 43 Greek companies listed at the Athens stock exchange. The association between the level of disclosure and some firm characteristics was examined using multiple linear regression analysis. The study reveals that Greek companies on general have responded adequately to the mandatory disclosure requirements of the regulatory bodies. The findings also indicate that firm size was significant positively associated with the level of disclosure. The remaining variables such as age, profitability, liquidity, and board composition were found to be insignificant in explaining the variation of mandatory disclosures. The outcome of this study is undoubtedly of great concern to the investment community at large to assist in evaluating the extent of mandatory disclosure by Greek firms and explaining the variation of disclosure in light of firm-specific characteristics.
Abstract: Gabor-based face representation has achieved enormous success in face recognition. This paper addresses a novel algorithm for face recognition using neural networks trained by Gabor features. The system is commenced on convolving a face image with a series of Gabor filter coefficients at different scales and orientations. Two novel contributions of this paper are: scaling of rms contrast and introduction of fuzzily skewed filter. The neural network employed for face recognition is based on the multilayer perceptron (MLP) architecture with backpropagation algorithm and incorporates the convolution filter response of Gabor jet. The effectiveness of the algorithm has been justified over a face database with images captured at different illumination conditions.
Abstract: Several recent studies have shown that the
transparency of financial reporting have a significant influence on investor-s decisions. Thus, regulation authorities and professional
organizations (IFAC) have emphasized the role of XBRL (eXtensible Business Reporting Language) and interactive data as a means of
promoting transparency and monitoring corporate reporting. In this
context, this paper has as objective the analysis of interactive reporting through XBRL and its use as a support in the process of
taking decisions in corporate governance, namely the potential of interactive reports in XBRL to increase the transparency and
monitoring process of corporate governance.
Abstract: With advances in computer vision, non-contact gaze tracking systems are heading towards being much easier to operate and more comfortable for use, the technique proposed in this paper is specially designed for achieving these goals. For the convenience in operation, the proposal aims at the system with simple configuration which is composed of a fixed wide angle camera and dual infrared illuminators. Then in order to enhance the usability of the system based on single camera, a self-adjusting method which is called Real-time gaze Tracking Algorithm with head movement Compensation (RTAC) is developed for estimating the gaze direction under natural head movement and simplifying the calibration procedure at the same time. According to the actual evaluations, the average accuracy of about 1° is achieved over a field of 20×15×15 cm3.
Abstract: Dew harvesting needs only weak investment and
exploits a free, clean and inexhaustible energy. This study aims to
measure the relative contributions of dew and rain water in the
Mediterranean Dalmatian coast and islands of Croatia and determine
whether dew water is potable. Two sites were chosen, an open site on
the coast favourable to dew formation (Zadar) and a less favourable
site in a circus of mountains in Komiža (Vis Island). Between July
1st, 2003 and October 31st, 2006, dew hasbeen daily collected on a 1
m2 tilted (30°) test dew condenser together with ordinary
meteorological data (air temperature and relative humidity, cloud
coverage, windspeed and direction). The mean yearly cumulative
dew yields were found to be 20 mm (Zadar) and 9.3 mm (Komiža ).
During the dry season (May to October), monthly cumulative dew
water yield can represent up to 38% of water collected by rain fall. In
July 2003 and 2006, dew water represented about 120% of the
monthly cumulative rain water. Dew and rain water were analyzed in
Zadar. The corresponding parameters were measured: pH, electrical
conductivity, major anions (HCO3
-, Cl-, SO4
2-
, NO3
-
, ,) and major
cations (NH4
+, Na+, K+, Ca2+, Mg2+. Both dew and rain water are in
conformity with the WHO directives for potability except Mg2+.
Using existing roofs and refurbishing the abandoned impluviums to
permit dew collection could then provide a useful supplementary
amount of water, especially during the dry season.
Abstract: A considerable amount of energy is consumed during
transmission and reception of messages in a wireless mesh network
(WMN). Reducing per-node transmission power would greatly
increase the network lifetime via power conservation in addition to
increasing the network capacity via better spatial bandwidth reuse. In
this work, the problem of topology control in a hybrid WMN of
heterogeneous wireless devices with varying maximum transmission
ranges is considered. A localized distributed topology control
algorithm is presented which calculates the optimal transmission
power so that (1) network connectivity is maintained (2) node
transmission power is reduced to cover only the nearest neighbours
(3) networks lifetime is extended. Simulations and analysis of results
are carried out in the NS-2 environment to demonstrate the
correctness and effectiveness of the proposed algorithm.
Abstract: Recently, analysis and designing of the structures
based on the Reliability theory have been the center of attention.
Reason of this attention is the existence of the natural and random
structural parameters such as the material specification, external
loads, geometric dimensions etc. By means of the Reliability theory,
uncertainties resulted from the statistical nature of the structural
parameters can be changed into the mathematical equations and the
safety and operational considerations can be considered in the
designing process. According to this theory, it is possible to study the
destruction probability of not only a specific element but also the
entire system. Therefore, after being assured of safety of every
element, their reciprocal effects on the safety of the entire system can
be investigated.
Abstract: Logic based methods for learning from structured data
is limited w.r.t. handling large search spaces, preventing large-sized
substructures from being considered by the resulting classifiers. A
novel approach to learning from structured data is introduced that
employs a structure transformation method, called finger printing, for
addressing these limitations. The method, which generates features
corresponding to arbitrarily complex substructures, is implemented in
a system, called DIFFER. The method is demonstrated to perform
comparably to an existing state-of-art method on some benchmark
data sets without requiring restrictions on the search space.
Furthermore, learning from the union of features generated by finger
printing and the previous method outperforms learning from each
individual set of features on all benchmark data sets, demonstrating
the benefit of developing complementary, rather than competing,
methods for structure classification.
Abstract: The wavelet transform is one of the most important
method used in signal processing. In this study, we have introduced
frequency-energy characteristics of local earthquakes using discrete
wavelet transform. Frequency-energy characteristic was analyzed
depend on difference between P and S wave arrival time and noise
within records. We have found that local earthquakes have similar
characteristics. If frequency-energy characteristics can be found
accurately, this gives us a hint to calculate P and S wave arrival time.
It can be seen that wavelet transform provides successful
approximation for this. In this study, 100 earthquakes with 500
records were analyzed approximately.
Abstract: Well-developed strategic marketing planning is the essential
prerequisite for establishment of the right and unique competitive
advantage. Typical market, however, is a heterogeneous
and decentralized structure with natural involvement of individual
or group subjectivity and irrationality. These features cannot be
fully expressed with one-shot rigorous formal models based on,
e.g. mathematics, statistics or empirical formulas. We present an
innovative solution, extending the domain of agent based computational
economics towards the concept of hybrid modeling in service
provider and consumer market such as telecommunications. The
behavior of the market is described by two classes of agents -
consumer and service provider agents - whose internal dynamics
are fundamentally different. Customers are rather free multi-state
structures, adjusting behavior and preferences quickly in accordance
with time and changing environment. Producers, on the contrary,
are traditionally structured companies with comparable internal processes
and specific managerial policies. Their business momentum is
higher and immediate reaction possibilities limited. This limitation
underlines importance of proper strategic planning as the main
process advising managers in time whether to continue with more
or less the same business or whether to consider the need for future
structural changes that would ensure retention of existing customers
or acquisition of new ones.
Abstract: A generic and extendible Multi-Agent Data Mining
(MADM) framework, MADMF (the Multi-Agent Data Mining
Framework) is described. The central feature of the framework is that
it avoids the use of agreed meta-language formats by supporting a
framework of wrappers.
The advantage offered is that the framework is easily extendible,
so that further data agents and mining agents can simply be added to
the framework. A demonstration MADMF framework is currently
available. The paper includes details of the MADMF architecture and
the wrapper principle incorporated into it. A full description and
evaluation of the framework-s operation is provided by considering
two MADM scenarios.
Abstract: In this paper, we propose a new method to distinguish
between arousal and relaxation states by using multiple features
acquired from a photoplethysmogram (PPG) and support vector
machine (SVM). To induce arousal and relaxation states in subjects, 2
kinds of sound stimuli are used, and their corresponding biosignals are
obtained using the PPG sensor. Two features–pulse to pulse interval
(PPI) and pulse amplitude (PA)–are extracted from acquired PPG
data, and a nonlinear classification between arousal and relaxation is
performed using SVM.
This methodology has several advantages when compared with
previous similar studies. Firstly, we extracted 2 separate features from
PPG, i.e., PPI and PA. Secondly, in order to improve the classification
accuracy, SVM-based nonlinear classification was performed.
Thirdly, to solve classification problems caused by generalized
features of whole subjects, we defined each threshold according to
individual features.
Experimental results showed that the average classification
accuracy was 74.67%. Also, the proposed method showed the better
identification performance than the single feature based methods.
From this result, we confirmed that arousal and relaxation can be
classified using SVM and PPG features.
Abstract: This paper presents results of measurements campaign
carried out at a carrier frequency of 24GHz with the help of TPLINK
router in indoor line-of-sight (LOS) scenarios. Firstly, the
radio wave propagation strategies are analyzed in some rooms with
router of point to point Ad hoc network. Then floor attenuation is
defined for 3 floors in experimental region. The free space model and
dual slope models are modified by considering the influence of
corridor conditions on each floor. Using these models, indoor signal
attenuation can be estimated in modeling of indoor radio wave
propagation. These results and modified models can also be used in
planning the networks of future personal communications services.
Abstract: Organic farmers across Saskatchewan face soil
phosphorus (P) shortages. Due to the restriction on inputs in organic
systems, farmers rely on crop rotation and naturally-occurring
arbuscular mycorrhizal fungi (AMF) for plant P supply. Crop rotation
is important for disease, pest, and weed management. Crops that are
not colonized by AMF (non-mycorrhizal) can decrease colonization
of a following crop. An experiment was performed to quantify soil P
cycling in four cropping sequences under organic management and
determine if mustard (non-mycorrhizal) was delaying the
colonization of subsequent wheat. Soils from the four cropping
sequences were measured for inorganic soil P (Pi), AMF spore
density (SD), phospholipid fatty acid analysis (PLFA, for AMF
biomarker counts), and alkaline phosphatase activity (ALPase,
related to AMF metabolic activity). Plants were measured for AMF
colonization and P content and uptake of above-ground biomass. A
lack of difference in AMF activity indicated that mustard was not
depressing colonization. Instead, AMF colonization was largely
determined by crop type and crop rotation.
Abstract: In order to be able to automatically differentiate
between two modes of permanent flow of a liquid simulating blood,
it was imperative to put together a data bank. Thus, the acquisition of
the various amplitude spectra of the Doppler signal of this liquid in
laminar flow and other spectra in turbulent flow enabled us to
establish an automatic difference between the two modes. According
to the number of parameters and their nature, a comparative study
allowed us to choose the best classifier.
Abstract: Concurrency and synchronization are becoming big
issues as every new PC comes with multi-core processors. A major
reason for Object-Oriented Programming originally was to enable
easier reuse: encode your algorithm into a class and thoroughly
debug it, then you can reuse the class again and again. However,
when we get to concurrency and synchronization, this is often not
possible. Thread-safety issues means that synchronization constructs
need to be entangled into every class involved. We contributed a
detailed literature review of issues and challenges in concurrent
programming and present a methodology that uses the Aspect-
Oriented paradigm to address this problem. Aspects will allow us to
extract the synchronization concerns as schemes to be “weaved in"
later into the main code. This allows the aspects to be separately
tested and verified. Hence, the functional components can be weaved
with reusable synchronization schemes that are robust and scalable.