Abstract: Decision support systems are usually based on
multidimensional structures which use the concept of hypercube.
Dimensions are the axes on which facts are analyzed and form a
space where a fact is located by a set of coordinates at the
intersections of members of dimensions. Conventional
multidimensional structures deal with discrete facts linked to discrete
dimensions. However, when dealing with natural continuous
phenomena the discrete representation is not adequate. There is a
need to integrate spatiotemporal continuity within multidimensional
structures to enable analysis and exploration of continuous field data.
Research issues that lead to the integration of spatiotemporal
continuity in multidimensional structures are numerous. In this paper,
we discuss research issues related to the integration of continuity in
multidimensional structures, present briefly a multidimensional
model for continuous field data. We also define new aggregation
operations. The model and the associated operations and measures
are validated by a prototype.
Abstract: EEG signal is one of the oldest measures of brain
activity that has been used vastly for clinical diagnoses and
biomedical researches. However, EEG signals are highly
contaminated with various artifacts, both from the subject and from
equipment interferences. Among these various kinds of artifacts,
ocular noise is the most important one. Since many applications such
as BCI require online and real-time processing of EEG signal, it is
ideal if the removal of artifacts is performed in an online fashion.
Recently, some methods for online ocular artifact removing have
been proposed. One of these methods is ARMAX modeling of EEG
signal. This method assumes that the recorded EEG signal is a
combination of EOG artifacts and the background EEG. Then the
background EEG is estimated via estimation of ARMAX parameters.
The other recently proposed method is based on adaptive filtering.
This method uses EOG signal as the reference input and subtracts
EOG artifacts from recorded EEG signals. In this paper we
investigate the efficiency of each method for removing of EOG
artifacts. A comparison is made between these two methods. Our
undertaken conclusion from this comparison is that adaptive filtering
method has better results compared with the results achieved by
ARMAX modeling.
Abstract: This work concerns the measurements of a Bulk
Acoustic Waves (BAW) emission filter S parameters and compare
with prototypes simulated types. Thanks to HP-ADS, a co-simulation
of filters- characteristics in a digital radio-communication chain is
performed. Four cases of modulation schemes are studied in order to
illustrate the impact of the spectral occupation of the modulated
signal. Results of simulations and co-simulation are given in terms of
Error Vector Measurements to be useful for a general sensibility
analysis of 4th/3rd Generation (G.) emitters (wideband QAM and
OFDM signals)
Abstract: After the terrorist attack on September 11, 2001 in
U.S., the container security issue got high attention, especially by U.S.
government, which deployed a lot of measures to promote or improve
security systems. U.S. government not only enhances its national
security system, but allies with other countries against the potential
terrorist attacks in the future. For example CSI (Container Security
Initiative), it encourages foreign ports outside U.S. to become CSI
ports as a part of U.S. anti-terrorism network. Although promotion of
the security could partly reach the goal of anti-terrorism, that will
influence the efficiency of container supply chain, which is the main
concern when implementing the inspection measurements. This paper
proposes a quick estimation methodology for an inspection service
rate by a berth allocation heuristic such that the inspection activities
will not affect the original container supply chain. Theoretical and
simulation results show this approach is effective.
Abstract: Word sense disambiguation is one of the most important open problems in natural language processing applications such as information retrieval and machine translation. Many approach strategies can be employed to resolve word ambiguity with a reasonable degree of accuracy. These strategies are: knowledgebased, corpus-based, and hybrid-based. This paper pays attention to the corpus-based strategy that employs an unsupervised learning method for disambiguation. We report our investigation of Latent Semantic Indexing (LSI), an information retrieval technique and unsupervised learning, to the task of Thai noun and verbal word sense disambiguation. The Latent Semantic Indexing has been shown to be efficient and effective for Information Retrieval. For the purposes of this research, we report experiments on two Thai polysemous words, namely /hua4/ and /kep1/ that are used as a representative of Thai nouns and verbs respectively. The results of these experiments demonstrate the effectiveness and indicate the potential of applying vector-based distributional information measures to semantic disambiguation.
Abstract: A phenomenological model for species spreading which incorporates the Allee effect, a species- maximum attainable growth rate, collective dispersal rate and dispersal adaptability is presented. This builds on a well-established reaction-diffusion model for spatial spreading of invading organisms. The model is phrased in terms of the “hostility" (which quantifies the Allee threshold in relation to environmental sustainability) and dispersal adaptability (which measures how a species is able to adapt its migratory response to environmental conditions). The species- invading/retreating speed and the sharpness of the invading boundary are explicitly characterised in terms of the fundamental parameters, and analysed in detail.
Abstract: Recently many research has been conducted to
retrieve pertinent parameters and adequate models for automatic
music genre classification. In this paper, two measures based upon
information theory concepts are investigated for mapping the features
space to decision space. A Gaussian Mixture Model (GMM) is used
as a baseline and reference system. Various strategies are proposed
for training and testing sessions with matched or mismatched
conditions, long training and long testing, long training and short
testing. For all experiments, the file sections used for testing are
never been used during training. With matched conditions all
examined measures yield the best and similar scores (almost 100%).
With mismatched conditions, the proposed measures yield better
scores than the GMM baseline system, especially for the short testing
case. It is also observed that the average discrimination information
measure is most appropriate for music category classifications and on
the other hand the divergence measure is more suitable for music
subcategory classifications.
Abstract: In this study, a fuzzy similarity approach for Arabic web pages classification is presented. The approach uses a fuzzy term-category relation by manipulating membership degree for the training data and the degree value for a test web page. Six measures are used and compared in this study. These measures include: Einstein, Algebraic, Hamacher, MinMax, Special case fuzzy and Bounded Difference approaches. These measures are applied and compared using 50 different Arabic web-pages. Einstein measure was gave best performance among the other measures. An analysis of these measures and concluding remarks are drawn in this study.
Abstract: Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.
Abstract: This article investigated the validity of C-test and Cloze test which purport to measure general English proficiency. To provide empirical evidence pertaining to the validity of the interpretations based on the results of these integrative language tests, their criterion-related validity was investigated. In doing so, the test of English as a foreign language (TOEFL) which is an established, standardized, and internationally administered test of general English proficiency was used as the criterion measure. Some 90 Iranian English majors participated in this study. They were seniors studying English at a university in Tehran, Iran. The results of analyses showed that there is a statistically significant correlation among participants- scores on Cloze test, C-test, and the TOEFL. Building on the findings of the study and considering criterion-related validity as the evidential basis of the validity argument, it was cautiously deducted that these tests measure the same underlying trait. However, considering the limitations of using criterion measures to validate tests, no absolute claims can be made as to the construct validity of these integrative tests.
Abstract: This paper presents an integrated model that
automatically measures the change of rivers, damage area of bridge
surroundings, and change of vegetation. The proposed model is on the
basis of a neurofuzzy mechanism enhanced by SOM optimization
algorithm, and also includes three functions to deal with river imagery.
High resolution imagery from FORMOSAT-2 satellite taken before
and after the invasion period is adopted. By randomly selecting a
bridge out of 129 destroyed bridges, the recognition results show that
the average width has increased 66%. The ruined segment of the
bridge is located exactly at the most scour region. The vegetation
coverage has also reduced to nearly 90% of the original. The results
yielded from the proposed model demonstrate a pinpoint accuracy rate
at 99.94%. This study brings up a successful tool not only for
large-scale damage assessment but for precise measurement to
disasters.
Abstract: Ten percent of the population will develop plantar
fasciitis (PF) during their lifetime. Two million people are treated
yearly accounting for 11-15% of visits to medical professionals.
Treatment ranges from conservative to surgical intervention. The
purpose of this study was to assess the effects of extracorporeal
shockwave therapy (ECSWT) on heel pain, function, range of motion
(ROM), and strength in patients with PF. One hundred subjects were
treated with ECSWT and measures were taken before and three
months after treatment. There was significant differences in visual
analog scale scores for pain at rest (p=0.0001); after activity (p=
0.0001) and; overall improvement (p=0.0001). There was also
significant improvement in Lower Extremity Functional Scale scores
(p=0.0001); ankle plantarflexion (p=0.0001), dorsiflexion (p=0.001),
and eversion (p=0.017),and first metatarsophalangeal joint flexion
(p=0.002) and extension (p=0.003) ROM. ECSWT is an effective
treatment improving heel pain, function and ROM in patients with
PF.
Abstract: This paper explores the scalability issues associated
with solving the Named Entity Recognition (NER) problem using
Support Vector Machines (SVM) and high-dimensional features. The
performance results of a set of experiments conducted using binary
and multi-class SVM with increasing training data sizes are
examined. The NER domain chosen for these experiments is the
biomedical publications domain, especially selected due to its
importance and inherent challenges. A simple machine learning
approach is used that eliminates prior language knowledge such as
part-of-speech or noun phrase tagging thereby allowing for its
applicability across languages. No domain-specific knowledge is
included. The accuracy measures achieved are comparable to those
obtained using more complex approaches, which constitutes a
motivation to investigate ways to improve the scalability of multiclass
SVM in order to make the solution more practical and useable.
Improving training time of multi-class SVM would make support
vector machines a more viable and practical machine learning
solution for real-world problems with large datasets. An initial
prototype results in great improvement of the training time at the
expense of memory requirements.
Abstract: This paper presents a research agenda on the SCOR
model adaptation. SCOR model is designated to measure supply
chain performance and logistics impact across the boundaries of
individual organizations. It is at its growing stage of its life cycle and
is enjoying the leverage of becoming the industry standard. The
SCOR model has been developed and used widely in developed
countries context. This research focuses on the SCOR model
adaptation for the manufacturing industry in developing countries.
With a necessary understanding of the characteristics, difficulties and
problems of the manufacturing industry in developing countries-
supply chain; consequently, we will try to designs an adapted model
with its building blocks: business process model, performance
measures and best practices.
Abstract: It is the living conditions in the cities that determine the future of our livelihood. “To change life, we must first change space"- Henri Lefebvre. Sustainable development is a utopian aspiration for South African cities (especially the case study of the Gauteng City Region), which are currently characterized by unplanned growth and increasing urban sprawl. While the reasons for poor environmental quality and living conditions are undoubtedly diverse and complex, having political, economical and social dimensions, it is argued that the prevailing approach to layout planning in South Africa is part of the problem. This article seeks a solution to the problem of sustainability, from a spatial planning perspective. The spatial planning tool, the urban development boundary, is introduced as the concept that will ensure empty talk being translated into a sustainable vision. The urban development boundary is a spatial planning tool that can be used and implemented to direct urban growth towards a more sustainable form. The urban development boundary aims to ensure planned urban areas, in contrast to the current unplanned areas characterized by urban sprawl and insufficient infrastructure. However, the success of the urban development boundary concept is subject to effective implementation measures, as well as adequate and efficient management. The concept of sustainable development can function as a driving force underlying societal change and transformation, but the interface between spatial planning and environmental management needs to be established (as this is the core aspects underlying sustainable development), and authorities needs to understand and implement this interface consecutively. This interface can, however, realize in terms of the objectives of the planning tool – the urban development boundary. The case study, the Gauteng City Region, is depicted as a site of economic growth and innovation, but there is a lack of good urban and regional governance, impacting on the design (layout) and function of urban areas and land use, as current authorities make uninformed decisions in terms of development applications, leading to unsustainable urban forms and unsustainable nodes. Place and space concepts are thus critical matters applicable to planning of the Gauteng City Region. The urban development boundary are thus explored as a planning tool to guide decision-making, and create a sustainable urban form, leading to better environmental and living conditions, and continuous sustainability.
Abstract: In this work we introduce an efficient method to limit
the impact of the hiding process on the quality of the cover speech.
Vector quantization of the speech spectral information reduces drastically
the number of the secret speech parameters to be embedded
in the cover signal. Compared to scalar hiding, vector quantization
hiding technique provides a stego signal that is indistinguishable from
the cover speech. The objective and subjective performance measures
reveal that the current hiding technique attracts no suspicion about the
presence of the secret message in the stego speech, while being able
to recover an intelligible copy of the secret message at the receiver
side.
Abstract: Activity-Based Costing (ABC) which has become an important aspect of manufacturing/service organizations can be defined as a methodology that measures the cost and performance of activities, resources and cost objects. It can be considered as an alternative paradigm to traditional cost-based accounting systems. The objective of this paper is to illustrate an application of ABC method and to compare the results of ABC with traditional costing methods. The results of the application highlight the weak points of traditional costing methods and an S-Curve obtained is used to identify the undercosted and overcosted products of the firm.
Abstract: Road authorities have confronted problems to
maintaining the serviceability of road infrastructure systems by using
various traditional methods of contracting. As a solution to these
problems, many road authorities have started contracting out road
maintenance works to the private sector based on performance
measures. This contracting method is named Performance-Based
Maintenance Contracting (PBMC). It is considered more costeffective
than other traditional methods of contracting. It has a
substantial success records in many developed and developing
countries over the last two decades. This paper discusses and
analyses the potential issues to be considered before the introduction
of PBMC in a country.
Abstract: In this paper, the authors examine whether or not there Institute for Information and Communications Policy shows are differences of Japanese Internet users awareness to information security based on individual attributes by using analysis of variance based on non-parametric method. As a result, generally speaking, it is found that Japanese Internet users' awareness to information security is different by individual attributes. Especially, the authors verify that the users who received the information security education would have rather higher recognition concerning countermeasures than other users including self-educated users. It is suggested that the information security education should be enhanced so that the users may appropriately take the information security countermeasures. In addition, the information security policy such as carrying out "e- net caravan" and "information security seminars" are effective in improving the users' awareness on the information security in Japan.
Abstract: In large datasets, identifying exceptional or rare cases
with respect to a group of similar cases is considered very significant
problem. The traditional problem (Outlier Mining) is to find
exception or rare cases in a dataset irrespective of the class label of
these cases, they are considered rare events with respect to the whole
dataset. In this research, we pose the problem that is Class Outliers
Mining and a method to find out those outliers. The general
definition of this problem is “given a set of observations with class
labels, find those that arouse suspicions, taking into account the
class labels". We introduce a novel definition of Outlier that is Class
Outlier, and propose the Class Outlier Factor (COF) which measures
the degree of being a Class Outlier for a data object. Our work
includes a proposal of a new algorithm towards mining of the Class
Outliers, presenting experimental results applied on various domains
of real world datasets and finally a comparison study with other
related methods is performed.