Abstract: STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data
Abstract: In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.
Abstract: It is a well-established fact that terrorism is one of the foremost threats to present-day international security. The creation of tools or mechanisms for confronting it in an effective and efficient manner will only be possible by way of an objective assessment of the phenomenon. In order to achieve this, this paper has the following three main objectives: Firstly, setting out to find the reasons that have prevented the establishment of a universally accepted definition of terrorism, and consequently trying to outline the main features defining the face of the terrorist threat in order to discover the fundamental goals of what is now a serious blight on world society. Secondly, trying to explain the differences between a terrorist movement and a terrorist organisation, and the reasons for which a terrorist movement can be led to transform itself into an organisation. After analysing these motivations and the characteristics of a terrorist organisation, an example of the latter will be succinctly analysed to help the reader understand the ideas expressed. Lastly, discovering and exposing the factors that can lead to the appearance of terrorist tendencies, and discussing the most efficient and effective responses that can be given to this global security threat.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: Many interventions for social skills acquisition aim to decrease the gap between social skills deficits in the individual and normative social skills; nevertheless little is known of typical social skills according to age difference in students. In this study, we developed new quintet of Hokkaido Social Skills Inventory (HSSI) to identify age-appropriate social skills for school adaptation. First, we selected 13 categories of social skills for school adaptation from previous studies, and created questionnaire items through discussion by 25 teachers in all three levels from elementary schools to senior high schools. Second, the factor structures of five versions of the social skills scale were investigated on 2nd grade (n = 1,864), 4th grade (n = 1,936), 6th grade (n = 2,085), 7th grade (n = 2,007), and 10th grade (n = 912) students, respectively. The exploratory factor analysis showed that a number of constructing factors of social skills increased as one’s grade in school advanced. The results in the present study can be useful to characterize the age-appropriate social skills for school adaptation.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Supply chains are the backbone of trade and
commerce. Their logistics use different transport corridors on regular
basis for operational purpose. The international supply chain
transport corridors include different infrastructure elements (e.g.
weighbridge, package handling equipments, border clearance
authorities, and so on). This paper presents the use of multi-agent
systems (MAS) to model and simulate some aspects of transportation
corridors, and in particular the area of weighbridge resource
optimization for operational profit. An underlying multi-agent model
provides a means of modeling the relationships among stakeholders
in order to enable coordination in a transport corridor environment.
Simulations of the costs of container unloading, reloading, and
waiting time for queuing up tracks have been carried out using data
sets. Results of the simulation provide the potential guidance in
making decisions about optimal service resource allocation in a trade
corridor.
Abstract: Aim of this research study is to investigate and
establish the characteristics of brain dominances (BD) and multiple
intelligences (MI). This experimentation has been conducted for the
sample size of 552 undergraduate computer-engineering students. In
addition, mathematical formulation has been established to exhibit
the relation between thinking and intelligence, and its correlation has
been analyzed. Correlation analysis has been statistically measured
using Pearson’s coefficient. Analysis of the results proves that there
is a strong relational existence between thinking and intelligence.
This research is carried to improve the didactic methods in
engineering learning and also to improve e-learning strategies.
Abstract: In present study, it was aimed to determine potential
agricultural lands (PALs) in Gokceada (Imroz) Island of Canakkale
province, Turkey. Seven-band Landsat 8 OLI images acquired on
July 12 and August 13, 2013, and their 14-band combination image
were used to identify current Land Use Land Cover (LULC) status.
Principal Component Analysis (PCA) was applied to three Landsat
datasets in order to reduce the correlation between the bands. A total
of six Original and PCA images were classified using supervised
classification method to obtain the LULC maps including 6 main
classes (“Forest”, “Agriculture”, “Water Surface”, “Residential Area-
Bare Soil”, “Reforestation” and “Other”). Accuracy assessment was
performed by checking the accuracy of 120 randomized points for
each LULC maps. The best overall accuracy and Kappa statistic
values (90.83%, 0.8791% respectively) were found for PCA images
which were generated from 14-bands combined images called 3-
B/JA.
Digital Elevation Model (DEM) with 15 m spatial resolution
(ASTER) was used to consider topographical characteristics. Soil
properties were obtained by digitizing 1:25000 scaled soil maps of
Rural Services Directorate General. Potential Agricultural Lands
(PALs) were determined using Geographic information Systems
(GIS). Procedure was applied considering that “Other” class of
LULC map may be used for agricultural purposes in the future
properties. Overlaying analysis was conducted using Slope (S), Land
Use Capability Class (LUCC), Other Soil Properties (OSP) and Land
Use Capability Sub-Class (SUBC) properties.
A total of 901.62 ha areas within “Other” class (15798.2 ha) of
LULC map were determined as PALs. These lands were ranked as
“Very Suitable”, “Suitable”, “Moderate Suitable” and “Low
Suitable”. It was determined that the 8.03 ha were classified as “Very
Suitable” while 18.59 ha as suitable and 11.44 ha as “Moderate
Suitable” for PALs. In addition, 756.56 ha were found to be “Low
Suitable”. The results obtained from this preliminary study can serve
as basis for further studies.
Abstract: Intermittent behavior near the boundary of phase
synchronization in the presence of noise is studied. In certain range of
the coupling parameter and noise intensity the intermittency of eyelet
and ring intermittencies is shown to take place. Main results are
illustrated using the example of two unidirectional coupled Rössler
systems. Similar behavior is shown to take place in two
hydrodynamical models of Pierce diode coupled unidirectional.
Abstract: In recent years, there has been a decline in physical
activity among adults. Motivation has been shown to be a crucial
factor in maintaining physical activity. The purpose of this study was
to whether PA motives measured by the Physical Activity and
Leisure Motivation Scale PALMS predicted the actual amount of PA
at a later time to provide evidence for the construct validity of the
PALMS. A quantitative, cross-sectional descriptive research design
was employed. The Demographic Form, PALMS, and International
Physical Activity Questionnaire Short form (IPAQ-S) questionnaires
were used to assess motives and amount for physical activity in
adults on two occasions. A sample of 489 male undergraduate
students aged 18 to 25 years (mean ±SD; 22.30±8.13 years) took part
in the study. Participants were divided into three types of activities,
namely exercise, racquet sport, and team sports and female
participants only took part in one type of activity, namely team
sports. After 14 weeks, all 489 undergraduate students who had filled
in the initial questionnaire (Occasion 1) received the questionnaire
via email (Occasion 2). Of the 489 students, 378 males emailed back
the completed questionnaire. The results showed that not only were
pertinent sub-scales of PALMS positively related to amount of
physical activity, but separate regression analyses showed the
positive predictive effect of PALMS motives for amount of physical
activity for each type of physical activity among participants. This
study supported the construct validity of the PALMS by showing that
the motives measured by PALMS did predict amount of PA. This
information can be obtained to match people with specific sport or
activity which in turn could potentially promote longer adherence to
the specific activity.
Abstract: Toxicity of copper (Cu), lead (Pb) and iron (Fe) to
Tilapia guinensis was carried out for 4 days with a view to
determining their effects on the liver and muscle tissues. Tilapia
guinensis samples of about 10 - 14cm length and 0.2 – 0.4kg weight
each were obtained from University of Calabar fish ponds and
acclimated for three (3) days before the experimental set up.
Survivors after the 96-hr LC50 test period were selected from test
solutions of the heavy metals for the histopathological studies.
Histological preparations of liver and muscle tissues were randomly
examined for histopathological lesions. Results of the histological
examinations showed gross abnormalities in the liver tissues due to
pathological and degenerative changes compared to liver and muscle
tissues from control samples (tilapia fishes from aquaria without
heavy metals). Extensive hepatocyte necrosis with chronic
inflammatory changes was observed in the liver of fishes exposed to
Cu solution. Similar but less damaging effects were observed in the
liver of fishes exposed to Pb and Fe. The extent of lesion observed
was therefore heavy metal-related. However, no pathologic changes
occurred in the muscle tissues.
Abstract: The flow duration curve (FDC) is an informative
method that represents the flow regime’s properties for a river basin.
Therefore, the FDC is widely used for water resource projects such as
hydropower, water supply, irrigation and water quality management.
The primary purpose of this study is to obtain synthetic daily flow
duration curves for Çoruh Basin, Turkey. For this aim, we firstly
developed univariate auto-regressive moving average (ARMA)
models for daily flows of 9 stations located in Çoruh basin and then
these models were used to generate 100 synthetic flow series each
having same size as historical series. Secondly, flow duration curves
of each synthetic series were drawn and the flow values exceeded 10,
50 and 95% of the time and 95% confidence limit of these flows were
calculated. As a result, flood, mean and low flows potential of Çoruh
basin will comprehensively be represented.
Abstract: The main objective of this article is to examine the
impact of interest rates on investments in Poland in the context of
financial crisis. The paper also investigates the dependence of bank
loans to enterprises on interbank market rates. The article studies the
impact of interbank market rate on the level of investments in Poland.
Besides, this article focuses on the research of the correlation
between the level of corporate loans and the amount of investments
in Poland in order to determine the indirect impact of central bank
interest rates through the transmission mechanism of monetary policy
on the real economy. To achieve the objective we have used
econometric and statistical research methods like: econometric model
and Pearson correlation coefficient.
This analysis suggests that the central bank reference rate
inversely proportionally affects the level of investments in Poland
and this dependence is moderate. This is also important issue because
it is related to preparing of Poland to accession to euro area. The
research is important from both theoretical and empirical points of
view. The formulated conclusions and recommendations determine
the practical significance of the paper which may be used in the
decision making process of monetary and economic authorities of the
country.
Abstract: In this paper, we present a model-based regression test
suite reducing approach that uses EFSM model dependence analysis
and probability-driven greedy algorithm to reduce software regression
test suites. The approach automatically identifies the difference
between the original model and the modified model as a set of
elementary model modifications. The EFSM dependence analysis is
performed for each elementary modification to reduce the regression
test suite, and then the probability-driven greedy algorithm is adopted
to select the minimum set of test cases from the reduced regression test
suite that cover all interaction patterns. Our initial experience shows
that the approach may significantly reduce the size of regression test
suites.
Abstract: To construct the lumped spring-mass model
considering the occupants for the offset frontal crash, the SISAME
software and the NHTSA test data were used. The data on 56 kph 40%
offset frontal vehicle to deformable barrier crash test of a MY2007
Mazda 6 4-door sedan were obtained from NHTSA test database. The
overall behaviors of B-pillar and engine of simulation models agreed
very well with the test data. The trends of accelerations at the driver
and passenger head were similar but big differences in peak values.
The differences of peak values caused the large errors of the HIC36
and 3 ms chest g’s. To predict well the behaviors of dummies, the
spring-mass model for the offset frontal crash needs to be improved.
Abstract: Model transformation, as a pivotal aspect of Modeldriven
engineering, attracts more and more attentions both from
researchers and practitioners. Many domains (enterprise engineering,
software engineering, knowledge engineering, etc.) use model
transformation principles and practices to serve to their domain
specific problems; furthermore, model transformation could also be
used to fulfill the gap between different domains: by sharing and
exchanging knowledge. Since model transformation has been widely
used, there comes new requirement on it: effectively and efficiently
define the transformation process and reduce manual effort that
involved in. This paper presents an automatic model transformation
methodology based on semantic and syntactic comparisons, and
focuses particularly on granularity issue that existed in transformation
process. Comparing to the traditional model transformation
methodologies, this methodology serves to a general purpose: crossdomain
methodology. Semantic and syntactic checking
measurements are combined into a refined transformation process,
which solves the granularity issue. Moreover, semantic and syntactic
comparisons are supported by software tool; manual effort is replaced
in this way.
Abstract: Ontology validation is an important part of web
applications’ development, where knowledge integration and
ontological reasoning play a fundamental role. It aims to ensure the
consistency and correctness of ontological knowledge and to
guarantee that ontological reasoning is carried out in a meaningful
way. Existing approaches to ontology validation address more or less
specific validation issues, but the overall process of validating web
ontologies has not been formally established yet. As the size and the
number of web ontologies continue to grow, more web applications’
developers will rely on the existing repository of ontologies rather
than develop ontologies from scratch. If an application utilizes
multiple independently created ontologies, their consistency must be
validated and eventually adjusted to ensure proper interoperability
between them. This paper presents a validation technique intended to
test the consistency of independent ontologies utilized by a common
application.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: Experts, enterprises and operators expect that the
bandwidth request will increase up to rates of 100 to 1,000 Mbps
within several years. Therefore the most important question is which
technology shall satisfy the future consumer broadband demands.
Currently the consensus is, that the fiber technology has the best
technical characteristics to achieve such the high bandwidth rates.
But fiber technology is so far very cost-intensive and resource
consuming. To avoid these investments, operators are concentrating
to upgrade the existing copper and hybrid fiber coax infrastructures.
This work presents a comparison of the copper and fiber
technologies including an overview about the current German
broadband market. Both technologies are reviewed in the terms of
demand, willingness to pay and economic efficiency in connection
with the technical characteristics.