Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: In this paper, a benchmarking framework is presented
for the performance assessment of irrigations systems. Firstly, a data
envelopment analysis (DEA) is applied to measure the technical
efficiency of irrigation systems. This method, based on linear
programming, aims to determine a consistent efficiency ranking of
irrigation systems in which known inputs, such as water volume
supplied and total irrigated area, and a given output corresponding to
the total value of irrigation production are taken into account
simultaneously. Secondly, in order to examine the irrigation
efficiency in more detail, a cross – system comparison is elaborated
using a performance indicators set selected by IWMI. The above
methodologies were applied in Thessaloniki plain, located in
Northern Greece while the results of the application are presented and
discussed. The conjunctive use of DEA and performance indicators
seems to be a very useful tool for efficiency assessment and
identification of best practices in irrigation systems management.
Abstract: Specification-based testing enables us to detect errors
in the implementation of functions defined in given specifications.
Its effectiveness in achieving high path coverage and efficiency in
generating test cases are always major concerns of testers. The automatic
test cases generation approach based on formal specifications
proposed by Liu and Nakajima is aimed at ensuring high effectiveness
and efficiency, but this approach has not been empirically assessed.
In this paper, we present an experiment for assessing Liu-s testing
approach. The result indicates that this testing approach may not be
effective in some circumstances. We discuss the result, analyse the
specific causes for the ineffectiveness, and describe some suggestions
for improvement.
Abstract: Several models of vulnerability assessment have been proposed. The selection of one of these models depends on the objectives of the study. The classical methodologies for seismic vulnerability analysis, as a part of seismic risk analysis, have been formulated with statistical criteria based on a rapid observation. The information relating to the buildings performance is statistically elaborated. In this paper, we use the European Macroseismic Scale EMS-98 to define the relationship between damage and macroseismic intensity to assess the seismic vulnerability. Applying to Algiers area, the first step is to identify building typologies and to assign vulnerability classes. In the second step, damages are investigated according to EMS-98.
Abstract: Development of motor car safety devices has reduced
fatality rates in car accidents. Yet despite this increase in car safety,
neck injuries resulting from rear impact collisions, particularly at low
speed, remain a primary concern. In this study, FEA(Finite Element
Analysis) of seat was performed to evaluate neck injuries in rear
impact. And the FEA result was verified by comparison with the actual
test results. The dummy used in FE model and actual test is BioRID II
which is regarded suitable for rear impact collision analysis. A
threshold of the BioRID II neck injury indicators was also proposed to
upgrade seat performance in order to reduce whiplash injury. To
optimize the seat for a low-speed rear impact collision, a method was
proposed, which is multi-objective optimization idea using DOE
(Design of Experiments) results.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: Disordered function of maniphalanx and difficulty with
ambulation will occur insofar as a human has a failure in the spinal
marrow. Cervical spondylotic myelopathy as one of the myelopathy
emanates from not only external factors but also increased age. In
addition, the diacrisis is difficult since cervical spondylotic
myelopathy is evaluated by a doctor-s neurological remark and
imaging findings. As a quantitative method for measuring the degree
of disability, hand-operated triangle step test (for short, TST) has
formulated. In this research, a full automatic triangle step counter
apparatus is designed and developed to measure the degree of
disability in an accurate fashion according to the principle of TST. The
step counter apparatus whose shape is a low triangle pole displays the
number of stepping upon each corner. Furthermore, the apparatus has
two modes of operation. Namely, one is for measuring the degree of
disability and the other for rehabilitation exercise. In terms of
usefulness, clinical practice should be executed before too long.
Abstract: Certifications such as the Passive House Standard aim to reduce the final space heating energy demand of residential buildings. Space conditioning, notably heating, is responsible for nearly 70% of final residential energy consumption in Europe. There is therefore significant scope for the reduction of energy consumption through improvements to the energy efficiency of residential buildings. However, these certifications totally overlook the energy embodied in the building materials used to achieve this greater operational energy efficiency. The large amount of insulation and the triple-glazed high efficiency windows require a significant amount of energy to manufacture. While some previous studies have assessed the life cycle energy demand of passive houses, including their embodied energy, these rely on incomplete assessment techniques which greatly underestimate embodied energy and can lead to misleading conclusions. This paper analyses the embodied and operational energy demands of a case study passive house using a comprehensive hybrid analysis technique to quantify embodied energy. Results show that the embodied energy is much more significant than previously thought. Also, compared to a standard house with the same geometry, structure, finishes and number of people, a passive house can use more energy over 80 years, mainly due to the additional materials required. Current building energy efficiency certifications should widen their system boundaries to include embodied energy in order to reduce the life cycle energy demand of residential buildings.
Abstract: In this paper, we propose novel algorithmic models
based on information fusion and feature transformation in crossmodal
subspace for different types of residue features extracted from
several intra-frame and inter-frame pixel sub-blocks in video
sequences for detecting digital video tampering or forgery. An
evaluation of proposed residue features – the noise residue features
and the quantization features, their transformation in cross-modal
subspace, and their multimodal fusion, for emulated copy-move
tamper scenario shows a significant improvement in tamper detection
accuracy as compared to single mode features without transformation
in cross-modal subspace.
Abstract: Long term rainfall analysis and prediction is a
challenging task especially in the modern world where the impact of
global warming is creating complications in environmental issues.
These factors which are data intensive require high performance
computational modeling for accurate prediction. This research paper
describes a prototype which is designed and developed on grid
environment using a number of coupled software infrastructural
building blocks. This grid enabled system provides the demanding
computational power, efficiency, resources, user-friendly interface,
secured job submission and high throughput. The results obtained
using sequential execution and grid enabled execution shows that
computational performance has enhanced among 36% to 75%, for
decade of climate parameters. Large variation in performance can be
attributed to varying degree of computational resources available for
job execution.
Grid Computing enables the dynamic runtime selection, sharing
and aggregation of distributed and autonomous resources which plays
an important role not only in business, but also in scientific
implications and social surroundings. This research paper attempts to
explore the grid enabled computing capabilities on weather indices
from HOAPS data for climate impact modeling and change
detection.
Abstract: This study reports the implementation of Good
Manufacturing Practice (GMP) in a polycarbonate film processing
plant. The implementation of GMP took place with the creation of a
multidisciplinary team. It was carried out in four steps: conduct gap
assessment, create gap closure plan, close gaps, and follow up the
GMP implementation. The basis for the gap assessment is the
guideline for GMP for plastic materials and articles intended for Food
Contact Material (FCM), which was edited by Plastic Europe. The
effective results of the GMP implementation in this study showed
100% completion of gap assessment. The key success factors for
implementing GMP in production process are the commitment,
intention and support of top management.
Abstract: Concrete strength evaluated from compression tests
on cores is affected by several factors causing differences from the
in-situ strength at the location from which the core specimen was
extracted. Among the factors, there is the damage possibly occurring
during the drilling phase that generally leads to underestimate the
actual in-situ strength. In order to quantify this effect, in this study
two wide datasets have been examined, including: (i) about 500 core
specimens extracted from Reinforced Concrete existing structures,
and (ii) about 600 cube specimens taken during the construction of
new structures in the framework of routine acceptance control. The
two experimental datasets have been compared in terms of
compression strength and specific weight values, accounting for the
main factors affecting a concrete property, that is type and amount of
cement, aggregates' grading, type and maximum size of aggregates,
water/cement ratio, placing and curing modality, concrete age. The
results show that the magnitude of the strength reduction due to
drilling damage is strongly affected by the actual properties of
concrete, being inversely proportional to its strength. Therefore, the
application of a single value of the correction coefficient, as generally
suggested in the technical literature and in structural codes, appears
inappropriate. A set of values of the drilling damage coefficient is
suggested as a function of the strength obtained from compressive
tests on cores.
Abstract: One major issue that is regularly cited as a block to
the widespread use of online assessments in eLearning, is that of the
authentication of the student and the level of confidence that an
assessor can have that the assessment was actually completed by that
student. Currently, this issue is either ignored, in which case
confidence in the assessment and any ensuing qualification is
damaged, or else assessments are conducted at central, controlled
locations at specified times, losing the benefits of the distributed
nature of the learning programme. Particularly as we move towards
constructivist models of learning, with intentions towards achieving
heutagogic learning environments, the benefits of a properly
managed online assessment system are clear. Here we discuss some
of the approaches that could be adopted to address these issues,
looking at the use of existing security and biometric techniques,
combined with some novel behavioural elements. These approaches
offer the opportunity to validate the student on accessing an
assessment, on submission, and also during the actual production of
the assessment. These techniques are currently under development in
the DECADE project, and future work will evaluate and report their
use..
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: Simplified coupled engine block-crankshaft models
based on beam theory provide an efficient substitute to engine
simulation in the design process. These models require accurate
definition of the main bearing stiffness. In this paper, an investigation
of this stiffness is presented. The clearance effect is studied using a
smooth bearing model. It is manifested for low shaft displacement.
The hydrodynamic assessment model shows that the oil film has no
stiffness for low loads and it is infinitely rigid for important loads.
The deformation stiffness is determined using a suitable finite
elements model based on real CADs. As a result, a main bearing
behaviour law is proposed. This behaviour law takes into account the
clearance, the hydrodynamic sustention and the deformation stiffness.
It ensures properly the transition from the configuration low rigidity
to the configuration high rigidity.
Abstract: The recent drive for use of performance-based methodologies in design and assessment of structures in seismic areas has significantly increased the demand for the development of reliable nonlinear inelastic static pushover analysis tools. As a result, the adaptive pushover methods have been developed during the last decade, which unlike their conventional pushover counterparts, feature the ability to account for the effect that higher modes of vibration and progressive stiffness degradation might have on the distribution of seismic storey forces. Even in advanced pushover methods, little attention has been paid to the Unsymmetric structures. This study evaluates the seismic demands for three dimensional Unsymmetric-Plan buildings determined by the Displacement-based Adaptive Pushover (DAP) analysis, which has been introduced by Antoniou and Pinho [2004]. The capability of DAP procedure in capturing the torsional effects due to the irregularities of the structures, is investigated by comparing its estimates to the exact results, obtained from Incremental Dynamic Analysis (IDA). Also the capability of the procedure in prediction the seismic behaviour of the structure is discussed.
Abstract: The seismic vulnerability of an urban area is of a great
deal for local authorities especially those facing earthquakes. So, it is
important to have an efficient tool to assess the vulnerability of
existing buildings. The use of the VIP (Vulnerability Index Program)
and the GIS (Geographic Information System) let us to identify the
most vulnerable districts of an urban area.
The use of the vulnerability index method lets us to assess the
vulnerability of the center town of Blida (Algeria) which is a
historical town and which has grown enormously during the last
decades. In this method, three levels of vulnerability are defined. The
GIS has been used to build a data base in order to perform different
thematic analyses. These analyses show the seismic vulnerability of
Blida.
Abstract: This study aims to assess the vulnerability and risk of
the coastal areas of Taijiang to abnormal oceanographic phenomena.
In addition, this study aims to investigate and collect data regarding
the disaster losses, land utilization, and other social, economic, and
environmental issues in these coastal areas to construct a coastal
vulnerability and risk map based on the obtained climate-change risk
assessment results. Considering the indexes of the three coastal
vulnerability dimensions, namely, man-made facilities, environmental
geography, and social economy, this study adopted the equal
weighting process and Analytic Hierarchy Process to analyze the
vulnerability of these coastal areas to disasters caused by climatic
changes. Among the areas with high coastal vulnerability to climatic
changes, three towns had the highest coastal vulnerability and four had
the highest relative vulnerability. Areas with lower disaster risks were
found to be increasingly vulnerable to disasters caused by climatic
changes as time progresses.