Abstract: This paper presents an application of a “Systematic
Soft Domain Driven Design Framework” as a soft systems approach
to domain-driven design of information systems development. The
framework use SSM as a guiding methodology within which we have
embedded a sequence of design tasks based on the UML leading to
the implementation of a software system using the Naked Objects
framework. This framework have been used in action research
projects that have involved the investigation and modelling of
business processes using object-oriented domain models and the
implementation of software systems based on those domain models.
Within this framework, Soft Systems Methodology (SSM) is used as
a guiding methodology to explore the problem situation and to
develop the domain model using UML for the given business
domain. The framework is proposed and evaluated in our previous
works, and a real case study “Information Retrieval System for
academic research” is used, in this paper, to show further practice and
evaluation of the framework in different business domain. We argue
that there are advantages from combining and using techniques from
different methodologies in this way for business domain modelling.
The framework is overviewed and justified as multimethodology
using Mingers multimethodology ideas.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Model transformation, as a pivotal aspect of Modeldriven
engineering, attracts more and more attentions both from
researchers and practitioners. Many domains (enterprise engineering,
software engineering, knowledge engineering, etc.) use model
transformation principles and practices to serve to their domain
specific problems; furthermore, model transformation could also be
used to fulfill the gap between different domains: by sharing and
exchanging knowledge. Since model transformation has been widely
used, there comes new requirement on it: effectively and efficiently
define the transformation process and reduce manual effort that
involved in. This paper presents an automatic model transformation
methodology based on semantic and syntactic comparisons, and
focuses particularly on granularity issue that existed in transformation
process. Comparing to the traditional model transformation
methodologies, this methodology serves to a general purpose: crossdomain
methodology. Semantic and syntactic checking
measurements are combined into a refined transformation process,
which solves the granularity issue. Moreover, semantic and syntactic
comparisons are supported by software tool; manual effort is replaced
in this way.
Abstract: Because blueberries are worldwide recognized as a
good source of beneficial components, their consumption has
increased in the past decades, and so have the scientific works about
their properties. Hence, this work was undertaken to evaluate the
effect of some production and conservation factors on the properties
of blueberries from cultivar Bluecrop. The physical and chemical
analyses were done according to established methodologies and then
all data was treated using software SPSS for assessment of the
possible differences among the factors investigated and/or the
correlations between the variables at study. The results showed that
location of production influenced some of the berries properties
(caliber, sugars, antioxidant activity, color and texture) and that the
age of the bushes was correlated with moisture, sugars and acidity, as
well as lightness. On the other hand, altitude of the farm only was
correlated to sugar content. With regards to conservation, it
influenced only anthocyanins content and DPPH antioxidant activity.
Finally, the type of extract and the order of extraction had a
pronounced influence on all the phenolic properties evaluated.
Abstract: Rehabilitation of dam components such as foundations, buttresses, spillways and overtopping protection require a wide range of construction and design methodologies. Geotechnical Engineering considerations play an important role in the design and construction of foundations of new dams. Much investigation is required to assess and evaluate the existing dams. The application of roller compacting concrete (RCC) has been accepted as a new method for constructing new dams or rehabilitating old ones. In the past 40 years there have been so many changes in the usage of RCC and now it is one of most satisfactory solutions of water and hydropower resource throughout the world. The considerations of rehabilitation and construction of dams might differ due to upstream reservoir and its influence on penetrating and dewatering of downstream, operations requirements and plant layout. One of the advantages of RCC is its rapid placement which allows the dam to be operated quickly. Unlike ordinary concrete it is a drier mix, and stiffs enough for compacting by vibratory rollers. This paper evaluates some different aspects of RCC and focuses on its preparation progress.
Abstract: The aim of this paper is to present the optimization
methodology developed in the frame of a Coastal Transport
Information System. The system will be used for the effective design
of coastal transportation lines and incorporates subsystems that
implement models, tools and techniques that may support the design
of improved networks. The role of the optimization and decision
subsystem is to provide the user with better and optimal scenarios
that will best fulfill any constrains, goals or requirements posed. The
complexity of the problem and the large number of parameters and
objectives involved led to the adoption of an evolutionary method
(Genetic Algorithms). The problem model and the subsystem
structure are presented in detail, and, its support for simulation is also
discussed.
Abstract: The building sector is responsible, in many
industrialized countries, for about 40% of the total energy
requirements, so it seems necessary to devote some efforts in this
area in order to achieve a significant reduction of energy
consumption and of greenhouse gases emissions.
The paper presents a study aiming at providing a design
methodology able to identify the best configuration of the system
building/plant, from a technical, economic and environmentally point
of view.
Normally, the classical approach involves a building's energy
loads analysis under steady state conditions, and subsequent selection
of measures aimed at improving the energy performance, based on
previous experience made by architects and engineers in the design
team. Instead, the proposed approach uses a sequence of two wellknown
scientifically validated calculation methods (TRNSYS and
RETScreen), that allow quite a detailed feasibility analysis.
To assess the validity of the calculation model, an existing,
historical building in Central Italy, that will be the object of
restoration and preservative redevelopment, was selected as a casestudy.
The building is made of a basement and three floors, with a
total floor area of about 3,000 square meters.
The first step has been the determination of the heating and
cooling energy loads of the building in a dynamic regime by means,
which allows simulating the real energy needs of the building in
function of its use. Traditional methodologies, based as they are on
steady-state conditions, cannot faithfully reproduce the effects of
varying climatic conditions and of inertial properties of the structure.
With this model is possible to obtain quite accurate and reliable
results that allow identifying effective combinations building-HVAC
system.
The second step has consisted of using output data obtained as
input to the calculation model, which enables to compare different
system configurations from the energy, environmental and financial
point of view, with an analysis of investment, and operation and
maintenance costs, so allowing determining the economic benefit of
possible interventions.
The classical methodology often leads to the choice of
conventional plant systems, while our calculation model provides a
financial-economic assessment for innovative energy systems and
low environmental impact.
Computational analysis can help in the design phase, particularly
in the case of complex structures with centralized plant systems, by
comparing the data returned by the calculation model for different
design options.
Abstract: Despite the highly touted benefits, emerging
technologies have unleashed pervasive concerns regarding unintended
and unforeseen social impacts. Thus, those wishing to create safe and
socially acceptable products need to identify such side effects and
mitigate them prior to the market proliferation. Various methodologies
in the field of technology assessment (TA), namely Delphi, impact
assessment, and scenario planning, have been widely incorporated in
such a circumstance. However, literatures face a major limitation in
terms of sole reliance on participatory workshop activities. They
unfortunately missed out the availability of a massive untapped data
source of futuristic information flooding through the Internet. This
research thus seeks to gain insights into utilization of futuristic data,
future-oriented documents from the Internet, as a supplementary
method to generate social impact scenarios whilst capturing
perspectives of experts from a wide variety of disciplines. To this end,
network analysis is conducted based on the social keywords extracted
from the futuristic documents by text mining, which is then used as a
guide to produce a comprehensive set of detailed scenarios. Our
proposed approach facilitates harmonized depictions of possible
hazardous consequences of emerging technologies and thereby makes
decision makers more aware of, and responsive to, broad qualitative
uncertainties.
Abstract: Our goal is development of an algorithm capable of
predicting the directional trend of the Standard and Poor’s 500 index
(S&P 500). Extensive research has been published attempting to
predict different financial markets using historical data testing on an
in-sample and trend basis, with many authors employing excessively
complex mathematical techniques. In reviewing and evaluating these
in-sample methodologies, it became evident that this approach was
unable to achieve sufficiently reliable prediction performance for
commercial exploitation. For these reasons, we moved to an out-ofsample
strategy based on linear regression analysis of an extensive
set of financial data correlated with historical closing prices of the
S&P 500. We are pleased to report a directional trend accuracy of
greater than 55% for tomorrow (t+1) in predicting the S&P 500.
Abstract: Qatar, a Gulf country highly dependent on its oil and
gas revenues – is looking to innovate, diversify, and ultimately reach
its aim of creating a knowledge economy to prepare for its post-oil
era. One area that the country is investing in is Contemporary Art,
and world renowned artists such as Damien Hirst and Richard Serra –
have been commissioned to design site-specific art for the public
spaces of the city of Doha as well as in more remote desert locations.
This research discusses the changing presence, role and context of
public art in Doha, both from a historical and cultural overview, and
the different forms and media as well as the typologies of urban and
public spaces in which the art is installed. It examines the process of
implementing site-specific artworks, looking at questions of scale,
history, social meaning and formal aesthetics. The methodologies
combine theoretical research on the understanding of public art and
its role and placement in public space, as well as empirical research
on contemporary public art projects in Doha, based on documentation
and interviews and as well as site and context analysis of the urban or
architectural spaces within which the art is situated. Surveys and
interviews – using social media - in different segments of the
contemporary Qatari society, including all nationalities and social
groups, are used to measure and qualify the impacts and effects on
the population.
Abstract: This study was conducted in the area of Vlora Bay,
Albania. Data about Sea Turtles Caretta caretta and Chelonia mydas,
belonging to two periods of time (1984 – 1991; 2008 – 2014) are
given. All data gathered were analyzed using recent methodologies.
For all turtles captured (as by catch), the Curve Carapace Length
(CCL) and Curved Carapace Width (CCW) were measured. These
data were statistically analyzed, where the mean was 67.11 cm for
CCL and 57.57 cm for CCW of all individuals studied (n=13). All
untagged individuals of marine turtles were tagged using metallic
tags (Stockbrand’s titanium tag) with an Albanian address. Sex was
determined and resulted that 45.4% of individuals were females,
27.3% males and 27.3% juveniles. All turtles were studied for the
presence of the epibionts. The area of Vlora Bay is used from marine
turtles (Caretta caretta) as a migratory corridor to pass from
Mediterranean to the northern part of the Adriatic Sea.
Abstract: In this study was monitored the population of the
European Pond Turtle, Emys orbicularis (Linnaeus, 1758) in the area
of Narta Lagoon, Vlora Bay (Albania), from August to October 2014.
A total of 54 individuals of E. orbicularis were studied using
different methodologies. Curved Carapace Length (CCL), Plastron
Length (PL) and Curved Carapace Width (CCW) were measured for
each individual of E. orbicularis and were statistically analyzed. All
captured turtles were separated in seven different size – classes based
on their carapace length (CCL). Each individual of E. orbicularis was
marked by notching the carapace (marginal scutes). Form all
individuals captured resulted that 37 were females (68.5%), 14 males
(25.9%), 3 juveniles (5.5%), while 18 individuals of E. orbicularis
were recaptured for the first and some for the second time.
Abstract: Construction cost estimation is one of the most
important aspects of construction project design. For generations, the
process of cost estimating has been manual, time-consuming and
error-prone. This has partly led to most cost estimates to be unclear
and riddled with inaccuracies that at times lead to over- or underestimation
of construction cost. The development of standard set of
measurement rules that are understandable by all those involved in a
construction project, have not totally solved the challenges. Emerging
Building Information Modelling (BIM) technologies can exploit
standard measurement methods to automate cost estimation process
and improve accuracies. This requires standard measurement
methods to be structured in ontological and machine readable format;
so that BIM software packages can easily read them. Most standard
measurement methods are still text-based in textbooks and require
manual editing into tables or Spreadsheet during cost estimation. The
aim of this study is to explore the development of an ontology based
on New Rules of Measurement (NRM) commonly used in the UK for
cost estimation. The methodology adopted is Methontology, one of
the most widely used ontology engineering methodologies. The
challenges in this exploratory study are also reported and
recommendations for future studies proposed.
Abstract: Building loss estimation methodologies which have
been advanced considerably in recent decades are usually used to
estimate socio and economic impacts resulting from seismic structural
damage. In accordance with these methods, this paper presents the
evaluation of an annual loss probability of a reinforced concrete
moment resisting frame designed according to Korean Building Code.
The annual loss probability is defined by (1) a fragility curve obtained
from a capacity spectrum method which is similar to a method adopted
from HAZUS, and (2) a seismic hazard curve derived from annual
frequencies of exceedance per peak ground acceleration. Seismic
fragilities are computed to calculate the annual loss probability of a
certain structure using functions depending on structural capacity,
seismic demand, structural response and the probability of exceeding
damage state thresholds. This study carried out a nonlinear static
analysis to obtain the capacity of a RC moment resisting frame
selected as a prototype building. The analysis results show that the
probability of being extensive structural damage in the prototype
building is expected to 0.01% in a year.
Abstract: A capacity spectrum method (CSM), one of methodologies to evaluate seismic fragilities of building structures, has been long recognized as the most convenient method, even if it contains several limitations to predict the seismic response of structures of interest. This paper proposes the procedure to estimate seismic fragility curves using an incremental dynamic analysis (IDA) rather than the method adopting a CSM. To achieve the research purpose, this study compares the seismic fragility curves of a 5-story reinforced concrete (RC) moment frame obtained from both methods; an IDA method and aCSM. Both seismic fragility curves are similar in slight and moderate damage states whereas the fragility curve obtained from the IDA method presents less variation (or uncertainties) in extensive and complete damage states. This is due to the fact that the IDA method can properly capture the structural response beyond yielding rather than the CSM and can directly calculate higher mode effects. From these observations, the CSM could overestimate seismic vulnerabilities of the studied structure in extensive or complete damage states.
Abstract: This paper impart the design and testing of
Nanotechnology based sequential circuits using multiplexer
conservative QCA (MX-CQCA) logic gates, which is easily testable
using only two vectors. This method has great prospective in the
design of sequential circuits based on reversible conservative logic
gates and also smashes the sequential circuits implemented in
traditional gates in terms of testability. Reversible circuits are similar
to usual logic circuits except that they are built from reversible gates.
Designs of multiplexer conservative QCA logic based two vectors
testable double edge triggered (DET) sequential circuits in VHDL
language are also accessible here; it will also diminish intricacy in
testing side. Also other types of sequential circuits such as D, SR, JK
latches are designed using this MX-CQCA logic gate. The objective
behind the proposed design methodologies is to amalgamate
arithmetic and logic functional units optimizing key metrics such as
garbage outputs, delay, area and power. The projected MX-CQCA
gate outshines other reversible gates in terms of the intricacy, delay.
Abstract: Ontologies provide a common understanding of a
specific domain of interest that can be communicated between people
and used as background knowledge for automated reasoning in a
wide range of applications. In this paper, we address the design of
multilingual ontologies following well-defined knowledge
engineering methodologies with the support of novel collaborative
development approaches. In particular, we present a collaborative
platform which allows ontologies to be developed incrementally in
multiple languages. This is made possible via an appropriate mapping
between language independent concepts and one lexicalization per
language (or a lexical gap in case such lexicalization does not exist).
The collaborative platform has been designed to support the
development of the Universal Knowledge Core, a multilingual
ontology currently in English, Italian, Chinese, Mongolian, Hindi and
Bangladeshi. Its design follows a workflow-based development
methodology that models resources as a set of collaborative objects
and assigns customizable workflows to build and maintain each
collaborative object in a community driven manner, with extensive
support of modern web 2.0 social and collaborative features.
Abstract: Enterprise Architecture (EA) Implementation
Methodologies have become an important part of EA projects.
Several implementation methodologies have been proposed, as a
theoretical and practical approach, to facilitate and support the
development of EA within an enterprise. A significant question when
facing the starting of EA implementation is deciding which
methodology to utilize. In order to answer this question, a framework
with several criteria is applied in this paper for the comparative
analysis of existing EA implementation methodologies. Five EA
implementation methodologies including: EAP, TOGAF, DODAF,
Gartner, and FEA are selected in order to compare with proposed
framework. The results of the comparison indicate that those
methodologies have not reached a sufficient maturity as whole due to
lack of consideration on requirement management, maintenance,
continuum, and complexities in their process. The framework has
also ability for the evaluation of any kind of EA implementation
methodologies.
Abstract: Fuzzy systems have been successfully used for
exchange rate forecasting. However, fuzzy system is very confusing
and complex to be designed by an expert, as there is a large set of
parameters (fuzzy knowledge base) that must be selected, it is not a
simple task to select the appropriate fuzzy knowledge base for an
exchange rate forecasting. The researchers often look the effect of
fuzzy knowledge base on the performances of fuzzy system
forecasting. This paper proposes a genetic fuzzy predictor to forecast
the future value of daily US Dollar/Euro exchange rate time’s series.
A range of methodologies based on a set of fuzzy predictor’s which
allow the forecasting of the same time series, but with a different
fuzzy partition. Each fuzzy predictor is built from two stages, where
each stage is performed by a real genetic algorithm.
Abstract: Concerns on corrosion and effective coating
protection of double hull tankers and bulk carriers in service have
been raised especially in water ballast tanks (WBTs). Test
protocols/methodologies specifically that which is incorporated in the
International Maritime Organisation (IMO), Performance Standard
for Protective Coatings for Dedicated Sea Water ballast tanks (PSPC)
are being used to assess and evaluate the performance of the coatings
for type approval prior to their application in WBTs. However, some
of the type approved coatings may be applied as very thick films to
less than ideally prepared steel substrates in the WBT. As such films
experience hygrothermal cycling from operating and environmental
conditions, they become embrittled which may ultimately result in
cracking. This embrittlement of the coatings is identified as an
undesirable feature in the PSPC but is not mentioned in the test
protocols within it. There is therefore renewed industrial research
aimed at understanding this issue in order to eliminate cracking and
achieve the intended coating lifespan of 15 years in good condition.
This paper will critically review test protocols currently used for
assessing and evaluating coating performance, particularly the IMO
PSPC.