Abstract: Unified Modeling Language (UML) is considered as one of the widespread modeling language standardized by the Object Management Group (OMG). Therefore, the model driving engineering (MDE) community attempts to provide reuse of UML diagrams, and do not construct it from scratch. The UML model appears according to a specific software development process. The existing method generation models focused on the different techniques of transformation without considering the development process. Our work aims to construct an UML component from fragments of UML diagram basing on an agile method. We define UML fragment as a portion of a UML diagram, which express a business target. To guide the generation of fragments of UML models using an agile process, we need a flexible approach, which adapts to the agile changes and covers all its activities. We use the software product line (SPL) to derive a fragment of process agile method. This paper explains our approach, named RECUP, to generate UML fragments following an agile process, and overviews the different aspects. In this paper, we present the approach and we define the different phases and artifacts.
Abstract: The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.
Abstract: To increase the temperature contrast in thermal
images, the characteristics of the electrical conductivity and thermal
imaging modalities can be combined. In this experimental study, it is
objected to observe whether the temperature contrast created by the
tumor tissue can be improved just due to the current application
within medical safety limits. Various thermal breast phantoms are
developed to simulate the female breast tissue. In vitro experiments
are implemented using a thermal infrared camera in a controlled
manner. Since experiments are implemented in vitro, there is no
metabolic heat generation and blood perfusion. Only the effects and
results of the electrical stimulation are investigated. Experimental
study is implemented with two-dimensional models. Temperature
contrasts due to the tumor tissues are obtained. Cancerous tissue is
determined using the difference and ratio of healthy and tumor
images. 1 cm diameter single tumor tissue causes almost 40 °mC
temperature contrast on the thermal-breast phantom. Electrode
artifacts are reduced by taking the difference and ratio of background
(healthy) and tumor images. Ratio of healthy and tumor images show
that temperature contrast is increased by the current application.
Abstract: The new methods as accelerated steam distillation
assisted by microwave (ASDAM) is a combination of microwave
heating and steam distillation, performed at atmospheric pressure at
very short extraction time. Isolation and concentration of volatile
compounds are performed by a single stage. (ASDAM) has been
compared with (ASDAM) with cryogrinding of seeds (CG) and a
conventional technique, hydrodistillation assisted by microwave
(HDAM), hydro-distillation (HD) for the extraction of essential oil
from aromatic herb as caraway and cumin seeds. The essential oils
extracted by (ASDAM) for 1 min were quantitatively (yield) and
qualitatively (aromatic profile) no similar to those obtained by
ASDAM-CG (1 min) and HD (for 3 h). The accelerated microwave
extraction with cryogrinding inhibits numerous enzymatic reactions
as hydrolysis of oils.
Microwave radiations constitute the adequate mean for the
extraction operations from the yields and high content in major
component majority point view, and allow to minimise considerably
the energy consumption, but especially heating time too, which is one
of essential parameters of artifacts formation.
The ASDAM and ASDAM-CG are green techniques and yields an
essential oil with higher amounts of more valuable oxygenated
compounds comparable to the biosynthesis compounds, and allows
substantial savings of costs, in terms of time, energy and plant
material.
Abstract: AmI proposes a new way of thinking about computers, which follows the ideas of the Ubiquitous Computing vision of Mark Weiser. In these, there is what is known as a Disappearing Computer Initiative, with users immersed in intelligent environments. Hence, technologies need to be adapted so that they are capable of replacing the traditional inputs to the system by embedding these in every-day artifacts. In this work, we present an approach, which uses Radiofrequency Identification (RFID) and Near Field Communication (NFC) technologies. In the latter, a new form of interaction appears by contact. We compare both technologies by analyzing their requirements and advantages. In addition, we propose using a combination of RFID and NFC.
Abstract: The introduction of a multitude of new and interactive
e-commerce information technology (IT) artifacts has impacted
adoption research. Rather than solely functioning as productivity
tools, new IT artifacts assume the roles of interaction mediators and
social actors. This paper describes the varying roles assumed by IT
artifacts, and proposes and distinguishes between four distinct foci of
how the artifacts are evaluated. It further proposes a theoretical
model that maps the different views of IT artifacts to four distinct
types of evaluations.
Abstract: The need to merge software artifacts seems inherent
to modern software development. Distribution of development over
several teams and breaking tasks into smaller, more manageable
pieces are an effective means to deal with the kind of complexity. In
each case, the separately developed artifacts need to be assembled as
efficiently as possible into a consistent whole in which the parts still
function as described. In addition, earlier changes are introduced into
the life cycle and easier is their management by designers.
Interaction-based specifications such as UML sequence diagrams
have been found effective in this regard. As a result, sequence
diagrams can be used not only for capturing system behaviors but
also for merging changes in order to create a new version. The
objective of this paper is to suggest a new approach to deal with the
problem of software merging at the level of sequence diagrams by
using the concept of dependence analysis that captures, formally, all
mapping, and differences between elements of sequence diagrams
and serves as a key concept to create a new version of sequence
diagram.
Abstract: This study aims to increase understanding of the
transition of business models in servitization. The significance of
service in all business has increased dramatically during the past
decades. Service-dominant logic (SDL) describes this change in the
economy and questions the goods-dominant logic on which business
has primarily been based in the past. A business model canvas is one
of the most cited and used tools in defining end developing business
models. The starting point of this paper lies in the notion that the
traditional business model canvas is inherently goods-oriented and
best suits for product-based business. However, the basic differences
between goods and services necessitate changes in business model
representations when proceeding in servitization. Therefore, new
knowledge is needed on how the conception of business model and
the business model canvas as its representation should be altered in
servitized firms in order to better serve business developers and interfirm
co-creation. That is to say, compared to products, services are
intangible and they are co-produced between the supplier and the
customer. Value is always co-created in interaction between a
supplier and a customer, and customer experience primarily depends
on how well the interaction succeeds between the actors. The role of
service experience is even stronger in service business compared to
product business, as services are co-produced with the customer. This paper provides business model developers with a service
business model canvas, which takes into account the intangible,
interactive, and relational nature of service. The study employs a
design science approach that contributes to theory development via
design artifacts. This study utilizes qualitative data gathered in
workshops with ten companies from various industries. In particular,
key differences between Goods-dominant logic (GDL) and SDLbased
business models are identified when an industrial firm
proceeds in servitization. As the result of the study, an updated version of the business
model canvas is provided based on service-dominant logic. The
service business model canvas ensures a stronger customer focus and
includes aspects salient for services, such as interaction between
companies, service co-production, and customer experience. It can be
used for the analysis and development of a current service business
model of a company or for designing a new business model. It
facilitates customer-focused new service design and service
development. It aids in the identification of development needs, and
facilitates the creation of a common view of the business model.
Therefore, the service business model canvas can be regarded as a
boundary object, which facilitates the creation of a common
understanding of the business model between several actors involved.
The study contributes to the business model and service business
development disciplines by providing a managerial tool for
practitioners in service development. It also provides research insight
into how servitization challenges companies’ business models.
Abstract: This report presents an alternative technique of
application of contrast agent in vivo, i.e. before sampling. By this
new method the electron micrograph of tissue sections have an
acceptable contrast compared to other methods and present no artifact
of precipitation on sections. Another advantage is that a small amount
of contrast is needed to get a good result given that most of them are
expensive and extremely toxic.
Abstract: Software fault prediction models are created by using
the source code, processed metrics from the same or previous version
of code and related fault data. Some company do not store and keep
track of all artifacts which are required for software fault prediction.
To construct fault prediction model for such company, the training
data from the other projects can be one potential solution. Earlier we
predicted the fault the less cost it requires to correct. The training
data consists of metrics data and related fault data at function/module
level. This paper investigates fault predictions at early stage using the
cross-project data focusing on the design metrics. In this study,
empirical analysis is carried out to validate design metrics for cross
project fault prediction. The machine learning techniques used for
evaluation is Naïve Bayes. The design phase metrics of other projects
can be used as initial guideline for the projects where no previous
fault data is available. We analyze seven datasets from NASA
Metrics Data Program which offer design as well as code metrics.
Overall, the results of cross project is comparable to the within
company data learning.
Abstract: High density electrical prospecting has been widely
used in groundwater investigation, civil engineering and
environmental survey. For efficient inversion, the forward modeling
routine, sensitivity calculation, and inversion algorithm must be
efficient. This paper attempts to provide a brief summary of the past
and ongoing developments of the method. It includes reviews of the
procedures used for data acquisition, processing and inversion of
electrical resistivity data based on compilation of academic literature.
In recent times there had been a significant evolution in field survey
designs and data inversion techniques for the resistivity method. In
general 2-D inversion for resistivity data is carried out using the
linearized least-square method with the local optimization technique
.Multi-electrode and multi-channel systems have made it possible to
conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve
complex geological structures that were not possible with traditional
1-D surveys. 3-D surveys play an increasingly important role in very
complex areas where 2-D models suffer from artifacts due to off-line
structures. Continued developments in computation technology, as
well as fast data inversion techniques and software, have made it
possible to use optimization techniques to obtain model parameters to
a higher accuracy. A brief discussion on the limitations of the
electrical resistivity method has also been presented.
Abstract: The focal aspire of e-Government (eGovt) is to offer
citizen-centered service delivery. Accordingly, the citizenry
consumes services from multiple government agencies through
national portal. Thus, eGovt is an enterprise with the primary
business motive of transparent, efficient and effective public services
to its citizenry and its logical structure is the eGovernment Enterprise
Architecture (eGEA). Since eGovt is IT oriented multifaceted
service-centric system, EA doesn’t do much on an automated
enterprise other than the business artifacts. Service-Oriented
Architecture (SOA) manifestation led some governments to pertain
this in their eGovts, but it limits the source of business artifacts. The
concurrent use of EA and SOA in eGovt executes interoperability and
integration and leads to Service-Oriented e-Government Enterprise
(SOeGE). Consequently, agile eGovt system becomes a reality. As an
IT perspective eGovt comprises of centralized public service artifacts
with the existing application logics belong to various departments at
central, state and local level. The eGovt is renovating to SOeGE by
apply the Service-Orientation (SO) principles in the entire system.
This paper explores IT perspective of SOeGE in India which
encompasses the public service models and illustrated with a case
study the Passport service of India.
Abstract: The phased-array ultrasound transducer types are
utilities for medical ultrasonography as well as optical imaging.
However, their discontinuity characteristic limits the applications due
to the artifacts contaminated into the reconstructed images. Because
of the effects of the ultrasound pressure field pattern to the echo
ultrasonic waves as well as the optical modulated signal, the side
lobes of the focused ultrasound beam induced by discontinuity of the
phased-array ultrasound transducer might the reason of the artifacts.
In this paper, a simple method in approach of numerical simulation
was used to investigate the limitation of discontinuity of the elements
in phased-array ultrasound transducer and their effects to the
ultrasound pressure field. Take into account the change of ultrasound
pressure field patterns in the conditions of variation of the pitches
between elements of the phased-array ultrasound transducer, the
appropriated parameters for phased-array ultrasound transducer
design were asserted quantitatively.
Abstract: This research proposes a novel reconstruction protocol
for restoring missing surfaces and low-quality edges and shapes in
photos of artifacts at historical sites. The protocol starts with the
extraction of a cloud of points. This extraction process is based on
four subordinate algorithms, which differ in the robustness and
amount of resultant. Moreover, they use different -but
complementary- accuracy to some related features and to the way
they build a quality mesh. The performance of our proposed protocol
is compared with other state-of-the-art algorithms and toolkits. The
statistical analysis shows that our algorithm significantly outperforms
its rivals in the resultant quality of its object files used to reconstruct
the desired model.
Abstract: Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when digital image is resized on a diagnostic monitor. In this paper we propose an automated grid artifactsdetection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.
Abstract: According to the scientific information management literature, the improper use of information technology (e.g. personal computers) by employees are one main cause for operational and information security loss events. Therefore, organizations implement information security awareness programs to increase employees’ awareness to further prevention of loss events. However, in many cases these information security awareness programs consist of conventional delivery methods like posters, leaflets, or internal messages to make employees aware of information security policies. We assume that a viral information security awareness video might be more effective medium than conventional methods commonly used by organizations. The purpose of this research is to develop a viral video artifact to improve employee security behavior concerning information technology.
Abstract: Enhancing the quality of two dimensional signals is one of the most important factors in the fields of video surveillance and computer vision. Usually in real-life video surveillance, false detection occurs due to the presence of random noise, illumination
and shadow artifacts. The detection methods based on background subtraction faces several problems in accurately detecting objects in realistic environments: In this paper, we propose a noise removal algorithm using neighborhood comparison method with thresholding. The illumination variations correction is done in the detected foreground objects by using an amalgamation of techniques like homomorphic decomposition, curvelet transformation and gamma adjustment operator. Shadow is removed using chromaticity estimator with local relation estimator. Results are compared with the existing methods and prove as high robustness in the video surveillance.
Abstract: The software evolution control requires a deep
understanding of the changes and their impact on different system
heterogeneous artifacts. And an understanding of descriptive
knowledge of the developed software artifacts is a prerequisite
condition for the success of the evolutionary process.
The implementation of an evolutionary process is to make changes
more or less important to many heterogeneous software artifacts such
as source code, analysis and design models, unit testing, XML
deployment descriptors, user guides, and others. These changes can
be a source of degradation in functional, qualitative or behavioral
terms of modified software. Hence the need for a unified approach
for extraction and representation of different heterogeneous artifacts
in order to ensure a unified and detailed description of heterogeneous
software artifacts, exploitable by several software tools and allowing
to responsible for the evolution of carry out the reasoning change
concerned.
Abstract: Petroglyphs, stone sculptures, burial mounds, and
other memorial religious structures are ancient artifacts which find
reflection in contemporary world culture, including the culture of
Kazakhstan. In this article, the problem of the influence of ancient
artifacts on contemporary culture is researched, using as an example
Kazakhstan-s sculpture and painting. The practice of creating
petroglyphs, stone sculptures, and memorial religious structures was
closely connected to all fields of human existence, which fostered the
formation of and became an inseparable part of a traditional
worldview. The ancient roots of Saka-Sythian and Turkic nomadic
culture have been studied, and integrated into the foundations of the
contemporary art of Kazakhstan. The study of the ancient cultural
heritage of Kazakhstan by contemporary artists, sculptors and
architects, as well as the influence of European art and cultures on the
art of Kazakhstan are furthering the development of a new national
art.
Abstract: Fishing has always been an essential component of
the Polynesians- life. Fishhooks, mostly in pearl shell, found during
archaeological excavations are the artifacts related to this activity the
most numerous. Thanks to them, we try to reconstruct the ancient
techniques of resources exploitation, inside the lagoons and offshore.
They can also be used as chronological and cultural indicators. The
shapes and dimensions of these artifacts allow comparisons and
classifications used in both functional approach and chrono-cultural
perspective. Hence it is very important for the ethno-archaeologists
to dispose of reliable methods and standardized measurement of
these artifacts. Such a reliable objective and standardized method
have been previously proposed. But this method cannot be envisaged
manually because of the very important time required to measure
each fishhook manually and the quantity of fishhooks to measure
(many hundreds). We propose in this paper a detailed acquisition
protocol of fishhooks and an automation of every step of this method.
We also provide some experimental results obtained on the fishhooks
coming from three archaeological excavations sites.