Abstract: Digital technologies offer many opportunities in the
design and implementation of brand communication and advertising.
Augmented reality (AR) is an innovative technology in marketing
communication that focuses on the fact that virtual interaction with a
product ad offers additional value to consumers. AR enables
consumers to obtain (almost) real product experiences by the way of
virtual information even before the purchase of a certain product.
Aim of AR applications in relation with advertising is in-depth
examination of product characteristics to enhance product knowledge
as well as brand knowledge. Interactive design of advertising
provides observers with an intense examination of a specific
advertising message and therefore leads to better brand knowledge.
The elaboration likelihood model and the central route to persuasion
strongly support this argumentation. Nevertheless, AR in brand
communication is still in an initial stage and therefore scientific
findings about the impact of AR on information processing and brand
attitude are rare. The aim of this paper is to empirically investigate
the potential of AR applications in combination with traditional print
advertising. To that effect an experimental design with different
levels of interactivity is built to measure the impact of interactivity of
an ad on different variables o advertising effectiveness.
Abstract: The rapid growth of the human population and the
environmental degradation associated with increased consumption of
resources raises concerns on sustainability. Social sustainability
constitutes one of the three dimensions of sustainability together with
environmental and economic dimensions. Even though there is not an
agreement on what social sustainability consists of, it is a well known
fact that it necessitates user participation. The fore, this study aims to
observe and analyze the role of user participation on social
sustainability. In this paper, the links between user participation and indicators of
social sustainability have been searched. In order to achieve this, first
of all a literature review on social sustainability has been done;
accordingly, the information obtained from researches has been used
in the evaluation of the projects conducted in the developing
countries considering user participation. These examples are taken as
role models with pros and cons for the development of the checklist
for the evaluation of the case studies. Furthermore, a case study over
the post earthquake residential settlements in Turkey have been
conducted. The case study projects are selected considering different building
scales (differing number of residential units), scale of the problem
(post-earthquake settlements, rehabilitation of shanty dwellings) and
the variety of users (differing socio-economic dimensions). Decisionmaking,
design, building and usage processes of the selected projects
and actors of these processes have been investigated in the context of
social sustainability. The cases include: New Gourna Village by
Hassan Fathy, Quinta Monroy dwelling units conducted in Chile by
Alejandro Aravena and Beyköy and Beriköy projects in Turkey
aiming to solve the problem of housing which have appeared after the
earthquake happened in 1999 have been investigated. Results of the
study possible links between social sustainability indicators and user
participation and links between user participation and the
peculiarities of place. Results are compared and discussed in order to find possible
solutions to form social sustainability through user participation.
Results show that social sustainability issues depend on communities'
characteristics, socio-economic conditions and user profile but user
participation has positive effects on some social sustainability
indicators like user satisfaction, a sense of belonging and social
stability.
Abstract: Strategic investment decisions are characterized by
high innovation potential and long-term effects on the
competitiveness of enterprises. Due to the uncertainty and risks
involved in this complex decision making process, the need arises for
well-structured support activities. A method that considers cost and
the long-term added value is the cost-benefit effectiveness estimation.
One of those methods is the “profitability estimation focused on
benefits – PEFB”-method developed at the Institute of Management
Cybernetics at RWTH Aachen University. The method copes with
the challenges associated with strategic investment decisions by
integrating long-term non-monetary aspects whilst also mapping the
chronological sequence of an investment within the organization’s
target system. Thus, this method is characterized as a holistic
approach for the evaluation of costs and benefits of an investment.
This participation-oriented method was applied to business
environments in many workshops. The results of the workshops are a
library of more than 96 cost aspects, as well as 122 benefit aspects.
These aspects are preprocessed and comparatively analyzed with
regards to their alignment to a series of risk levels. For the first time,
an accumulation and a distribution of cost and benefit aspects
regarding their impact and probability of occurrence are given. The
results give evidence that the PEFB-method combines precise
measures of financial accounting with the incorporation of benefits.
Finally, the results constitute the basics for using information
technology and data science for decision support when applying
within the PEFB-method.
Abstract: Recent progress in the next generation of automobile
technology is geared towards incorporating information technology
into cars. Collectively called smart cars are bringing intelligence to
cars that provides comfort, convenience and safety. A branch of smart
cars is connected-car system. The key concept in connected-cars is the
sharing of driving information among cars through decentralized
manner enabling collective intelligence. This paper proposes a
foundation of the information model that is necessary to define the
driving information for smart-cars. Road conditions are modeled
through a unique data structure that unambiguously represent the time
variant traffics in the streets. Additionally, the modeled data structure
is exemplified in a navigational scenario and usage using UML.
Optimal driving route searching is also discussed using the proposed
data structure in a dynamically changing road conditions.
Abstract: Cloud computing is a business model which provides
an easier management of computing resources. Cloud users can
request virtual machine and install additional softwares and configure
them if needed. However, user can also request virtual appliance
which provides a better solution to deploy application in much faster
time, as it is ready-built image of operating system with necessary
softwares installed and configured. Large numbers of virtual
appliances are available in different image format. User can
download available appliances from public marketplace and start
using it. However, information published about the virtual appliance
differs from each providers leading to the difficulty in choosing
required virtual appliance as it is composed of specific OS with
standard software version. However, even if user choses the
appliance from respective providers, user doesn’t have any flexibility
to choose their own set of softwares with required OS and
application. In this paper, we propose a referenced architecture for
dynamically customizing virtual appliance and provision them in an
easier manner. We also add our experience in integrating our
proposed architecture with public marketplace and Mi-Cloud, a cloud
management software.
Abstract: Interaction between mixing and crystallization is often
ignored despite the fact that it affects almost every aspect of the
operation including nucleation, growth, and maintenance of the
crystal slurry. This is especially pronounced in multiple impeller
systems where flow complexity is increased. By choosing proper
mixing parameters, what closely depends on the knowledge of the
hydrodynamics in a mixing vessel, the process of batch cooling
crystallization may considerably be improved. The values that render
useful information when making this choice are mixing time and
power consumption. The predominant motivation for this work was
to investigate the extent to which radial dual impeller configuration
influences mixing time, power consumption and consequently the
values of metastable zone width and nucleation rate. In this research,
crystallization of borax was conducted in a 15 dm3 baffled batch
cooling crystallizer with an aspect ratio (H/T) of 1.3. Mixing was
performed using two straight blade turbines (4-SBT) mounted on the
same shaft that generated radial fluid flow. Experiments were
conducted at different values of N/NJS ratio (impeller speed/
minimum impeller speed for complete suspension), D/T ratio
(impeller diameter/crystallizer diameter), c/D ratio (lower impeller
off-bottom clearance/impeller diameter), and s/D ratio (spacing
between impellers/impeller diameter). Mother liquor was saturated at
30°C and was cooled at the rate of 6°C/h. Its concentration was
monitored in line by Na-ion selective electrode. From the values of
supersaturation that was monitored continuously over process time, it
was possible to determine the metastable zone width and
subsequently the nucleation rate using the Mersmann’s nucleation
criterion. For all applied dual impeller configurations, the mixing
time was determined by potentiometric method using a pulse
technique, while the power consumption was determined using a
torque meter produced by Himmelstein & Co. Results obtained in
this investigation show that dual impeller configuration significantly
influences the values of mixing time, power consumption as well as
the metastable zone width and nucleation rate. A special attention
should be addressed to the impeller spacing considering the flow
interaction that could be more or less pronounced depending on the
spacing value.
Abstract: Smart metering and demand response are gaining
ground in industrial and residential applications. Smart Appliances
have been given concern towards achieving Smart home. The success
of Smart grid development relies on the successful implementation of
Information and Communication Technology (ICT) in power sector.
Smart Appliances have been the technology under development and
many new contributions to its realization have been reported in the
last few years. The role of ICT here is to capture data in real time,
thereby allowing bi-directional flow of information/data between
producing and utilization point; that lead a way for the attainment of
Smart appliances where home appliances can communicate between
themselves and provide a self-control (switch on and off) using the
signal (information) obtained from the grid. This paper depicts the
background on ICT for smart appliances paying a particular attention
to the current technology and identifying the future ICT trends for
load monitoring through which smart appliances can be achieved to
facilitate an efficient smart home system which promote demand
response program. This paper grouped and reviewed the recent
contributions, in order to establish the current state of the art and
trends of the technology, so that the reader can be provided with a
comprehensive and insightful review of where ICT for smart
appliances stands and is heading to. The paper also presents a brief
overview of communication types, and then narrowed the discussion
to the load monitoring (Non-intrusive Appliances Load Monitoring
‘NALM’). Finally, some future trends and challenges in the further
development of the ICT framework are discussed to motivate future
contributions that address open problems and explore new
possibilities.
Abstract: Wireless sensors, also known as wireless sensor nodes,
have been making a significant impact on human daily life. The
Radio Frequency Identification (RFID) and Wireless Sensor Network
(WSN) are two complementary technologies; hence, an integrated
implementation of these technologies expands the overall
functionality in obtaining long-range and real-time information on the
location and properties of objects and people. An approach for
integrating ZigBee and RFID networks is proposed in this paper, to
create an energy-efficient network improved by the benefits of
combining ZigBee and RFID architecture. Furthermore, the
compatibility and requirements of the ZigBee device and
communication links in the typical RFID system which is presented
with the real world experiment on the capabilities of the proposed
RFID system.
Abstract: Growth and remodeling of biological structures have
gained lots of attention over the past decades. Determining the
response of living tissues to mechanical loads is necessary for a wide
range of developing fields such as prosthetics design or computerassisted
surgical interventions. It is a well-known fact that biological
structures are never stress-free, even when externally unloaded. The
exact origin of these residual stresses is not clear, but theoretically,
growth is one of the main sources. Extracting body organ’s shapes
from medical imaging does not produce any information regarding
the existing residual stresses in that organ. The simplest cause of such
stresses is gravity since an organ grows under its influence from
birth. Ignoring such residual stresses might cause erroneous results in
numerical simulations. Accounting for residual stresses due to tissue
growth can improve the accuracy of mechanical analysis results. This
paper presents an original computational framework based on gradual
growth to determine the residual stresses due to growth. To illustrate
the method, we apply it to a finite element model of a healthy human
face reconstructed from medical images. The distribution of residual
stress in facial tissues is computed, which can overcome the effect of
gravity and maintain tissues firmness. Our assumption is that tissue
wrinkles caused by aging could be a consequence of decreasing
residual stress and thus not counteracting gravity. Taking into
account these stresses seems therefore extremely important in
maxillofacial surgery. It would indeed help surgeons to estimate
tissues changes after surgery.
Abstract: Mumbai, being traditionally the epicenter of India's
trade and commerce, the existing major ports such as Mumbai and
Jawaharlal Nehru Ports (JN) situated in Thane estuary are also
developing its waterfront facilities. Various developments over the
passage of decades in this region have changed the tidal flux
entering/leaving the estuary. The intake at Pir-Pau is facing the
problem of shortage of water in view of advancement of shoreline,
while jetty near Ulwe faces the problem of ship scheduling due to
existence of shallower depths between JN Port and Ulwe Bunder. In
order to solve these problems, it is inevitable to have information
about tide levels over a long duration by field measurements.
However, field measurement is a tedious and costly affair;
application of artificial intelligence was used to predict water levels
by training the network for the measured tide data for one lunar tidal
cycle. The application of two layered feed forward Artificial Neural
Network (ANN) with back-propagation training algorithms such as
Gradient Descent (GD) and Levenberg-Marquardt (LM) was used to
predict the yearly tide levels at waterfront structures namely at Ulwe
Bunder and Pir-Pau. The tide data collected at Apollo Bunder, Ulwe,
and Vashi for a period of lunar tidal cycle (2013) was used to train,
validate and test the neural networks. These trained networks having
high co-relation coefficients (R= 0.998) were used to predict the tide
at Ulwe, and Vashi for its verification with the measured tide for the
year 2000 & 2013. The results indicate that the predicted tide levels
by ANN give reasonably accurate estimation of tide. Hence, the
trained network is used to predict the yearly tide data (2015) for
Ulwe. Subsequently, the yearly tide data (2015) at Pir-Pau was
predicted by using the neural network which was trained with the
help of measured tide data (2000) of Apollo and Pir-Pau. The analysis of measured data and study reveals that: The
measured tidal data at Pir-Pau, Vashi and Ulwe indicate that there is
maximum amplification of tide by about 10-20 cm with a phase lag
of 10-20 minutes with reference to the tide at Apollo Bunder
(Mumbai). LM training algorithm is faster than GD and with increase
in number of neurons in hidden layer and the performance of the
network increases. The predicted tide levels by ANN at Pir-Pau and
Ulwe provides valuable information about the occurrence of high and
low water levels to plan the operation of pumping at Pir-Pau and
improve ship schedule at Ulwe.
Abstract: The increasing availability of information about earth
surface elevation (Digital Elevation Models DEM) generated from
different sources (remote sensing, Aerial Images, Lidar) poses the
question about how to integrate and make available to the most than
possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the
quality of data management plays a fundamental role. Due to the high
acquisition costs and the huge amount of generated data, highresolution
terrain surveys tend to be small or medium sized and
available on limited portion of earth. Here comes the need to merge
large-scale height maps that typically are made available for free at
worldwide level, with very specific high resolute datasets. One the
other hand, the third dimension increases the user experience and the
data representation quality, unlocking new possibilities in data
analysis for civil protection, real estate, urban planning, environment
monitoring, etc. The open-source 3D virtual globes, which are
trending topics in Geovisual Analytics, aim at improving the
visualization of geographical data provided by standard web services
or with proprietary formats. Typically, 3D Virtual globes like do not
offer an open-source tool that allows the generation of a terrain
elevation data structure starting from heterogeneous-resolution terrain
datasets. This paper describes a technological solution aimed to set
up a so-called “Terrain Builder”. This tool is able to merge
heterogeneous-resolution datasets, and to provide a multi-resolution
worldwide terrain services fully compatible with CesiumJS and
therefore accessible via web using traditional browser without any
additional plug-in.
Abstract: The process of human resources management in the structures of corporate groups demonstrates certain specificity, resulting from the division of decision-making and executive competencies, which occurs within these structures between a parent company and its subsidiaries. The subprocess of employee assessment is considered crucial, since it provides information for the implementation of personnel function. The empirical studies conducted in corporate groups, within which at least one company is located in Poland, confirmed the critical significance of employee assessment systems in the process of human resources management in corporate groups. Parent companies, most often, retain their decision-making authority within the framework of the discussed process and introduce uniform employee assessment and personnel controlling systems to subsidiary companies. However, the instruments for employee assessment applied in corporate groups do not present such specificity.
Abstract: Seeking and sharing knowledge on online forums
have made them popular in recent years. Although online forums are
valuable sources of information, due to variety of sources of
messages, retrieving reliable threads with high quality content is an
issue. Majority of the existing information retrieval systems ignore
the quality of retrieved documents, particularly, in the field of thread
retrieval. In this research, we present an approach that employs
various quality features in order to investigate the quality of retrieved
threads. Different aspects of content quality, including completeness,
comprehensiveness, and politeness, are assessed using these features,
which lead to finding not only textual, but also conceptual relevant
threads for a user query within a forum. To analyse the influence of
the features, we used an adopted version of voting model thread
search as a retrieval system. We equipped it with each feature solely
and also various combinations of features in turn during multiple
runs. The results show that incorporating the quality features
enhances the effectiveness of the utilised retrieval system
significantly.
Abstract: In this study, the potential benefits of playing action
video game among congenitally deaf and dumb subjects is reported in
terms of EEG ratio indices. The frontal and occipital lobes are
associated with development of motor skills, cognition, and visual
information processing and color recognition. The sixteen hours of
First-Person shooter action video game play resulted in the increase
of the ratios β/(α+θ) and β/θ in frontal and occipital lobes. This can
be attributed to the enhancement of certain aspect of cognition among
deaf and dumb subjects.
Abstract: 21st century has transformed the labor market
landscape in a way of posing new and different demands on
university graduates as well as university lecturers, which means that
the knowledge and academic skills students acquire in the course of
their studies should be applicable and transferable from the higher
education context to their future professional careers. Given the
context of the Languages for Specific Purposes (LSP) classroom, the
teachers’ objective is not only to teach the language itself, but also to
prepare students to use that language as a medium to develop generic
skills and competences. These include media and information
literacy, critical and creative thinking, problem-solving and analytical
skills, effective written and oral communication, as well as
collaborative work and social skills, all of which are necessary to
make university graduates more competitive in everyday professional
environments. On the other hand, due to limitations of time and large
numbers of students in classes, the frequently topic-centered syllabus
of LSP courses places considerable focus on acquiring the subject
matter and specialist vocabulary instead of sufficient development of
skills and competences required by students’ prospective employers.
This paper intends to explore some of those issues as viewed both by
LSP lecturers and by business professionals in their respective
surveys. The surveys were conducted among more than 50 LSP
lecturers at higher education institutions in Croatia, more than 40 HR
professionals and more than 60 university graduates with degrees in
economics and/or business working in management positions in
mainly large and medium-sized companies in Croatia. Various elements of LSP course content have been taken into
consideration in this research, including reading and listening
comprehension of specialist texts, acquisition of specialist vocabulary
and grammatical structures, as well as presentation and negotiation
skills. The ability to hold meetings, conduct business correspondence,
write reports, academic texts, case studies and take part in debates
were also taken into consideration, as well as informal business
communication, business etiquette and core courses delivered in a
foreign language. The results of the surveys conducted among LSP
lecturers will be analyzed with reference to what extent those
elements are included in their courses and how consistently and
thoroughly they are evaluated according to their course requirements.
Their opinions will be compared to the results of the surveys
conducted among professionals from a range of industries in Croatia
so as to examine how useful and important they perceive the same
elements of the LSP course content in their working environments.
Such comparative analysis will thus show to what extent the syllabi
of LSP courses meet the demands of the employment market when it
comes to the students’ language skills and competences, as well as
transferable skills. Finally, the findings will also be compared to the
observations based on practical teaching experience and the relevant
sources that have been used in this research. In conclusion, the ideas and observations in this paper are merely
open-ended questions that do not have conclusive answers, but might
prompt LSP lecturers to re-evaluate the content and objectives of
their course syllabi.
Abstract: Data fusion technology can be the best way to extract
useful information from multiple sources of data. It has been widely
applied in various applications. This paper presents a data fusion
approach in multimedia data for event detection in twitter by using
Dempster-Shafer evidence theory. The methodology applies a mining
algorithm to detect the event. There are two types of data in the
fusion. The first is features extracted from text by using the bag-ofwords
method which is calculated using the term frequency-inverse
document frequency (TF-IDF). The second is the visual features
extracted by applying scale-invariant feature transform (SIFT). The
Dempster - Shafer theory of evidence is applied in order to fuse the
information from these two sources. Our experiments have indicated
that comparing to the approaches using individual data source, the
proposed data fusion approach can increase the prediction accuracy
for event detection. The experimental result showed that the proposed
method achieved a high accuracy of 0.97, comparing with 0.93 with
texts only, and 0.86 with images only.
Abstract: Ant algorithms are well-known metaheuristics which
have been widely used since two decades. In most of the literature,
an ant is a constructive heuristic able to build a solution from scratch.
However, other types of ant algorithms have recently emerged: the
discussion is thus not limited by the common framework of the
constructive ant algorithms. Generally, at each generation of an ant
algorithm, each ant builds a solution step by step by adding an
element to it. Each choice is based on the greedy force (also called the
visibility, the short term profit or the heuristic information) and the
trail system (central memory which collects historical information of
the search process). Usually, all the ants of the population have the
same characteristics and behaviors. In contrast in this paper, a new
type of ant metaheuristic is proposed, namely SMART (for Solution
Methods with Ants Running by Types). It relies on the use of different
population of ants, where each population has its own personality.
Abstract: In the deep south of Thailand, checkpoints for people
verification are necessary for the security management of risk zones,
such as official buildings in the conflict area. In this paper, we
propose an automatic checkpoint system that verifies persons using
information from ID cards and facial features. The methods for a
person’s information abstraction and verification are introduced
based on useful information such as ID number and name, extracted
from official cards, and facial images from videos. The proposed
system shows promising results and has a real impact on the local
society.
Abstract: Introduction: To update ourselves and understand the
concept of latest electronic formats available for Health care
providers and how it could be used and developed as per standards.
The idea is to correlate between the patients Manual Medical Records
keeping and maintaining patients Electronic Information in a Health
care setup in this world. Furthermore, this stands with adapting to the
right technology depending upon the organization and improve our
quality and quantity of Healthcare providing skills. Objective: The
concept and theory is to explain the terms of Electronic Medical
Record (EMR), Electronic Health Record (EHR) and Personal Health
Record (PHR) and selecting the best technical among the available
Electronic sources and software before implementing. It is to guide
and make sure the technology used by the end users without any
doubts and difficulties. The idea is to evaluate is to admire the uses
and barriers of EMR-EHR-PHR. Aim and Scope: The target is to
achieve the health care providers like Physicians, Nurses, Therapists,
Medical Bill reimbursements, Insurances and Government to assess
the patient’s information on easy and systematic manner without
diluting the confidentiality of patient’s information. Method: Health
Information Technology can be implemented with the help of
Organisations providing with legal guidelines and help to stand by
the health care provider. The main objective is to select the correct
embedded and affordable database management software and
generating large-scale data. The parallel need is to know how the
latest software available in the market. Conclusion: The question lies
here is implementing the Electronic information system with
healthcare providers and organization. The clinicians are the main
users of the technology and manage us to “go paperless”. The fact is
that day today changing technologically is very sound and up to date.
Basically, the idea is to tell how to store the data electronically safe
and secure. All three exemplifies the fact that an electronic format
has its own benefit as well as barriers.
Abstract: This paper proposes a method of learning topics for
broadcasting contents. There are two kinds of texts related to
broadcasting contents. One is a broadcasting script, which is a series of
texts including directions and dialogues. The other is blogposts, which
possesses relatively abstracted contents, stories, and diverse
information of broadcasting contents. Although two texts range over
similar broadcasting contents, words in blogposts and broadcasting
script are different. When unseen words appear, it needs a method to
reflect to existing topic. In this paper, we introduce a semantic
vocabulary expansion method to reflect unseen words. We expand
topics of the broadcasting script by incorporating the words in
blogposts. Each word in blogposts is added to the most semantically
correlated topics. We use word2vec to get the semantic correlation
between words in blogposts and topics of scripts. The vocabularies of
topics are updated and then posterior inference is performed to
rearrange the topics. In experiments, we verified that the proposed
method can discover more salient topics for broadcasting contents.