Abstract: This research will give the introductory ideas for
cultural adaption of B2C E-Service design in Germany. By the
intense competition of E-Service development, many companies have
realized the importance of understanding the emotional and cultural
characteristics of their customers. Ignoring customers’ needs and
requirements throughout the E-Service design can lead to faults,
mistakes, and gaps. The term of E-Service usability now is changed
not only to develop high quality E-Services, but also to be extended
to include customer satisfaction and provide for them to feel local.
Abstract: This study aims to increase understanding of the
transition of business models in servitization. The significance of
service in all business has increased dramatically during the past
decades. Service-dominant logic (SDL) describes this change in the
economy and questions the goods-dominant logic on which business
has primarily been based in the past. A business model canvas is one
of the most cited and used tools in defining end developing business
models. The starting point of this paper lies in the notion that the
traditional business model canvas is inherently goods-oriented and
best suits for product-based business. However, the basic differences
between goods and services necessitate changes in business model
representations when proceeding in servitization. Therefore, new
knowledge is needed on how the conception of business model and
the business model canvas as its representation should be altered in
servitized firms in order to better serve business developers and interfirm
co-creation. That is to say, compared to products, services are
intangible and they are co-produced between the supplier and the
customer. Value is always co-created in interaction between a
supplier and a customer, and customer experience primarily depends
on how well the interaction succeeds between the actors. The role of
service experience is even stronger in service business compared to
product business, as services are co-produced with the customer. This paper provides business model developers with a service
business model canvas, which takes into account the intangible,
interactive, and relational nature of service. The study employs a
design science approach that contributes to theory development via
design artifacts. This study utilizes qualitative data gathered in
workshops with ten companies from various industries. In particular,
key differences between Goods-dominant logic (GDL) and SDLbased
business models are identified when an industrial firm
proceeds in servitization. As the result of the study, an updated version of the business
model canvas is provided based on service-dominant logic. The
service business model canvas ensures a stronger customer focus and
includes aspects salient for services, such as interaction between
companies, service co-production, and customer experience. It can be
used for the analysis and development of a current service business
model of a company or for designing a new business model. It
facilitates customer-focused new service design and service
development. It aids in the identification of development needs, and
facilitates the creation of a common view of the business model.
Therefore, the service business model canvas can be regarded as a
boundary object, which facilitates the creation of a common
understanding of the business model between several actors involved.
The study contributes to the business model and service business
development disciplines by providing a managerial tool for
practitioners in service development. It also provides research insight
into how servitization challenges companies’ business models.
Abstract: Communicating users' needs, goals and problems help
designers and developers overcome challenges faced by end users.
Personas are used to represent end users’ needs. In our research,
creating personas allowed the following questions to be answered:
Who are the potential user groups? What do they want to achieve by
using the service? What are the problems that users face? What
should the service provide to them? To develop realistic personas, we
conducted a focus group discussion with undergraduate and graduate
students and also interviewed a university librarian. The personas
were created to help evaluating the Institutional Repository that is
based on the DSpace system. The profiles helped to communicate
users' needs, abilities, tasks, and problems, and the task scenarios
used in the heuristic evaluation were based on these personas. Four
personas resulted of a focus group discussion with undergraduate and
graduate students and from interviewing a university librarian. We
then used these personas to create focused task-scenarios for a
heuristic evaluation on the system interface to ensure that it met
users' needs, goals, problems and desires. In this paper, we present
the process that we used to create the personas that led to devise the
task scenarios used in the heuristic evaluation as a follow up study of
the DSpace university repository.
Abstract: Electrostatic interaction energy (ΔEEDL) is a part of the Extended Derjaguin-Landau-Verwey-Overbeek (XDLVO) theory, which, together with van der Waals (ΔEVDW) and acid base (ΔEAB) interaction energies, has been extensively used to investigate the initial adhesion of bacteria to surfaces. Electrostatic or electrical double layer interaction energy is considerably affected by surface potential; however it cannot be determined experimentally and is usually replaced by zeta (ζ) potential via electrophoretic mobility. This paper focusses on the effect of ionic concentration as a function of pH and the effect of mineral grain size on ζ potential. It was found that both ionic strength and mineral grain size play a major role in determining the value of ζ potential for the adhesion of P. putida to hematite and quartz surfaces. Higher ζ potential values lead to higher electrostatic interaction energies and eventually to higher total XDLVO interaction energy resulting in bacterial repulsion.
Abstract: Traditional mechanical control systems in thrust
vectoring are efficient in rocket thrust guidance but their costs
and their weights are excessive. The fluidic injection in the nozzle
divergent constitutes an alternative procedure to achieve the goal. In
this paper, we present a 3D analytical model for fluidic injection
in a supersonic nozzle integrating an orifice. The fluidic vectoring
uses a sonic secondary injection in the divergent. As a result, the
flow and interaction between the main and secondary jet has built in
order to express the pressure fields from which the forces and thrust
vectoring are deduced. Under various separation criteria, the present
analytical model results are compared with the existing numerical
and experimental data from the literature.
Abstract: This paper presents a fully Lagrangian coupled
Fluid-Structure Interaction (FSI) solver for simulations of
fluid-structure interactions, which is based on the Moving Particle
Semi-implicit (MPS) method to solve the governing equations
corresponding to incompressible flows as well as elastic structures.
The developed solver is verified by reproducing the high velocity
impact loads of deformable thin wedges with three different materials
such as mild steel, aluminium and tin during water entry. The present
simulation results for aluminium are compared with analytical solution
derived from the hydrodynamic Wagner model and linear Wan’s
theory. And also, the impact pressure and strain on the water entry
wedge with three different materials, such as mild steel, aluminium
and tin, are simulated and the effects of hydro-elasticity are discussed.
Abstract: The objective of this paper is to evaluate the effects of
soil-structure interaction (SSI) on the modal characteristics and on
the dynamic response of current structures. The objective is on the
overall behaviour of a real structure of five storeys reinforced
concrete (R/C) building typically encountered in Algeria. Sensitivity
studies are undertaken in order to study the effects of frequency
content of the input motion, frequency of the soil-structure system,
rigidity and depth of the soil layer on the dynamic response of such
structures. This investigation indicated that the rigidity of the soil
layer is the predominant factor in soil-structure interaction and its
increases would definitely reduce the deformation in the R/C
structure. On the other hand, increasing the period of the underlying
soil will cause an increase in the lateral displacements at story levels
and create irregularity in the distribution of story shears. Possible
resonance between the frequency content of the input motion and soil
could also play an important role in increasing the structural
response.
Abstract: Determination of genetic variation is useful for plant
breeding and hence production of more efficient plant species under
different conditions, like drought stress. In this study a sample of 28
recombinant inbred lines (RILs) of wheat developed from the cross of
Norstar and Zagross varieties, together with their parents, were
evaluated for two years (2010-2012) under normal and water stress
conditions using split plot design with three replications. Main plots
included two irrigation treatments of 70 and 140 mm evaporation
from Class A pan and sub-plots consisted of 30 genotypes. The effect
of genotypes and interaction of genotypes with years and water
regimes were significant for all characters. Significant genotypic
effect implies the existence of genetic variation among the lines
under study. Heritability estimates were high for 1000 grain weight
(0.87). Biomass and grain yield showed the lowest heritability values
(0.42 and 0.50, respectively). Highest genotypic and phenotypic
coefficients of variation (GCV and PCV) belonged to harvest index.
Moderate genetic advance for most of the traits suggested the
feasibility of selection among the RILs under investigation. Some
RILs were higher yielding than either parent at both environments.
Abstract: The development of transport systems has negative
impacts on the environment although it has beneficial effects on
society. The car policy caused many problems such as: - the
spectacular growth of fuel consumption hence the very vast increase
in urban pollution, traffic congestion in certain places and at certain
times, the increase in the number of accidents. The exhaust emissions
from cars and weather conditions are the main factors that determine
the level of pollution in urban atmosphere. These conditions lead to
the phenomenon of heat transfer and radiation occurring between the
air and the soil surface of any town. These exchanges give rise, in
urban areas, to the effects of heat islands that correspond to the
appearance of excess air temperature between the city and its
surrounding space. In this object, we perform a numerical simulation
of the plume generated by the cars exhaust gases and show that these
gases form a screening effect above the urban city which cause the
heat island in the presence of wind flow. This study allows us: 1. To
understand the different mechanisms of interactions between these
phenomena.2. To consider appropriate technical solutions to mitigate
the effects of the heat island.
Abstract: The purpose of this project is to propose a quick and
environmentally friendly alternative to measure the quality of oils
used in food industry. There is evidence that repeated and
indiscriminate use of oils in food processing cause physicochemical
changes with formation of potentially toxic compounds that can
affect the health of consumers and cause organoleptic changes. In
order to assess the quality of oils, non-destructive optical techniques
such as Interferometry offer a rapid alternative to the use of reagents,
using only the interaction of light on the oil. Through this project, we
used interferograms of samples of oil placed under different heating
conditions to establish the changes in their quality. These
interferograms were obtained by means of a Mach-Zehnder
Interferometer using a beam of light from a HeNe laser of 10mW at
632.8nm. Each interferogram was captured, analyzed and measured
full width at half-maximum (FWHM) using the software from
Amcap and ImageJ. The total of FWHMs was organized in three
groups. It was observed that the average obtained from each of the
FWHMs of group A shows a behavior that is almost linear, therefore
it is probable that the exposure time is not relevant when the oil is
kept under constant temperature. Group B exhibits a slight
exponential model when temperature raises between 373 K and 393
K. Results of the t-Student show a probability of 95% (0.05) of the
existence of variation in the molecular composition of both samples.
Furthermore, we found a correlation between the Iodine Indexes
(Physicochemical Analysis) and the Interferograms (Optical
Analysis) of group C. Based on these results, this project highlights
the importance of the quality of the oils used in food industry and
shows how Interferometry can be a useful tool for this purpose.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: Nowadays, several research studies point up that an
active lifestyle is essential for physical and mental health benefits.
Mobile phones have greatly influenced people’s habits and attitudes
also in the way they exercise. Our research work is mainly focused on
investigating how to exploit mobile technologies to favour people’s
exertion experience. To this end, we developed an exertion framework
users can exploit through a real world mobile application, called
EverywhereSport Run (EWRun), designed to act as a virtual personal
trainer to support runners during their trainings. In this work, inspired
by both previous findings in the field of interaction design for people
with visual impairments, feedback gathered from real users of our
framework, and positive results obtained from two experimentations,
we present some new interaction facilities we designed to enhance
the interaction experience during a training. The positive obtained
results helped us to derive some interaction design recommendations
we believe will be a valid support for designers of future mobile
systems conceived to be used in circumstances where there are limited
possibilities of interaction.
Abstract: DNA Barcode provides good sources of needed
information to classify living species. The classification problem has
to be supported with reliable methods and algorithms. To analyze
species regions or entire genomes, it becomes necessary to use the
similarity sequence methods. A large set of sequences can be
simultaneously compared using Multiple Sequence Alignment which
is known to be NP-complete. However, all the used methods are still
computationally very expensive and require significant computational
infrastructure. Our goal is to build predictive models that are highly
accurate and interpretable. In fact, our method permits to avoid the
complex problem of form and structure in different classes of
organisms. The empirical data and their classification performances
are compared with other methods. Evenly, in this study, we present
our system which is consisted of three phases. The first one, is called
transformation, is composed of three sub steps; Electron-Ion
Interaction Pseudopotential (EIIP) for the codification of DNA
Barcodes, Fourier Transform and Power Spectrum Signal Processing.
Moreover, the second phase step is an approximation; it is
empowered by the use of Multi Library Wavelet Neural Networks
(MLWNN). Finally, the third one, is called the classification of DNA
Barcodes, is realized by applying the algorithm of hierarchical
classification.
Abstract: This paper presents a state-of-the-art survey of the
operations research models developed for internal audit planning.
Two alternative approaches have been followed in the literature for
audit planning: (1) identifying the optimal audit frequency; and (2)
determining the optimal audit resource allocation. The first approach
identifies the elapsed time between two successive audits, which can
be presented as the optimal number of audits in a given planning
horizon, or the optimal number of transactions after which an audit
should be performed. It also includes the optimal audit schedule. The
second approach determines the optimal allocation of audit frequency
among all auditable units in the firm. In our review, we discuss both
the deterministic and probabilistic models developed for audit
planning. In addition, game theory models are reviewed to find the
optimal auditing strategy based on the interactions between the
auditors and the clients.
Abstract: Numerical studies have been carried out using a
validated two-dimensional standard k-omega turbulence model for
the design optimization of a thrust vector control system using shock
induced self-impinging supersonic secondary double jet. Parametric
analytical studies have been carried out at different secondary
injection locations to identifying the highest unsymmetrical
distribution of the main gas flow due to shock waves, which produces
a desirable side force more lucratively for vectoring. The results from
the parametric studies of the case on hand reveal that the shock
induced self-impinging supersonic secondary double jet is more
efficient in certain locations at the divergent region of a CD nozzle
than a case with supersonic single jet with same mass flow rate. We
observed that the best axial location of the self-impinging supersonic
secondary double jet nozzle with a given jet interaction angle, built-in
to a CD nozzle having area ratio 1.797, is 0.991 times the primary
nozzle throat diameter from the throat location. We also observed
that the flexible steering is possible after invoking ON/OFF facility to
the secondary nozzles for meeting the onboard mission requirements.
Through our case studies we concluded that the supersonic self-impinging
secondary double jet at predesigned jet interaction angle
and location can provide more flexible steering options facilitating
with 8.81% higher thrust vectoring efficiency than the conventional
supersonic single secondary jet without compromising the payload
capability of any supersonic aerospace vehicle.
Abstract: Neural activity in the human brain starts from the
early stages of prenatal development. This activity or signals
generated by the brain are electrical in nature and represent not only
the brain function but also the status of the whole body. At the
present moment, three methods can record functional and
physiological changes within the brain with high temporal resolution
of neuronal interactions at the network level: the
electroencephalogram (EEG), the magnet oencephalogram (MEG),
and functional magnetic resonance imaging (fMRI); each of these has
advantages and shortcomings. EEG recording with a large number of
electrodes is now feasible in clinical practice. Multichannel EEG
recorded from the scalp surface provides very valuable but indirect
information about the source distribution. However, deep electrode
measurements yield more reliable information about the source
locations intracranial recordings and scalp EEG are used with the
source imaging techniques to determine the locations and strengths of
the epileptic activity. As a source localization method, Low
Resolution Electro-Magnetic Tomography (LORETA) is solved for
the realistic geometry based on both forward methods, the Boundary
Element Method (BEM) and the Finite Difference Method (FDM). In
this paper, we review the findings EEG- LORETA about epilepsy.
Abstract: The literature on language teaching and second
language acquisition has been largely driven by monolingual
ideology with a common assumption that a second language (L2) is
best taught and learned in the L2 only. The current study challenges
this assumption by reporting learners' positive perceptions of tertiary
level teachers' code switching practices in Vietnam. The findings of
this study contribute to our understanding of code switching practices
in language classrooms from a learners' perspective.
Data were collected from student participants who were working
towards a Bachelor degree in English within the English for Business
Communication stream through the use of focus group interviews.
The literature has documented that this method of interviewing has a
number of distinct advantages over individual student interviews. For
instance, group interactions generated by focus groups create a more
natural environment than that of an individual interview because they
include a range of communicative processes in which each individual
may influence or be influenced by others - as they are in their real
life. The process of interaction provides the opportunity to obtain the
meanings and answers to a problem that are "socially constructed
rather than individually created" leading to the capture of real-life
data. The distinct feature of group interaction offered by this
technique makes it a powerful means of obtaining deeper and richer
data than those from individual interviews. The data generated
through this study were analysed using a constant comparative
approach. Overall, the students expressed positive views of this
practice indicating that it is a useful teaching strategy. Teacher code
switching was seen as a learning resource and a source supporting
language output. This practice was perceived to promote student
comprehension and to aid the learning of content and target language
knowledge. This practice was also believed to scaffold the students'
language production in different contexts. However, the students
indicated their preference for teacher code switching to be
constrained, as extensive use was believed to negatively impact on
their L2 learning and trigger cognitive reliance on the L1 for L2
learning. The students also perceived that when the L1 was used to a
great extent, their ability to develop as autonomous learners was
negatively impacted.
This study found that teacher code switching was supported in
certain contexts by learners, thus suggesting that there is a need for
the widespread assumption about the monolingual teaching approach
to be re-considered.
Abstract: Cyberspace has become a more viable arena for
budding artists to share musical acts through digital forms. The
increasing relevance of online communities has attracted scholars
from various fields demonstrating its influence on social capital. This
paper extends this understanding of social capital among Filipino
music artists belonging to the SoundCloud Philippines Facebook
Group.
The study makes use of various qualitative data obtained from
key-informant interviews and participant observation of online and
physical encounters, analyzed using the case study approach.
Soundcloud Philippines has over seven-hundred members and is
composed of Filipino singers, instrumentalists, composers, arrangers,
producers, multimedia artists and event managers. Group interactions
are a mix of online encounters based on Facebook and SoundCloud
and physical encounters through meet-ups and events. Benefits
reaped from the community are informational, technical,
instrumental, promotional, motivational and social support. Under the
guidance of online group administrators, collaborative activities such
as music productions, concerts and events transpire. Most conflicts
and problems arising are resolved peacefully. Social capital in
SoundCloud Philippines is mobilized through recognition, respect
and reciprocity.
Abstract: Continuous upflow filters can combine the nutrient
(nitrogen and phosphate) and suspended solid removal in one unit
process. The contaminant removal could be achieved chemically or
biologically; in both processes the filter removal efficiency depends
on the interaction between the packed filter media and the influent. In
this paper a residence time distribution (RTD) study was carried out
to understand and compare the transfer behaviour of contaminants
through a selected filter media packed in a laboratory-scale
continuous up flow filter; the selected filter media are limestone and
white dolomite. The experimental work was conducted by injecting a
tracer (red drain dye tracer –RDD) into the filtration system and then
measuring the tracer concentration at the outflow as a function of
time; the tracer injection was applied at hydraulic loading rates
(HLRs) (3.8 to 15.2 m h-1). The results were analysed according to
the cumulative distribution function F(t) to estimate the residence
time of the tracer molecules inside the filter media. The mean
residence time (MRT) and variance σ2 are two moments of RTD that
were calculated to compare the RTD characteristics of limestone with
white dolomite. The results showed that the exit-age distribution of
the tracer looks better at HLRs (3.8 to 7.6 m h-1) and (3.8 m h-1) for
limestone and white dolomite respectively. At these HLRs the
cumulative distribution function F(t) revealed that the residence time
of the tracer inside the limestone was longer than in the white
dolomite; whereas all the tracer took 8 minutes to leave the white
dolomite at 3.8 m h-1. On the other hand, the same amount of the
tracer took 10 minutes to leave the limestone at the same HLR. In
conclusion, the determination of the optimal level of hydraulic
loading rate, which achieved the better influent distribution over the
filtration system, helps to identify the applicability of the material as
filter media. Further work will be applied to examine the efficiency
of the limestone and white dolomite for phosphate removal by
pumping a phosphate solution into the filter at HLRs (3.8 to 7.6 m h-1).
Abstract: The numerical simulation has made tremendous
advances in investigating the blood flow phenomenon through elastic
arteries. Such study can be useful in demonstrating the disease
progression and hemodynamics of cardiovascular diseases such as
atherosclerosis. In the present study, patient specific case diagnosed
with partially stenosed complete right ICA and normal left carotid
bifurcation without any atherosclerotic plaque formation is
considered. 3D patient specific carotid bifurcation model is generated
based on CT scan data using MIMICS-4.0 and numerical analysis is
performed using FSI solver in ANSYS-14.5. The blood flow is
assumed to be incompressible, homogenous and Newtonian, while
the artery wall is assumed to be linearly elastic. The two-way
sequentially coupled transient FSI analysis is performed using FSI
solver for three pulse cycles. The hemodynamic parameters such as
flow pattern, Wall Shear Stress, pressure contours and arterial wall
deformation are studied at the bifurcation and critical zones such as
stenosis. The variation in flow behavior is studied throughout the
pulse cycle. Also, the simulation results reveal that there is a
considerable increase in the flow behavior in stenosed carotid in
contrast to the normal carotid bifurcation system. The investigation
also demonstrates the disturbed flow pattern especially at the
bifurcation and stenosed zone elevating the hemodynamics,
particularly during peak systole and later part of the pulse cycle. The
results obtained agree well with the clinical observation and
demonstrates the potential of patient specific numerical studies in
prognosis of disease progression and plaque rupture.