Abstract: High quality requirements analysis is one of the most
crucial activities to ensure the success of a software project, so that
requirements verification for software system becomes more and more
important in Requirements Engineering (RE) and it is one of the most
helpful strategies for improving the quality of software system.
Related works show that requirement elicitation and analysis can be
facilitated by ontological approaches and semantic web technologies.
In this paper, we proposed a hybrid method which aims to verify
requirements with structural and formal semantics to detect
interactions. The proposed method is twofold: one is for modeling
requirements with the semantic web language OWL, to construct a
semantic context; the other is a set of interaction detection rules which
are derived from scenario-based analysis and represented with
semantic web rule language (SWRL). SWRL based rules are working
with rule engines like Jess to reason in semantic context for
requirements thus to detect interactions. The benefits of the proposed
method lie in three aspects: the method (i) provides systematic steps
for modeling requirements with an ontological approach, (ii) offers
synergy of requirements elicitation and domain engineering for
knowledge sharing, and (3)the proposed rules can systematically assist
in requirements interaction detection.
Abstract: Work stress causes the organizational work-life
imbalance of employees. Because of this imbalance, workers perform
with lower effort to finish assignments and thus an organization will
experience reduced productivity. In order to investigate the problem
of an organizational work-life imbalance, this qualitative case study
focuses on an organizational work-life imbalance among Thai
software developers in a German-owned company in Chiang Mai,
Thailand. In terms of knowledge management, fishbone diagram is
useful analysis tool to investigate the root causes of an organizational
work-life imbalance systematically in focus-group discussions.
Furthermore, fishbone diagram shows the relationship between
causes and effects clearly. It was found that an organizational worklife
imbalance among Thai software developers is influenced by
management team, work environment, and information tools used in
the company over time.
Abstract: Recently studies in area of supply chain network
(SCN) have focused on the disruption issues in distribution systems.
Also this paper extends the previous literature by providing a new biobjective
model for cost minimization of designing a three echelon
SCN across normal and failure scenarios with considering multi
capacity option for manufacturers and distribution centers. Moreover,
in order to solve the problem by means of LINGO software, novel
model will be reformulated through a branch of LP-Metric method
called Min-Max approach.
Abstract: There have been different approaches to compute the
analytic instantaneous frequency with a variety of background reasoning
and applicability in practice, as well as restrictions. This paper presents an adaptive Fourier decomposition and (α-counting) based
instantaneous frequency computation approach. The adaptive Fourier
decomposition is a recently proposed new signal decomposition
approach. The instantaneous frequency can be computed through the so called mono-components decomposed by it. Due to the fast energy
convergency, the highest frequency of the signal will be discarded by the adaptive Fourier decomposition, which represents the noise of
the signal in most of the situation. A new instantaneous frequency
definition for a large class of so-called simple waves is also proposed
in this paper. Simple wave contains a wide range of signals for which
the concept instantaneous frequency has a perfect physical sense.
The α-counting instantaneous frequency can be used to compute the highest frequency for a signal. Combination of these two approaches one can obtain the IFs of the whole signal. An experiment is demonstrated the computation procedure with promising results.
Abstract: In this study we investigate silica nanoparticle (SiO2- NP) effects on the structure and phase properties of supported lipid monolayers and bilayers, coupling surface pressure measurements, fluorescence microscopy and atomic force microscopy. SiO2-NPs typically in size range of 10nm to 100 nm in diameter are tested. Our results suggest first that lipid molecules organization depends to their nature. Secondly, lipid molecules in the vinicity of big aggregates nanoparticles organize in liquid condensed phase whereas small aggregates are localized in both fluid liquid-expanded (LE) and liquid-condenced (LC). We demonstrated also by atomic force microscopy that by measuring friction forces it is possible to get information as if nanoparticle aggregates are recovered or not by lipid monolayers and bilayers.
Abstract: The group mutual exclusion (GME) problem is a
variant of the mutual exclusion problem. In the present paper a
token-based group mutual exclusion algorithm, capable of handling
transient faults, is proposed. The algorithm uses the concept of
dynamic request sets. A time out mechanism is used to detect the
token loss; also, a distributed scheme is used to regenerate the token.
The worst case message complexity of the algorithm is n+1. The
maximum concurrency and forum switch complexity of the
algorithm are n and min (n, m) respectively, where n is the number of
processes and m is the number of groups. The algorithm also satisfies
another desirable property called smooth admission. The scheme can
also be adapted to handle the extended group mutual exclusion
problem.
Abstract: Nowadays, where most of the leading economies are
service oriented and e-business is being widely used for their
management, supply chain management has become one of the most
studied and practiced fields. Quality has an important role on today-s
business processes, so it is important to understand the impact of IT
service quality on the performance of supply chains. This paper will
start by analyzing the Supply Chain Operations Reference (SCOR)
model and each of its five activities: Plan, Source, Make, Delivery,
and Return. This article proposes a framework for analyzing Effect of
IT Service Quality on Supply Chain Performance. Using the
proposed framework, hypotheses are framed for the direct effect of IT
service quality on Supply Chain Performance and its indirect effect
through effective Supply Chain Management. The framework will be
validated empirically based on the surveys of executives of various
organizations and statistical analyses of the data collected.
Abstract: In this paper, we proposed an efficient data
compression strategy exploiting the multi-resolution characteristic of
the wavelet transform. We have developed a sensor node called
“Smart Sensor Node; SSN". The main goals of the SSN design are
lightweight, minimal power consumption, modular design and robust
circuitry. The SSN is made up of four basic components which are a
sensing unit, a processing unit, a transceiver unit and a power unit.
FiOStd evaluation board is chosen as the main controller of the SSN
for its low costs and high performance. The software coding of the
implementation was done using Simulink model and MATLAB
programming language. The experimental results show that the
proposed data compression technique yields recover signal with good
quality. This technique can be applied to compress the collected data
to reduce the data communication as well as the energy consumption
of the sensor and so the lifetime of sensor node can be extended.
Abstract: This paper presents a methodology towards the emulation of the electrical power consumption of the RF device during the cellular phone/handset transmission mode using the LTE technology. The emulation methodology takes the physical environmental variables and the logical interface between the baseband and the RF system as inputs to compute the emulated power dissipation of the RF device. The emulated power, in between the measured points corresponding to the discrete values of the logical interface parameters is computed as a polynomial interpolation using polynomial basis functions. The evaluation of polynomial and spline curve fitting models showed a respective divergence (test error) of 8% and 0.02% from the physically measured power consumption. The precisions of the instruments used for the physical measurements have been modeled as intervals. We have been able to model the power consumption of the RF device operating at 5MHz using homotopy between 2 continuous power consumptions of the RF device operating at the bandwidths 3MHz and 10MHz.
Abstract: In this paper, we investigate the study of techniques
for scheduling users for resource allocation in the case of multiple
input and multiple output (MIMO) packet transmission systems. In
these systems, transmit antennas are assigned to one user or
dynamically to different users using spatial multiplexing. The
allocation of all transmit antennas to one user cannot take full
advantages of multi-user diversity. Therefore, we developed the case
when resources are allocated dynamically. At each time slot users
have to feed back their channel information on an uplink feedback
channel. Channel information considered available in the schedulers
is the zero forcing (ZF) post detection signal to interference plus
noise ratio. Our analysis study concerns the round robin and the
opportunistic schemes.
In this paper, we present an overview and a complete capacity
analysis of these schemes. The main results in our study are to give
an analytical form of system capacity using the ZF receiver at the
user terminal. Simulations have been carried out to validate all
proposed analytical solutions and to compare the performance of
these schemes.
Abstract: Electromyography (EMG) signal processing has been investigated remarkably regarding various applications such as in rehabilitation systems. Specifically, wavelet transform has served as a powerful technique to scrutinize EMG signals since wavelet transform is consistent with the nature of EMG as a non-stationary signal. In this paper, the efficiency of wavelet transform in surface EMG feature extraction is investigated from four levels of wavelet decomposition and a comparative study between different mother wavelets had been done. To recognize the best function and level of wavelet analysis, two evaluation criteria, scatter plot and RES index are recruited. Hereupon, four wavelet families, namely, Daubechies, Coiflets, Symlets and Biorthogonal are studied in wavelet decomposition stage. Consequently, the results show that only features from first and second level of wavelet decomposition yields good performance and some functions of various wavelet families can lead to an improvement in separability class of different hand movements.
Abstract: In this contribution a newly developed elearning environment is presented, which incorporates Intelligent Agents and Computational Intelligence Techniques. The new e-learning environment is constituted by three parts, the E-learning platform Front-End, the Student Questioner Reasoning and the Student Model Agent. These parts are distributed geographically in dispersed computer servers, with main focus on the design and development of these subsystems through the use of new and emerging technologies. These parts are interconnected in an interoperable way, using web services for the integration of the subsystems, in order to enhance the user modelling procedure and achieve the goals of the learning process.
Abstract: This study proposes a materials procurement contracts
model to which the zero-cost collar option is applied for heading price
fluctuation risks in construction.The material contract model based on
the collar option that consists of the call option striking zone of the
construction company(the buyer) following the materials price
increase andthe put option striking zone of the material vendor(the
supplier) following a materials price decrease. This study first
determined the call option strike price Xc of the construction company
by a simple approach: it uses the predicted profit at the project starting
point and then determines the strike price of put option Xp that has an
identical option value, which completes the zero-cost material
contract.The analysis results indicate that the cost saving of the
construction company increased as Xc decreased. This was because the
critical level of the steel materials price increasewas set at a low level.
However, as Xc decreased, Xpof a put option that had an identical
option value gradually increased. Cost saving increased as Xc
decreased. However, as Xp gradually increased, the risk of loss from a
construction company increased as the steel materials price decreased.
Meanwhile, cost saving did not occur for the construction company,
because of volatility. This result originated in the zero-cost features of
the two-way contract of the collar option. In the case of the regular
one-way option, the transaction cost had to be subtracted from the cost
saving. The transaction cost originated from an option value that
fluctuated with the volatility. That is, the cost saving of the one-way
option was affected by the volatility. Meanwhile, even though the
collar option with zero transaction cost cut the connection between
volatility and cost saving, there was a risk of exercising the put option.
Abstract: In this paper, a new approach for target recognition based on the Empirical mode decomposition (EMD) algorithm of Huang etal. [11] and the energy tracking operator of Teager [13]-[14] is introduced. The conjunction of these two methods is called Teager-Huang analysis. This approach is well suited for nonstationary signals analysis. The impulse response (IR) of target is first band pass filtered into subsignals (components) called Intrinsic mode functions (IMFs) with well defined Instantaneous frequency (IF) and Instantaneous amplitude (IA). Each IMF is a zero-mean AM-FM component. In second step, the energy of each IMF is tracked using the Teager energy operator (TEO). IF and IA, useful to describe the time-varying characteristics of the signal, are estimated using the Energy separation algorithm (ESA) algorithm of Maragos et al .[16]-[17]. In third step, a set of features such as skewness and kurtosis are extracted from the IF, IA and IMF energy functions. The Teager-Huang analysis is tested on set of synthetic IRs of Sonar targets with different physical characteristics (density, velocity, shape,? ). PCA is first applied to features to discriminate between manufactured and natural targets. The manufactured patterns are classified into spheres and cylinders. One hundred percent of correct recognition is achieved with twenty three echoes where sixteen IRs, used for training, are free noise and seven IRs, used for testing phase, are corrupted with white Gaussian noise.
Abstract: Modular multiplication is the basic operation
in most public key cryptosystems, such as RSA, DSA, ECC,
and DH key exchange. Unfortunately, very large operands
(in order of 1024 or 2048 bits) must be used to provide
sufficient security strength. The use of such big numbers
dramatically slows down the whole cipher system, especially
when running on embedded processors.
So far, customized hardware accelerators - developed on
FPGAs or ASICs - were the best choice for accelerating
modular multiplication in embedded environments. On the
other hand, many algorithms have been developed to speed
up such operations. Examples are the Montgomery modular
multiplication and the interleaved modular multiplication
algorithms. Combining both customized hardware with
an efficient algorithm is expected to provide a much faster
cipher system.
This paper introduces an enhanced architecture for computing
the modular multiplication of two large numbers X
and Y modulo a given modulus M. The proposed design is
compared with three previous architectures depending on
carry save adders and look up tables. Look up tables should
be loaded with a set of pre-computed values. Our proposed
architecture uses the same carry save addition, but replaces
both look up tables and pre-computations with an enhanced
version of sign detection techniques. The proposed architecture
supports higher frequencies than other architectures.
It also has a better overall absolute time for a single operation.
Abstract: The utility of expert system generators has been
widely recognized in many applications. Several generators based on
concept of the paradigm object, have been recently proposed. The
generator of oriented object expert system (GSEOO) offers
languages that are often complex and difficult to use. We propose in
this paper an extension of the expert system generator, JESS, which
permits a friendly use of this expert system. The new tool, called
VISUAL JESS, bring two main improvements to JESS. The first
improvement concerns the easiness of its utilization while giving
back transparency to the syntax and semantic aspects of the JESS
programming language. The second improvement permits an easy
access and modification of the JESS knowledge basis. The
implementation of VISUAL JESS is made so that it is extensible and
portable.
Abstract: The purpose of this study is mainly to predict collision
frequency on the horizontal tangents combined with vertical curves
using artificial neural network methods. The proposed ANN models
are compared with existing regression models. First, the variables
that affect collision frequency were investigated. It was found that
only the annual average daily traffic, section length, access density,
the rate of vertical curvature, smaller curve radius before and after
the tangent were statistically significant according to related
combinations. Second, three statistical models (negative binomial,
zero inflated Poisson and zero inflated negative binomial) were
developed using the significant variables for three alignment
combinations. Third, ANN models are developed by applying the
same variables for each combination. The results clearly show that
the ANN models have the lowest mean square error value than those
of the statistical models. Similarly, the AIC values of the ANN
models are smaller to those of the regression models for all the
combinations. Consequently, the ANN models have better statistical
performances than statistical models for estimating collision
frequency. The ANN models presented in this paper are
recommended for evaluating the safety impacts 3D alignment
elements on horizontal tangents.
Abstract: The Eulerian numerical method is proposed to analyze
the explosion in tunnel. Based on this method, an original software
M-MMIC2D is developed by Cµ program language. With this
software, the explosion problem in the tunnel with three
expansion-chambers is numerically simulated, and the results are
found to be in full agreement with the observed experimental data.
Abstract: In films, visual effects have played the role of
expressing realities more realistically or describing imaginations as if
they are real. Such images are immediated images representing
realism, and the logic of immediation for the reality of images has
been perceived dominant in visual effects. In order for immediation to
have an identity as immediation, there should be the opposite concept
hypermediation.
In the mid 2000s, hypermediated images were settled as a code of
mass culture in Asia. Thus, among Asian films highly popular in those
days, this study selected five displaying hypermediated images – 2 Korean, 2 Japanese, and 1 Thailand movies – and examined the
semiotic meanings of such images using Roland Barthes- directional and implicated meaning analysis and Metz-s paradigmatic analysis
method, focusing on how hypermediated images work in the general
context of the films, how they are associated with spaces, and what
meanings they try to carry.
Abstract: Feeder is one of the airships of the Multibody Advanced Airship for Transport (MAAT) system, under development within the EU FP7 project. MAAT is based on a modular concept composed of two different parts that have the possibility to join; respectively they are the so-called Cruiser and Feeder, designed on the lighter than air principle. Feeder, also named ATEN (Airship Transport Elevator Network), is the smaller one which joins the bigger one, Cruiser, also named PTAH (Photovoltaic modular Transport Airship for High altitude),envisaged to happen at 15km altitude. During the MAAT design phase, the aerodynamic studies of the both airships and their interactions are analyzed. The objective of these studies is to understand the aerodynamic behavior of all the preselected configurations, as an important element in the overall MAAT system design. The most of these configurations are only simulated by CFD, while the most feasible one is experimentally analyzed in order to validate and thrust the CFD predictions. This paper presents the numerical and experimental investigation of the Feeder “conical like" shape configuration. The experiments are focused on the aerodynamic force coefficients and the pressure distribution over the Feeder outer surface, while the numerical simulation cover also the analysis of the velocity and pressure distribution. Finally, the wind tunnel experiment is compared with its CFD model in order to validate such specific simulations with respective experiments and to better understand the difference between the wind tunnel and in-flight circumstances.