Abstract: In this work we make a bifurcation analysis for a
single compartment representation of Traub model, one of the most
important conductance-based models. The analysis focus in two
principal parameters: current and leakage conductance. Study of
stable and unstable solutions are explored; also Hop-bifurcation and
frequency interpretation when current varies is examined. This study
allows having control of neuron dynamics and neuron response when
these parameters change. Analysis like this is particularly important
for several applications such as: tuning parameters in learning
process, neuron excitability tests, measure bursting properties of the
neuron, etc. Finally, a hardware implementation results were
developed to corroborate these results.
Abstract: The increasing availability of information about earth
surface elevation (Digital Elevation Models DEM) generated from
different sources (remote sensing, Aerial Images, Lidar) poses the
question about how to integrate and make available to the most than
possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the
quality of data management plays a fundamental role. Due to the high
acquisition costs and the huge amount of generated data, highresolution
terrain surveys tend to be small or medium sized and
available on limited portion of earth. Here comes the need to merge
large-scale height maps that typically are made available for free at
worldwide level, with very specific high resolute datasets. One the
other hand, the third dimension increases the user experience and the
data representation quality, unlocking new possibilities in data
analysis for civil protection, real estate, urban planning, environment
monitoring, etc. The open-source 3D virtual globes, which are
trending topics in Geovisual Analytics, aim at improving the
visualization of geographical data provided by standard web services
or with proprietary formats. Typically, 3D Virtual globes like do not
offer an open-source tool that allows the generation of a terrain
elevation data structure starting from heterogeneous-resolution terrain
datasets. This paper describes a technological solution aimed to set
up a so-called “Terrain Builder”. This tool is able to merge
heterogeneous-resolution datasets, and to provide a multi-resolution
worldwide terrain services fully compatible with CesiumJS and
therefore accessible via web using traditional browser without any
additional plug-in.
Abstract: Myoelectric control system is the fundamental
component of modern prostheses, which uses the myoelectric signals
from an individual’s muscles to control the prosthesis movements.
The surface electromyogram signal (sEMG) being noninvasive has
been used as an input to prostheses controllers for many years.
Recent technological advances has led to the development of
implantable myoelectric sensors which enable the internal
myoelectric signal (MES) to be used as input to these prostheses
controllers. The intramuscular measurement can provide focal
recordings from deep muscles of the forearm and independent signals
relatively free of crosstalk thus allowing for more independent
control sites. However, little work has been done to compare the two
inputs. In this paper we have compared the classification accuracy of
six pattern recognition based myoelectric controllers which use
surface myoelectric signals recorded using untargeted (symmetric)
surface electrode arrays to the same controllers with multichannel
intramuscular myolectric signals from targeted intramuscular
electrodes as inputs. There was no significant enhancement in the
classification accuracy as a result of using the intramuscular EMG
measurement technique when compared to the results acquired using
the surface EMG measurement technique. Impressive classification
accuracy (99%) could be achieved by optimally selecting only five
channels of surface EMG.
Abstract: Scripts are one of the basic text resources to understand
broadcasting contents. Topic modeling is the method to get the
summary of the broadcasting contents from its scripts. Generally,
scripts represent contents descriptively with directions and speeches,
and provide scene segments that can be seen as semantic units.
Therefore, a script can be topic modeled by treating a scene segment
as a document. Because scene segments consist of speeches mainly,
however, relatively small co-occurrences among words in the scene
segments are observed. This causes inevitably the bad quality of
topics by statistical learning method. To tackle this problem, we
propose a method to improve topic quality with additional word
co-occurrence information obtained using scene similarities. The
main idea of improving topic quality is that the information that
two or more texts are topically related can be useful to learn high
quality of topics. In addition, more accurate topical representations
lead to get information more accurate whether two texts are related
or not. In this paper, we regard two scene segments are related
if their topical similarity is high enough. We also consider that
words are co-occurred if they are in topically related scene segments
together. By iteratively inferring topics and determining semantically
neighborhood scene segments, we draw a topic space represents
broadcasting contents well. In the experiments, we showed the
proposed method generates a higher quality of topics from Korean
drama scripts than the baselines.
Abstract: Digital images are widely used in computer
applications. To store or transmit the uncompressed images
requires considerable storage capacity and transmission bandwidth.
Image compression is a means to perform transmission or storage of
visual data in the most economical way. This paper explains about
how images can be encoded to be transmitted in a multiplexing
time-frequency domain channel. Multiplexing involves packing
signals together whose representations are compact in the working
domain. In order to optimize transmission resources each 4 × 4
pixel block of the image is transformed by a suitable polynomial
approximation, into a minimal number of coefficients. Less than
4 × 4 coefficients in one block spares a significant amount of
transmitted information, but some information is lost. Different
approximations for image transformation have been evaluated as
polynomial representation (Vandermonde matrix), least squares +
gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev
polynomials or singular value decomposition (SVD). Results have
been compared in terms of nominal compression rate (NCR),
compression ratio (CR) and peak signal-to-noise ratio (PSNR)
in order to minimize the error function defined as the difference
between the original pixel gray levels and the approximated
polynomial output. Polynomial coefficients have been later encoded
and handled for generating chirps in a target rate of about two
chirps per 4 × 4 pixel block and then submitted to a transmission
multiplexing operation in the time-frequency domain.
Abstract: The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.
Abstract: In this paper, we discuss some properties of left
spectrum and give the representation of linear preserver map the left
spectrum of diagonal quaternionic matrices.
Abstract: Carbon dioxide is one of the major greenhouse gas
(GHG) contributors. It is an obligation of the industry to reduce the
amount of carbon dioxide emission to the acceptable limits.
Tremendous research and studies are reported in the past and still the
quest to find the suitable and economical solution of this problem
needed to be explored in order to develop the most plausible absorber
for carbon dioxide removal. Amino acids can be potential alternate
solvents for carbon dioxide capture from gaseous streams. This is due
to its ability to resist oxidative degradation, low volatility and its
ionic structure. In addition, the introduction of promoter-like
piperazine to amino acid helps to further enhance the solubility. In
this work, the effect of piperazine on thermo physical properties and
solubility of β-Alanine aqueous solutions were studied for various
concentrations. The measured physicochemical properties data was
correlated as a function of temperature using least-squares method
and the correlation parameters are reported together with it respective
standard deviations. The effect of activator piperazine on the CO2
loading performance of selected amino acid under high-pressure
conditions (1bar to 10bar) at temperature range of (30 to 60)oC was
also studied. Solubility of CO2 decreases with increasing temperature
and increases with increasing pressure. Quadratic representation of
solubility using Response Surface Methodology (RSM) shows that
the most important parameter to optimize solubility is system
pressure. The addition of promoter increases the solubility effect of
the solvent.
Abstract: The ASEAN Economic Community (AEC) is the goal
of regional economic integration by 2015. In the region, tourism is an
activity that is important, especially as a source of foreign currency, a
source of employment creation and a source of income bringing to the
region. Given the complexity of the issues entailing the concept of
sustainable tourism, this paper tries to assess tourism sustainability
with the ASEAN, based on a number of quantitative indicators for all
the ten economies, Thailand, Myanmar, Laos, Vietnam, Malaysia,
Singapore, Indonesia, Philippines, Cambodia, and Brunei. The
methodological framework will provide a number of benchmarks of
tourism activities in these countries. They include identification of the
dimensions; for example, economic, socio-ecologic, infrastructure
and indicators, method of scaling, chart representation and evaluation
on Asian countries. This specification shows that a similar level of
tourism activity might introduce different implementation in the
tourism activity and might have different consequences for the socioecological
environment and sustainability. The heterogeneity of
developing countries exposed briefly here would be useful to detect
and prepare for coping with the main problems of each country in
their tourism activities, as well as competitiveness and value creation
of tourism for ASEAN economic community, and will compare with
other parts of the world.
Abstract: The Com-Poisson (CMP) model is one of the most
popular discrete generalized linear models (GLMS) that handles
both equi-, over- and under-dispersed data. In longitudinal context,
an integer-valued autoregressive (INAR(1)) process that incorporates
covariate specification has been developed to model longitudinal
CMP counts. However, the joint likelihood CMP function is
difficult to specify and thus restricts the likelihood-based estimating
methodology. The joint generalized quasi-likelihood approach
(GQL-I) was instead considered but is rather computationally
intensive and may not even estimate the regression effects due
to a complex and frequently ill-conditioned covariance structure.
This paper proposes a new GQL approach for estimating the
regression parameters (GQL-III) that is based on a single score vector
representation. The performance of GQL-III is compared with GQL-I
and separate marginal GQLs (GQL-II) through some simulation
experiments and is proved to yield equally efficient estimates as
GQL-I and is far more computationally stable.
Abstract: This paper presents a case study of using STATCOM to enhance the performance of Al-Qatraneh 33-kV transmission line. The location of the STATCOM was identified by maintaining minimum voltage drops at the 110 load nodes. The transmission line and the 110 load nodes have been modeled by MATLAB/Simulink. The suggested STATCOM and its location will increase the transmission capability of this transmission line and overcome the overload expected in the year 2020. The annual percentage loading rise has been considered as 14.35%. A graphical representation of the line-to-line voltages and the voltage drops at different load nodes is illustrated.
Abstract: Non-linear dynamic time history analysis is
considered as the most advanced and comprehensive analytical
method for evaluating the seismic response and performance of
multi-degree-of-freedom building structures under the influence of
earthquake ground motions. However, effective and accurate
application of the method requires the implementation of advanced
hysteretic constitutive models of the various structural components
including masonry infill panels. Sophisticated computational research
tools that incorporate realistic hysteresis models for non-linear
dynamic time-history analysis are not popular among the professional
engineers as they are not only difficult to access but also complex and
time-consuming to use. In addition, commercial computer programs
for structural analysis and design that are acceptable to practicing
engineers do not generally integrate advanced hysteretic models
which can accurately simulate the hysteresis behavior of structural
elements with a realistic representation of strength degradation,
stiffness deterioration, energy dissipation and ‘pinching’ under cyclic
load reversals in the inelastic range of behavior. In this scenario,
push-over or non-linear static analysis methods have gained
significant popularity, as they can be employed to assess the seismic
performance of building structures while avoiding the complexities
and difficulties associated with non-linear dynamic time-history
analysis. “Push-over” or non-linear static analysis offers a practical
and efficient alternative to non-linear dynamic time-history analysis
for rationally evaluating the seismic demands. The present paper is
based on the analytical investigation of the effect of distribution of
masonry infill panels over the elevation of planar masonry infilled
reinforced concrete [R/C] frames on the seismic demands using the
capacity spectrum procedures implementing nonlinear static analysis
[pushover analysis] in conjunction with the response spectrum
concept. An important objective of the present study is to numerically
evaluate the adequacy of the capacity spectrum method using
pushover analysis for performance based design of masonry infilled
R/C frames for near-field earthquake ground motions.
Abstract: Wireless Sensor Networks (WSNs), which sense
environmental data with battery-powered nodes, require multi-hop
communication. This power-demanding task adds an extra workload
that is unfairly distributed across the network. As a result, nodes run
out of battery at different times: this requires an impractical
individual node maintenance scheme. Therefore we investigate a new
Cooperative Sensing approach that extends the WSN operational life
and allows a more practical network maintenance scheme (where all
nodes deplete their batteries almost at the same time). We propose a
novel cooperative algorithm that derives a piecewise representation
of the sensed signal while controlling approximation accuracy.
Simulations show that our algorithm increases WSN operational life
and spreads communication workload evenly. Results convey a
counterintuitive conclusion: distributing workload fairly amongst
nodes may not decrease the network power consumption and yet
extend the WSN operational life. This is achieved as our cooperative
approach decreases the workload of the most burdened cluster in the
network.
Abstract: Detecting changes in multiple images of the same
scene has recently seen increased interest due to the many
contemporary applications including smart security systems, smart
homes, remote sensing, surveillance, medical diagnosis, weather
forecasting, speed and distance measurement, post-disaster forensics
and much more. These applications differ in the scale, nature, and
speed of change. This paper presents an application of image
processing techniques to implement a real-time change detection
system. Change is identified by comparing the RGB representation of
two consecutive frames captured in real-time. The detection threshold
can be controlled to account for various luminance levels. The
comparison result is passed through a filter before decision making to
reduce false positives, especially at lower luminance conditions. The
system is implemented with a MATLAB Graphical User interface
with several controls to manage its operation and performance.
Abstract: In this study, we examine some spectral properties
of non-selfadjoint matrix-valued difference equations consisting of
a polynomial-type Jost solution. The aim of this study is to
investigate the eigenvalues and spectral singularities of the difference
operator L which is expressed by the above-mentioned difference
equation. Firstly, thanks to the representation of polynomial type Jost
solution of this equation, we obtain asymptotics and some analytical
properties. Then, using the uniqueness theorems of analytic functions,
we guarantee that the operator L has a finite number of eigenvalues
and spectral singularities.
Abstract: Carefully scheduling the operations of pumps can be
resulted to significant energy savings. Schedules can be defined
either implicit, in terms of other elements of the network such as tank
levels, or explicit by specifying the time during which each pump is
on/off. In this study, two new explicit representations based on timecontrolled
triggers were analyzed, where the maximum number of
pump switches was established beforehand, and the schedule may
contain fewer switches than the maximum. The optimal operation of
pumping stations was determined using a Jumping Particle Swarm
Optimization (JPSO) algorithm to achieve the minimum energy cost.
The model integrates JPSO optimizer and EPANET hydraulic
network solver. The optimal pump operation schedule of VanZyl
water distribution system was determined using the proposed model
and compared with those from Genetic and Ant Colony algorithms.
The results indicate that the proposed model utilizing the JPSO
algorithm is a versatile management model for the operation of realworld
water distribution system.
Abstract: Healthcare safety has been perceived important. It is
essential to prevent troubles in healthcare processes for healthcare
safety. Trouble prevention is based on trouble prediction using
accumulated knowledge on processes, troubles, and countermeasures.
However, information on troubles has not been accumulated in
hospitals in the appropriate structure, and it has not been utilized
effectively to prevent troubles. In the previous study, however a
detailed knowledge acquisition process for trouble prediction was
proposed, the knowledgebase for countermeasures was not involved.
In this paper, we aim to propose the structure of the knowledgebase for
countermeasures, in the knowledge acquisition process for trouble
prediction in healthcare process. We first design the structure of
countermeasures and propose the knowledge representation form on
countermeasures. Then, we evaluate the validity of the proposal, by
applying it into an actual hospital.
Abstract: This study aims to increase understanding of the
transition of business models in servitization. The significance of
service in all business has increased dramatically during the past
decades. Service-dominant logic (SDL) describes this change in the
economy and questions the goods-dominant logic on which business
has primarily been based in the past. A business model canvas is one
of the most cited and used tools in defining end developing business
models. The starting point of this paper lies in the notion that the
traditional business model canvas is inherently goods-oriented and
best suits for product-based business. However, the basic differences
between goods and services necessitate changes in business model
representations when proceeding in servitization. Therefore, new
knowledge is needed on how the conception of business model and
the business model canvas as its representation should be altered in
servitized firms in order to better serve business developers and interfirm
co-creation. That is to say, compared to products, services are
intangible and they are co-produced between the supplier and the
customer. Value is always co-created in interaction between a
supplier and a customer, and customer experience primarily depends
on how well the interaction succeeds between the actors. The role of
service experience is even stronger in service business compared to
product business, as services are co-produced with the customer. This paper provides business model developers with a service
business model canvas, which takes into account the intangible,
interactive, and relational nature of service. The study employs a
design science approach that contributes to theory development via
design artifacts. This study utilizes qualitative data gathered in
workshops with ten companies from various industries. In particular,
key differences between Goods-dominant logic (GDL) and SDLbased
business models are identified when an industrial firm
proceeds in servitization. As the result of the study, an updated version of the business
model canvas is provided based on service-dominant logic. The
service business model canvas ensures a stronger customer focus and
includes aspects salient for services, such as interaction between
companies, service co-production, and customer experience. It can be
used for the analysis and development of a current service business
model of a company or for designing a new business model. It
facilitates customer-focused new service design and service
development. It aids in the identification of development needs, and
facilitates the creation of a common view of the business model.
Therefore, the service business model canvas can be regarded as a
boundary object, which facilitates the creation of a common
understanding of the business model between several actors involved.
The study contributes to the business model and service business
development disciplines by providing a managerial tool for
practitioners in service development. It also provides research insight
into how servitization challenges companies’ business models.
Abstract: Web mining is to discover and extract useful
Information. Different users may have different search goals when
they search by giving queries and submitting it to a search engine.
The inference and analysis of user search goals can be very useful for
providing an experience result for a user search query. In this project,
we propose a novel approach to infer user search goals by analyzing
search web logs. First, we propose a novel approach to infer user
search goals by analyzing search engine query logs, the feedback
sessions are constructed from user click-through logs and it
efficiently reflect the information needed for users. Second we
propose a preprocessing technique to clean the unnecessary data’s
from web log file (feedback session). Third we propose a technique
to generate pseudo-documents to representation of feedback sessions
for clustering. Finally we implement k-medoids clustering algorithm
to discover different user search goals and to provide a more optimal
result for a search query based on feedback sessions for the user.
Abstract: One of the most critical decision points in the design of a
face recognition system is the choice of an appropriate face representation.
Effective feature descriptors are expected to convey sufficient, invariant
and non-redundant facial information. In this work we propose a set of
Hahn moments as a new approach for feature description. Hahn moments
have been widely used in image analysis due to their invariance, nonredundancy
and the ability to extract features either globally and locally.
To assess the applicability of Hahn moments to Face Recognition we
conduct two experiments on the Olivetti Research Laboratory (ORL)
database and University of Notre-Dame (UND) X1 biometric collection.
Fusion of the global features along with the features from local facial
regions are used as an input for the conventional k-NN classifier. The
method reaches an accuracy of 93% of correctly recognized subjects for
the ORL database and 94% for the UND database.