Abstract: A dynamic of Bertrand duopoly game is analyzed, where players use different production methods and choose their prices with bounded rationality. The equilibriums of the corresponding discrete dynamical systems are investigated. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability of Nash equilibrium, as some parameters of the model are varied, gives rise to complex dynamics such as cycles of higher order and chaos. On this basis, we discover that an increase of adjustment speed of bounded rational player can make Bertrand market sink into the chaotic state. Finally, the complex dynamics, bifurcations and chaos are displayed by numerical simulation.
Abstract: Medical image segmentation based on image smoothing followed by edge detection assumes a great degree of importance in the field of Image Processing. In this regard, this paper proposes a novel algorithm for medical image segmentation based on vigorous smoothening by identifying the type of noise and edge diction ideology which seems to be a boom in medical image diagnosis. The main objective of this algorithm is to consider a particular medical image as input and make the preprocessing to remove the noise content by employing suitable filter after identifying the type of noise and finally carrying out edge detection for image segmentation. The algorithm consists of three parts. First, identifying the type of noise present in the medical image as additive, multiplicative or impulsive by analysis of local histograms and denoising it by employing Median, Gaussian or Frost filter. Second, edge detection of the filtered medical image is carried out using Canny edge detection technique. And third part is about the segmentation of edge detected medical image by the method of Normalized Cut Eigen Vectors. The method is validated through experiments on real images. The proposed algorithm has been simulated on MATLAB platform. The results obtained by the simulation shows that the proposed algorithm is very effective which can deal with low quality or marginal vague images which has high spatial redundancy, low contrast and biggish noise, and has a potential of certain practical use of medical image diagnosis.
Abstract: AAM has been successfully applied to face alignment,
but its performance is very sensitive to initial values. In case the initial
values are a little far distant from the global optimum values, there
exists a pretty good possibility that AAM-based face alignment may
converge to a local minimum. In this paper, we propose a progressive
AAM-based face alignment algorithm which first finds the feature
parameter vector fitting the inner facial feature points of the face and
later localize the feature points of the whole face using the first
information. The proposed progressive AAM-based face alignment
algorithm utilizes the fact that the feature points of the inner part of the
face are less variant and less affected by the background surrounding
the face than those of the outer part (like the chin contour). The
proposed algorithm consists of two stages: modeling and relation
derivation stage and fitting stage. Modeling and relation derivation
stage first needs to construct two AAM models: the inner face AAM
model and the whole face AAM model and then derive relation matrix
between the inner face AAM parameter vector and the whole face
AAM model parameter vector. In the fitting stage, the proposed
algorithm aligns face progressively through two phases. In the first
phase, the proposed algorithm will find the feature parameter vector
fitting the inner facial AAM model into a new input face image, and
then in the second phase it localizes the whole facial feature points of
the new input face image based on the whole face AAM model using
the initial parameter vector estimated from using the inner feature
parameter vector obtained in the first phase and the relation matrix
obtained in the first stage. Through experiments, it is verified that the
proposed progressive AAM-based face alignment algorithm is more
robust with respect to pose, illumination, and face background than the
conventional basic AAM-based face alignment algorithm.
Abstract: The model of neural networks on the small-world
topology, with metric (local and random connectivity) is investigated.
The synaptic weights are random, driving the network towards a
chaotic state for the neural activity. An ordered macroscopic neuron
state is induced by a bias in the network connections. When the
connections are mainly local, the network emulates a block-like
structure. It is found that the topology and the bias compete to
influence the network to evolve into a global or a block activity
ordering, according to the initial conditions.
Abstract: In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.
Abstract: It has been defined that the “network is the system".
This implies providing levels of service, reliability, predictability and
availability that are commensurate with or better than those that
individual computers provide today. To provide this requires
integrated network management for interconnected networks of
heterogeneous devices covering both the local campus. In this paper
we are addressing a framework to effectively deal with this issue. It
consists of components and interactions between them which are
required to perform the service fault management. A real-world
scenario is used to derive the requirements which have been applied
to the component identification. An analysis of existing frameworks
and approaches with respect to their applicability to the framework is
also carried out.
Abstract: A continuum model is presented to study vdW
interaction on buckling analysis of multi-walled walled carbon
nanotube. In previous studies, only the vdW interaction between
adjacent two layers was considered and the vdW interaction between
the other two layers was neglected. The results show that the vdW
interaction cofficients are dependent on the change of interlayer
spacing and the radii of tubes. With increase of radii the vdW
coefficients approach a constant value. The numerical results show
that the effect of vdW interaction on the critical strain for a doublewalled
CNT is negligible when the radius is large enough for the
both the cases of before and after buckling.
Abstract: The quality of short term load forecasting can improve the efficiency of planning and operation of electric utilities. Artificial Neural Networks (ANNs) are employed for nonlinear short term load forecasting owing to their powerful nonlinear mapping capabilities. At present, there is no systematic methodology for optimal design and training of an artificial neural network. One has often to resort to the trial and error approach. This paper describes the process of developing three layer feed-forward large neural networks for short-term load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. Particle Swarm Optimization (PSO) is used to develop the optimum large neural network structure and connecting weights for one-day ahead electric load forecasting problem. PSO is a novel random optimization method based on swarm intelligence, which has more powerful ability of global optimization. Employing PSO algorithms on the design and training of ANNs allows the ANN architecture and parameters to be easily optimized. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. The experimental results show that the proposed method optimized by PSO can quicken the learning speed of the network and improve the forecasting precision compared with the conventional Back Propagation (BP) method. Moreover, it is not only simple to calculate, but also practical and effective. Also, it provides a greater degree of accuracy in many cases and gives lower percent errors all the time for STLF problem compared to BP method. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: A mathematical model for the transmission of SARS is developed. In addition to dividing the population into susceptible (high and low risk), exposed, infected, quarantined, diagnosed and recovered classes, we have included a class called untraced. The model simulates the Gompertz curves which are the best representation of the cumulative numbers of probable SARS cases in Hong Kong and Singapore. The values of the parameters in the model which produces the best fit of the observed data for each city are obtained by using a differential evolution algorithm. It is seen that the values for the parameters needed to simulate the observed daily behaviors of the two epidemics are different.
Abstract: BRI-STARS (BRIdge Stream Tube model for Alluvial
River Simulation) program was used to investigate the scour depth around bridge piers in some of the major river systems in Iran. Model
calibration was performed by collecting different field data. Field data are cataloged on three categories, first group of bridges that
their rivers bed are formed by fine material, second group of bridges
that their rivers bed are formed by sand material, and finally bridges that their rivers bed are formed by gravel or cobble materials.
Verification was performed with some field data in Fars Province. Results show that for wide piers, computed scour depth is more than
measured one. In gravel bed streams, computed scour depth is greater
than measured scour depth, the reason is due to formation of armor layer on bed of channel. Once this layer is eroded, the computed
scour depth is close to the measured one.
Abstract: In this paper, we carry over some of the results which
are valid on a certain class of Moufang-Klingenberg planes M(A)
coordinatized by an local alternative ring A := A(ε) = A+Aε of
dual numbers to finite projective Klingenberg plane M(A) obtained
by taking local ring Zq (where prime power q = pk) instead of A.
So, we show that the collineation group of M(A) acts transitively
on 4-gons, and that any 6-figure corresponds to only one inversible
m ∈ A.
Abstract: In this paper a simple terrain evaluation method for
hexapod robot is introduced. This method is based on feet coordinate
evaluation when all are on the ground. Depending on the feet
coordinate differences the local terrain evaluation is possible. Terrain
evaluation is necessary for right gait selection and/or body position
correction. For terrain roughness evaluation three planes are plotted:
two of them as definition points use opposite feet coordinates, third
coincides with the robot body plane. The leaning angle of body plane
is evaluated measuring gravity force using three-axis accelerometer.
Terrain roughness evaluation method is based on angle estimation
between normal vectors of these planes. Aim of this work is to
present a simple method for embedded robot controller, allowing to
find the best further movement settings.
Abstract: Technology transfer is a common method for
companies to acquire new technology and presents both challenges
and substantial benefits. In some cases especially in developing
countries, the mere possession of technology does not guarantee a
competitive advantage if the appropriate infrastructure is not in place.
In this paper, we identify the localization factors needed to provide a
better understanding of the conditions necessary for localization in
order to benefit from future technology developments. Our
theoretical and empirical analyses allow us to identify several factors
in the technology transfer process that affect localization and provide
leverage in enhancing capabilities and absorptive capacity.The
impact factors are categorized within different groups of government,
firms, institutes and market, and are verified through the empirical
survey of a technology transfer experience. Moreover, statistical
analysis has allowed a deeper understanding of the importance of
each factor and has enabled each group to prioritize their
organizational policies to effectively localize their technology.
Abstract: The OTOP Entrepreneurship that used to create
substantial source of income for local Thai communities are now in a
stage of exigent matters that required assistances from public sectors
due to over Entrepreneurship of duplicative ideas, unable to adjust
costs and prices, lack of innovation, and inadequate of quality
control. Moreover, there is a repetitive problem of middlemen who
constantly corner the OTOP market. Local OTOP producers become
easy preys since they do not know how to add more values, how to
create and maintain their own brand name, and how to create proper
packaging and labeling. The suggested solutions to local OTOP
producers are to adopt modern management techniques, to find
knowhow to add more values to products and to unravel other
marketing problems. The objectives of this research are to study the
prevalent OTOP products management and to discover direction to
manage OTOP products to enhance the effectiveness of OTOP
Entrepreneurship in Nonthaburi Province, Thailand. There were 113
participants in this study. The research tools can be divided into two
parts: First part is done by questionnaire to find responses of the
prevalent OTOP Entrepreneurship management. Second part is the
use of focus group which is conducted to encapsulate ideas and local
wisdom. Data analysis is performed by using frequency, percentage,
mean, and standard deviation as well as the synthesis of several small
group discussions. The findings reveal that 1) Business Resources:
the quality of product is most important and the marketing of product
is least important. 2) Business Management: Leadership is most
important and raw material planning is least important. 3) Business
Readiness: Communication is most important and packaging is least
important. 4) Support from public sector: Certified from the
government is most important and source of raw material is the least
important.
Abstract: Ant colony optimization is an ant algorithm framework that took inspiration from foraging behavior of ant colonies. Indeed, ACO algorithms use a chemical communication, represented by pheromone trails, to build good solutions. However, ants involve different communication channels to interact. Thus, this paper introduces the acoustic communication between ants while they are foraging. This process allows fine and local exploration of search space and permits optimal solution to be improved.
Abstract: The log periodogram regression is widely used in empirical
applications because of its simplicity, since only a least squares
regression is required to estimate the memory parameter, d, its good
asymptotic properties and its robustness to misspecification of the
short term behavior of the series. However, the asymptotic distribution
is a poor approximation of the (unknown) finite sample distribution
if the sample size is small. Here the finite sample performance of different
nonparametric residual bootstrap procedures is analyzed when
applied to construct confidence intervals. In particular, in addition to
the basic residual bootstrap, the local and block bootstrap that might
adequately replicate the structure that may arise in the errors of the
regression are considered when the series shows weak dependence in
addition to the long memory component. Bias correcting bootstrap
to adjust the bias caused by that structure is also considered. Finally,
the performance of the bootstrap in log periodogram regression based
confidence intervals is assessed in different type of models and how
its performance changes as sample size increases.
Abstract: Service life of existing reinforced concrete (RC)
structures in coastal towns of Sabah has been affected very much.
Concrete crack, spalling of concrete cover and reinforcement rusting
of RC buildings are seen even within 5 years of construction in
Sabah. Hence, in this study a new mix design of concrete grout was
developed using locally available materials and investigated under
two curing conditions and workability, compressive strength,
Accelerated Mortar Bar Test (AMBT), water absorption, volume of
permeable voids (VPV), Sorptivity and 90-days salt ponding test
were conducted. The compressive strength of concrete grout at the
age 90 days was found to be 44.49 N/mm2 under water curing. It was
observed that the percentage of mortar bar length change was below
1% for developed concrete grout. The water absorption of the
concrete grout was in between the range of 0.88 % to 3.60 % under
two different curing up to the age 90 days. It was also observed that
the VPV of concrete was in the range of 0 % to 9.75 and 2.44% to
13.05% under water curing and site curing respectively. It was found
that the Sorptivity of the concrete grout under water curing at the age
of 28 days is 0.211mm/√min and at the age 90 day are 0.067
mm/√min. The chloride content decreased greatly, 90% after a depth
of 15 mm. It was noticed that the site cured samples showed higher
chloride contents near surface compared to water cured samples.
This investigation suggested that the developed mix design of
concrete grout using locally available construction materials can be
used for crack repairing of existing RC structures in Sabah.
Abstract: Generally, in order to create 3D sound using binaural
systems, we use head related transfer functions (HRTF) including the
information of sounds which is arrived to our ears. But it can decline
some three-dimensional effects in the area of a cone of confusion
between front and back directions, because of the characteristics of
HRTF.
In this paper, we propose a new method to use psychoacoustics
theory that reduces the confusion of sound image localization. In the
method, HRTF spectrum characteristic is enhanced by using the
energy ratio of the bark band. Informal listening tests show that the
proposed method improves the front-back sound localization
characteristics much better than the conventional methods
Abstract: Digital Video Terrestrial Broadcasting (DVB-T)
allows combining broadcasting, telephone and data services in one
network. It has facilitated mobile TV broadcasting. Mobile TV
broadcasting is dominated by fragmentation of standards in use in
different continents. In Asia T-DMB and ISDB-T are used while
Europe uses mainly DVB-H and in USA it is MediaFLO. Issues of
royalty for developers of these different incompatible technologies,
investments made and differing local conditions shall make it
difficult to agree on a unified standard in a very near future. Despite
this shortcoming, mobile TV has shown very good market potential.
There are a number of challenges that still exist for regulators,
investors and technology developers but the future looks bright.
There is need for mobile telephone operators to cooperate with
content providers and those operating terrestrial digital broadcasting
infrastructure for mutual benefit.
Abstract: According to dramatic growth of internet services, an easy and prompt service deployment has been important for internet service providers to successfully maintain time-to-market. Before global service deployment, they have to pay the big cost for service evaluation to make a decision of the proper system location, system scale, service delay and so on. But, intra-Lab evaluation tends to have big gaps in the measured data compared with the realistic situation, because it is very difficult to accurately expect the local service environment, network congestion, service delay, network bandwidth and other factors. Therefore, to resolve or ease the upper problems, we propose multiple cloud based GPES Broker system and use case that helps internet service providers to alleviate the above problems in beta release phase and to make a prompt decision for their service launching. By supporting more realistic and reliable evaluation information, the proposed GPES Broker system saves the service release cost and enables internet service provider to make a prompt decision about their service launching to various remote regions.