Abstract: Urbanization and regionalization are two different
approaches when it comes to economical structures and development,
infrastructure and mobility, quality of life and living, education,
social cohesion and many other topics. At first glance, the structures
associated with urbanization and regionalization seems to be
contradicting. This paper discusses possibilities of transfer and
cooperation between rural and urban structures. An empirical
investigation contributed to reveal scenarios of supposable forms of
exchange and cooperation of remote rural areas and big cities.
Abstract: In this paper, a robust watermarking algorithm using
the wavelet transform and edge detection is presented. The efficiency
of an image watermarking technique depends on the preservation of
visually significant information. This is attained by embedding the
watermark transparently with the maximum possible strength. The
watermark embedding process is carried over the subband
coefficients that lie on edges, where distortions are less noticeable,
with a subband level dependent strength. Also, the watermark is
embedded to selected coefficients around edges, using a different
scale factor for watermark strength, that are captured by a
morphological dilation operation. The experimental evaluation of the
proposed method shows very good results in terms of robustness and
transparency to various attacks such as median filtering, Gaussian
noise, JPEG compression and geometrical transformations.
Abstract: Following the laser ablation studies leading to a
theory of nuclei confinement by a Debye layer mechanism, we
present here numerical evaluations for the known stable nuclei where
the Coulomb repulsion is included as a rather minor component
especially for lager nuclei. In this research paper the required
physical conditions for the formation and stability of nuclei
particularly endothermic nuclei with mass number greater than to
which is an open astrophysical question have been investigated.
Using the Debye layer mechanism, nuclear surface energy, Fermi
energy and coulomb repulsion energy it is possible to find conditions
under which the process of nucleation is permitted in early universe.
Our numerical calculations indicate that about 200 second after the
big bang at temperature of about 100 KeV and subrelativistic region
with nucleon density nearly equal to normal nuclear density namely,
10cm all endothermic and exothermic nuclei have been
formed.
Abstract: This paper introduces a new method called ARPDC (Advanced Robust Parallel Distributed Compensation) for automatic control of nonlinear systems. This method improves a quality of robust control by interpolating of robust and optimal controller. The weight of each controller is determined by an original criteria function for model validity and disturbance appreciation. ARPDC method is based on nonlinear Takagi-Sugeno (T-S) fuzzy systems and Parallel Distributed Compensation (PDC) control scheme. The relaxed stability conditions of ARPDC control of nominal system have been derived. The advantages of presented method are demonstrated on the inverse pendulum benchmark problem. From comparison between three different controllers (robust, optimal and ARPDC) follows, that ARPDC control is almost optimal with the robustness close to the robust controller. The results indicate that ARPDC algorithm can be a good alternative not only for a robust control, but in some cases also to an adaptive control of nonlinear systems.
Abstract: A digital system is proposed for low power 100-
channel neural recording system in this paper, which consists of 100
amplifiers, 100 analog-to-digital converters (ADC), digital controller
and baseband, transceiver for data link and RF command link. The
proposed system is designed in a 0.18 μm CMOS process and 65 nm
CMOS process.
Abstract: Global environmental changes lead to increased frequency and scale of natural disaster, Taiwan is under the influence of global warming and extreme weather. Therefore, the vulnerability was increased and variability and complexity of disasters is relatively enhanced. The purpose of this study is to consider the source and magnitude of hazard characteristics on the tourism industry. Using modern risk management concepts, integration of related domestic and international basic research, this goes beyond the Taiwan typhoon disaster risk assessment model and evaluation of loss. This loss evaluation index system considers the impact of extreme weather, in particular heavy rain on the tourism industry in Taiwan. Consider the extreme climate of the compound impact of disaster for the tourism industry; we try to make multi-hazard risk assessment model, strategies and suggestions. Related risk analysis results are expected to provide government department, the tourism industry asset owners, insurance companies and banking include tourist disaster risk necessary information to help its tourism industry for effective natural disaster risk management.
Abstract: The main aim of this study was to examine whether
people understand indicative conditionals on the basis of syntactic
factors or on the basis of subjective conditional probability. The
second aim was to investigate whether the conditional probability of
q given p depends on the antecedent and consequent sizes or derives
from inductive processes leading to establish a link of plausible cooccurrence
between events semantically or experientially associated.
These competing hypotheses have been tested through a 3 x 2 x 2 x 2
mixed design involving the manipulation of four variables: type of
instructions (“Consider the following statement to be true", “Read the
following statement" and condition with no conditional statement);
antecedent size (high/low); consequent size (high/low); statement
probability (high/low). The first variable was between-subjects, the
others were within-subjects. The inferences investigated were Modus
Ponens and Modus Tollens. Ninety undergraduates of the Second
University of Naples, without any prior knowledge of logic or
conditional reasoning, participated in this study.
Results suggest that people understand conditionals in a syntactic
way rather than in a probabilistic way, even though the perception of
the conditional probability of q given p is at least partially involved in
the conditionals- comprehension. They also showed that, in presence
of a conditional syllogism, inferences are not affected by the
antecedent or consequent sizes. From a theoretical point of view these
findings suggest that it would be inappropriate to abandon the idea
that conditionals are naturally understood in a syntactic way for the
idea that they are understood in a probabilistic way.
Abstract: This study applied the Gaussian trajectory
transfer-coefficient model (GTx) to simulate the particulate matter
concentrations and the source apportionments at Nanzih Air Quality
Monitoring Station in southern Taiwan from November 2007 to
February 2008. The correlation coefficient between the observed and
the calculated daily PM10 concentrations is 0.5 and the absolute bias of
the PM10 concentrations is 24%. The simulated PM10 concentrations
matched well with the observed data. Although the emission rate of
PM10 was dominated by area sources (58%), the results of source
apportionments indicated that the primary sources for PM10 at Nanzih
Station were point sources (42%), area sources (20%) and then upwind
boundary concentration (14%). The obvious difference of PM10 source
apportionment between episode and non-episode days was upwind
boundary concentrations which contributed to 20% and 11% PM10
sources, respectively. The gas-particle conversion of secondary
aerosol and long range transport played crucial roles on the PM10
contribution to a receptor.
Abstract: This paper addresses the problem of forbidden states in
non safe Petri Nets. In the system, for preventing it from entering the
forbidden states, some linear constraints can be assigned to them.
Then these constraints can be enforced on the system using control
places. But when the number of constraints in the system is large, a
large number of control places must be added to the model of system.
This concept complicates the model of system. There are some
methods for reducing the number of constraints in safe Petri Nets.
But there is no a systematic method for non safe Petri Nets. In this
paper we propose a method for reducing the number of constraints in
non safe Petri Nets which is based on solving an integer linear
programming problem.
Abstract: This research aims to study the preferable tourism and
the elements of choosing tourist destination from domestic tourist in
Bangkok and the nearby areas in Thailand.The data were collected by
using 1249 set of questionnaires, in mid-August 2012. The result
illustrates that religious destinations are the most preferable places
for the tourist. The average expense per travel is approximately 47
USD a time. Travellers travel based on the advertisement in the
television and internet and their decisions is based on the reputation
of the destinations.
The result on a place dimension demonstrates the neatness and
well managed location play a crucial role on tourist destination.
Gender, age, marriage status and their origins are affecting their
spending and travelling behaviour. The researcher reckon that
providing the area of arcade, selling the souvenir and promoting
tourism among a young professional group would be an important
key follow the income distribution policy, including managing the
destination to welcome the family group, which the result is to
identified as the highest spending.
Abstract: Routing in MANET is extremely challenging because
of MANETs dynamic features, its limited bandwidth, frequent
topology changes caused by node mobility and power energy
consumption. In order to efficiently transmit data to destinations, the
applicable routing algorithms must be implemented in mobile ad-hoc
networks. Thus we can increase the efficiency of the routing by
satisfying the Quality of Service (QoS) parameters by developing
routing algorithms for MANETs. The algorithms that are inspired by
the principles of natural biological evolution and distributed
collective behavior of social colonies have shown excellence in
dealing with complex optimization problems and are becoming more
popular. This paper presents a survey on few meta-heuristic
algorithms and naturally-inspired algorithms.
Abstract: In this paper, the design of a multiple U-slotted microstrip patch antenna with frequency selective surface (FSS) as a superstrate for WLAN and WiMAX applications is presented. The proposed antenna is designed by using substrate FR4 having permittivity of 4.4 and air substrate. The characteristics of the antenna are designed and evaluated the performance of modelled antenna using CST Microwave studio. The proposed antenna dual resonant frequency has been achieved in the band of 2.37-2.55 GHz and 3.4-3.6 GHz. Because of the impact of FSS superstrate, it is found that the bandwidths have been improved from 6.12% to 7.35 % and 3.7% to 5.7% at resonant frequencies 2.45 GHz and 3.5 GHz, respectively. The maximum gain at the resonant frequency of 2.45 and 3.5 GHz are 9.3 and 11.33 dBi, respectively.
Abstract: A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.
Abstract: Grid computing provides a virtual framework for
controlled sharing of resources across institutional boundaries.
Recently, trust has been recognised as an important factor for
selection of optimal resources in a grid. We introduce a new method
that provides a quantitative trust value, based on the past interactions
and present environment characteristics. This quantitative trust value
is used to select a suitable resource for a job and eliminates run time
failures arising from incompatible user-resource pairs. The proposed
work will act as a tool to calculate the trust values of the various
components of the grid and there by improves the success rate of the
jobs submitted to the resource on the grid. The access to a resource
not only depend on the identity and behaviour of the resource but
also upon its context of transaction, time of transaction, connectivity
bandwidth, availability of the resource and load on the resource. The
quality of the recommender is also evaluated based on the accuracy
of the feedback provided about a resource. The jobs are submitted for
execution to the selected resource after finding the overall trust value
of the resource. The overall trust value is computed with respect to
the subjective and objective parameters.
Abstract: In this paper an ant colony optimization algorithm is
developed to solve the permutation flow shop scheduling problem. In
the permutation flow shop scheduling problem which has been vastly
studied in the literature, there are a set of m machines and a set of n
jobs. All the jobs are processed on all the machines and the sequence
of jobs being processed is the same on all the machines. Here this
problem is optimized considering two criteria, makespan and total
flow time. Then the results are compared with the ones obtained by
previously developed algorithms. Finally it is visible that our
proposed approach performs best among all other algorithms in the
literature.
Abstract: The world of wireless telecommunications is rapidly evolving. Technologies under research and development promise to deliver more services to more users in less time. This paper presents the emerging technologies helping wireless systems grow from where we are today into our visions of the future. This paper will cover the applications and characteristics of emerging wireless technologies: Wireless Local Area Networks (WiFi-802.11n), Wireless Personal Area Networks (ZigBee) and Wireless Metropolitan Area Networks (WiMAX). The purpose of this paper is to explain the impending 802.11n standard and how it will enable WLANs to support emerging media-rich applications. The paper will also detail how 802.11n compares with existing WLAN standards and offer strategies for users considering higher-bandwidth alternatives. The emerging IEEE 802.15.4 (ZigBee) standard aims to provide low data rate wireless communications with high-precision ranging and localization, by employing UWB technologies for a low-power and low cost solution. WiMAX (Worldwide Interoperability for Microwave Access) is a standard for wireless data transmission covering a range similar to cellular phone towers. With high performance in both distance and throughput, WiMAX technology could be a boon to current Internet providers seeking to become the leader of next generation wireless Internet access. This paper also explores how these emerging technologies differ from one another.
Abstract: Wimax (Worldwide Interoperability for Microwave Access)
is a promising technology which can offer high speed data,
voice and video service to the customer end, which is presently, dominated
by the cable and digital subscriber line (DSL) technologies.
The performance assessment of Wimax systems is dealt with. The
biggest advantage of Broadband wireless application (BWA) over its
wired competitors is its increased capacity and ease of deployment.
The aims of this paper are to model and simulate the fixed OFDM
IEEE 802.16d physical layer under variant combinations of digital
modulation (BPSK, QPSK, and 16-QAM) over diverse combination
of fading channels (AWGN, SUIs). Stanford University Interim (SUI)
Channel serial was proposed to simulate the fixed broadband wireless
access channel environments where IEEE 802.16d is to be deployed.
It has six channel models that are grouped into three categories
according to three typical different outdoor Terrains, in order to give
a comprehensive effect of fading channels on the overall performance
of the system.
Abstract: This research studied recycled waste by the Recyclable Material Bank Project of 4 universities in the central region of Thailand for the evaluation of reducing greenhouse gas emissions compared with landfilling activity during July 2012 to June 2013. The results showed that the projects collected total amount of recyclable wastes of about 911,984.80 kilograms. Office paper had the largest amount among these recycled wastes (50.68% of total recycled waste). Groups of recycled waste can be prioritized from high to low according to their amount as paper, plastic, glass, mixed recyclables, and metal, respectively. The project reduced greenhouse gas emissions equivalent to about 2814.969 metric tons of carbon dioxide. The most significant recycled waste that affects the reduction of greenhouse gas emissions is office paper which is 70.16% of total reduced greenhouse gasses emission. According to amount of reduced greenhouse gasses emission, groups of recycled waste can be prioritized from high to low significances as paper, plastic, metals, mixed recyclables, and glass, respectively.
Abstract: The increase on the demand of IT resources diverts
the enterprises to use the cloud as a cheap and scalable solution.
Cloud computing promises achieved by using the virtual machine as a
basic unite of computation. However, the virtual machine pre-defined
settings might be not enough to handle jobs QoS requirements. This
paper addresses the problem of mapping jobs have critical start
deadlines to virtual machines that have predefined specifications.
These virtual machines hosted by physical machines and shared a
fixed amount of bandwidth. This paper proposed an algorithm that
uses the idle virtual machines bandwidth to increase the quote of other
virtual machines nominated as executors to urgent jobs. An algorithm
with empirical study have been given to evaluate the impact of the
proposed model on impatient jobs. The results show the importance
of dynamic bandwidth allocation in virtualized environment and its
affect on throughput metric.
Abstract: Brassinosteroids (BRs) regulate cell elongation,
vascular differentiation, senescence, and stress responses. BRs signal
through the BES1/BZR1 family of transcription factors, which
regulate hundreds of target genes involved in this pathway. In this
research a comprehensive genome-wide analysis was carried out in
BES1/BZR1 gene family in Arabidopsis thaliana, Cucumis sativus,
Vitis vinifera, Glycin max and Brachypodium distachyon.
Specifications of the desired sequences, dot plot and hydropathy plot
were analyzed in the protein and genome sequences of five plant
species. The maximum amino acid length was attributed to protein
sequence Brdic3g with 374aa and the minimum amino acid length
was attributed to protein sequence Gm7g with 163aa. The maximum
Instability index was attributed to protein sequence AT1G19350
equal with 79.99 and the minimum Instability index was attributed to
protein sequence Gm5g equal with 33.22. Aliphatic index of these
protein sequences ranged from 47.82 to 78.79 in Arabidopsis
thaliana, 49.91 to 57.50 in Vitis vinifera, 55.09 to 82.43 in Glycin
max, 54.09 to 54.28 in Brachypodium distachyon 55.36 to 56.83 in
Cucumis sativus. Overall, data obtained from our investigation
contributes a better understanding of the complexity of the
BES1/BZR1 gene family and provides the first step towards directing
future experimental designs to perform systematic analysis of the
functions of the BES1/BZR1 gene family.