Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).
Abstract: Fixed-bed slow pyrolysis experiments of rice husk
have been conducted to determine the effect of pyrolysis
temperature, heating rate, particle size and reactor length on the
pyrolysis product yields. Pyrolysis experiments were performed at
pyrolysis temperature between 400 and 600°C with a constant
heating rate of 60°C/min and particle sizes of 0.60-1.18 mm. The
optimum process conditions for maximum liquid yield from the rice
husk pyrolysis in a fixed bed reactor were also identified. The highest
liquid yield was obtained at a pyrolysis temperature of 500°C,
particle size of
1.18-1.80 mm, with a heating rate of 60°C/min in a 300 mm length
reactor. The obtained yield of, liquid, gas and solid were found be in
the range of 22.57-31.78 %, 27.75-42.26 % and 34.17-42.52 % (all
weight basics) respectively at different pyrolysis conditions. The
results indicate that the effects of pyrolysis temperature and particle
size on the pyrolysis yield are more significant than that of heating
rate and reactor length. The functional groups and chemical
compositions present in the liquid obtained at optimum conditions
were identified by Fourier Transform-Infrared (FT-IR) spectroscopy
and Gas Chromatography/ Mass Spectroscopy (GC/MS) analysis
respectively.
Abstract: In this paper, we consider the problem for identifying the unknown source in the Poisson equation. A modified Tikhonov regularization method is presented to deal with illposedness of the problem and error estimates are obtained with an a priori strategy and an a posteriori choice rule to find the regularization parameter. Numerical examples show that the proposed method is effective and stable.
Abstract: The design requirements for successful human
accommodation in urban spaces are well known; and the range of
facilities available for meeting urban water quality and quantity
requirements is also well established. Their competing requirements
must be reconciled in order for urban spaces to be successful for
both. This paper outlines the separate human and water imperatives
and their interactions in urban spaces. Stormwater management
facilities- relative potential contributions to urban spaces are
contrasted, and design choices for achieving those potentials are
described. This study uses human success of urban space as the
evaluative criterion of stormwater amenity: human values call on
stormwater facilities to contribute to successful human spaces.
Placing water-s contribution under the overall idea of successful
urban space is an evolution from previous subjective evaluations.
The information is based on photographs and notes from
approximately 1,000 stormwater facilities and urban sites collected
during the last 35 years in North America and overseas, and the
author-s experience on multi-disciplinary design teams. This
conceptual study combines the disciplinary roles of engineering,
landscape architecture, and sociology in effecting successful urban
design.
Abstract: A Web-services based grid infrastructure is evolving to be readily available in the near future. In this approach, the Web services are inherited (encapsulated or functioned) into the same existing Grid services class. In practice there is not much difference between the existing Web and grid infrastructure. Grid services emerged as stateful web services. In this paper, we present the key components of web-services based grid and also how the resource discovery is performed on web-services based grid considering resource discovery, as a critical service, to be provided by any type of grid.
Abstract: Chess is one of the indoor games, which improves the
level of human confidence, concentration, planning skills and
knowledge. The main objective of this paper is to help the chess
players to improve their chess openings using data mining
techniques. Budding Chess Players usually do practices by analyzing
various existing openings. When they analyze and correlate
thousands of openings it becomes tedious and complex for them. The
work done in this paper is to analyze the best lines of Blackmar-
Diemer Gambit(BDG) which opens with White D4... using data
mining analysis. It is carried out on the collection of winning games
by applying association rules. The first step of this analysis is
assigning variables to each different sequence moves. In the second
step, the sequence association rules were generated to calculate
support and confidence factor which help us to find the best
subsequence chess moves that may lead to winning position.
Abstract: In order to optimize annual IT spending and to reduce
the complexity of an entire system architecture, SOA trials have been
started. It is common knowledge that to design an SOA system we
have to adopt the top-down approach, but in reality silo systems are
being made, so these companies cannot reuse newly designed services,
and cannot enjoy SOA-s economic benefits. To prevent this situation,
we designed a generic SOA development process referred to as the
architecture of “mass customization."
To define the generic detail development processes, we did a case
study on an imaginary company. Through the case study, we could
define the practical development processes and found this could vastly
reduce updating development costs.
Abstract: Data from 1731 Gentile di Puglia lambs, sired by 65 rams over a 5-year period were analyzed by a mixed model to estimate the variance components for heritability. The considered growth traits were: birth weight (BW), weight at 30 days of age (W30) and average daily gain from birth to 30 days of age (DG). Year of birth, sex of lamb, type of birth (single or twin), dam age at lambing and farm were significant sources of variation for all the considered growth traits. The average lamb weights were 3.85±0.16 kg at birth, 9.57±0.91 kg at 30 days of age and the average daily gain was 191±14 g. Estimates of heritability were 0.33±0.05, 0.41±0.06 and 0.16±0.05 respectively for the same traits. These values suggest there is a good opportunity to improve Gentile di Puglia lambs by selecting animals for growth traits.
Abstract: How to effectively allocate system resource to process
the Client request by Gateway servers is a challenging problem. In
this paper, we propose an improved scheme for autonomous
performance of Gateway servers under highly dynamic traffic loads.
We devise a methodology to calculate Queue Length and Waiting
Time utilizing Gateway Server information to reduce response time
variance in presence of bursty traffic. The most widespread
contemplation is performance, because Gateway Servers must offer
cost-effective and high-availability services in the elongated period,
thus they have to be scaled to meet the expected load. Performance
measurements can be the base for performance modeling and
prediction. With the help of performance models, the performance
metrics (like buffer estimation, waiting time) can be determined at
the development process. This paper describes the possible queue
models those can be applied in the estimation of queue length to
estimate the final value of the memory size. Both simulation and
experimental studies using synthesized workloads and analysis of
real-world Gateway Servers demonstrate the effectiveness of the
proposed system.
Abstract: The problem of exponential stability and periodicity for a class of cellular neural networks (DCNNs) with time-varying delays is investigated. By dividing the network state variables into subgroups according to the characters of the neural networks, some sufficient conditions for exponential stability and periodicity are derived via the methods of variation parameters and inequality techniques. These conditions are represented by some blocks of the interconnection matrices. Compared with some previous methods, the method used in this paper does not resort to any Lyapunov function, and the results derived in this paper improve and generalize some earlier criteria established in the literature cited therein. Two examples are discussed to illustrate the main results.
Abstract: Design and land use are closely linked to the
energy efficiency levels for an urban area. The current city
planning practice does not involve an effective land useenergy
evaluation in its 'blueprint' urban plans. The study
proposed an appraisal method that can be embedded in GIS
programs using five planning criteria as how far a planner can
give away from the planning principles (criteria) for the most
energy output s/he can obtain. The case of Balcova, a district
in the Izmir Metropolitan area, is used conformingly for
evaluating the proposed master plan and the geothermal
energy (heating only) use for the concern district.
If the land use design were proposed accordingly at-most
energy efficiency (a 30% obtained), mainly increasing the
density around the geothermal wells and also proposing more
mixed use zones, we could have 17% distortion (infidelity to
the main planning principles) from the original plan. The
proposed method can be an effective tool for planners as
simulation media, of which calculations can be made by GIS
ready tools, to evaluate efficiency levels for different plan
proposals, letting to know how much energy saving causes
how much deviation from the other planning ideals. Lower
energy uses can be possible for different land use proposals
for various policy trials.
Abstract: Semiconductor materials with coatings have a wide range of applications in MEMS and NEMS. This work uses transfermatrix method for calculating the radiative properties. Dopped silicon is used and the coherent formulation is applied. The Drude model for the optical constants of doped silicon is employed. Results showed that for the visible wavelengths, more emittance occurs in greater concentrations and the reflectance decreases as the concentration increases. In these wavelengths, transmittance is negligible. Donars and acceptors act similar in visible wavelengths. The effect of wave interference can be understood by plotting the spectral properties such as reflectance or transmittance of a thin dielectric film versus the film thickness and analyzing the oscillations of properties due to constructive and destructive interferences. But this effect has not been shown at visible wavelengths. At room temperature, the scattering process is dominated by lattice scattering for lightly doped silicon, and the impurity scattering becomes important for heavily doped silicon when the dopant concentration exceeds1018cm-3 .
Abstract: The increasing competitiveness in manufacturing
industry is forcing manufacturers to seek effective processing
schedules. The paper presents an optimization manufacture
scheduling approach for dependent details processing with given
processing sequences and times on multiple machines. By defining
decision variables as start and end moments of details processing it is
possible to use straightforward variables restrictions to satisfy
different technological requirements and to formulate easy to
understand and solve optimization tasks for multiple numbers of
details and machines. A case study example is solved for seven base
moldings for CNC metalworking machines processed on five
different machines with given processing order among details and
machines and known processing time-s duration. As a result of linear
optimization task solution the optimal manufacturing schedule
minimizing the overall processing time is obtained. The
manufacturing schedule defines the moments of moldings delivery
thus minimizing storage costs and provides mounting due-time
satisfaction. The proposed optimization approach is based on real
manufacturing plant problem. Different processing schedules variants
for different technological restrictions were defined and implemented
in the practice of Bulgarian company RAIS Ltd. The proposed
approach could be generalized for other job shop scheduling
problems for different applications.
Abstract: Whereas in the third generation nuclear reactors,
dimensions of core and also the kind of coolant and enrichment
percent of fuel have significantly changed than the second
generation, therefore in this article the aim is based on a
comparative investigation between two same power reactors of
second and third generations, that the neutronic parameters of both
reactors such as: K∞, Keff and its details and thermal hydraulic
parameters such as: power density, specific power, volumetric heat
rate, released power per fuel volume unit, volume and mass of clad
and fuel (consisting fissile and fertile fuels), be calculated and
compared together. By this comparing the efficiency and
modification of third generation nuclear reactors than second
generation which have same power can be distinguished.
In order to calculate the cited parameters, some information
such as: core dimensions, the pitch of lattice, the fuel matter, the
percent of enrichment and the kind of coolant are used. For
calculating the neutronic parameters, a neutronic program entitled:
SIXFAC and also related formulas have been used. Meantime for
calculating the thermal hydraulic and other parameters, analytical
method and related formulas have been applied.
Abstract: In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as
weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services
with the Web-mined knowledge have begun to be developed for
the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be
problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore,
this paper introduces the simplest Web Sensor and spatiotemporallynormalized
Web Sensor to extract spatiotemporal data about a target
phenomenon from weblogs searched by keyword(s) representing the
target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity
analyses of coefficient correlation with temperature, rainfall, snowfall,
and earthquake statistics per day by region of Japan Meteorological
Agency as physical-world data: spatial granularity (region-s population
density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and
media granularity (weblogs vs. microblogs such as Tweets).
Abstract: In real-time networks a large number of application programs are relying on video data and heterogeneous data transmission techniques. The aim of this research is presenting a method for end-to-end vouch quality service in surface applicationlayer for sending video data in comparison form in wireless heterogeneous networks. This method tries to improve the video sending over the wireless heterogeneous networks with used techniques in surface layer, link and application. The offered method is showing a considerable improvement in quality observing by user. In addition to this, other specifications such as shortage of data load that had require to resending and limited the relation period length to require time for second data sending, help to be used the offered method in the wireless devices that have a limited energy. The presented method and the achieved improvement is simulated and presented in the NS-2 software.
Abstract: This paper proposes a new approach to offer a private
cloud service in HPC clusters. In particular, our approach relies on
automatically scheduling users- customized environment request as a
normal job in batch system. After finishing virtualization request jobs,
those guest operating systems will dismiss so that compute nodes will
be released again for computing. We present initial work on the
innovative integration of HPC batch system and virtualization tools
that aims at coexistence such that they suffice for meeting the
minimizing interference required by a traditional HPC cluster. Given
the design of initial infrastructure, the proposed effort has the potential
to positively impact on synergy model. The results from the
experiment concluded that goal for provisioning customized cluster
environment indeed can be fulfilled by using virtual machines, and
efficiency can be improved with proper setup and arrangements.
Abstract: Delay and Disruption Tolerant Networking is part of
the Inter Planetary Internet with primary application being Deep
Space Networks. Its Terrestrial form has interesting research
applications such as Alagappa University Delay Tolerant Water
Monitoring Network which doubles as test beds for improvising its
routing scheme. DTNs depend on node mobility to deliver packets
using a store-carry-and forward paradigm. Throwboxes are small and
inexpensive stationary devices equipped with wireless interfaces and
storage. We propose the use of Throwboxes to enhance the contact
opportunities of the nodes and hence improve the Throughput. The
enhancement is evaluated using Alunivdtnsim, a desktop simulator in
C language and the results are graphically presented.
Abstract: Cerium-doped lanthanum bromide LaBr3:Ce(5%)
crystals are considered to be one of the most advanced scintillator
materials used in PET scanning, combining a high light yield, fast
decay time and excellent energy resolution. Apart from the correct
choice of scintillator, it is also important to optimise the detector
geometry, not least in terms of source-to-detector distance in order to
obtain reliable measurements and efficiency. In this study a
commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce
(5%) detector was characterised in terms of its efficiency at varying
source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and
137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As
a result of the change in solid angle subtended by the detector, the
geometric efficiency reduced in efficiency with increasing distance.
High efficiencies at low distances can cause pulse pile-up when
subsequent photons are detected before previously detected events
have decayed. To reduce this systematic error the source-to-detector
distance should be balanced between efficiency and pulse pile-up
suppression as otherwise pile-up corrections would need to be
necessary at short distances. In addition to the experimental
measurements Monte Carlo simulations have been carried out for the
same setup, allowing a comparison of results. The advantages and
disadvantages of each approach have been highlighted.
Abstract: The state-of-the-art Bag of Words model in Content-
Based Image Retrieval has been used for years but the relevance
feedback strategies for this model are not fully investigated. Inspired
from text retrieval, the Bag of Words model has the ability to use the
wealth of knowledge and practices available in text retrieval. We
study and experiment the relevance feedback model in text retrieval
for adapting it to image retrieval. The experiments show that the
techniques from text retrieval give good results for image retrieval
and that further improvements is possible.