Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: According to investigating impact of complexity of
stereoscopic frame pairs on stereoscopic video coding and
transmission, a new rate control algorithm is presented. The proposed
rate control algorithm is performed on three levels: stereoscopic group
of pictures (SGOP) level, stereoscopic frame (SFrame) level and
frame level. A temporal-spatial frame complexity model is firstly
established, in the bits allocation stage, the frame complexity, position
significance and reference property between the left and right frames
are taken into account. Meanwhile, the target buffer is set according to
the frame complexity. Experimental results show that the proposed
method can efficiently control the bitrates, and it outperforms the fixed
quantization parameter method from the rate distortion perspective,
and average PSNR gain between rate-distortion curves (BDPSNR) is
0.21dB.
Abstract: In cryptography, confusion and diffusion are very
important to get confidentiality and privacy of message in block
ciphers and stream ciphers. There are two types of network to provide
confusion and diffusion properties of message in block ciphers. They
are Substitution- Permutation network (S-P network), and Feistel
network. NLFS (Non-Linear feedback stream cipher) is a fast and
secure stream cipher for software application. NLFS have two modes
basic mode that is synchronous mode and self synchronous mode.
Real random numbers are non-deterministic. R-box (random box)
based on the dynamic properties and it performs the stochastic
transformation of data that can be used effectively meet the
challenges of information is protected from international destructive
impacts. In this paper, a new implementation of stochastic
transformation will be proposed.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: Sustainable development is a concept which was
originated in Burtland commission in 1978. Although this concept
was born with environmental aspects, it is penetrated in all areas
rapidly, turning into a dominate view of planning. Concentrating on
future generation issue, especially when talking about heritage has a
long story. Each approach with all of its characteristics illustrates
differences in planning, hence planning always reflects the dominate
idea of its age. This paper studies sustainable development in
planning for historical cities with the aim of finding ways to deal
with heritage in planning for historical cities in Iran. Through this, it
will be illustrated how challenges between sustainable concept and
heritage could be concluded in planning.
Consequently, the paper will emphasize on:
Sustainable development in city planning
Trends regarding heritage
Challenges due to planning for historical cities in Iran
For the first two issues, documentary method regarding the
sustainable development and heritage literature is considered. As the
next step focusing on Iranian historical cities require considering the
urban planning and management structure and identifying the main
challenges related to heritage, so analyzing challenges regarding
heritage is considered. As the result it would be illustrated that key
issue in such planning is active conservation to improve and use the
potential of heritage while it's continues conservation is guaranteed.
By emphasizing on the planning system in Iran it will be obvious that
some reforms are needed in this system and its way of relating with
heritage. The main weakness in planning for historical cities in Iran
is the lack of independent city management. Without this factor
achieving active conservation as the main factor of sustainable
development would not be possible.
Abstract: Multimedia security is an incredibly significant area
of concern. A number of papers on robust digital watermarking have
been presented, but there are no standards that have been defined so
far. Thus multimedia security is still a posing problem. The aim of
this paper is to design a robust image-watermarking scheme, which
can withstand a different set of attacks. The proposed scheme
provides a robust solution integrating image moment normalization,
content dependent watermark and discrete wavelet transformation.
Moment normalization is useful to recover the watermark even in
case of geometrical attacks. Content dependent watermarks are a
powerful means of authentication as the data is watermarked with its
own features. Discrete wavelet transforms have been used as they
describe image features in a better manner. The proposed scheme
finds its place in validating identification cards and financial
instruments.
Abstract: Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Abstract: In this paper we consider the problem of distributed adaptive estimation in wireless sensor networks for two different observation noise conditions. In the first case, we assume that there are some sensors with high observation noise variance (noisy sensors) in the network. In the second case, different variance for observation noise is assumed among the sensors which is more close to real scenario. In both cases, an initial estimate of each sensor-s observation noise is obtained. For the first case, we show that when there are such sensors in the network, the performance of conventional distributed adaptive estimation algorithms such as incremental distributed least mean square (IDLMS) algorithm drastically decreases. In addition, detecting and ignoring these sensors leads to a better performance in a sense of estimation. In the next step, we propose a simple algorithm to detect theses noisy sensors and modify the IDLMS algorithm to deal with noisy sensors. For the second case, we propose a new algorithm in which the step-size parameter is adjusted for each sensor according to its observation noise variance. As the simulation results show, the proposed methods outperforms the IDLMS algorithm in the same condition.
Abstract: The impact assessment in its various forms has
recently become a very important part of policy-making and
legislation in many different countries. Regulatory impact assessment
(RIA) is yet another set of analytical methods deployed in the
legislation of the European Union, of many developed countries as
well as in many developing ones such as Mexico, Malaysia and
Philippines. The aim of this paper is to provide a theoretical
background for economic models in regulatory impact assessment
and an overview of their application especially on the financial
market in the Czech Republic. We found out an inadequate
application of these models, what makes room for further research in
this field.
Abstract: In this paper, the least-squares design of variable fractional-delay (VFD) finite impulse response (FIR) digital differentiators is proposed. The used transfer function is formulated so that Farrow structure can be applied to realize the designed system. Also, the symmetric characteristics of filter coefficients are derived, which leads to the complexity reduction by saving almost a half of the number of coefficients. Moreover, all the elements of related vectors or matrices for the optimal process can be represented in closed forms, which make the design easier. Design example is also presented to illustrate the effectiveness of the proposed method.
Abstract: A preliminary evaluation of the feasibility of installing small wind turbines on offshore oil and gas extraction platforms is presented. Some aerodynamic considerations are developed in order to determine the best rotor architecture to exploit the wind potential on such installations, assuming that wind conditions over the platforms are similar to those registered on the roofs of urban buildings. Economical considerations about both advantages and disadvantages of the exploitation of wind energy on offshore extraction platforms with respect to conventional offshore wind plants, is also presented. Finally, wind charts of European offshore winds are presented together with a map of the major offshore installations.
Abstract: The information revealed by derivatives can help to
better characterize digital near-end crosstalk signatures with the
ultimate goal of identifying the specific aggressor signal.
Unfortunately, derivatives tend to be very sensitive to even low
levels of noise. In this work we approximated the derivatives of both
quiet and noisy digital signals using a wavelet-based technique. The
results are presented for Gaussian digital edges, IBIS Model digital
edges, and digital edges in oscilloscope data captured from an actual
printed circuit board. Tradeoffs between accuracy and noise
immunity are presented. The results show that the wavelet technique
can produce first derivative approximations that are accurate to
within 5% or better, even under noisy conditions. The wavelet
technique can be used to calculate the derivative of a digital signal
edge when conventional methods fail.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: This paper deals with new concept of using compressed atmospheric air as a zero pollution power source for running motorbikes. The motorbike is equipped with an air turbine in place of an internal combustion engine, and transforms the energy of the compressed air into shaft work. The mathematical modeling and performance evaluation of a small capacity compressed air driven vaned type novel air turbine is presented in this paper. The effect of isobaric admission and adiabatic expansion of high pressure air for different rotor to casing diameter ratios with respect to different vane angles (number of vanes) have been considered and analyzed. It is found that the shaft work output is optimum for some typical values of rotor / casing diameter ratios at a particular value of vane angle (no. of vanes). In this study, the maximum power is obtained as 4.5kW - 5.3kW (5.5-6.25 HP) when casing diameter is taken 100 mm, and rotor to casing diameter ratios are kept from 0.65 to 0.55. This value of output is sufficient to run motorbike.
Abstract: This paper presents a multi-objective order allocation
planning problem with the consideration of various real-world
production features. A novel hybrid intelligent optimization model,
integrating a multi-objective memetic optimization process, a Monte
Carlo simulation technique and a heuristic pruning technique, is
proposed to handle this problem. Experiments based on industrial data
are conducted to validate the proposed model. Results show that (1)
the proposed model can effectively solve the investigated problem by
providing effective production decision-making solutions, which
outperformsan NSGA-II-based optimization process and an industrial
method.
Abstract: This paper proposes a set of quasi-static mathematical
model of magnetic fields caused by high voltage conductors of
distribution transformer by using a set of second-order partial
differential equation. The modification for complex magnetic field
analysis and time-harmonic simulation are also utilized. In this
research, transformers were study in both balanced and unbalanced
loading conditions. Computer-based simulation utilizing the threedimensional
finite element method (3-D FEM) is exploited as a tool
for visualizing magnetic fields distribution volume a distribution
transformer. Finite Element Method (FEM) is one among popular
numerical methods that is able to handle problem complexity in
various forms. At present, the FEM has been widely applied in most
engineering fields. Even for problems of magnetic field distribution,
the FEM is able to estimate solutions of Maxwell-s equations
governing the power transmission systems. The computer simulation
based on the use of the FEM has been developed in MATLAB
programming environment.
Abstract: This paper describes a new measuring algorithm for
three-dimensional (3-D) braided composite material .Braided angle is
an important parameter of braided composites. The objective of this
paper is to present an automatic measuring system. In the paper, the
algorithm is performed by using vcµ6.0 language on PC. An
advanced filtered algorithm for image of 3-D braided composites
material performs has been developed. The procedure is completely
automatic and relies on the gray scale information content of the
images and their local wavelet transform modulus maxims.
Experimental results show that the proposed method is feasible.
The algorithm was tested on both carbon-fiber and glass-fiber
performs.
Abstract: This paper deals with the thermo-mechanical deformation behavior of shear deformable functionally graded ceramicmetal (FGM) plates. Theoretical formulations are based on higher order shear deformation theory with a considerable amendment in the transverse displacement using finite element method (FEM). The mechanical properties of the plate are assumed to be temperaturedependent and graded in the thickness direction according to a powerlaw distribution in terms of the volume fractions of the constituents. The temperature field is supposed to be a uniform distribution over the plate surface (XY plane) and varied in the thickness direction only. The fundamental equations for the FGM plates are obtained using variational approach by considering traction free boundary conditions on the top and bottom faces of the plate. A C0 continuous isoparametric Lagrangian finite element with thirteen degrees of freedom per node have been employed to accomplish the results. Convergence and comparison studies have been performed to demonstrate the efficiency of the present model. The numerical results are obtained for different thickness ratios, aspect ratios, volume fraction index and temperature rise with different loading and boundary conditions. Numerical results for the FGM plates are provided in dimensionless tabular and graphical forms. The results proclaim that the temperature field and the gradient in the material properties have significant role on the thermo-mechanical deformation behavior of the FGM plates.
Abstract: To solve the problem of multisensor data fusion under
non-Gaussian channel noise. The advanced M-estimates are known
to be robust solution while trading off some accuracy. In order to
improve the estimation accuracy while still maintaining the equivalent
robustness, a two-stage robust fusion algorithm is proposed using
preliminary rejection of outliers then an optimal linear fusion. The
numerical experiments show that the proposed algorithm is equivalent
to the M-estimates in the case of uncorrelated local estimates and
significantly outperforms the M-estimates when local estimates are
correlated.
Abstract: This paper proposes a novel game theoretical
technique to address the problem of data object replication in largescale
distributed computing systems. The proposed technique draws
inspiration from computational economic theory and employs the
extended Vickrey auction. Specifically, players in a non-cooperative
environment compete for server-side scarce memory space to
replicate data objects so as to minimize the total network object
transfer cost, while maintaining object concurrency. Optimization of
such a cost in turn leads to load balancing, fault-tolerance and
reduced user access time. The method is experimentally evaluated
against four well-known techniques from the literature: branch and
bound, greedy, bin-packing and genetic algorithms. The experimental
results reveal that the proposed approach outperforms the four
techniques in both the execution time and solution quality.