Abstract: Abu Dhabi is one of the fastest developed cities in the region. On top of all the current and future environmental challenges, Abu Dhabi aims to be among the top governments in the world in sustainable development. Abu Dhabi plans to create an attractive, livable and sustainable managed urban environment in which all necessary services and infrastructure are provided in a sustainable and timely manner. Abu Dhabi is engaged in a difficult challenge to develop credible environmental indicators that would assess the ambitious environmental targets. The aim of those indicators is to provide reliable guidance to decision makers and the public concerning key factors that determine the state of urban environment and identify major areas for policy intervention. In order to ensure sustainable development in UAE in general, and of Abu Dhabi City in particular, relevant and contextual environmental indicators need to be carefully considered. These indicators provide a gauge at a national government scale of how close countries are to establish environmental policy goals. The environment indicators assist city decision-making in such areas as identification of significant environmental aspects and observation of environmental performance trends. Those can help to find ways of reducing environmental pollution and in improving eco-efficiency. This paper outlines recent strategies implemented in Abu Dhabi that aims to improve the sustainable performance of the city-s built environment. The paper explores the variety of current and possible indicators at different levels and their roles in the development of the city.
Abstract: The paper deals with calculation of the parameters of
ceramic material from a set of destruction tests of ceramic heads of
total hip joint endoprosthesis. The standard way of calculation of the
material parameters consists in carrying out a set of 3 or 4 point
bending tests of specimens cut out from parts of the ceramic material
to be analysed. In case of ceramic heads, it is not possible to cut out
specimens of required dimensions because the heads are too small (if
the cut out specimens were smaller than the normalised ones, the
material parameters derived from them would exhibit higher strength
values than those which the given ceramic material really has). On
that score, a special testing jig was made, in which 40 heads were
destructed. From the measured values of circumferential strains of the
head-s external spherical surface under destruction, the state of stress
in the head under destruction was established using the final elements
method (FEM). From the values obtained, the sought for parameters
of the ceramic material were calculated using Weibull-s weakest-link
theory.
Abstract: We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Abstract: The Chinese Postman Problem (CPP) is one of the
classical problems in graph theory and is applicable in a wide range
of fields. With the rapid development of hybrid systems and model
based testing, Chinese Postman Problem with Time Dependent Travel
Times (CPPTDT) becomes more realistic than the classical problems.
In the literature, we have proposed the first integer programming
formulation for the CPPTDT problem, namely, circuit formulation,
based on which some polyhedral results are investigated and a cutting
plane algorithm is also designed. However, there exists a main drawback:
the circuit formulation is only available for solving the special
instances with all circuits passing through the origin. Therefore, this
paper proposes a new integer programming formulation for solving
all the general instances of CPPTDT. Moreover, the size of the circuit
formulation is too large, which is reduced dramatically here. Thus, it
is possible to design more efficient algorithm for solving the CPPTDT
in the future research.
Abstract: As the Computed Tomography(CT) requires normally
hundreds of projections to reconstruct the image, patients are exposed
to more X-ray energy, which may cause side effects such as cancer.
Even when the variability of the particles in the object is very less,
Computed Tomography requires many projections for good quality
reconstruction. In this paper, less variability of the particles in an
object has been exploited to obtain good quality reconstruction.
Though the reconstructed image and the original image have same
projections, in general, they need not be the same. In addition
to projections, if a priori information about the image is known,
it is possible to obtain good quality reconstructed image. In this
paper, it has been shown by experimental results why conventional
algorithms fail to reconstruct from a few projections, and an efficient
polynomial time algorithm has been given to reconstruct a bi-level
image from its projections along row and column, and a known sub
image of unknown image with smoothness constraints by reducing the
reconstruction problem to integral max flow problem. This paper also
discusses the necessary and sufficient conditions for uniqueness and
extension of 2D-bi-level image reconstruction to 3D-bi-level image
reconstruction.
Abstract: Dhaka, the capital city of Bangladesh, is one of the
densely populated cities in the world. Due to rapid urbanization 60%
of its population lives in slum and squatter settlements. The reason
behind this poverty is low economic growth, inequitable distribution
of income, unequal distribution of productive assets, unemployment
and underemployment, high rate of population growth, low level of
human resource development, natural disasters, and limited access to
public services. Along with poverty, creating pressure on urban land,
shelter, plots, open spaces this creates environmental and ecological
degradation. These constraints are mostly resulted from the failures
of the government policies and measures and only Government can
solve this problem. This is now prime time to establish planning and
environmental management policy and sustainable urban
development for the city and for the urban slum dwellers which are
free from eviction, criminals, rent seekers and other miscreants.
Abstract: Environment both endowed and built are essential for
tourism. However tourism and environment maintains a complex
relationship, where in most cases environment is at the receiving end.
Many tourism development activities have adverse environmental
effects, mainly emanating from construction of general infrastructure
and tourism facilities. These negative impacts of tourism can lead to
the destruction of precious natural resources on which it depends.
These effects vary between locations; and its effect on a hill
destination is highly critical. This study aims at developing a
Sustainable Tourism Planning Model for an environmentally
sensitive tourism destination in Kerala, India. Being part of the
Nilgiri mountain ranges, Munnar falls in the Western Ghats, one of
the biological hotspots in the world. Endowed with a unique high
altitude environment Munnar inherits highly significant ecological
wealth. Giving prime importance to the protection of this ecological
heritage, the study proposes a tourism planning model with resource
conservation and sustainability as the paramount focus. Conceiving a
novel approach towards sustainable tourism planning, the study
proposes to assess tourism attractions using Ecological Sensitivity
Index (ESI) and Tourism Attractiveness Index (TAI). Integration of
these two indices will form the Ecology – Tourism Matrix (ETM),
outlining the base for tourism planning in an environmentally
sensitive destination. The ETM Matrix leads to a classification of
tourism nodes according to its Conservation Significance and
Tourism Significance. The spatial integration of such nodes based on
the Hub & Spoke Principle constitutes sub – regions within the STZ.
Ensuing analyses lead to specific guidelines for the STZ as a whole,
specific tourism nodes, hubs and sub-regions. The study results in a
multi – dimensional output, viz., (1) Classification system for tourism
nodes in an environmentally sensitive region/ destination (2)
Conservation / Tourism Development Strategies and Guidelines for
the micro and macro regions and (3) A Sustainable Tourism Planning
Tool particularly for Ecologically Sensitive Destinations, which can
be adapted for other destinations as well.
Abstract: In this paper, an effective sliding mode design is
applied to chaos synchronization. The proposed controller can make
the states of two identical modified Chua-s circuits globally
asymptotically synchronized. Numerical results are provided to show
the effectiveness and robustness of the proposed method.
Abstract: A new SUZ-4 zeolite membrane with
tetraethlyammonium hydroxide as the template was fabricated on
mullite tube via hydrothermal sol-gel synthesis in a rotating
autoclave reactor. The suitable synthesis condition was SiO2:Al2O3
ratio of 21.2 for 4 days at 155 °C crystallization under autogenous
pressure. The obtained SUZ-4 possessed a high BET surface area of
396.4 m2/g, total pore volume at 2.611 cm3/g, and narrow pore size
distribution with 97 nm mean diameter and 760 nm long of needle
crystal shape. The SUZ-4 layer obtained from seeding crystallization
was thicker than that of without seeds or in situ crystallization.
Abstract: This paper presents an algorithm which extends the rapidly-exploring random tree (RRT) framework to deal with change of the task environments. This algorithm called the Retrieval RRT Strategy (RRS) combines a support vector machine (SVM) and RRT and plans the robot motion in the presence of the change of the surrounding environment. This algorithm consists of two levels. At the first level, the SVM is built and selects a proper path from the bank of RRTs for a given environment. At the second level, a real path is planned by the RRT planners for the given environment. The suggested method is applied to the control of KUKA™,, a commercial 6 DOF robot manipulator, and its feasibility and efficiency are demonstrated via the cosimulatation of MatLab™, and RecurDyn™,.
Abstract: Li1.5Al0.5Ti1.5 (PO4)3(LATP) has received much
attention as a solid electrolyte for lithium batteries. In this study, the
LATP solid electrolyte is prepared by the co-precipitation method
using Li3PO4 as a Li source. The LATP is successfully prepared and
the Li ion conductivities of bulk (inner crystal) and total (inner crystal
and grain boundary) are 1.1 × 10-3 and 1.1 × 10-4 S cm-1, respectively.
These values are comparable to the reported values, in which Li2C2O4
is used as the Li source. It is conclude that the LATP solid electrolyte
can be prepared by the co-precipitation method using Li3PO4 as the Li
source and this procedure has an advantage in mass production over
previous procedure using Li2C2O4 because Li3PO4 is lower price
reagent compared with Li2C2O4.
Abstract: Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.
Abstract: This paper describes the design and results of FROID,
an outbound intrusion detection system built with agent technology
and supported by an attacker-centric ontology. The prototype
features a misuse-based detection mechanism that identifies remote
attack tools in execution. Misuse signatures composed of attributes
selected through entropy analysis of outgoing traffic streams and
process runtime data are derived from execution variants of attack
programs. The core of the architecture is a mesh of self-contained
detection cells organized non-hierarchically that group agents in a
functional fashion. The experiments show performance gains when
the ontology is enabled as well as an increase in accuracy achieved
when correlation cells combine detection evidence received from
independent detection cells.
Abstract: A great deal of research works in the field information
systems security has been based on a positivist paradigm. Applying
the reductionism concept of the positivist paradigm for information
security means missing the bigger picture and thus, the lack of holism
which could be one of the reasons why security is still overlooked,
comes as an afterthought or perceived from a purely technical
dimension. We need to reshape our thinking and attitudes towards
security especially in a complex and dynamic environment such as e-
Business to develop a holistic understanding of e-Business security in
relation to its context as well as considering all the stakeholders in
the problem area. In this paper we argue the suitability and need for
more inductive interpretive approach and qualitative research method
to investigate e-Business security. Our discussion is based on a
holistic framework of enquiry, nature of the research problem, the
underling theoretical lens and the complexity of e-Business
environment. At the end we present a research strategy for
developing a holistic framework for understanding of e-Business
security problems in the context of developing countries based on an
interdisciplinary inquiry which considers their needs and
requirements.
Abstract: How to efficiently assign system resource to route the
Client demand by Gateway servers is a tricky predicament. In this
paper, we tender an enhanced proposal for autonomous recital of
Gateway servers under highly vibrant traffic loads. We devise a
methodology to calculate Queue Length and Waiting Time utilizing
Gateway Server information to reduce response time variance in
presence of bursty traffic.
The most widespread contemplation is performance, because
Gateway Servers must offer cost-effective and high-availability
services in the elongated period, thus they have to be scaled to meet
the expected load. Performance measurements can be the base for
performance modeling and prediction. With the help of performance
models, the performance metrics (like buffer estimation, waiting
time) can be determined at the development process.
This paper describes the possible queue models those can be
applied in the estimation of queue length to estimate the final value
of the memory size. Both simulation and experimental studies using
synthesized workloads and analysis of real-world Gateway Servers
demonstrate the effectiveness of the proposed system.
Abstract: Most file systems overwrite modified file data and
metadata in their original locations, while the Log-structured File
System (LFS) dynamically relocates them to other locations. We
design and implement the Evergreen file system that can select
between overwriting or relocation for each block of a file or metadata.
Therefore, the Evergreen file system can achieve superior write
performance by sequentializing write requests (similar to LFS-style
relocation) when space utilization is low and overwriting when
utilization is high. Another challenging issue is identifying
performance benefits of LFS-style relocation over overwriting on a
newly introduced SSD (Solid State Drive) which has only
Flash-memory chips and control circuits without mechanical parts.
Our experimental results measured on a SSD show that relocation
outperforms overwriting when space utilization is below 80% and vice
versa.
Abstract: Accurate timing alignment and stability is important
to maximize the true counts and minimize the random counts in
positron emission tomography So signals output from detectors must
be centering with the two isotopes to pre-operation and fed signals
into four units of pulse-processing units, each unit can accept up to
eight inputs. The dual source computed tomography consist two units
on the left for 15 detector signals of Cs-137 isotope and two units on
the right are for 15 detectors signals of Co-60 isotope. The gamma
spectrum consisting of either single or multiple photo peaks. This
allows for the use of energy discrimination electronic hardware
associated with the data acquisition system to acquire photon counts
data with a specific energy, even if poor energy resolution detectors
are used. This also helps to avoid counting of the Compton scatter
counts especially if a single discrete gamma photo peak is emitted by
the source as in the case of Cs-137. In this study the polyenergetic
version of the alternating minimization algorithm is applied to the
dual energy gamma computed tomography problem.
Abstract: A robust AUSM+ upwind discretisation scheme has been developed to simulate multiphase flow using consistent spatial discretisation schemes and a modified low-Mach number diffusion term. The impact of the selection of an interfacial pressure model has also been investigated. Three representative test cases have been simulated to evaluate the accuracy of the commonly-used stiffenedgas equation of state with respect to the IAPWS-IF97 equation of state for water. The algorithm demonstrates a combination of robustness and accuracy over a range of flow conditions, with the stiffened-gas equation tending to overestimate liquid temperature and density profiles.
Abstract: Reinforced concrete has good durability and excellent structural performance. But there are cases of early deterioration due to a number of factors, one prominent factor being corrosion of steel reinforcement. The process of corrosion sets in due to ingress of moisture, oxygen and other ingredients into the body of concrete, which is unsound, permeable and absorbent. Cracks due to structural and other causes such as creep, shrinkage, etc also allow ingress of moisture and other harmful ingredients and thus accelerate the rate of corrosion. There are several interactive factors both external and internal, which lead to corrosion of reinforcement and ultimately failure of structures. Suitable addition of mineral admixture like silica fume (SF) in concrete improves the strength and durability of concrete due to considerable improvement in the microstructure of concrete composites, especially at the transition zone. Secondary reinforcement in the form of fibre is added to concrete, which provides three dimensional random reinforcement in the entire mass of concrete. Reinforced concrete beams of size 0.1 m X 0.15 m and length 1m have been cast using M 35 grade of concrete. The beams after curing process were subjected to corrosion process by impressing an external Direct Current (Galvanostatic Method) for a period of 15 days under stressed and unstressed conditions. The corroded beams were tested by applying two point loads to determine the ultimate load carrying capacity and cracking pattern and the results of specimens were compared with that of the companion specimens. Gravimetric method is used to quantify corrosion that has occurred.
Abstract: Conventional materials like glass, wood or metals
replacement with polymer materials is still continuing. More simple
thus cheaper production is the main reason. However due to high
energy and petrochemical prices are polymer prices increasing too.
That´s why various kinds of fillers are used to make polymers
cheaper. Of course target is to maintain or improve properties of
these compounds. In this paper are solved rheology issues of
polymers compounded with vegetal origin fibers.