Abstract: We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Abstract: The Chinese Postman Problem (CPP) is one of the
classical problems in graph theory and is applicable in a wide range
of fields. With the rapid development of hybrid systems and model
based testing, Chinese Postman Problem with Time Dependent Travel
Times (CPPTDT) becomes more realistic than the classical problems.
In the literature, we have proposed the first integer programming
formulation for the CPPTDT problem, namely, circuit formulation,
based on which some polyhedral results are investigated and a cutting
plane algorithm is also designed. However, there exists a main drawback:
the circuit formulation is only available for solving the special
instances with all circuits passing through the origin. Therefore, this
paper proposes a new integer programming formulation for solving
all the general instances of CPPTDT. Moreover, the size of the circuit
formulation is too large, which is reduced dramatically here. Thus, it
is possible to design more efficient algorithm for solving the CPPTDT
in the future research.
Abstract: As the Computed Tomography(CT) requires normally
hundreds of projections to reconstruct the image, patients are exposed
to more X-ray energy, which may cause side effects such as cancer.
Even when the variability of the particles in the object is very less,
Computed Tomography requires many projections for good quality
reconstruction. In this paper, less variability of the particles in an
object has been exploited to obtain good quality reconstruction.
Though the reconstructed image and the original image have same
projections, in general, they need not be the same. In addition
to projections, if a priori information about the image is known,
it is possible to obtain good quality reconstructed image. In this
paper, it has been shown by experimental results why conventional
algorithms fail to reconstruct from a few projections, and an efficient
polynomial time algorithm has been given to reconstruct a bi-level
image from its projections along row and column, and a known sub
image of unknown image with smoothness constraints by reducing the
reconstruction problem to integral max flow problem. This paper also
discusses the necessary and sufficient conditions for uniqueness and
extension of 2D-bi-level image reconstruction to 3D-bi-level image
reconstruction.
Abstract: Wireless Mesh Networking is a promising proposal
for broadband data transmission in a large area with low cost and
acceptable QoS. These features- trade offs in WMNs is a hot research
field nowadays. In this paper a mathematical optimization framework
has been developed to maximize throughput according to upper
bound delay constraints. IEEE 802.11 based infrastructure
backhauling mode of WMNs has been considered to formulate the
MINLP optimization problem. Proposed method gives the full
routing and scheduling procedure in WMN in order to obtain
mentioned goals.
Abstract: Tomato powder has good potential as substitute of tomato paste and other tomato products. In order to protect physicochemical properties and nutritional quality of tomato during dehydration process, investigation was carried out using different drying methods and pretreatments. Solar drier and continuous conveyor (tunnel) drier were used for dehydration where as calcium chloride (CaCl2), potassium metabisulphite (KMS), calcium chloride and potassium metabisulphite (CaCl2 +KMS), and sodium chloride (NaCl) selected for treatment.. lycopene content, dehydration ratio, rehydration ratio and non-enzymatic browning in addition to moisture, sugar and titrable acidity were studied. Results show that pre-treatment with CaCl2 and NaCl increased water removal and moisture mobility in tomato slices during drying of tomatoes. Where CaCl2 used along with KMS the NEB was recorded the least compared to other treatments and the best results were obtained while using the two chemicals in combination form. Storage studies in LDPE polymeric and metalized polyesters films showed less changes in the products packed in metallized polyester pouches and even after 6 months lycopene content did not decrease more than 20% as compared to the control sample and provide extension of shelf life in acceptable condition for 6 months. In most of the quality characteristics tunnel drier samples presented better values in comparison to solar drier.
Abstract: The numerical analytic continuation of a function f(z) = f(x + iy) on a strip is discussed in this paper. The data are only given approximately on the real axis. The periodicity of given data is assumed. A truncated Fourier spectral method has been introduced to deal with the ill-posedness of the problem. The theoretic results show that the discrepancy principle can work well for this problem. Some numerical results are also given to show the efficiency of the method.
Abstract: Dhaka, the capital city of Bangladesh, is one of the
densely populated cities in the world. Due to rapid urbanization 60%
of its population lives in slum and squatter settlements. The reason
behind this poverty is low economic growth, inequitable distribution
of income, unequal distribution of productive assets, unemployment
and underemployment, high rate of population growth, low level of
human resource development, natural disasters, and limited access to
public services. Along with poverty, creating pressure on urban land,
shelter, plots, open spaces this creates environmental and ecological
degradation. These constraints are mostly resulted from the failures
of the government policies and measures and only Government can
solve this problem. This is now prime time to establish planning and
environmental management policy and sustainable urban
development for the city and for the urban slum dwellers which are
free from eviction, criminals, rent seekers and other miscreants.
Abstract: Environment both endowed and built are essential for
tourism. However tourism and environment maintains a complex
relationship, where in most cases environment is at the receiving end.
Many tourism development activities have adverse environmental
effects, mainly emanating from construction of general infrastructure
and tourism facilities. These negative impacts of tourism can lead to
the destruction of precious natural resources on which it depends.
These effects vary between locations; and its effect on a hill
destination is highly critical. This study aims at developing a
Sustainable Tourism Planning Model for an environmentally
sensitive tourism destination in Kerala, India. Being part of the
Nilgiri mountain ranges, Munnar falls in the Western Ghats, one of
the biological hotspots in the world. Endowed with a unique high
altitude environment Munnar inherits highly significant ecological
wealth. Giving prime importance to the protection of this ecological
heritage, the study proposes a tourism planning model with resource
conservation and sustainability as the paramount focus. Conceiving a
novel approach towards sustainable tourism planning, the study
proposes to assess tourism attractions using Ecological Sensitivity
Index (ESI) and Tourism Attractiveness Index (TAI). Integration of
these two indices will form the Ecology – Tourism Matrix (ETM),
outlining the base for tourism planning in an environmentally
sensitive destination. The ETM Matrix leads to a classification of
tourism nodes according to its Conservation Significance and
Tourism Significance. The spatial integration of such nodes based on
the Hub & Spoke Principle constitutes sub – regions within the STZ.
Ensuing analyses lead to specific guidelines for the STZ as a whole,
specific tourism nodes, hubs and sub-regions. The study results in a
multi – dimensional output, viz., (1) Classification system for tourism
nodes in an environmentally sensitive region/ destination (2)
Conservation / Tourism Development Strategies and Guidelines for
the micro and macro regions and (3) A Sustainable Tourism Planning
Tool particularly for Ecologically Sensitive Destinations, which can
be adapted for other destinations as well.
Abstract: In this paper, an effective sliding mode design is
applied to chaos synchronization. The proposed controller can make
the states of two identical modified Chua-s circuits globally
asymptotically synchronized. Numerical results are provided to show
the effectiveness and robustness of the proposed method.
Abstract: This paper presents a genetic algorithm based
approach for solving security constrained optimal power flow
problem (SCOPF) including FACTS devices. The optimal location of
FACTS devices are identified using an index called overload index
and the optimal values are obtained using an enhanced genetic
algorithm. The optimal allocation by the proposed method optimizes
the investment, taking into account its effects on security in terms of
the alleviation of line overloads. The proposed approach has been
tested on IEEE-30 bus system to show the effectiveness of the
proposed algorithm for solving the SCOPF problem.
Abstract: This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.
Abstract: A new SUZ-4 zeolite membrane with
tetraethlyammonium hydroxide as the template was fabricated on
mullite tube via hydrothermal sol-gel synthesis in a rotating
autoclave reactor. The suitable synthesis condition was SiO2:Al2O3
ratio of 21.2 for 4 days at 155 °C crystallization under autogenous
pressure. The obtained SUZ-4 possessed a high BET surface area of
396.4 m2/g, total pore volume at 2.611 cm3/g, and narrow pore size
distribution with 97 nm mean diameter and 760 nm long of needle
crystal shape. The SUZ-4 layer obtained from seeding crystallization
was thicker than that of without seeds or in situ crystallization.
Abstract: Modeling product configurations needs large amounts of knowledge about technical and marketing restrictions on the product. Previous attempts to automate product configurations concentrate on representations and management of the knowledge for specific domains in fixed and isolated computing environments. Since the knowledge about product configurations is subject to continuous change and hard to express, these attempts often failed to efficiently manage and exchange the knowledge in collaborative product development. In this paper, XML Topic Map (XTM) is introduced to represent and exchange the knowledge about product configurations in collaborative product development. A product configuration model based on XTM along with its merger and inference facilities enables configuration engineers in collaborative product development to manage and exchange their knowledge efficiently. A prototype implementation is also presented to demonstrate the proposed model can be applied to engineering information systems to exchange the product configuration knowledge.
Abstract: Discrete particle swarm optimization (DPSO) is a
powerful stochastic evolutionary algorithm that is used to solve the
large-scale, discrete and nonlinear optimization problems. However,
it has been observed that standard DPSO algorithm has premature
convergence when solving a complex optimization problem like
transmission expansion planning (TEP). To resolve this problem an
advanced discrete particle swarm optimization (ADPSO) is proposed
in this paper. The simulation result shows that optimization of lines
loading in transmission expansion planning with ADPSO is better
than DPSO from precision view point.
Abstract: Nowadays there are more than thirty maturity models
in different knowledge areas. Maturity model is an area of interest
that contributes organizations to find out where they are in a specific
knowledge area and how to improve it. As Information Resource
Management (IRM) is the concept that information is a major
corporate resource and must be managed using the same basic
principles used to manage other assets, assessment of the current
IRM status and reveal the improvement points can play a critical role
in developing an appropriate information structure in organizations.
In this paper we proposed a framework for information resource
management maturity model (IRM3) that includes ten best practices
for the maturity assessment of the organizations' IRM.
Abstract: This paper is concerned with the application of the vision control algorithm for robot's point placement task in discontinuous trajectory caused by obstacle. The presented vision control algorithm consists of four models, which are the robot kinematic model, vision system model, parameters estimation model, and robot joint angle estimation model.When the robot moves toward a target along discontinuous trajectory, several types of obstacles appear in two obstacle regions. Then, this study is to investigate how these changes will affect the presented vision control algorithm.Thus, the practicality of the vision control algorithm is demonstrated experimentally by performing the robot's point placement task in discontinuous trajectory by obstacle.
Abstract: This is a study on numerical simulation of the convection-diffusion transport of a chemical species in steady flow through a small-diameter tube, which is lined with a very thin layer made up of retentive and absorptive materials. The species may be subject to a first-order kinetic reversible phase exchange with the wall material and irreversible absorption into the tube wall. Owing to the velocity shear across the tube section, the chemical species may spread out axially along the tube at a rate much larger than that given by the molecular diffusion; this process is known as dispersion. While the long-time dispersion behavior, well described by the Taylor model, has been extensively studied in the literature, the early development of the dispersion process is by contrast much less investigated. By early development, that means a span of time, after the release of the chemical into the flow, that is shorter than or comparable to the diffusion time scale across the tube section. To understand the early development of the dispersion, the governing equations along with the reactive boundary conditions are solved numerically using the Flux Corrected Transport Algorithm (FCTA). The computation has enabled us to investigate the combined effects on the early development of the dispersion coefficient due to the reversible and irreversible wall reactions. One of the results is shown that the dispersion coefficient may approach its steady-state limit in a short time under the following conditions: (i) a high value of Damkohler number (say Da ≥ 10); (ii) a small but non-zero value of absorption rate (say Γ* ≤ 0.5).
Abstract: This paper presents an algorithm which extends the rapidly-exploring random tree (RRT) framework to deal with change of the task environments. This algorithm called the Retrieval RRT Strategy (RRS) combines a support vector machine (SVM) and RRT and plans the robot motion in the presence of the change of the surrounding environment. This algorithm consists of two levels. At the first level, the SVM is built and selects a proper path from the bank of RRTs for a given environment. At the second level, a real path is planned by the RRT planners for the given environment. The suggested method is applied to the control of KUKA™,, a commercial 6 DOF robot manipulator, and its feasibility and efficiency are demonstrated via the cosimulatation of MatLab™, and RecurDyn™,.
Abstract: Li1.5Al0.5Ti1.5 (PO4)3(LATP) has received much
attention as a solid electrolyte for lithium batteries. In this study, the
LATP solid electrolyte is prepared by the co-precipitation method
using Li3PO4 as a Li source. The LATP is successfully prepared and
the Li ion conductivities of bulk (inner crystal) and total (inner crystal
and grain boundary) are 1.1 × 10-3 and 1.1 × 10-4 S cm-1, respectively.
These values are comparable to the reported values, in which Li2C2O4
is used as the Li source. It is conclude that the LATP solid electrolyte
can be prepared by the co-precipitation method using Li3PO4 as the Li
source and this procedure has an advantage in mass production over
previous procedure using Li2C2O4 because Li3PO4 is lower price
reagent compared with Li2C2O4.
Abstract: Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.