Abstract: An effective visual error concealment method has been presented by employing a robust rotation, scale, and translation (RST) invariant partial patch matching model (RSTI-PPMM) and
exemplar-based inpainting. While the proposed robust and inherently
feature-enhanced texture synthesis approach ensures the generation
of excellent and perceptually plausible visual error concealment results, the outlier pruning property guarantees the significant quality improvements, both quantitatively and qualitatively. No intermediate
user-interaction is required for the pre-segmented media and the
presented method follows a bootstrapping approach for an automatic
visual loss recovery and the image and video error concealment.
Abstract: In the context of spectrum surveillance, a new method
to recover the code of spread spectrum signal is presented, while the
receiver has no knowledge of the transmitter-s spreading sequence. In
our previous paper, we used Genetic algorithm (GA), to recover
spreading code. Although genetic algorithms (GAs) are well known
for their robustness in solving complex optimization problems, but
nonetheless, by increasing the length of the code, we will often lead
to an unacceptable slow convergence speed. To solve this problem we
introduce Particle Swarm Optimization (PSO) into code estimation in
spread spectrum communication system. In searching process for
code estimation, the PSO algorithm has the merits of rapid
convergence to the global optimum, without being trapped in local
suboptimum, and good robustness to noise. In this paper we describe
how to implement PSO as a component of a searching algorithm in
code estimation. Swarm intelligence boasts a number of advantages
due to the use of mobile agents. Some of them are: Scalability, Fault
tolerance, Adaptation, Speed, Modularity, Autonomy, and
Parallelism. These properties make swarm intelligence very attractive
for spread spectrum code estimation. They also make swarm
intelligence suitable for a variety of other kinds of channels. Our
results compare between swarm-based algorithms and Genetic
algorithms, and also show PSO algorithm performance in code
estimation process.
Abstract: Zero inflated Strict Arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, maximum likelihood estimation method is used in estimating the parameters for zero inflated strict arcsine model. Bootstrapping is then employed to compute the confidence intervals for the estimated parameters.
Abstract: Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.
Abstract: Petri Net being one of the most useful graphical tools for modelling complex asynchronous systems, we have used Petri Net to model multi-track railway level crossing system. The roadway has been augmented with four half-size barriers. For better control, a three stage control mechanism has been introduced to ensure that no road-vehicle is trapped on the level crossing. Timed Petri Net is used to include the temporal nature of the signalling system. Safeness analysis has also been included in the discussion section.
Abstract: Technology transfer of renewable energy technologies is very often unsuccessful in the developing world. Aside from challenges that have social, economic, financial, institutional and environmental dimensions, technology transfer has generally been misunderstood, and largely seen as mere delivery of high tech equipment from developed to developing countries or within the developing world from R&D institutions to society. Technology transfer entails much more, including, but not limited to: entire systems and their component parts, know-how, goods and services, equipment, and organisational and managerial procedures. Means to facilitate the successful transfer of energy technologies, including the sharing of lessons are subsequently extremely important for developing countries as they grapple with increasing energy needs to sustain adequate economic growth and development. Improving the success of technology transfer is an ongoing process as more projects are implemented, new problems are encountered and new lessons are learnt. Renewable energy is also critical to improve the quality of lives of the majority of people in developing countries. In rural areas energy is primarily traditional biomass. The consumption activities typically occur in an inefficient manner, thus working against the notion of sustainable development. This paper explores the implementation of technology transfer in the developing world (sub-Saharan Africa). The focus is necessarily on RETs since most rural energy initiatives are RETs-based. Additionally, it aims to highlight some lessons drawn from the cited RE projects and identifies notable differences where energy technology transfer was judged to be successful. This is done through a literature review based on a selection of documented case studies which are judged against the definition provided for technology transfer. This paper also puts forth research recommendations that might contribute to improved technology transfer in the developing world. Key findings of this paper include: Technology transfer cannot be complete without satisfying pre-conditions such as: affordability, maintenance (and associated plans), knowledge and skills transfer, appropriate know how, ownership and commitment, ability to adapt technology, sound business principles such as financial viability and sustainability, project management, relevance and many others. It is also shown that lessons are learnt in both successful and unsuccessful projects.
Abstract: Sol-gel immobilization of enzymes, which can improve considerably their properties, is now one of the most used techniques. By deposition of the entrapped lipase on a solid support, a new and improved biocatalyst was obtained, which can be used with excellent results in acylation reactions. In this paper, lipase B from Candida antarctica was double immobilized on different adsorbents. These biocatalysts were employed in the kinetic resolution of several aliphatic secondary alcohols in organic medium. High total recovery yields of enzymatic activity, up to 560%, were obtained. For all the studied alcohols the enantiomeric ratios E were over 200. The influence of the reaction medium was studied for the kinetic resolution of 2-pentanol.
Abstract: Master plan is a tool to guide and manage the growth of cities in a planned manner. The soul of a master plan lies in its implementation framework. If not implemented, people are trapped in a mess of urban problems and laissez-faire development having serious long term repercussions. Unfortunately, Master Plans prepared for several major cities of Pakistan could not be fully implemented due to host of reasons and Lahore is no exception. Being the second largest city of Pakistan with a population of over 7 million people, Lahore holds the distinction that the first ever Master Plan in the country was prepared for this city in 1966. Recently in 2004, a new plan titled `Integrated Master Plan for Lahore-2021- has been approved for implementation. This paper provides a comprehensive account of the weaknesses and constraints in the plan preparation process and implementation strategies of Master Plans prepared for Lahore. It also critically reviews the new Master Plan particularly with respect to the proposed implementation framework. The paper discusses the prospects and pre-conditions for successful implementation of the new Plan in the light of historic analysis, interviews with stakeholders and the new institutional context under the devolution plan.
Abstract: This paper draws a methodological framework adopted within an internal Telecomitalia project aimed to identify, on a user centred base, the potential interest towards a technological scenario aimed to extend on a personal bubble the typical communication and media fruition home environment. The problem is that involving user in the early stage of the development of such disruptive technology scenario asking users opinions on something that users actually do not manage even in a rough manner could lead to wrong or distorted results. For that reason we chose an approach that indirectly aim to understand users hidden needs in order to obtain a meaningful picture of the possible interest for a technological proposition non yet easily understandable.
Abstract: Fiber optic sensor technology offers the possibility of
sensing different parameters like strain, temperature, pressure in
harsh environment and remote locations. these kinds of sensors
modulates some features of the light wave in an optical fiber such an
intensity and phase or use optical fiber as a medium for transmitting
the measurement information.
The advantages of fiber optic sensors in contrast to conventional
electrical ones make them popular in different applications and now a
day they consider as a key component in improving industrial
processes, quality control systems, medical diagnostics, and
preventing and controlling general process abnormalities.
This paper is an introduction to fiber optic sensor technology and
some of the applications that make this branch of optic technology,
which is still in its early infancy, an interesting field.
Abstract: A generic and extendible Multi-Agent Data Mining
(MADM) framework, MADMF (the Multi-Agent Data Mining
Framework) is described. The central feature of the framework is that
it avoids the use of agreed meta-language formats by supporting a
framework of wrappers.
The advantage offered is that the framework is easily extendible,
so that further data agents and mining agents can simply be added to
the framework. A demonstration MADMF framework is currently
available. The paper includes details of the MADMF architecture and
the wrapper principle incorporated into it. A full description and
evaluation of the framework-s operation is provided by considering
two MADM scenarios.
Abstract: The most suitable Semiconductor detector, Cadmium
Zinc Teloraid , has unique properties because of high Atomic number
and wide Brand Gap . It has been tried in this project with different
processes such as Lead , Diffusion , Produce and Recombination ,
effect of Trapping and injection carrier of CdZnTe , to get hole and
then present a complete answer of it . Then we should investigate the
movement of carrier ( Electron – Hole ) by using above answer.
Abstract: Nanoemulsions are a class of emulsions with a droplet
size in the range of 50–500 nm and have attracted a great deal of
attention in recent years because it is unique characteristics. The
physicochemical properties of nanoemulsion suggests that it can be
successfully used to recover the residual oil which is trapped in the
fine pore of reservoir rock by capillary forces after primary and
secondary recovery. Oil-in-water nanoemulsion which can be formed
by high-energy emulsification techniques using specific surfactants
can reduce oil-water interfacial tension (IFT) by 3-4 orders of
magnitude. The present work is aimed on characterization of oil-inwater
nanoemulsion in terms of its phase behavior, morphological
studies; interfacial energy; ability to reduce the interfacial tension and
understanding the mechanisms of mobilization and displacement of
entrapped oil blobs by lowering interfacial tension both at the
macroscopic and microscopic level. In order to investigate the
efficiency of oil-water nanoemulsion in enhanced oil recovery
(EOR), experiments were performed to characterize the emulsion in
terms of their physicochemical properties and size distribution of the
dispersed oil droplet in water phase. Synthetic mineral oil and a series
of surfactants were used to prepare oil-in-water emulsions.
Characterization of emulsion shows that it follows pseudo-plastic
behaviour and drop size of dispersed oil phase follows lognormal
distribution. Flooding experiments were also carried out in a
sandpack system to evaluate the effectiveness of the nanoemulsion as
displacing fluid for enhanced oil recovery. Substantial additional
recoveries (more than 25% of original oil in place) over conventional
water flooding were obtained in the present investigation.
Abstract: This study was designed to formulate,
pharmaceutically evaluate a topical skin-care cream (w/o emulsion)
of Aloe Vera versus its vehicle (Base) as control and determine their
effects on Stratum Corneum (SC) water content and Transepidermal
water loss (TEWL). Base containing no extract and a Formulation
containing 3% concentrated extract of Aloe Vera was developed by
entrapping in the inner aqueous phase of w/o emulsion (cream).
Lemon oil was incorporated to improve the odor. Both the Base and
Formulation were stored at 8°C ±0.1°C (in refrigerator), 25°C±0.1°C,
40°C±0.1°C and 40°C± 0.1°C with 75% RH (in incubator) for a
period of 4 weeks to predict their stability. The evaluation parameters
consisted of color, smell, type of emulsion, phase separation,
electrical conductivity, centrifugation, liquefaction and pH. Both the
Base and Formulation were applied to the cheeks of 21 healthy
human volunteers for a period of 8 weeks Stratum corneum (SC)
water content and Transepidermal water loss (TEWL) were
monitored every week to measure any effect produced by these
topical creams. The expected organoleptic stability of creams was
achieved from 4 weeks in-vitro study period. Odor was disappeared
with the passage of time due to volatilization of lemon oil. Both the
Base and Formulation produced significant (p≤0.05) changes in
TEWL with respect to time. SC water content was significantly
(p≤0.05) increased by the Formulation while the Base has
insignificant (p 0.05) effects on SC water content. The newly
formulated cream of Aloe Vera, applied is suitable for improvement
and quantitative monitoring of skin hydration level (SC water
content/ moisturizing effects) and reducing TEWL in people with dry
skin.
Abstract: We investigated statistical performance of Bayesian inference using maximum entropy and MAP estimation for several models which approximated wave-fronts in remote sensing using SAR interferometry. Using Monte Carlo simulation for a set of wave-fronts generated by assumed true prior, we found that the method of maximum entropy realized the optimal performance around the Bayes-optimal conditions by using model of the true prior and the likelihood representing optical measurement due to the interferometer. Also, we found that the MAP estimation regarded as a deterministic limit of maximum entropy almost achieved the same performance as the Bayes-optimal solution for the set of wave-fronts. Then, we clarified that the MAP estimation perfectly carried out phase unwrapping without using prior information, and also that the MAP estimation realized accurate phase unwrapping using conjugate gradient (CG) method, if we assumed the model of the true prior appropriately.
Abstract: In this work, we present a novel active learning approach
for learning a visual object detection system. Our system
is composed of an active learning mechanism as wrapper around
a sub-algorithm which implement an online boosting-based learning
object detector. In the core is a combination of a bootstrap procedure
and a semi automatic learning process based on the online boosting
procedure. The idea is to exploit the availability of classifier during
learning to automatically label training samples and increasingly
improves the classifier. This addresses the issue of reducing labeling
effort meanwhile obtain better performance. In addition, we propose
a verification process for further improvement of the classifier.
The idea is to allow re-update on seen data during learning for
stabilizing the detector. The main contribution of this empirical study
is a demonstration that active learning based on an online boosting
approach trained in this manner can achieve results comparable or
even outperform a framework trained in conventional manner using
much more labeling effort. Empirical experiments on challenging data
set for specific object deteciton problems show the effectiveness of
our approach.
Abstract: This work aims to describe the process of developing
services and applications of seamless communication within a
Telecom Italia long-term research project, which takes as central aim
the design of a wearable communication device. In particular, the
objective was to design a wrist phone integrated into everyday life of
people in full transparency. The methodology used to design the
wristwatch was developed through several subsequent steps also
involving the Personas Layering Framework. The data collected in
this phases have been very useful for designing an improved version
of the first two concepts of wrist phone going to change aspects
related to the four critical points expressed by the users.
Abstract: Developing an accurate classifier for high dimensional microarray datasets is a challenging task due to availability of small sample size. Therefore, it is important to determine a set of relevant genes that classify the data well. Traditionally, gene selection method often selects the top ranked genes according to their discriminatory power. Often these genes are correlated with each other resulting in redundancy. In this paper, we have proposed a hybrid method using feature ranking and wrapper method (Genetic Algorithm with multiclass SVM) to identify a set of relevant genes that classify the data more accurately. A new fitness function for genetic algorithm is defined that focuses on selecting the smallest set of genes that provides maximum accuracy. Experiments have been carried on four well-known datasets1. The proposed method provides better results in comparison to the results found in the literature in terms of both classification accuracy and number of genes selected.
Abstract: Sharing consistent and correct master data among
disparate applications in a reverse-logistics chain has long been
recognized as an intricate problem. Although a master data
management (MDM) system can surely assume that responsibility,
applications that need to co-operate with it must comply with
proprietary query interfaces provided by the specific MDM system. In
this paper, we present a RFID-ready MDM system which makes
master data readily available for any participating applications in a
reverse-logistics chain. We propose a RFID-wrapper as a part of our
MDM. It acts as a gateway between any data retrieval request and
query interfaces that process it. With the RFID-wrapper, any
participating applications in a reverse-logistics chain can easily
retrieve master data in a way that is analogous to retrieval of any other
RFID-based logistics transactional data.
Abstract: On the basis of Bayesian inference using the
maximizer of the posterior marginal estimate, we carry out phase
unwrapping using multiple interferograms via generalized mean-field
theory. Numerical calculations for a typical wave-front in remote
sensing using the synthetic aperture radar interferometry, phase
diagram in hyper-parameter space clarifies that the present method
succeeds in phase unwrapping perfectly under the constraint of
surface- consistency condition, if the interferograms are not corrupted
by any noises. Also, we find that prior is useful for extending a phase
in which phase unwrapping under the constraint of the
surface-consistency condition. These results are quantitatively
confirmed by the Monte Carlo simulation.