Abstract: The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.
Abstract: This paper presents the fundamentals of Origami engineering and its application in nowadays as well as future industry. Several main cores of mathematical approaches such as Huzita- Hatori axioms, Maekawa and Kawasaki-s theorems are introduced briefly. Meanwhile flaps and circle packing by Robert Lang is explained to make understood the underlying principles in designing crease pattern. Rigid origami and its corrugation patterns which are potentially applicable for creating transformable or temporary spaces is discussed to show the transition of origami from paper to thick material. Moreover, some innovative applications of origami such as eyeglass, origami stent and high tech origami based on mentioned theories and principles are showcased in section III; while some updated origami technology such as Vacuumatics, self-folding of polymer sheets and programmable matter folding which could greatlyenhance origami structureare demonstrated in Section IV to offer more insight in future origami.
Abstract: In this paper we present a technique to speed up
ICA based on the idea of reducing the dimensionality of the data
set preserving the quality of the results. In particular we refer to
FastICA algorithm which uses the Kurtosis as statistical property
to be maximized. By performing a particular Johnson-Lindenstrauss
like projection of the data set, we find the minimum dimensionality
reduction rate ¤ü, defined as the ratio between the size k of the reduced
space and the original one d, which guarantees a narrow confidence
interval of such estimator with high confidence level. The derived
dimensionality reduction rate depends on a system control parameter
β easily computed a priori on the basis of the observations only.
Extensive simulations have been done on different sets of real world
signals. They show that actually the dimensionality reduction is very
high, it preserves the quality of the decomposition and impressively
speeds up FastICA. On the other hand, a set of signals, on which the
estimated reduction rate is greater than 1, exhibits bad decomposition
results if reduced, thus validating the reliability of the parameter β.
We are confident that our method will lead to a better approach to
real time applications.
Abstract: Mobile Ad hoc Networks is an autonomous system of
mobile nodes connected by multi-hop wireless links without
centralized infrastructure support. As mobile communication gains
popularity, the need for suitable ad hoc routing protocols will
continue to grow. Efficient dynamic routing is an important research
challenge in such a network. Bandwidth constrained mobile devices
use on-demand approach in their routing protocols because of its
effectiveness and efficiency. Many researchers have conducted
numerous simulations for comparing the performance of these
protocols under varying conditions and constraints. Most of them are
not aware of MAC Protocols, which will impact the relative
performance of routing protocols considered in different network
scenarios. In this paper we investigate the choice of MAC protocols
affects the relative performance of ad hoc routing protocols under
different scenarios. We have evaluated the performance of these
protocols using NS2 simulations. Our results show that the
performance of routing protocols of ad hoc networks will suffer when
run over different MAC Layer protocols.
Abstract: Recently, there have been an increasing interest in RFID system and RFID systems have been applied to various applications. Load balancing is a fundamental technique for providing scalability of systems by moving workload from overloaded nodes to under-loaded nodes. This paper presents an approach to adaptive load balancing for RFID middlewares. Workloads of RFID middlewares can have a considerable variation according to the location of the connected RFID readers and can abruptly change at a particular instance. The proposed approach considers those characteristics of RFID middle- wares to provide an efficient load balancing.
Abstract: Green- spaces might be very attractive, but
where are the economic benefits? What value do nature and
landscape have for us? What difference will it make to jobs,
health and the economic strength of areas struggling with
deprivation and social problems? [1].There is a need to consider
green spaces from a different perspective. Green planning is not just
about flora and fauna, but also about planning for economic benefits
[2]. It is worth trying to quantify the value of green spaces since
nature and landscape are crucially important to our quality of life and
sustainable development. The reality, however, is that urban
development often takes place at the expense of green spaces.
Urbanization is an ongoing process throughout the world; however,
hyper-urbanization without environmental planning is destructive,
not constructive [3]. Urban spaces are believed to be more valuable
than other land uses, particular green areas, simply because of the
market value connected to urban spaces. However, attractive
landscapes can help raise the quality and value of the urban market
even more. In order to reach these objectives of integrated planning,
the Green-Value-Gap needs to be bridged. Economists have to
understand the concept of Green-Planning and the spinoffs, and
Environmentalists have to understand the importance of urban
economic development and the benefits thereof to green planning. An
interface between Environmental Management, Economic
Development and sustainable Spatial Planning are needed to bridge
the Green-Value-Gap.
Abstract: This article presents the results using a parametric approach and a Wavelet Transform in analysing signals emitting from the sperm whale. The extraction of intrinsic characteristics of these unique signals emitted by marine mammals is still at present a difficult exercise for various reasons: firstly, it concerns non-stationary signals, and secondly, these signals are obstructed by interfering background noise. In this article, we compare the advantages and disadvantages of both methods: Auto Regressive models and Wavelet Transform. These approaches serve as an alternative to the commonly used estimators which are based on the Fourier Transform for which the hypotheses necessary for its application are in certain cases, not sufficiently proven. These modern approaches provide effective results particularly for the periodic tracking of the signal's characteristics and notably when the signal-to-noise ratio negatively effects signal tracking. Our objectives are twofold. Our first goal is to identify the animal through its acoustic signature. This includes recognition of the marine mammal species and ultimately of the individual animal (within the species). The second is much more ambitious and directly involves the intervention of cetologists to study the sounds emitted by marine mammals in an effort to characterize their behaviour. We are working on an approach based on the recordings of marine mammal signals and the findings from this data result from the Wavelet Transform. This article will explore the reasons for using this approach. In addition, thanks to the use of new processors, these algorithms once heavy in calculation time can be integrated in a real-time system.
Abstract: Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.
Abstract: Developing a supply chain management (SCM) system is costly, but important. However, because of its complicated nature, not many of such projects are considered successful. Few research publications directly relate to key success factors (KSFs) for implementing a SCM system. Motivated by the above, this research proposes a hierarchy of KSFs for SCM system implementation in the semiconductor industry by using a two-step approach. First, the literature review indicates the initial hierarchy. The second step includes a focus group approach to finalize the proposed KSF hierarchy by extracting valuable experiences from executives and managers that actively participated in a project, which successfully establish a seamless SCM integration between the world's largest semiconductor foundry manufacturing company and the world's largest assembly and testing company. Future project executives may refer the resulting KSF hierarchy as a checklist for SCM system implementation in semiconductor or related industries.
Abstract: Approximate tandem repeats in a genomic sequence are
two or more contiguous, similar copies of a pattern of nucleotides.
They are used in DNA mapping, studying molecular evolution
mechanisms, forensic analysis and research in diagnosis of inherited
diseases. All their functions are still investigated and not well
defined, but increasing biological databases together with tools for
identification of these repeats may lead to discovery of their specific
role or correlation with particular features. This paper presents a new
approach for finding approximate tandem repeats in a given sequence,
where the similarity between consecutive repeats is measured using
the Hamming distance. It is an enhancement of a method for finding
exact tandem repeats in DNA sequences based on the Burrows-
Wheeler transform.
Abstract: Morphological operators transform the original image
into another image through the interaction with the other image of
certain shape and size which is known as the structure element.
Mathematical morphology provides a systematic approach to analyze
the geometric characteristics of signals or images, and has been
applied widely too many applications such as edge detection,
objection segmentation, noise suppression and so on. Fuzzy
Mathematical Morphology aims to extend the binary morphological
operators to grey-level images. In order to define the basic
morphological operations such as fuzzy erosion, dilation, opening
and closing, a general method based upon fuzzy implication and
inclusion grade operators is introduced. The fuzzy morphological
operations extend the ordinary morphological operations by using
fuzzy sets where for fuzzy sets, the union operation is replaced by a
maximum operation, and the intersection operation is replaced by a
minimum operation.
In this work, it consists of two articles. In the first one, fuzzy set
theory, fuzzy Mathematical morphology which is based on fuzzy
logic and fuzzy set theory; fuzzy Mathematical operations and their
properties will be studied in details. As a second part, the application
of fuzziness in Mathematical morphology in practical work such as
image processing will be discussed with the illustration problems.
Abstract: The controllable electrical loss which consists of the
copper loss and iron loss can be minimized by the optimal control of
the armature current vector. The control algorithm of current vector
minimizing the electrical loss is proposed and the optimal current
vector can be decided according to the operating speed and the load
conditions. The proposed control algorithm is applied to the
experimental PM motor drive system and this paper presents a
modern approach of speed control for permanent magnet
synchronous motor (PMSM) applied for Electric Vehicle using a
nonlinear control. The regulation algorithms are based on the
feedback linearization technique. The direct component of the current
is controlled to be zero which insures the maximum torque operation.
The near unity power factor operation is also achieved. More over,
among EV-s motor electric propulsion features, the energy efficiency
is a basic characteristic that is influenced by vehicle dynamics and
system architecture. For this reason, the EV dynamics are taken into
account.
Abstract: Methane is the second most important greenhouse gas
(GHG) after carbon dioxide. Amount of methane emission from
energy sector is increasing day by day with various activities. In
present work, various sources of methane emission from upstream,
middle stream and downstream of oil & gas sectors are identified and
categorised as per IPCC-2006 guidelines. Data were collected from
various oil & gas sector like (i) exploration & production of oil & gas
(ii) supply through pipelines (iii) refinery throughput & production
(iv) storage & transportation (v) usage. Methane emission factors for
various categories were determined applying Tier-II and Tier-I
approach using the collected data. Total methane emission from
Indian Oil & Gas sectors was thus estimated for the year 1990 to
2007.
Abstract: This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.
Abstract: Firms have invested heavily in knowledge
management (KM) with the aim to build a knowledge capability and
use it to achieve a competitive advantage. Research has shown,
however, that not all knowledge management projects succeed. Some
studies report that about 84% of knowledge management projects
fail. This paper has integrated studies on the impediments to
knowledge management into a theoretical framework. Based on this
framework, five cases documenting failed KM initiatives were
analysed. The analysis gave us a clear picture about why certain KM
projects fail. The high failure rate of KM can be explained by the
gaps that exist between users and management in terms of KM
perceptions and objectives
Abstract: Operational risk has become one of the most discussed topics in the financial industry in the recent years. The reasons for this attention can be attributed to higher investments in information systems and technology, the increasing wave of mergers and acquisitions and emergence of new financial instruments. In addition, the New Basel Capital Accord (known as Basel II) demands a capital requirement for operational risk and further motivates financial institutions to more precisely measure and manage this type of risk. The aim of this paper is to shed light on main characteristics of operational risk management and common applied methods: scenario analysis, key risk indicators, risk control self assessment and loss distribution approach.
Abstract: Rural areas of Tanzania are still disadvantaged in terms of diffusion of IP-based services; this is due to lack of Information and Communication Technology (ICT) infrastructures, especially lack of connectivity. One of the limitations for connectivity problems in rural areas of Tanzania is the high cost to establish infrastructures for IP-based services [1-2]. However the cost of connectivity varies from one technology to the other and at the same time, the cost is also different from one operator (service provider) to another within the country. This paper presents development of software system to calculate cost of connectivity to rural areas of Tanzania. The system is developed to make an easy access of connectivity cost from different technologies and different operators. The development of the calculator follows the V-model software development lifecycle. The calculator is used to evaluate the economic viability of different technologies considered as being potential candidates to provide rural connectivity. In this paper, the evaluation is based on the techno-economic analysis approach.
Abstract: As the electrical power industry is restructured, the electrical power exchange is becoming extended. One of the key information used to determine how much power can be transferred through the network is known as available transfer capability (ATC). To calculate ATC, traditional deterministic approach is based on the severest case, but the approach has the complexity of procedure. Therefore, novel approach for ATC calculation is proposed using cost-optimization method in this paper, and is compared with well-being method and risk-benefit method. This paper proposes the optimal transfer capability of HVDC system between mainland and a separated island in Korea through these three methods. These methods will consider production cost, wheeling charge through HVDC system and outage cost with one depth (N-1 contingency)
Abstract: In this paper, mesh-free element free Galerkin (EFG) method is extended to solve two-dimensional potential flow problems. Two ideal fluid flow problems (i.e. flow over a rigid cylinder and flow over a sphere) have been formulated using variational approach. Penalty and Lagrange multiplier techniques have been utilized for the enforcement of essential boundary conditions. Four point Gauss quadrature have been used for the integration on two-dimensional domain (Ω) and nodal integration scheme has been used to enforce the essential boundary conditions on the edges (┌). The results obtained by EFG method are compared with those obtained by finite element method. The effects of scaling and penalty parameters on EFG results have also been discussed in detail.
Abstract: The work reported in this paper proposes
Swarm-Array computing, a novel technique inspired by swarm
robotics, and built on the foundations of autonomic and parallel
computing. The approach aims to apply autonomic computing
constructs to parallel computing systems and in effect achieve the
self-ware objectives that describe self-managing systems. The
constitution of swarm-array computing comprising four constituents,
namely the computing system, the problem/task, the swarm and the
landscape is considered. Approaches that bind these constituents
together are proposed. Space applications employing FPGAs are
identified as a potential area for applying swarm-array computing for
building reliable systems. The feasibility of a proposed approach is
validated on the SeSAm multi-agent simulator and landscapes are
generated using the MATLAB toolkit.