Abstract: This paper presents the fundamentals of Origami engineering and its application in nowadays as well as future industry. Several main cores of mathematical approaches such as Huzita- Hatori axioms, Maekawa and Kawasaki-s theorems are introduced briefly. Meanwhile flaps and circle packing by Robert Lang is explained to make understood the underlying principles in designing crease pattern. Rigid origami and its corrugation patterns which are potentially applicable for creating transformable or temporary spaces is discussed to show the transition of origami from paper to thick material. Moreover, some innovative applications of origami such as eyeglass, origami stent and high tech origami based on mentioned theories and principles are showcased in section III; while some updated origami technology such as Vacuumatics, self-folding of polymer sheets and programmable matter folding which could greatlyenhance origami structureare demonstrated in Section IV to offer more insight in future origami.
Abstract: In this paper we present a technique to speed up
ICA based on the idea of reducing the dimensionality of the data
set preserving the quality of the results. In particular we refer to
FastICA algorithm which uses the Kurtosis as statistical property
to be maximized. By performing a particular Johnson-Lindenstrauss
like projection of the data set, we find the minimum dimensionality
reduction rate ¤ü, defined as the ratio between the size k of the reduced
space and the original one d, which guarantees a narrow confidence
interval of such estimator with high confidence level. The derived
dimensionality reduction rate depends on a system control parameter
β easily computed a priori on the basis of the observations only.
Extensive simulations have been done on different sets of real world
signals. They show that actually the dimensionality reduction is very
high, it preserves the quality of the decomposition and impressively
speeds up FastICA. On the other hand, a set of signals, on which the
estimated reduction rate is greater than 1, exhibits bad decomposition
results if reduced, thus validating the reliability of the parameter β.
We are confident that our method will lead to a better approach to
real time applications.
Abstract: Health problems linked to urban growth are current
major concerns of developing countries. In 2002 and 2005, an
interdisciplinary program “Populations et Espaces ├á Risques
SANitaires" (PERSAN) was set up under the patronage of the
Development and Research Institute. Centered on health in
Cameroon-s urban environment, the program mainly sought to (i)
identify diarrhoea risk factors in Yaoundé, (ii) to measure their
prevalence and apprehend their spatial distribution. The crosssectional
epidemiological study that was carried out revealed a
diarrheic prevalence of 14.4% (437 cases of diarrhoea on the 3,034
children examined). Also, among risk factors studied, household
refuse management methods used by city dwellers were statistically
associated to these diarrhoeas. Moreover, it happened that levels of
diarrhoeal attacks varied consistently from one neighbourhood to
another because of the discrepancy urbanization process of the
Yaoundé metropolis.
Abstract: The curves, of which the square of the distance
between the two points equal to zero, are called minimal or isotropic
curves [4]. In this work, first, necessary and sufficient conditions to
be a Pseudo Helix, which is a special case of such curves, are
presented. Thereafter, it is proven that an isotropic curve-s position
vector and pseudo curvature satisfy a vector differential equation of
fourth order. Additionally, In view of solution of mentioned
equation, position vector of pseudo helices is obtained.
Abstract: Green- spaces might be very attractive, but
where are the economic benefits? What value do nature and
landscape have for us? What difference will it make to jobs,
health and the economic strength of areas struggling with
deprivation and social problems? [1].There is a need to consider
green spaces from a different perspective. Green planning is not just
about flora and fauna, but also about planning for economic benefits
[2]. It is worth trying to quantify the value of green spaces since
nature and landscape are crucially important to our quality of life and
sustainable development. The reality, however, is that urban
development often takes place at the expense of green spaces.
Urbanization is an ongoing process throughout the world; however,
hyper-urbanization without environmental planning is destructive,
not constructive [3]. Urban spaces are believed to be more valuable
than other land uses, particular green areas, simply because of the
market value connected to urban spaces. However, attractive
landscapes can help raise the quality and value of the urban market
even more. In order to reach these objectives of integrated planning,
the Green-Value-Gap needs to be bridged. Economists have to
understand the concept of Green-Planning and the spinoffs, and
Environmentalists have to understand the importance of urban
economic development and the benefits thereof to green planning. An
interface between Environmental Management, Economic
Development and sustainable Spatial Planning are needed to bridge
the Green-Value-Gap.
Abstract: The work reported in this paper proposes
Swarm-Array computing, a novel technique inspired by swarm
robotics, and built on the foundations of autonomic and parallel
computing. The approach aims to apply autonomic computing
constructs to parallel computing systems and in effect achieve the
self-ware objectives that describe self-managing systems. The
constitution of swarm-array computing comprising four constituents,
namely the computing system, the problem/task, the swarm and the
landscape is considered. Approaches that bind these constituents
together are proposed. Space applications employing FPGAs are
identified as a potential area for applying swarm-array computing for
building reliable systems. The feasibility of a proposed approach is
validated on the SeSAm multi-agent simulator and landscapes are
generated using the MATLAB toolkit.
Abstract: The design of methods of the 20 K large dimension cold shield used for infrared radiation demarcating in space environment simulation test were introduced in this paper. The cold shield were cooled by five G-M cryocoolers , and the dimension of the cold shield is the largest in our country.Cold shield installation and distribution and compensator for contraction on cooling were introduced detailedly. The temperature distribution and cool-down time of cold shield surface were also calculated and analysed in this paper. The design of cold shield resolves the difficulty of compensator for contraction on cooling successfully. Test results show that the actual technical performance indicators of cold shield met and exceeded the design requirements.
Abstract: In this paper, a fragile watermarking scheme is proposed for color image specified object-s authentication. The color image is first transformed from RGB to YST color space, suitable for watermarking the color media. The T channel corresponds to the chrominance component of a color image andYS ÔèÑ T , therefore selected for embedding the watermark. The T channel is first divided into 2×2 non-overlapping blocks and the two LSBs are set to zero. The object that is to be authenticated is also divided into 2×2 nonoverlapping blocks and each block-s intensity mean is computed followed by eight bit encoding. The generated watermark is then embedded into T channel randomly selected 2×2 block-s LSBs using 2D-Torus Automorphism. Selection of block size is paramount for exact localization and recovery of work. The proposed scheme is blind, efficient and secure with ability to detect and locate even minor tampering applied to the image with full recovery of original work. The quality of watermarked media is quite high both subjectively and objectively. The technique is suitable for class of images with format such as gif, tif or bitmap.
Abstract: In this study, the sorption of Malachite green (MG) on Hydrilla verticillata biomass, a submerged aquatic plant, was investigated in a batch system. The effects of operating parameters such as temperature, adsorbent dosage, contact time, adsorbent size, and agitation speed on the sorption of Malachite green were analyzed using response surface methodology (RSM). The proposed quadratic model for central composite design (CCD) fitted very well to the experimental data that it could be used to navigate the design space according to ANOVA results. The optimum sorption conditions were determined as temperature - 43.5oC, adsorbent dosage - 0.26g, contact time - 200min, adsorbent size - 0.205mm (65mesh), and agitation speed - 230rpm. The Langmuir and Freundlich isotherm models were applied to the equilibrium data. The maximum monolayer coverage capacity of Hydrilla verticillata biomass for MG was found to be 91.97 mg/g at an initial pH 8.0 indicating that the optimum sorption initial pH. The external and intra particle diffusion models were also applied to sorption data of Hydrilla verticillata biomass with MG, and it was found that both the external diffusion as well as intra particle diffusion contributes to the actual sorption process. The pseudo-second order kinetic model described the MG sorption process with a good fitting.
Abstract: The rate of production of main products of the Fischer-Tropsch reactions over Fe/HZSM5 bifunctional catalyst in a fixed bed reactor is investigated at a broad range of temperature, pressure, space velocity, H2/CO feed molar ratio and CO2, CH4 and water flow rates. Model discrimination and parameter estimation were performed according to the integral method of kinetic analysis. Due to lack of mechanism development for Fisher – Tropsch Synthesis on bifunctional catalysts, 26 different models were tested and the best model is selected. Comprehensive one and two dimensional heterogeneous reactor models are developed to simulate the performance of fixed-bed Fischer – Tropsch reactors. To reduce computational time for optimization purposes, an Artificial Feed Forward Neural Network (AFFNN) has been used to describe intra particle mass and heat transfer diffusion in the catalyst pellet. It is seen that products' reaction rates have direct relation with H2 partial pressure and reverse relation with CO partial pressure. The results show that the hybrid model has good agreement with rigorous mechanistic model, favoring that the hybrid model is about 25-30 times faster.
Abstract: The main goal of this work is to propose a way for
combined use of two nontraditional algorithms by solving topological
problems on telecommunications concentrator networks. The
algorithms suggested are the Simulated Annealing algorithm and the
Genetic Algorithm. The Algorithm of Simulated Annealing unifies
the well known local search algorithms. In addition - Simulated
Annealing allows acceptation of moves in the search space witch lead
to decisions with higher cost in order to attempt to overcome any
local minima obtained. The Genetic Algorithm is a heuristic approach
witch is being used in wide areas of optimization works. In the last
years this approach is also widely implemented in
Telecommunications Networks Planning. In order to solve less or
more complex planning problem it is important to find the most
appropriate parameters for initializing the function of the algorithm.
Abstract: The innovative intelligent fuzzy weighted input
estimation method (FWIEM) can be applied to the inverse heat
transfer conduction problem (IHCP) to estimate the unknown
time-varying heat flux of the multilayer materials as presented in this
paper. The feasibility of this method can be verified by adopting the
temperature measurement experiment. The experiment modular may
be designed by using the copper sample which is stacked up 4
aluminum samples with different thicknesses. Furthermore, the
bottoms of copper samples are heated by applying the standard heat
source, and the temperatures on the tops of aluminum are measured by
using the thermocouples. The temperature measurements are then
regarded as the inputs into the presented method to estimate the heat
flux in the bottoms of copper samples. The influence on the estimation
caused by the temperature measurement of the sample with different
thickness, the processing noise covariance Q, the weighting factor γ ,
the sampling time interval Δt , and the space discrete interval Δx ,
will be investigated by utilizing the experiment verification. The
results show that this method is efficient and robust to estimate the
unknown time-varying heat input of the multilayer materials.
Abstract: This research investigates the design of a low-cost 3D
spatial interaction approach using the Wii Remote for immersive
Head-Mounted Display (HMD) virtual reality. Current virtual reality
applications that incorporate the Wii Remote are either desktop
virtual reality applications or systems that use large screen displays.
However, the requirements for an HMD virtual reality system differ
from such systems. This is mainly because in HMD virtual reality,
the display screen does not remain at a fixed location. The user views
the virtual environment through display screens that are in front of
the user-s eyes and when the user moves his/her head, these screens
move as well. This means that the display has to be updated in realtime
based on where the user is currently looking. Normal usage of
the Wii Remote requires the controller to be pointed in a certain
direction, typically towards the display. This is too restrictive for
HMD virtual reality systems that ideally require the user to be able to
turn around in the virtual environment. Previous work proposed a
design to achieve this, however it suffered from a number of
drawbacks. The aim of this study is to look into a suitable method of
using the Wii Remote for 3D interaction in a space around the user
for HMD virtual reality. This paper presents an overview of issues
that had to be considered, the system design as well as experimental
results.
Abstract: Calcium is a vital second messenger used in signal transduction. Calcium controls secretion, cell movement, muscular contraction, cell differentiation, ciliary beating and so on. Two theories have been used to simplify the system of reaction-diffusion equations of calcium into a single equation. One is excess buffer approximation (EBA) which assumes that mobile buffer is present in excess and cannot be saturated. The other is rapid buffer approximation (RBA), which assumes that calcium binding to buffer is rapid compared to calcium diffusion rate. In the present work, attempt has been made to develop a model for calcium diffusion under excess buffer approximation in neuron cells. This model incorporates the effect of [Na+] influx on [Ca2+] diffusion,variable calcium and sodium sources, sodium-calcium exchange protein, Sarcolemmal Calcium ATPase pump, sodium and calcium channels. The proposed mathematical model leads to a system of partial differential equations which have been solved numerically using Forward Time Centered Space (FTCS) approach. The numerical results have been used to study the relationships among different types of parameters such as buffer concentration, association rate, calcium permeability.
Abstract: Rooted in the study of social functioning of space in architecture, Space Syntax (SS) and the more recent Network Pattern (NP) researches demonstrate the 'spatial structures' of city, i.e. the hierarchical patterns of streets, junctions and alley ends. Applying SS and NP models, planners can conceptualize the real city-s patterns. Although, both models yield the optimal path of the city their underpinning displays of the city-s spatial configuration differ. The Axial Map analyzes the topological non-distance-based connectivity structure, whereas, the Central-Node Map and the Shortcut-Path Map, in contrast, analyze the metrical distance-based structures. This research contrasts and combines them to understand various forms of city-s structures. It concludes that, while they reveal different spatial structures, Space Syntax and Network Pattern urban models support each the other. Combining together they simulate the global access and the locally compact structures namely the central nodes and the shortcuts for the city.
Abstract: The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Abstract: This paper describes a new approach of classification
using genetic programming. The proposed technique consists of
genetically coevolving a population of non-linear transformations on
the input data to be classified, and map them to a new space with a
reduced dimension, in order to get a maximum inter-classes
discrimination. The classification of new samples is then performed
on the transformed data, and so become much easier. Contrary to the
existing GP-classification techniques, the proposed one use a
dynamic repartition of the transformed data in separated intervals, the
efficacy of a given intervals repartition is handled by the fitness
criterion, with a maximum classes discrimination. Experiments were
first performed using the Fisher-s Iris dataset, and then, the KDD-99
Cup dataset was used to study the intrusion detection and
classification problem. Obtained results demonstrate that the
proposed genetic approach outperform the existing GP-classification
methods [1],[2] and [3], and give a very accepted results compared to
other existing techniques proposed in [4],[5],[6],[7] and [8].
Abstract: Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The urban climate, representing different urban morphologies across central Bangkok metropolitan area (BMA), are used to investigates the effects of both the composition and configuration of variables of urban morphology indicators on the summer diurnal range of urban climate, using correlation analyses and multiple linear regressions. Results show first indicate that approximately 92.6% of the variation in the average maximum daytime near-surface air temperature (Ta) was explained jointly by the two composition variables of urban morphology indicators including open space ratio (OSR) and floor area ratio (FAR). It has been possible to determine the membership of sample areas to the local climate zones (LCZs) using these urban morphology descriptors automatically computed with GIS and remote sensed data. Finally result found the temperature differences among zones of large separation, such as the city center could be respectively from 35.48±1.04ºC (Mean±S.D.) warmer than the outskirt of Bangkok on average for maximum daytime near surface temperature to 28.27±0.21ºC for extreme event and, can exceed as 8ºC. A spatially disaggregation of urban thermal responsiveness map would be helpful for several reasons. First, it would localize urban areas concerned by different climate behavior over summer daytime and be a good indicator of urban climate variability. Second, when overlaid with a land cover map, this map may contribute to identify possible urban management strategies to reduce heat wave effects in BMA.
Abstract: In this paper, we propose a solution to the motion
control problem of a 2-link revolute manipulator arm. We require the
end-effector of the arm to move safely to its designated target in a
priori known workspace cluttered with fixed circular obstacles of
arbitrary position and sizes. Firstly a unique velocity algorithm is
used to move the end-effector to its target. Secondly, for obstacle
avoidance a turning angle is designed, which when incorporated into
the control laws ensures that the entire robot arm avoids any number
of fixed obstacles along its path enroute the target. The control laws
proposed in this paper also ensure that the equilibrium point of the
system is asymptotically stable. Computer simulations of the
proposed technique are presented.
Abstract: Active research is underway on virtual touch screens
that complement the physical limitations of conventional touch
screens. This paper discusses a virtual touch screen that uses a
multi-layer perceptron to recognize and control three-dimensional
(3D) depth information from a time of flight (TOF) camera. This
system extracts an object-s area from the image input and compares it
with the trajectory of the object, which is learned in advance, to
recognize gestures. The system enables the maneuvering of content in
virtual space by utilizing human actions.