Abstract: This study presents the improvement of thermal
performance of heat pipe using copper nanofluid with aqueous
solution of n-Butanol. The nanofluids kept in the suspension of
conventional fluids have the potential of superior heat transfer
capability than the conventional fluids due to their improved thermal
conductivity. In this work, the copper nanofluid which has a 40 nm
size with a concentration of 100 mg/lit is kept in the suspension of
the de-ionized (DI) water and an aqueous solution of n-Butanol and
these fluids are used as a working medium in the heat pipe. The study
discusses about the effect of heat pipe inclination, type of working
fluid and heat input on the thermal efficiency and thermal resistance.
The experimental results are evaluated in terms of its performance
metrics and are compared with that of DI water.
Abstract: In this work, we propose a hybrid heuristic in order to
solve the Team Orienteering Problem (TOP). Given a set of points (or
customers), each with associated score (profit or benefit), and a team
that has a fixed number of members, the problem to solve is to visit a
subset of points in order to maximize the total collected score. Each
member performs a tour starting at the start point, visiting distinct
customers and the tour terminates at the arrival point. In addition,
each point is visited at most once, and the total time in each tour
cannot be greater than a given value. The proposed heuristic combines
beam search and a local optimization strategy. The algorithm was
tested on several sets of instances and encouraging results were
obtained.
Abstract: The 20th century has brought much development to the practice of Architecture worldwide, and technology has bridged inhabitation limits in many regions of the world with high levels of comfort and conveniences, most times at high costs to the environment. Throughout the globe, the tropical countries are being urbanized at an unprecedented rate and housing has become a major issue worldwide, in light of increased demand and lack of appropriate infra-structure and planning. Buildings and urban spaces designed in tropical cities have mainly adopted external concepts that in most cases do not fit the needs of the inhabitants living in such harsh climatic environment, and when they do, do so at high financial, environmental and cultural costs. Traditional architectural practices can provide valuable understanding on how self-reliance and autonomy of construction can be reinforced in rural-urban tropical environments. From traditional housing knowledge, it is possible to derive lessons for the development of new construction materials that are affordable, environmentally friendly, culturally acceptable and accesible to all.Specifically to the urban context, such solutions are of outmost importance, given the needs to a more democratic society, where access to housing is considered high in the agenda for development. Traditional or rural constructions are also ongoing through extensive changes eventhough they have mostly adopted climate-responsive building practices relying on local resources (with minimum embodied energy) and energy (for comfort and quality of life). It is important to note that many of these buildings can actually be called zero-energy, and hold potential answers to enable transition from high energy, high cost, low comfort urban habitations to zero/low energy habitations with high quality urban livelihood. Increasing access to modern urban lifestyels have also an effect on the aspirations from people in terms of performance, comfort and convenience in terms of their housing and the way it is produced and used. These aspirations are resulting in transitions from localresource dependent habitations- to non-local resource based highenergy urban style habitations. And such transitions are resulting in the habitations becoming increasingly unsuited to the local climatic conditions with increasing discomfort, ill-health, and increased CO2 emissions and local environmental disruption. This research studies one specific transition group in the context of 'water communities' in tropical-equatorial regions: Ribeirinhos housing typology (Amazonas, Brazil). The paper presents the results of a qualitative sustainability assessment of the housing typologies under transition, found at the Ribeirinhos communities.
Abstract: Many Wireless Sensor Network (WSN) applications necessitate secure multicast services for the purpose of broadcasting delay sensitive data like video files and live telecast at fixed time-slot. This work provides a novel method to deal with end-to-end delay and drop rate of packets. Opportunistic Routing chooses a link based on the maximum probability of packet delivery ratio. Null Key Generation helps in authenticating packets to the receiver. Markov Decision Process based Adaptive Scheduling algorithm determines the time slot for packet transmission. Both theoretical analysis and simulation results show that the proposed protocol ensures better performance in terms of packet delivery ratio, average end-to-end delay and normalized routing overhead.
Abstract: There has been a growing interest in utilizing surfactants in remediation processes to separate the hydrophobic volatile organic compounds (HVOCs) from aqueous solution. One attractive process is cloud point extraction (CPE), which utilizes nonionic surfactants as a separating agent. Since the surfactant cost is a key determination of the economic viability of the process, it is important that the surfactants are recycled and reused. This work aims to study the performance of the co-current vacuum stripping using a packed column for HVOCs removal from contaminated surfactant solution. Six types HVOCs are selected as contaminants. The studied surfactant is the branched secondary alcohol ethoxylates (AEs), Tergitol TMN-6 (C14H30O2). The volatility and the solubility of HVOCs in surfactant system are determined in terms of an apparent Henry’s law constant and a solubilization constant, respectively. Moreover, the HVOCs removal efficiency of vacuum stripping column is assessed in terms of percentage of HVOCs removal and the overall liquid phase volumetric mass transfer coefficient. The apparent Henry’s law constant of benzenz , toluene, and ethyl benzene were 7.00×10-5, 5.38×10-5, 3.35× 10-5 respectively. The solubilization constant of benzene, toluene, and ethyl benzene were 1.71, 2.68, 7.54 respectively. The HVOCs removal for all solute were around 90 percent.
Abstract: In this paper, we propose a supervised method for
color image classification based on a multilevel sigmoidal neural
network (MSNN) model. In this method, images are classified into
five categories, i.e., “Car", “Building", “Mountain", “Farm" and
“Coast". This classification is performed without any segmentation
processes. To verify the learning capabilities of the proposed method,
we compare our MSNN model with the traditional Sigmoidal Neural
Network (SNN) model. Results of comparison have shown that the
MSNN model performs better than the traditional SNN model in the
context of training run time and classification rate. Both color
moments and multi-level wavelets decomposition technique are used
to extract features from images. The proposed method has been
tested on a variety of real and synthetic images.
Abstract: The fault detection and diagnosis of complicated
production processes is one of essential tasks needed to run the process
safely with good final product quality. Unexpected events occurred in
the process may have a serious impact on the process. In this work,
triangular representation of process measurement data obtained in an
on-line basis is evaluated using simulation process. The effect of using
linear and nonlinear reduced spaces is also tested. Their diagnosis
performance was demonstrated using multivariate fault data. It has
shown that the nonlinear technique based diagnosis method produced
more reliable results and outperforms linear method. The use of
appropriate reduced space yielded better diagnosis performance. The
presented diagnosis framework is different from existing ones in that it
attempts to extract the fault pattern in the reduced space, not in the
original process variable space. The use of reduced model space helps
to mitigate the sensitivity of the fault pattern to noise.
Abstract: The aim of this paper is to investigate a process of modernization of the People-s Republic of China. The theme of scientific research is interesting, first, because the Chinese model of development is recognized as successful and most dynamically developing. They are obliged by these successes of the modernization spent in the country. Economy modernization as the basic motive power of progress of the country is a priority direction of development in the Republic of Kazakhstan. So the example of successful development modernization processes in China can be rather useful to use in working out of the Kazakhstan national reforms.
Abstract: Image coding based on clustering provides immediate
access to targeted features of interest in a high quality decoded
image. This approach is useful for intelligent devices, as well as for
multimedia content-based description standards. The result of image
clustering cannot be precise in some positions especially on pixels
with edge information which produce ambiguity among the clusters.
Even with a good enhancement operator based on PDE, the quality of
the decoded image will highly depend on the clustering process. In
this paper, we introduce an ambiguity cluster in image coding to
represent pixels with vagueness properties. The presence of such
cluster allows preserving some details inherent to edges as well for
uncertain pixels. It will also be very useful during the decoding phase
in which an anisotropic diffusion operator, such as Perona-Malik,
enhances the quality of the restored image. This work also offers a
comparative study to demonstrate the effectiveness of a fuzzy
clustering technique in detecting the ambiguity cluster without losing
lot of the essential image information. Several experiments have been
carried out to demonstrate the usefulness of ambiguity concept in
image compression. The coding results and the performance of the
proposed algorithms are discussed in terms of the peak signal-tonoise
ratio and the quantity of ambiguous pixels.
Abstract: Wireless Sensor Networks (WSN) are emerging
because of the developments in wireless communication technology and miniaturization of the hardware. WSN consists of a large number of low-cost, low-power, multifunctional sensor nodes to monitor physical conditions, such as temperature, sound, vibration, pressure,
motion, etc. The MAC protocol to be used in the sensor networks must be energy efficient and this should aim at conserving the energy during its operation. In this paper, with the focus of analyzing the
MAC protocols used in wireless Adhoc networks to WSN, simulation
experiments were conducted in Global Mobile Simulator
(GloMoSim) software. Number of packets sent by regular nodes, and received by sink node in different deployment strategies, total energy
spent, and the network life time have been chosen as the metric for comparison. From the results of simulation, it is evident that the IEEE 802.11 protocol performs better compared to CSMA and MACA protocols.
Abstract: Throughout this paper, a relatively new technique, the Tabu search variable selection model, is elaborated showing how it can be efficiently applied within the financial world whenever researchers come across the selection of a subset of variables from a whole set of descriptive variables under analysis. In the field of financial prediction, researchers often have to select a subset of variables from a larger set to solve different type of problems such as corporate bankruptcy prediction, personal bankruptcy prediction, mortgage, credit scoring and the Arbitrage Pricing Model (APM). Consequently, to demonstrate how the method operates and to illustrate its usefulness as well as its superiority compared to other commonly used methods, the Tabu search algorithm for variable selection is compared to two main alternative search procedures namely, the stepwise regression and the maximum R 2 improvement method. The Tabu search is then implemented in finance; where it attempts to predict corporate bankruptcy by selecting the most appropriate financial ratios and thus creating its own prediction score equation. In comparison to other methods, mostly the Altman Z-Score model, the Tabu search model produces a higher success rate in predicting correctly the failure of firms or the continuous running of existing entities.
Abstract: We here propose improved version of elastic graph matching (EGM) as a face detector, called the multi-scale EGM (MS-EGM). In this improvement, Gabor wavelet-based pyramid reduces computational complexity for the feature representation often used in the conventional EGM, but preserving a critical amount of information about an image. The MS-EGM gives us higher detection performance than Viola-Jones object detection algorithm of the AdaBoost Haar-like feature cascade. We also show rapid detection speeds of the MS-EGM, comparable to the Viola-Jones method. We find fruitful benefits in the MS-EGM, in terms of topological feature representation for a face.
Abstract: This paper examines predictability in stock return in
developed and emergingmarkets by testing long memory in stock
returns using wavelet approach. Wavelet-based maximum likelihood
estimator of the fractional integration estimator is superior to the
conventional Hurst exponent and Geweke and Porter-Hudak
estimator in terms of asymptotic properties and mean squared error.
We use 4-year moving windows to estimate the fractional integration
parameter. Evidence suggests that stock return may not be predictable
indeveloped countries of the Asia-Pacificregion. However,
predictability of stock return insome developing countries in this
region such as Indonesia, Malaysia and Philippines may not be ruled
out. Stock return in the Thailand stock market appears to be not
predictable after the political crisis in 2008.
Abstract: The Goursat partial differential equation arises in
linear and non linear partial differential equations with mixed
derivatives. This equation is a second order hyperbolic partial
differential equation which occurs in various fields of study such as
in engineering, physics, and applied mathematics. There are many
approaches that have been suggested to approximate the solution of
the Goursat partial differential equation. However, all of the
suggested methods traditionally focused on numerical differentiation
approaches including forward and central differences in deriving the
scheme. An innovation has been done in deriving the Goursat partial
differential equation scheme which involves numerical integration
techniques. In this paper we have developed a new scheme to solve
the Goursat partial differential equation based on the Adomian
decomposition (ADM) and associated with Boole-s integration rule to
approximate the integration terms. The new scheme can easily be
applied to many linear and non linear Goursat partial differential
equations and is capable to reduce the size of computational work.
The accuracy of the results reveals the advantage of this new scheme
over existing numerical method.
Abstract: In this paper, a pipelined version of genetic algorithm,
called PLGA, and a corresponding hardware platform are described.
The basic operations of conventional GA (CGA) are made pipelined
using an appropriate selection scheme. The selection operator, used
here, is stochastic in nature and is called SA-selection. This helps
maintaining the basic generational nature of the proposed pipelined
GA (PLGA). A number of benchmark problems are used to compare
the performances of conventional roulette-wheel selection and the
SA-selection. These include unimodal and multimodal functions with
dimensionality varying from very small to very large. It is seen that
the SA-selection scheme is giving comparable performances with
respect to the classical roulette-wheel selection scheme, for all the
instances, when quality of solutions and rate of convergence are considered.
The speedups obtained by PLGA for different benchmarks
are found to be significant. It is shown that a complete hardware
pipeline can be developed using the proposed scheme, if parallel
evaluation of the fitness expression is possible. In this connection
a low-cost but very fast hardware evaluation unit is described.
Results of simulation experiments show that in a pipelined hardware
environment, PLGA will be much faster than CGA. In terms of
efficiency, PLGA is found to outperform parallel GA (PGA) also.
Abstract: The characterization of κ-carrageenan could provide a
better understanding of its functions in biological, medical and
industrial applications. Chemical and physical analyses of
carrageenan from seaweeds, Euchema cottonii L., were done to offer
information on its properties and the effects of Co-60 γ-irradiation on
its thermochemical characteristics. The structural and morphological
characteristics of κ-carrageenan were determined using scanning
electron microscopy (SEM) while the composition, molecular weight
and thermal properties were determined using attenuated total
reflectance Fourier transform infrared spectroscopy (ATR-FTIR), gel
permeation chromatography (GPC), thermal gravimetric analysis
(TGA) and differential scanning calorimetry (DSC). Further chemical
analysis was done using hydrogen-1 nuclear magnetic resonance (1H
NMR) and functional characteristics in terms of biocompatibility
were evaluated using cytotoxicity test.
Abstract: This paper presents a genetic algorithm based
approach for solving security constrained optimal power flow
problem (SCOPF) including FACTS devices. The optimal location of
FACTS devices are identified using an index called overload index
and the optimal values are obtained using an enhanced genetic
algorithm. The optimal allocation by the proposed method optimizes
the investment, taking into account its effects on security in terms of
the alleviation of line overloads. The proposed approach has been
tested on IEEE-30 bus system to show the effectiveness of the
proposed algorithm for solving the SCOPF problem.
Abstract: Most file systems overwrite modified file data and
metadata in their original locations, while the Log-structured File
System (LFS) dynamically relocates them to other locations. We
design and implement the Evergreen file system that can select
between overwriting or relocation for each block of a file or metadata.
Therefore, the Evergreen file system can achieve superior write
performance by sequentializing write requests (similar to LFS-style
relocation) when space utilization is low and overwriting when
utilization is high. Another challenging issue is identifying
performance benefits of LFS-style relocation over overwriting on a
newly introduced SSD (Solid State Drive) which has only
Flash-memory chips and control circuits without mechanical parts.
Our experimental results measured on a SSD show that relocation
outperforms overwriting when space utilization is below 80% and vice
versa.
Abstract: Public parks are placed high on the research agenda, with many studies addressing their social, economic and environment influences in different countries around the world. They have been recognized as contributors to the physical quality of urban environments. Recently, a broader view of public parks has emerged. This view goes well beyond the traditional value of parks as places for more recreation and visual delight, to depict them as valuable contributors to broader strategic objectives, such as property values, place attractiveness, job opportunities, social belonging, public health, tourist development, and improving the overall quality of life. This research examines the role of public parks in enhancing the quality of human life in Egyptian environment. It measures 'quality of life' in terms of 'human needs' and 'well-being'. This should open ways for policymakers, practitioners, researchers and the public to realize the potentials of public parks towards improving the quality of life.
Abstract: In the context of channel coding, the Generalized Belief Propagation (GBP) is an iterative algorithm used to recover the transmission bits sent through a noisy channel. To ensure a reliable transmission, we apply a map on the bits, that is called a code. This code induces artificial correlations between the bits to send, and it can be modeled by a graph whose nodes are the bits and the edges are the correlations. This graph, called Tanner graph, is used for most of the decoding algorithms like Belief Propagation or Gallager-B. The GBP is based on a non unic transformation of the Tanner graph into a so called region-graph. A clear advantage of the GBP over the other algorithms is the freedom in the construction of this graph. In this article, we explain a particular construction for specific graph topologies that involves relevant performance of the GBP. Moreover, we investigate the behavior of the GBP considered as a dynamic system in order to understand the way it evolves in terms of the time and in terms of the noise power of the channel. To this end we make use of classical measures and we introduce a new measure called the hyperspheres method that enables to know the size of the attractors.