Abstract: Nanoemulsions are a class of emulsions with a droplet
size in the range of 50–500 nm and have attracted a great deal of
attention in recent years because it is unique characteristics. The
physicochemical properties of nanoemulsion suggests that it can be
successfully used to recover the residual oil which is trapped in the
fine pore of reservoir rock by capillary forces after primary and
secondary recovery. Oil-in-water nanoemulsion which can be formed
by high-energy emulsification techniques using specific surfactants
can reduce oil-water interfacial tension (IFT) by 3-4 orders of
magnitude. The present work is aimed on characterization of oil-inwater
nanoemulsion in terms of its phase behavior, morphological
studies; interfacial energy; ability to reduce the interfacial tension and
understanding the mechanisms of mobilization and displacement of
entrapped oil blobs by lowering interfacial tension both at the
macroscopic and microscopic level. In order to investigate the
efficiency of oil-water nanoemulsion in enhanced oil recovery
(EOR), experiments were performed to characterize the emulsion in
terms of their physicochemical properties and size distribution of the
dispersed oil droplet in water phase. Synthetic mineral oil and a series
of surfactants were used to prepare oil-in-water emulsions.
Characterization of emulsion shows that it follows pseudo-plastic
behaviour and drop size of dispersed oil phase follows lognormal
distribution. Flooding experiments were also carried out in a
sandpack system to evaluate the effectiveness of the nanoemulsion as
displacing fluid for enhanced oil recovery. Substantial additional
recoveries (more than 25% of original oil in place) over conventional
water flooding were obtained in the present investigation.
Abstract: We measured the major and trace element contents
and Rb-Sr isotopic compositions of 12 tektites from the Maoming
area, Guandong province (south China). All the samples studied are
splash-form tektites which show pitted or grooved surfaces with
schlieren structures on some surfaces. The trace element ratios Ba/Rb
(avg. 4.33), Th/Sm (avg. 2.31), Sm/Sc (avg. 0.44), Th/Sc (avg. 1.01) ,
La/Sc (avg. 2.86), Th/U (avg. 7.47), Zr/Hf (avg. 46.01) and the rare
earth elements (REE) contents of tektites of this study are similar to the
average upper continental crust. From the chemical composition, it is
suggested that tektites in this study are derived from similar parental
terrestrial sedimentary deposit which may be related to post-Archean
upper crustal rocks. The tektites from the Maoming area have high
positive εSr(0) values-ranging from 176.9~190.5 which indicate that
the parental material for these tektites have similar Sr isotopic
compositions to old terrestrial sedimentary rocks and they were not
dominantly derived from recent young sediments (such as soil or
loess). The Sr isotopic data obtained by the present study support the
conclusion proposed by Blum et al. (1992)[1] that the depositional age
of sedimentary target materials is close to 170Ma (Jurassic). Mixing
calculations based on the model proposed by Ho and Chen (1996)[2]
for various amounts and combinations of target rocks indicate that the
best fit for tektites from the Maoming area is a mixture of 40% shale,
30% greywacke, 30% quartzite.
Abstract: Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.
Abstract: Automated discovery of hierarchical structures in
large data sets has been an active research area in the recent past.
This paper focuses on the issue of mining generalized rules with crisp
hierarchical structure using Genetic Programming (GP) approach to
knowledge discovery. The post-processing scheme presented in this
work uses flat rules as initial individuals of GP and discovers
hierarchical structure. Suitable genetic operators are proposed for the
suggested encoding. Based on the Subsumption Matrix(SM), an
appropriate fitness function is suggested. Finally, Hierarchical
Production Rules (HPRs) are generated from the discovered
hierarchy. Experimental results are presented to demonstrate the
performance of the proposed algorithm.
Abstract: This article is devoted to the analysis of results of
sociological researches carried out by authors directed on studying of
opinion of representatives of small, medium and big business on
formation of the Customs Union, Common Free Market Zone with
participation of Kazakhstan, Russia and Belarus.
It-s forecasted that companies, their branches will interpenetrate
with registration and moving their businesses to regions with more
beneficial conditions. They say that in Kazakhstan there are more
profitable geo-strategic operating environment for business and lower
taxes. Russia using this opportunity will create new conditions for
expansion into other countries of Central Asia and China. Opinions
of participants of questionnaire and expert poll different in estimation
of value of these two integration mechanisms since market segments
on the one hand extend, but also on the other hand - loss of exclusive
influence in certain fields of activity.
Abstract: Digital news with a variety topics is abundant on the
internet. The problem is to classify news based on its appropriate
category to facilitate user to find relevant news rapidly. Classifier
engine is used to split any news automatically into the respective
category. This research employs Support Vector Machine (SVM) to
classify Indonesian news. SVM is a robust method to classify
binary classes. The core processing of SVM is in the formation of an
optimum separating plane to separate the different classes. For
multiclass problem, a mechanism called one against one is used to
combine the binary classification result. Documents were taken
from the Indonesian digital news site, www.kompas.com. The
experiment showed a promising result with the accuracy rate of 85%.
This system is feasible to be implemented on Indonesian news
classification.
Abstract: The goals of the present research are to estimate Six Sigma implementation in Latvian commercial banks and to identify the perceived benefits of its implementation. To achieve the goals, the authors used sequential explanatory method. To obtain empirical data, the authors have developed the questionnaire and adapted it for the employees of Latvian commercial banks. The questions are related to Six Sigma implementation and its perceived benefits. The questionnaire mainly consists of closed questions, the evaluation of which is based on 5 point Likert scale. The obtained empirical data has shown that of the two hypotheses put forward in the present research – Hypothesis 1 – has to be rejected, while Hypothesis 2 has been partially confirmed. The authors have also faced some research limitations related to the fact that the participants in the questionnaire belong to different rank of the organization hierarchy.
Abstract: The preparation of good-quality Environmental Impact Assessment (EIA) reports contribute to enhancing overall effectiveness of EIA. This component of the EIA process becomes more important in situation where public participation is weak and there is lack of expertise on the part of the competent authority. In Pakistan, EIA became mandatory for every project likely to cause adverse environmental impacts from July 1994. The competent authority also formulated guidelines for preparation and review of EIA reports in 1997. However, EIA is yet to prove as a successful decision support tool to help in environmental protection. One of the several reasons of this ineffectiveness is the generally poor quality of EIA reports. This paper critically reviews EIA reports of some randomly selected projects. Interviews of EIA consultants, project proponents and concerned government officials have also been conducted to underpin the root causes of poor quality of EIA reports. The analysis reveals several inadequacies particularly in areas relating to identification, evaluation and mitigation of key impacts and consideration of alternatives. The paper identifies some opportunities and suggests measures for improving the quality of EIA reports and hence making EIA an effective tool to help in environmental protection.
Abstract: According to Hermite there exists only a finite
number of number fields having a given degree, and a given value of
the discriminant, nevertheless this number is not known generally.
The determination of a maximum number of number fields of degree
10 having a given discriminant that contain a subfield of degree 5
having a fixed class number, narrow class number and Galois group
is the purpose of this work. The constructed lists of the first
coincidences of 52 (resp. 50, 40, 48, 22, 6) nonisomorphic number
fields with same discriminant of degree 10 of signature (6,2) (resp.
(4,3), (8,1), (2,4), (0,5), (10,0)) containing a quintic field. For each
field in the lists, we indicate its discriminant, the discriminant of its
subfield, a relative polynomial generating the field over its quintic
field and its relative discriminant, the corresponding polynomial over
Q and its Galois closure are presented with concluding remarks.
Abstract: Computerized lip reading has been one of the most
actively researched areas of computer vision in recent past because
of its crime fighting potential and invariance to acoustic environment.
However, several factors like fast speech, bad pronunciation,
poor illumination, movement of face, moustaches and beards make
lip reading difficult. In present work, we propose a solution for
automatic lip contour tracking and recognizing letters of English
language spoken by speakers using the information available from
lip movements. Level set method is used for tracking lip contour
using a contour velocity model and a feature vector of lip movements
is then obtained. Character recognition is performed using modified
k nearest neighbor algorithm which assigns more weight to nearer
neighbors. The proposed system has been found to have accuracy
of 73.3% for character recognition with speaker lip movements as
the only input and without using any speech recognition system in
parallel. The approach used in this work is found to significantly
solve the purpose of lip reading when size of database is small.
Abstract: A new numerical scheme based on the H1-Galerkin mixed finite element method for a class of second-order pseudohyperbolic equations is constructed. The proposed procedures can be split into three independent differential sub-schemes and does not need to solve a coupled system of equations. Optimal error estimates are derived for both semidiscrete and fully discrete schemes for problems in one space dimension. And the proposed method dose not requires the LBB consistency condition. Finally, some numerical results are provided to illustrate the efficacy of our method.
Abstract: In this paper real money demand function is analyzed
within multivariate time-series framework. Cointegration approach is
used (Johansen procedure) assuming interdependence between
money demand determinants, which are nonstationary variables. This
will help us to understand the behavior of money demand in Croatia,
revealing the significant influence between endogenous variables in
vector autoregrression system (VAR), i.e. vector error correction
model (VECM). Exogeneity of the explanatory variables is tested.
Long-run money demand function is estimated indicating slow speed
of adjustment of removing the disequilibrium. Empirical results
provide the evidence that real industrial production and exchange
rate explains the most variations of money demand in the long-run,
while interest rate is significant only in short-run.
Abstract: Despite the fact that Arabic language is currently one
of the most common languages worldwide, there has been only a
little research on Arabic speech recognition relative to other
languages such as English and Japanese. Generally, digital speech
processing and voice recognition algorithms are of special
importance for designing efficient, accurate, as well as fast automatic
speech recognition systems. However, the speech recognition process
carried out in this paper is divided into three stages as follows: firstly,
the signal is preprocessed to reduce noise effects. After that, the
signal is digitized and hearingized. Consequently, the voice activity
regions are segmented using voice activity detection (VAD)
algorithm. Secondly, features are extracted from the speech signal
using Mel-frequency cepstral coefficients (MFCC) algorithm.
Moreover, delta and acceleration (delta-delta) coefficients have been
added for the reason of improving the recognition accuracy. Finally,
each test word-s features are compared to the training database using
dynamic time warping (DTW) algorithm. Utilizing the best set up
made for all affected parameters to the aforementioned techniques,
the proposed system achieved a recognition rate of about 98.5%
which outperformed other HMM and ANN-based approaches
available in the literature.
Abstract: The paper presents the study of synthetic transmit
aperture method applying the Golay coded transmission for medical
ultrasound imaging. Longer coded excitation allows to increase the
total energy of the transmitted signal without increasing the peak
pressure. Signal-to-noise ratio and penetration depth are improved
maintaining high ultrasound image resolution.
In the work the 128-element linear transducer array with 0.3 mm
inter-element spacing excited by one cycle and the 8 and 16-bit
Golay coded sequences at nominal frequencies 4 MHz was used.
Single element transmission aperture was used to generate a spherical
wave covering the full image region and all the elements received the
echo signals. The comparison of 2D ultrasound images of the wire
phantom as well as of the tissue mimicking phantom is presented to
demonstrate the benefits of the coded transmission. The results were
obtained using the synthetic aperture algorithm with transmit and
receive signals correction based on a single element directivity
function.
Abstract: In this paper we present a modification to existed model of threshold for shot cut detection, which is able to adapt itself to the sequence statistics and operate in real time, because it use for calculation only previously evaluated frames. The efficiency of proposed modified adaptive threshold scheme was verified through extensive test experiment with several similarity metrics and achieved results were compared to the results reached by the original model. According to results proposed threshold scheme reached higher accuracy than existed original model.
Abstract: It is not easy to imagine how the existing city can be
converted to the principles of sustainability, however, the need for
innovation, requires a pioneering phase which must address the main
problems of rehabilitation of the operating models of the city. Today,
however, there is a growing awareness that the identification and
implementation of policies and measures to promote the adaptation,
resilience and reversibility of the city, require the contribution of our
discipline. This breakthrough is present in some recent international
experiences of Climate Plans, in which the envisaged measures are
closely interwoven with those of urban planning. These experiences,
provide some answers principle questions, such as: how the strategies
to combat climate can be integrated in the instruments of the local
government; what new and specific analysis must be introduced in
urban planning in order to understand the issues of urban
sustainability, and how the project compares with different spatial
scales.
Abstract: Solid waste can be considered as an urban burden or
as a valuable resource depending on how it is managed. To meet the
rising demand for energy and to address environmental concerns, a
conversion from conventional energy systems to renewable resources
is essential. For the sustainability of human civilization, an
environmentally sound and techno-economically feasible waste
treatment method is very important to treat recyclable waste. Several
technologies are available for realizing the potential of solid waste as
an energy source, ranging from very simple systems for disposing of
dry waste to more complex technologies capable of dealing with
large amounts of industrial waste. There are three main pathways for
conversion of waste material to energy: thermo chemical,
biochemical and physicochemical. This paper investigates the thermo
chemical conversion of solid waste for energy recovery. The
processes, advantages and dis-advantages of various thermo chemical
conversion processes are discussed and compared. Special attention
is given to Gasification process as it provides better solutions
regarding public acceptance, feedstock flexibility, near-zero
emissions, efficiency and security. Finally this paper presents
comparative statements of thermo chemical processes and introduces
an integrated waste management system.
Abstract: Built environments have a large impact on environmental sustainability and if it is not considered properly can negatively affect our planet. The application of transformable intelligent building systems that automatically respond to environmental conditions is one of the best ways that can intelligently assist us to create sustainable environment. The significance of this issue is evident as energy crisis and environmental changes has made the sustainability the main concerns in many societies. The aim of this research is to review and evaluate the importance and influence of transformable intelligent structure on the creation of sustainable architecture. Intelligent systems in current buildings provide convenience through automatically responding to changes in environmental conditions, reducing energy dissipation and increase of the lifecycle of buildings. This paper by analyzing significant intelligent building systems will evaluate the potentials of transformable intelligent systems in the creation of sustainable architecture and environment.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: National Biodiversity Database System (NBIDS) has
been developed for collecting Thai biodiversity data. The goal of this
project is to provide advanced tools for querying, analyzing,
modeling, and visualizing patterns of species distribution for
researchers and scientists. NBIDS data record two types of datasets:
biodiversity data and environmental data. Biodiversity data are
specie presence data and species status. The attributes of biodiversity
data can be further classified into two groups: universal and projectspecific
attributes. Universal attributes are attributes that are common
to all of the records, e.g. X/Y coordinates, year, and collector name.
Project-specific attributes are attributes that are unique to one or a
few projects, e.g., flowering stage. Environmental data include
atmospheric data, hydrology data, soil data, and land cover data
collecting by using GLOBE protocols. We have developed webbased
tools for data entry. Google Earth KML and ArcGIS were used
as tools for map visualization. webMathematica was used for simple
data visualization and also for advanced data analysis and
visualization, e.g., spatial interpolation, and statistical analysis.
NBIDS will be used by park rangers at Khao Nan National Park, and
researchers.