Abstract: This paper describes the performance of TCP Vegas
over the wireless IPv6 network. The performance of TCP Vegas is
evaluated using network simulator (ns-2). The simulation experiment
investigates how packet spacing affects the network delay, network
throughput and network efficiency of TCP Vegas. Moreover, we
investigate how the variable FTP packet sizes affect the network
performance. The result of the simulation experiment shows that as
the packet spacing is implements, the network delay is reduces,
network throughput and network efficiency is optimizes. As the FTP
packet sizes increase, the ratio of delay per throughput decreases.
From the result of experiment, we propose the appropriate packet size
in transmitting file transfer protocol application using TCP Vegas
with packet spacing enhancement over wireless IPv6 environment in
ns-2. Additionally, we suggest the appropriate ratio in determining
the appropriate RTT and buffer size in a network.
Abstract: Realistic 3D face model is more precise in representing
pose, illumination, and expression of face than 2D face model so that it
can be utilized usefully in various applications such as face recognition,
games, avatars, animations, and etc.
In this paper, we propose a 3D face modeling method based on 3D
dense morphable shape model. The proposed 3D modeling method
first constructs a 3D dense morphable shape model from 3D face scan
data obtained using a 3D scanner. Next, the proposed method extracts
and matches facial landmarks from 2D image sequence containing a
face to be modeled, and then reconstructs 3D vertices coordinates of
the landmarks using a factorization-based SfM technique. Then, the
proposed method obtains a 3D dense shape model of the face to be
modeled by fitting the constructed 3D dense morphable shape model
into the reconstructed 3D vertices. Also, the proposed method makes a
cylindrical texture map using 2D face image sequence. Finally, the
proposed method generates a 3D face model by rendering the 3D dense
face shape model using the cylindrical texture map. Through building
processes of 3D face model by the proposed method, it is shown that
the proposed method is relatively easy, fast and precise.
Abstract: Electronic commerce is growing rapidly with on-line
sales already heading for hundreds of billion dollars per year. Due to
the huge amount of money transferred everyday, an increased
security level is required. In this work we present the architecture of
an intelligent speaker verification system, which is able to accurately
verify the registered users of an e-commerce service using only their
voices as an input. According to the proposed architecture, a
transaction-based e-commerce application should be complemented
by a biometric server where customer-s unique set of speech models
(voiceprint) is stored. The verification procedure requests from the
user to pronounce a personalized sequence of digits and after
capturing speech and extracting voice features at the client side are
sent back to the biometric server. The biometric server uses pattern
recognition to decide whether the received features match the stored
voiceprint of the customer who claims to be, and accordingly grants
verification. The proposed architecture can provide e-commerce
applications with a higher degree of certainty regarding the identity
of a customer, and prevent impostors to execute fraudulent
transactions.
Abstract: In this paper we introduce new data oriented modeling
of uniform random variable well-matched with computing systems. Due to this conformity with current computers structure, this modeling will be efficiently used in statistical inference.
Abstract: While in practice negotiation is always a mix of
cooperation and competition, these two elements correspond to
different approaches of the relationship and also different orientations
in term of strategy, techniques, tactics and arguments employed by
the negotiators with related effects and in the end leading to different
outcomes. The levels of honesty, trust and therefore cooperation are
influenced not only by the uncertainty of the situation, the objectives,
stakes or power but also by the orientation given from the very
beginning of the relationship. When negotiation is reduced to a
confrontation of power, participants rely on coercive measures, using
different kinds of threats or make false promises and bluff in order to
establish a more acceptable balance of power.
Most of the negotiators have a tendency to complain about the
unethical aspects of the tactics used by their counterparts while, as
the same time, they are mostly unaware of the sources of influence of
their own vision and practices. In this article, our intention is to
clarify these sources and try to understand what can lead negotiators
to unethical practices.
Abstract: Optimal routing in communication networks is a
major issue to be solved. In this paper, the application of Tabu Search
(TS) in the optimum routing problem where the aim is to minimize
the computational time and improvement of quality of the solution in
the communication have been addressed. The goal is to minimize the
average delays in the communication. The effectiveness of Tabu
Search method is shown by the results of simulation to solve the
shortest path problem. Through this approach computational cost can
be reduced.
Abstract: This paper presents the impact study of apparent
reactance injected by series Flexible AC Transmission System
(FACTS) i.e. Thyristor Controlled Series Reactor (TCSR) on the
measured impedance of a 400 kV single electrical transmission line
in the presence of phase to earth fault with fault resistance. The study
deals with an electrical transmission line of Eastern Algerian
transmission networks at Group Sonelgaz (Algerian Company of
Electrical and Gas) compensated by TCSR connected at midpoint of
the line. This compensator used to inject active and reactive powers
is controlled by three TCSR-s. The simulations results investigate the
impacts of the TCSR on the parameters of short circuit calculation
and parameters of measured impedance by distance relay in the
presence of earth fault for three cases study.
Abstract: Polymeric microreactors have emerged as a new
generation of carriers that hold tremendous promise in the areas of
cancer therapy, controlled delivery of drugs, for removal of
pollutants etc. Present work reports a simple and convenient
methodology for synthesis of polystyrene and poly caprolactone
microreactors. An aqueous suspension of carboxylated (1μm)
polystyrene latex particles was mixed with toluene solution followed
by freezing with liquid nitrogen. Freezed particles were incubated at
-20°C and characterized for formation of voids on the surface of
polymer microspheres by Field Emission Scanning Electron
Microscope. The hollow particles were then overnight incubated at
40ºC with unfunctionalized quantum dots (QDs) in 5:1 ratio. QDs
Encapsulated polystyrene microcapsules were characterized by
fluorescence microscopy.
Likewise Poly ε-caprolactone microreactors were prepared by
micro-volcanic rupture of freeze dried microspheres synthesized
using emulsification of polymer with aqueous Poly vinyl alcohol and
freezed with liquid nitrogen. Microreactors were examined with Field
Emission Scanning Electron Microscope for size and morphology.
Current study is an attempt to create hollow polymer particles which
can be employed for microencapsulation of nanoparticles and drug
molecules.
Abstract: ZnO+Ga2O3 functionally graded thin films (FGTFs)
were examined for their potential use as Solar cell and organic light
emitting diodes (OLEDs). FGTF transparent conducting oxides (TCO)
were fabricated by combinatorial RF magnetron sputtering. The
composition gradient was controlled up to 10% by changing the
plasma power of the two sputter guns. A Ga2O3+ZnO graded region
was placed on the top layer of ZnO. The FGTFs showed up to 80%
transmittance. Their surface resistances were reduced to < 10% by
increasing the Ga2O3: pure ZnO ratio in the TCO. The FGTFs- work
functions could be controlled within a range of 0.18 eV. The
controlled work function is a very promising technology because it
reduces the contact resistance between the anode and Hall transport
layers of OLED and solar cell devices.
Abstract: The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.
Abstract: Data mining and knowledge engineering have become a tough task due to the availability of large amount of data in the web nowadays. Validity and reliability of data also become a main debate in knowledge acquisition. Besides, acquiring knowledge from different languages has become another concern. There are many language translators and corpora developed but the function of these translators and corpora are usually limited to certain languages and domains. Furthermore, search results from engines with traditional 'keyword' approach are no longer satisfying. More intelligent knowledge engineering agents are needed. To address to these problems, a system known as Multilingual Word Semantic Network is proposed. This system adapted semantic network to organize words according to concepts and relations. The system also uses open source as the development philosophy to enable the native language speakers and experts to contribute their knowledge to the system. The contributed words are then defined and linked using lexical and semantic relations. Thus, related words and derivatives can be identified and linked. From the outcome of the system implementation, it contributes to the development of semantic web and knowledge engineering.
Abstract: A large section of the society in Urban India is unable
to afford a basic dwelling unit. Housing shortage due to the rising unafforability makes it logical to consider alternative technologies more seriously for their application How far do these alternative
technologies match up with the conventional techniques? How do these integrate with the present-day need for urban amenities and
facilities? Are the owners of bamboo dwellings, for instance, a part of
the mainstream housing sector, having the same rights and privileges
as those enjoyed by other property owners? Will they have access to loans for building, improving, renovating or repairing their
dwellings? Why do we still hesitate to build a bamboo house for ourselves? Is our policy framework and political resolve in place, to
welcome such alternative technologies? It is time we found these answers, in order to explore the reasons for large-scale nonacceptance,
of a technology proven for its worthiness.
Abstract: Message Passing Interface is widely used for Parallel
and Distributed Computing. MPICH and LAM are popular open
source MPIs available to the parallel computing community also
there are commercial MPIs, which performs better than MPICH etc.
In this paper, we discuss a commercial Message Passing Interface, CMPI
(C-DAC Message Passing Interface). C-MPI is an optimized
MPI for CLUMPS. It is found to be faster and more robust compared
to MPICH. We have compared performance of C-MPI and MPICH
on Gigabit Ethernet network.
Abstract: The effects of global warming on India vary from the
submergence of low-lying islands and coastal lands to the melting of
glaciers in the Indian Himalayas, threatening the volumetric flow rate
of many of the most important rivers of India and South Asia. In
India, such effects are projected to impact millions of lives. As a
result of ongoing climate change, the climate of India has become
increasingly volatile over the past several decades; this trend is
expected to continue.
Climate change is one of the most important global environmental
challenges, with implications for food production, water supply,
health, energy, etc. Addressing climate change requires a good
scientific understanding as well as coordinated action at national and
global level. The climate change issue is part of the larger challenge
of sustainable development. As a result, climate policies can be more
effective when consistently embedded within broader strategies
designed to make national and regional development paths more
sustainable. The impact of climate variability and change, climate
policy responses, and associated socio-economic development will
affect the ability of countries to achieve sustainable development
goals.
A very well calibrated Soil and Water Assessment Tool (R2 =
0.9968, NSE = 0.91) was exercised over the Khatra sub basin of the
Kangsabati River watershed in Bankura district of West Bengal,
India, in order to evaluate projected parameters for agricultural
activities. Evapotranspiration, Transmission Losses, Potential
Evapotranspiration and Lateral Flow to reach are evaluated from the
years 2041-2050 in order to generate a picture for sustainable
development of the river basin and its inhabitants.
India has a significant stake in scientific advancement as well as
an international understanding to promote mitigation and adaptation.
This requires improved scientific understanding, capacity building,
networking and broad consultation processes. This paper is a
commitment towards the planning, management and development of
the water resources of the Kangsabati River by presenting detailed
future scenarios of the Kangsabati river basin, Khatra sub basin, over
the mentioned time period.
India-s economy and societal infrastructures are finely tuned to the
remarkable stability of the Indian monsoon, with the consequence
that vulnerability to small changes in monsoon rainfall is very high.
In 2002 the monsoon rains failed during July, causing profound loss
of agricultural production with a drop of over 3% in India-s GDP.
Neither the prolonged break in the monsoon nor the seasonal rainfall
deficit was predicted. While the general features of monsoon
variability and change are fairly well-documented, the causal
mechanisms and the role of regional ecosystems in modulating the
changes are still not clear. Current climate models are very poor at
modelling the Asian monsoon: this is a challenging and critical
region where the ocean, atmosphere, land surface and mountains all
interact. The impact of climate change on regional ecosystems is
likewise unknown. The potential for the monsoon to become more
volatile has major implications for India itself and for economies
worldwide. Knowledge of future variability of the monsoon system,
particularly in the context of global climate change, is of great
concern for regional water and food security.
The major findings of this paper were that of all the chosen
projected parameters, transmission losses, soil water content,
potential evapotranspiration, evapotranspiration and lateral flow to
reach, display an increasing trend over the time period of years 2041-
2050.
Abstract: The current speech interfaces in many military
applications may be adequate for native speakers. However,
the recognition rate drops quite a lot for non-native speakers
(people with foreign accents). This is mainly because the nonnative
speakers have large temporal and intra-phoneme
variations when they pronounce the same words. This
problem is also complicated by the presence of large
environmental noise such as tank noise, helicopter noise, etc.
In this paper, we proposed a novel continuous acoustic feature
adaptation algorithm for on-line accent and environmental
adaptation. Implemented by incremental singular value
decomposition (SVD), the algorithm captures local acoustic
variation and runs in real-time. This feature-based adaptation
method is then integrated with conventional model-based
maximum likelihood linear regression (MLLR) algorithm.
Extensive experiments have been performed on the NATO
non-native speech corpus with baseline acoustic model trained
on native American English. The proposed feature-based
adaptation algorithm improved the average recognition
accuracy by 15%, while the MLLR model based adaptation
achieved 11% improvement. The corresponding word error
rate (WER) reduction was 25.8% and 2.73%, as compared to
that without adaptation. The combined adaptation achieved
overall recognition accuracy improvement of 29.5%, and
WER reduction of 31.8%, as compared to that without
adaptation.
Abstract: Congestion control is one of the fundamental issues in computer networks. Without proper congestion control mechanisms there is the possibility of inefficient utilization of resources, ultimately leading to network collapse. Hence congestion control is an effort to adapt the performance of a network to changes in the traffic load without adversely affecting users perceived utilities. AIMD (Additive Increase Multiplicative Decrease) is the best algorithm among the set of liner algorithms because it reflects good efficiency as well as good fairness. Our control model is based on the assumption of the original AIMD algorithm; we show that both efficiency and fairness of AIMD can be improved. We call our approach is New AIMD. We present experimental results with TCP that match the expectation of our theoretical analysis.
Abstract: In this paper we proposed the use of Huffman
coding to reduce the PAR of an OFDM system as a distortionless
scrambling technique, and we utilize the amount saved in the
total bit rate by the Huffman coding to send the encoding table
for accurate decoding at the receiver without reducing the
effective throughput. We found that the use of Huffman coding
reduces the PAR by about 6 dB. Also we have investigated the
effect of PAR reduction due to Huffman coding through testing
the spectral spreading and the inband distortion due to HPA with
different IBO values. We found a complete match of our
expectation from the proposed solution with the obtained
simulation results.
Abstract: The density estimates considered in this paper comprise
a base density and an adjustment component consisting of a linear
combination of orthogonal polynomials. It is shown that, in the
context of density approximation, the coefficients of the linear combination
can be determined either from a moment-matching technique
or a weighted least-squares approach. A kernel representation of
the corresponding density estimates is obtained. Additionally, two
refinements of the Kronmal-Tarter stopping criterion are proposed
for determining the degree of the polynomial adjustment. By way of
illustration, the density estimation methodology advocated herein is
applied to two data sets.
Abstract: The present work describes a computational study of
aerodynamic characteristics of GLC305 airfoil clean and with 16.7
min ice shape (rime 212) and 22.5 min ice shape (glaze 944).The
performance of turbulence models SA, Kε, Kω Std, and Kω SST
model are observed against experimental flow fields at different
Mach numbers 0.12, 0.21, 0.28 in a range of Reynolds numbers
3x106, 6x106, and 10.5x106 on clean and iced aircraft airfoil
GLC305. Numerical predictions include lift, drag and pitching
moment coefficients at different Mach numbers and at different angle
of attacks were done. Accuracy of solutions with respect to the
effects of turbulence models, variation of Mach number, initial
conditions, grid resolution and grid spacing near the wall made the
study much sensitive. Navier Stokes equation based computational
technique is used. Results are very close to the experimental results.
It has seen that SA and SST models are more efficient than Kε and
Kω standard in under study problem.
Abstract: With the rapid growth in business size, today-s businesses orient Throughout thirty years local, national and international experience in medicine as a medical student, junior doctor and eventually Consultant and Professor in Anaesthesia, Intensive Care and Pain Management, I note significant generalised dissatisfaction among medical students and doctors regarding their medical education and practice. We repeatedly hear complaints from patients about the dysfunctional health care system they are dealing with and subsequently the poor medical service that they are receiving. Medical students are bombarded with lectures, tutorials, clinical rounds and various exams. Clinicians are weighed down with a never-ending array of competing duties. Patients are extremely unhappy about the long waiting lists, loss of their records and the continuous deterioration of the health care service. This problem has been reported in different countries by several authors [1,2,3]. In a trial to solve this dilemma, a genuine idea has been suggested implementing computer technology in medicine [2,3]. Computers in medicine are a medium of international communication of the revolutionary advances being made in the application of the computer to the fields of bioscience and medicine [4,5]. The awareness about using computers in medicine has recently increased all over the world. In Misr University for Science & Technology (MUST), Egypt, medical students are now given hand-held computers (Laptop) with Internet facility making their medical education accessible, convenient and up to date. However, this trial still needs to be validated. Helping the readers to catch up with the on going fast development in this interesting field, the author has decided to continue reviewing the literature, exploring the state-of-art in computer based medicine and up dating the medical professionals especially the local trainee Doctors in Egypt. In part I of this review article we will give a general background discussing the potential use of computer technology in the various aspects of the medical field including education, research, clinical practice and the health care service given to patients. Hope this will help starting changing the culture, promoting the awareness about the importance of implementing information technology (IT) in medicine, which is a field in which such help is needed. An international collaboration is recommended supporting the emerging countries achieving this target.