Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: This paper presents the experimental results of a
single cylinder Enfield engine using an electronically controlled fuel
injection system which was developed to carry out exhaustive tests
using neat CNG, and mixtures of hydrogen in compressed natural gas
(HCNG) as 0, 5, 10, 15 and 20% by energy. Experiments were
performed at 2000 and 2400 rpm with wide open throttle and varying
the equivalence ratio. Hydrogen which has fast burning rate, when
added to compressed natural gas, enhances its flame propagation rate.
The emissions of HC, CO, decreased with increasing percentage of
hydrogen but NOx was found to increase. The results indicated a
marked improvement in the brake thermal efficiency with the
increase in percentage of hydrogen added. The improved thermal
efficiency was clearly observed to be more in lean region as
compared to rich region. This study is expected to reduce vehicular
emissions along with increase in thermal efficiency and thus help in
reduction of further environmental degradation.
Abstract: Wavelet transform or wavelet analysis is a recently
developed mathematical tool in applied mathematics. In numerical
analysis, wavelets also serve as a Galerkin basis to solve partial
differential equations. Haar transform or Haar wavelet transform has
been used as a simplest and earliest example for orthonormal wavelet
transform. Since its popularity in wavelet analysis, there are several
definitions and various generalizations or algorithms for calculating
Haar transform. Fast Haar transform, FHT, is one of the algorithms
which can reduce the tedious calculation works in Haar transform. In
this paper, we present a modified fast and exact algorithm for FHT,
namely Modified Fast Haar Transform, MFHT. The algorithm or
procedure proposed allows certain calculation in the process
decomposition be ignored without affecting the results.
Abstract: Principal Component Analysis (PCA) has many
different important applications especially in pattern detection
such as face detection / recognition. Therefore, for real time
applications, the response time is required to be as small as
possible. In this paper, new implementation of PCA for fast
face detection is presented. Such new implementation is
designed based on cross correlation in the frequency domain
between the input image and eigenvectors (weights).
Simulation results show that the proposed implementation of
PCA is faster than conventional one.
Abstract: The study was carried out to evaluated effect of S-gridling on fruit growth and quality of wax apple. The study was laid in Random completed block design with four replicated. Four treatment were applied as follows: S-girdling, fruit thinning plus bagging with 2,4-D sprayed, fruit thinning plus bagging and the control treatment. 2,4D was sprayed at the small bud and petal fall stage. Girdling was applied three week before flowering. The effect of all treatments on fruit growth was measured weekly. Number of flower, fruit set, fruit drop, fruit crack, and fruit quality were recorded. The result indicated that S-girdling, 2,4D application produced the lowest bud drop, fruit drop compared to untreated control. S-girdling improved faster fruit growth producing the best final fruit length and diameter compared to untreated control. S-girdling also markedly enhanced fruit set, fruit weight, and total soluble solid, reduced fruit crack, titratable acidity. On the other hand, it was noticed that with 2,4-D application also increased the fruit growth rate, improved physiological and biochemical characters of fruit than control treatment. It was concluded that S-girdling was recommended as the industry norm to increase fruit set, fruit quality in wax apple. 2,4D application had a distinctive and significant effect on most of the fruit quality characteristics assessed.
Abstract: The design of Automatic Generation Control (AGC) system plays a vital role in automation of power system. This paper proposes Hybrid Neuro Fuzzy (HNF) approach for AGC of two-area interconnected reheat thermal power system with the consideration of Generation Rate Constraint (GRC). The advantage of proposed controller is that it can handle the system non-linearities and at the same time the proposed approach is faster than conventional controllers. The performance of HNF controller has been compared with that of both conventional Proportional Integral (PI) controller as well as Fuzzy Logic Controller (FLC) both in the absence and presence of Generation Rate Constraint (GRC). System performance is examined considering disturbance in each area of interconnected power system.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: During the last years, the genomes of more and more
species have been sequenced, providing data for phylogenetic recon-
struction based on genome rearrangement measures. A main task in
all phylogenetic reconstruction algorithms is to solve the median of
three problem. Although this problem is NP-hard even for the sim-
plest distance measures, there are exact algorithms for the breakpoint
median and the reversal median that are fast enough for practical use.
In this paper, this approach is extended to the transposition median as
well as to the weighted reversal and transposition median. Although
there is no exact polynomial algorithm known even for the pairwise
distances, we will show that it is in most cases possible to solve
these problems exactly within reasonable time by using a branch and
bound algorithm.
Abstract: Saudi Arabia in recent years has seen drastic increase
in traffic related crashes. With population of over 29 million, Saudi
Arabia is considered as a fast growing and emerging economy. The
rapid population increase and economic growth has resulted in rapid
expansion of transportation infrastructure, which has led to increase
in road crashes. Saudi Ministry of Interior reported more than 7,000
people killed and 68,000 injured in 2011 ranking Saudi Arabia to be
one of the worst worldwide in traffic safety. The traffic safety issues
in the country also result in distress to road users and cause and
economic loss exceeding 3.7 billion Euros annually. Keeping this in
view, the researchers in Saudi Arabia are investigating ways to
improve traffic safety conditions in the country. This paper presents a
multilevel approach to collect traffic safety related data required to do
traffic safety studies in the region. Two highway corridors including
King Fahd Highway 39 kilometre and Gulf Cooperation Council
Highway 42 kilometre long connecting the cities of Dammam and
Khobar were selected as a study area. Traffic data collected included
traffic counts, crash data, travel time data, and speed data. The
collected data was analysed using geographic information system to
evaluate any correlation. Further research is needed to investigate the
effectiveness of traffic safety related data when collected in a
concerted effort.
Abstract: This research explores the links between physical
development and transportation infrastructure around Kumasi,
Ghana. It utilizes census data as well as fieldwork and interviews
carried out during July and December 2005. The results suggest that
there is a weak association between transportation investments and
physical development, and that recent housing has generally occurred
in poorly accessible locations. Road investments have generally
followed physical expansion rather than the reverse. Hence policies
designed to manage the fast growth now occurring around Ghanaian
cities should not focus exclusively on improving transportation
infrastructure but also strengthening the underlying the traditional
land management structures and the official land administrative
institutions that operate within those structures.
Abstract: Disordered function of maniphalanx and difficulty with
ambulation will occur insofar as a human has a failure in the spinal
marrow. Cervical spondylotic myelopathy as one of the myelopathy
emanates from not only external factors but also increased age. In
addition, the diacrisis is difficult since cervical spondylotic
myelopathy is evaluated by a doctor-s neurological remark and
imaging findings. As a quantitative method for measuring the degree
of disability, hand-operated triangle step test (for short, TST) has
formulated. In this research, a full automatic triangle step counter
apparatus is designed and developed to measure the degree of
disability in an accurate fashion according to the principle of TST. The
step counter apparatus whose shape is a low triangle pole displays the
number of stepping upon each corner. Furthermore, the apparatus has
two modes of operation. Namely, one is for measuring the degree of
disability and the other for rehabilitation exercise. In terms of
usefulness, clinical practice should be executed before too long.
Abstract: Crucial information barely visible to the human eye is
often embedded in a series of low resolution images taken of the
same scene. Super resolution reconstruction is the process of
combining several low resolution images into a single higher
resolution image. The ideal algorithm should be fast, and should add
sharpness and details, both at edges and in regions without adding
artifacts. In this paper we propose a super resolution blind
reconstruction technique for linearly degraded images. In our
proposed technique the algorithm is divided into three parts an image
registration, wavelets based fusion and an image restoration. In this
paper three low resolution images are considered which may sub
pixels shifted, rotated, blurred or noisy, the sub pixel shifted images
are registered using affine transformation model; A wavelet based
fusion is performed and the noise is removed using soft thresolding.
Our proposed technique reduces blocking artifacts and also
smoothens the edges and it is also able to restore high frequency
details in an image. Our technique is efficient and computationally
fast having clear perspective of real time implementation.
Abstract: The flash memory has many advantages such as low power consumption, strong shock resistance, fast I/O and non-volatility. And it is increasingly used in the mobile storage device. The YAFFS, one of the NAND flash file system, is widely used in the embedded device. However, the existing YAFFS takes long time to mount the file system because it scans whole spare areas in all pages of NAND flash memory. In order to solve this problem, we propose a new content-based flash file system using a mounting time reduction technique. The proposed method only scans partial spare areas of some special pages by using content-based block management. The experimental results show that the proposed method reduces the average mounting time by 87.2% comparing with JFFS2 and 69.9% comparing with YAFFS.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Abstract: Every day human life experiences new equipments
more automatic and with more abilities. So the need for faster
processors doesn-t seem to finish. Despite new architectures and
higher frequencies, a single processor is not adequate for many
applications. Parallel processing and networks are previous solutions
for this problem. The new solution to put a network of resources on a
chip is called NOC (network on a chip). The more usual topology for
NOC is mesh topology. There are several routing algorithms suitable
for this topology such as XY, fully adaptive, etc. In this paper we
have suggested a new algorithm named Intermittent X, Y (IX/Y). We
have developed the new algorithm in simulation environment to
compare delay and power consumption with elders' algorithms.
Abstract: Web-based technologies have created numerous
opportunities for electronic word-of-mouth (eWOM) communication.
There are many factors that affect customer adoption and decisionmaking
process. However, only a few researches focus on some
factors such as the membership time of forum and propensity to trust.
Using a discrete-time event simulation to simulate a diffusion model
along with a consumer decision model, the study shows the effect of
each factor on adoption of opinions on on-line discussion forum. The
purpose of this study is to examine the effect of factor affecting
information adoption and decision making process. The model is
constructed to test quantitative aspects of each factor. The simulation
study shows the membership time and the propensity to trust has an
effect on information adoption and purchasing decision. The result of
simulation shows that the longer the membership time in the
communities and the higher propensity to trust could lead to the
higher demand rates because consumers find it easier and faster to
trust the person in the community and then adopt the eWOM. Other
implications for both researchers and practitioners are provided.
Abstract: Wood pyrolysis for Casuarina glauca, Casuarina cunninghamiana, Eucalyptus camaldulensis, Eucalyptus microtheca was made at 450°C with 2.5°C/min. in a flowing N2-atmosphere. The Eucalyptus genus wood gave higher values of specific gravity, ash , total extractives, lignin, N2-liquid trap distillate (NLTD) and water trap distillate (WSP) than those for Casuarina genus. The GHC of NLTD was higher for Casuarina genus than that for Eucalyptus genus with the highest value for Casuarina cunninghamiana. Guiacol, 4-ethyl-2-methoxyphenol and syringol were observed in the NLTD of all the four wood species reflecting their parent hardwood lignin origin. Eucalyptus camaldulensis wood had the highest lignin content (28.89%) and was pyrolyzed to the highest values of phenolics (73.01%), guaiacol (11.2%) and syringol (32.28%) contents in methylene chloride fraction (MCF) of NLTD. Accordingly, recoveries of syringol and guaiacol may become economically attractive from Eucalyptus camaldulensis.
Abstract: We present a method for fast volume rendering using
graphics hardware (GPU). To our knowledge, it is the first implementation
on the GPU. Based on the Shear-Warp algorithm, our
GPU-based method provides real-time frame rates and outperforms
the CPU-based implementation. When the number of slices is not
sufficient, we add in-between slices computed by interpolation. This
improves then the quality of the rendered images. We have also
implemented the ray marching algorithm on the GPU. The results
generated by the three algorithms (CPU-based and GPU-based Shear-
Warp, GPU-based Ray Marching) for two test models has proved that
the ray marching algorithm outperforms the shear-warp methods in
terms of speed up and image quality.
Abstract: This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.