Abstract: The past decade has witnessed a good opportunities
for city development schemes in UK. The government encouraged
restoration of city centers to comprise mixed use developments with
high density residential apartments. Investments in regeneration areas
were doing well according to the analyses of Property Databank
(IPD). However, more recent analysis by IPD has shown that since
2007, property in regeneration areas has been more vulnerable to the
market downturn than other types of investment property. The early
stages of a property market downturn may be felt most in
regeneration where funding, investor confidence and occupier
demand would dissipate because the sector was considered more
marginal or risky when development costs rise. Moreover, the Bank
of England survey shows that lenders have sequentially tightened the
availability of credit for commercial real estate since mid-2007. A
sharp reduction in the willingness of banks to lend on commercial
property was recorded. The credit crunch has already affected
commercial property but its impact has been particularly severe in
certain kinds of properties where residential developments are
extremely difficult, in particular city centre apartments and buy-to-let
markets. Commercial property – retail, industrial leisure and mixed
use were also pressed, in Birmingham; tens of mixed use plots were
built to replace old factories in the heart of the city. The purpose of
these developments was to enable young professionals to work and
live in same place. Thousands of people lost their jobs during the
recession, moreover lending was more difficult and the future of
many developments is unknown. The recession casts its shadow upon
the society due to cuts in public spending by government, Inflation,
rising tuition fees and high rise in unemployment generated anger and
hatred was spreading among youth causing vandalism and riots in
many cities. Recent riots targeted many mixed used development in
the UK where banks, shops, restaurants and big stores were robbed
and set into fire leaving residents with horror and shock. This paper
examines the impact of the recession and riots on mixed use
development in UK.
Abstract: In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Abstract: This paper presents a hybrid fuzzy-PD plus PID
(HFPP) controller and its application to steam distillation process for
essential oil extraction system. Steam temperature is one of the most
significant parameters that can influence the composition of essential
oil yield. Due to parameter variations and changes in operation
conditions during distillation, a robust steam temperature controller becomes nontrivial to avoid the degradation of essential oil quality.
Initially, the PRBS input is triggered to the system and output of steam temperature is modeled using ARX model structure. The
parameter estimation and tuning method is adopted by simulation
using HFPP controller scheme. The effectiveness and robustness of
proposed controller technique is validated by real time
implementation to the system. The performance of HFPP using 25 and 49 fuzzy rules is compared. The experimental result demonstrates the proposed HFPP using 49 fuzzy rules achieves a
better, consistent and robust controller compared to PID when considering the test on tracking the set point and the effects due to disturbance.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: EGOTHOR is a search engine that indexes the Web
and allows us to search the Web documents. Its hit list contains URL
and title of the hits, and also some snippet which tries to shortly
show a match. The snippet can be almost always assembled by an
algorithm that has a full knowledge of the original document (mostly
HTML page). It implies that the search engine is required to store
the full text of the documents as a part of the index.
Such a requirement leads us to pick up an appropriate compression
algorithm which would reduce the space demand. One of the solutions
could be to use common compression methods, for instance gzip or
bzip2, but it might be preferable if we develop a new method which
would take advantage of the document structure, or rather, the textual
character of the documents.
There already exist a special compression text algorithms and
methods for a compression of XML documents. The aim of this
paper is an integration of the two approaches to achieve an optimal
level of the compression ratio
Abstract: The aim of the present study is to analyze empirical
researches on the social resources dimension of occupational status
attainment process and relate them to the rational choice approach.
The analysis suggests that the existing data on the strength of ties
aspect of social resources is insufficient and does not allow any
implication concerning rational actor-s behavior. However, the results
concerning work relation aspect are more encouraging.
Abstract: Although many studies on the assembly technology of
the bridge construction have dealt mostly with on the pier, girder or the
deck of the bridge, studies on the prefabricated barrier have rarely been
performed. For understanding structural characteristics and
application of the concrete barrier in the modular bridge, which is an
assembly of structure members, static loading test was performed.
Structural performances as a road barrier of the three methods,
conventional cast-in-place(ST), vertical bolt connection(BVC) and
horizontal bolt connection(BHC) were evaluated and compared
through the analyses of load-displacement curves, strain curves of the
steel, concrete strain curves and the visual appearances of crack
patterns. The vertical bolt connection(BVC) method demonstrated
comparable performance as an alternative to conventional
cast-in-place(ST) while providing all the advantages of prefabricated
technology. Necessities for the future improvement in nuts
enforcement as well as legal standard and regulation are also
addressed.
Abstract: Dengue disease is an infectious vector-borne viral
disease that is commonly found in tropical and sub-tropical regions,
especially in urban and semi-urban areas, around the world and
including Malaysia. There is no currently available vaccine or
chemotherapy for the prevention or treatment of dengue disease.
Therefore prevention and treatment of the disease depend on vector
surveillance and control measures. Disease risk mapping has been
recognized as an important tool in the prevention and control
strategies for diseases. The choice of statistical model used for
relative risk estimation is important as a good model will
subsequently produce a good disease risk map. Therefore, the aim of
this study is to estimate the relative risk for dengue disease based
initially on the most common statistic used in disease mapping called
Standardized Morbidity Ratio (SMR) and one of the earliest
applications of Bayesian methodology called Poisson-gamma model.
This paper begins by providing a review of the SMR method, which
we then apply to dengue data of Perak, Malaysia. We then fit an
extension of the SMR method, which is the Poisson-gamma model.
Both results are displayed and compared using graph, tables and
maps. Results of the analysis shows that the latter method gives a
better relative risk estimates compared with using the SMR. The
Poisson-gamma model has been demonstrated can overcome the
problem of SMR when there is no observed dengue cases in certain
regions. However, covariate adjustment in this model is difficult and
there is no possibility for allowing spatial correlation between risks in
adjacent areas. The drawbacks of this model have motivated many
researchers to propose other alternative methods for estimating the
risk.
Abstract: In this paper, the full state feedback controllers
capable of regulating and tracking the speed trajectory are presented.
A fourth order nonlinear mean value model of a 448 kW turbocharged
diesel engine published earlier is used for the purpose.
For designing controllers, the nonlinear model is linearized and
represented in state-space form. Full state feedback controllers
capable of meeting varying speed demands of drivers are presented.
Main focus here is to investigate sensitivity of the controller to the
perturbations in the parameters of the original nonlinear model.
Suggested controller is shown to be highly insensitive to the
parameter variations. This indicates that the controller is likely
perform with same accuracy even after significant wear and tear of
engine due to its use for years.
Abstract: This paper presents a system overview of Mobile to Server Face Recognition, which is a face recognition application developed specifically for mobile phones. Images taken from mobile phone cameras lack of quality due to the low resolution of the cameras. Thus, a prototype is developed to experiment the chosen method. However, this paper shows a result of system backbone without the face recognition functionality. The result demonstrated in this paper indicates that the interaction between mobile phones and server is successfully working. The result shown before the database is completely ready. The system testing is currently going on using real images and a mock-up database to test the functionality of the face recognition algorithm used in this system. An overview of the whole system including screenshots and system flow-chart are presented in this paper. This paper also presents the inspiration or motivation and the justification in developing this system.
Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: The number of features required to represent an image
can be very huge. Using all available features to recognize objects
can suffer from curse dimensionality. Feature selection and
extraction is the pre-processing step of image mining. Main issues in
analyzing images is the effective identification of features and
another one is extracting them. The mining problem that has been
focused is the grouping of features for different shapes. Experiments
have been conducted by using shape outline as the features. Shape
outline readings are put through normalization and dimensionality
reduction process using an eigenvector based method to produce a
new set of readings. After this pre-processing step data will be
grouped through their shapes. Through statistical analysis, these
readings together with peak measures a robust classification and
recognition process is achieved. Tests showed that the suggested
methods are able to automatically recognize objects through their
shapes. Finally, experiments also demonstrate the system invariance
to rotation, translation, scale, reflection and to a small degree of
distortion.
Abstract: This paper demonstrates the bus location system for
the route bus through the experiment in the real environment. A
bus location system is a system that provides information such as
the bus delay and positions. This system uses actual services and
positions data of buses, and those information should match data
on the database. The system has two possible problems. One, the
system could cost high in preparing devices to get bus positions.
Two, it could be difficult to match services data of buses. To avoid
these problems, we have developed this system at low cost and short
time by using the smart phone with GPS and the bus route system.
This system realizes the path planning considering bus delay and
displaying position of buses on the map. The bus location system
was demonstrated on route buses with smart phones for two months.
Abstract: Novel acrylated epoxidized hemp oil (AEHO) based
bioresins were successfully synthesised, characterized and applied to
biocomposites reinforced with woven jute fibre. Characterisation of
the synthesised AEHO consisted of acid number titrations and FTIR
spectroscopy to assess the success of the acrylation reaction. Three
different matrices were produced (vinylester (VE), 50/50 blend of
AEHO/VE and 100% AEHO) and reinforced with jute fibre to form
three different types of biocomposite samples. Mechanical properties
in the form of flexural and interlaminar shear strength (ILSS) were
investigated and compared for the different samples. Results from the
mechanical tests showed that AEHO and 50/50 based neat bioresins
displayed lower flexural properties compared with the VE samples.
However when applied to biocomposites and compared with VE
based samples, AEHO biocomposites demonstrated comparable
flexural performance and improved ILSS. These results are attributed
to improved fibre-matrix interfacial adhesion due to surface-chemical
compatibility between the natural fibres and bioresin.
Abstract: A numerical simulation of micro Poiseuille flow has
performed for rarefied and compressible flow at slip flow regimes.
The wall roughness is simulated in two cases with triangular
microelements and random micro peaks distributed on wall surfaces
to study the effects of roughness shape and distribution on flow field.
Two values of Mach and Knudsen numbers have used to investigate
the effects of rarefaction as well as compressibility. The numerical
results have also checked with available theoretical and experimental
relations and good agreements has achieved. High influence of
roughness shape can be seen for both compressible and
incompressible rarefied flows. In addition it is found that rarefaction
has more significant effect on flow field in microchannels with
higher relative roughness. It is also found that compressibility has
more significant effects on Poiseuille number when relative
roughness increases.
Abstract: Property investment in the real estate industry has a
high risk due to the uncertainty factors that will affect the decisions
made and high cost. Analytic hierarchy process has existed for some
time in which referred to an expert-s opinion to measure the
uncertainty of the risk factors for the risk analysis. Therefore,
different level of experts- experiences will create different opinion
and lead to the conflict among the experts in the field. The objective
of this paper is to propose a new technique to measure the uncertainty
of the risk factors based on multidimensional data model and data
mining techniques as deterministic approach. The propose technique
consist of a basic framework which includes four modules: user,
technology, end-user access tools and applications. The property
investment risk analysis defines as a micro level analysis as the
features of the property will be considered in the analysis in this
paper.
Abstract: Brand name plays a vital role for in-shop buying
behavior of consumers and mutated brand name may affect the
selling of leading branded products. In Indian market, there are many
products with mutated brand names which are either orthographically
or phonologically similar. Due to presence of such products, Indian
consumers very often fall under confusion when buying some
regularly used stuff. Authors of the present paper have attempted to
demonstrate relationship between less attention and false recognition
of mutated brand names during a product selection process. To
achieve this goal, visual attention study was conducted on 15 male
college students using eye-tracker against a mutated brand name and
errors in recognition were noted using questionnaire. Statistical
analysis of the acquired data revealed that there was more false
recognition of mutated brand name when less attention was paid
during selection of favorite product. Moreover, it was perceived that
eye tracking is an effective tool for analyzing false recognition of
brand name mutation.
Abstract: The study was conducted to investigate the profile of
hepatitis in Kingdom of Saudi Arabia, and to determine which age
group hepatitis viruses most commonly infect. The epidemiology of
viral hepatitis in Saudi Arabia has undergone major changes,
concurrent with major socioeconomic developments over the last two
to three decades. This disease represents a major public health
problem in Saudi Arabia resulting in the need for considerable
healthcare resources. A retrospective cross sectional analysis of the
reported cases of viral hepatitis was conducted based on the reports
of The Ministry of Health in Saudi Arabia about Hepatitis A, B and C
infections in all regions from the period of January 2006 to December
2010. The study demonstrated that incidence of viral Hepatitis is
decreasing, except for Hepatitis B that showed minimal increase. Of
hepatitis A, B, and C, Hepatitis B virus (HBV) was the most
predominant type, accounting for (53%) of the cases, followed by
Hepatitis C virus (HCV) (30%) and HAV (17%). HAV infection
predominates in children (5–14 years) with 60% of viral hepatitis
cases, HBV in young adults (15–44 years) with 69% of viral hepatitis
cases, and HCV in older adults (>45 years) with 59% of viral
hepatitis cases. Despite significant changes in the prevalence of viral
hepatitis A, B and C, it remains a major public health problem in
Saudi Arabia; however, it showed a significant decline in the last two
decades that could be attributed to the vaccination programs and the
improved health facilities. Further researches are needed to identify
the risk factors making a specific age group or a specific region in
Saudi Arabia targeted for a specific type of hepatitis viruses.
Abstract: This paper presents a dynamic adaptation scheme for
the frequency of inter-deme migration in distributed genetic algorithms
(GA), and its VLSI hardware design. Distributed GA,
or multi-deme-based GA, uses multiple populations which evolve
concurrently. The purpose of dynamic adaptation is to improve
convergence performance so as to obtain better solutions. Through
simulation experiments, we proved that our scheme achieves better
performance than fixed frequency migration schemes.
Abstract: Fourier transform infrared (FT-IR) spectroscopic imaging
is an emerging technique that provides both chemically and
spatially resolved information. The rich chemical content of data
may be utilized for computer-aided determinations of structure and
pathologic state (cancer diagnosis) in histological tissue sections for
prostate cancer. FT-IR spectroscopic imaging of prostate tissue has
shown that tissue type (histological) classification can be performed to
a high degree of accuracy [1] and cancer diagnosis can be performed
with an accuracy of about 80% [2] on a microscopic (≈ 6μm)
length scale. In performing these analyses, it has been observed
that there is large variability (more than 60%) between spectra from
different points on tissue that is expected to consist of the same
essential chemical constituents. Spectra at the edges of tissues are
characteristically and consistently different from chemically similar
tissue in the middle of the same sample. Here, we explain these
differences using a rigorous electromagnetic model for light-sample
interaction. Spectra from FT-IR spectroscopic imaging of chemically
heterogeneous samples are different from bulk spectra of individual
chemical constituents of the sample. This is because spectra not
only depend on chemistry, but also on the shape of the sample.
Using coupled wave analysis, we characterize and quantify the nature
of spectral distortions at the edges of tissues. Furthermore, we
present a method of performing histological classification of tissue
samples. Since the mid-infrared spectrum is typically assumed to
be a quantitative measure of chemical composition, classification
results can vary widely due to spectral distortions. However, we
demonstrate that the selection of localized metrics based on chemical
information can make our data robust to the spectral distortions
caused by scattering at the tissue boundary.