Abstract: Recently, lots of researchers are attracted to retrieving
multimedia database by using some impression words and their values.
Ikezoe-s research is one of the representatives and uses eight pairs of
opposite impression words. We had modified its retrieval interface and
proposed '2D-RIB'. In '2D-RIB', after a retrieval person selects a
single basic music, the system visually shows some other music
around the basic one along relative position. He/she can select one of
them fitting to his/her intention, as a retrieval result. The purpose of
this paper is to improve his/her satisfaction level to the retrieval result
in 2D-RIB. One of our extensions is to define and introduce the
following two measures: 'melody goodness' and 'general acceptance'.
We implement them in different five combinations. According to an
evaluation experiment, both of these two measures can contribute to
the improvement. Another extension is three types of customization.
We have implemented them and clarified which customization is
effective.
Abstract: Lipases are enzymes particularly amenable for
immobilization by entrapment methods, as they can work equally
well in aqueous or non-conventional media and long-time stability of
enzyme activity and enantioselectivity is needed to elaborate more
efficient bioprocesses. The improvement of Pseudomonas
fluorescens (Amano AK) lipase characteristics was investigated by
optimizing the immobilization procedure in hybrid organic-inorganic
matrices using ionic liquids as additives. Ionic liquids containing a
more hydrophobic alkyl group in the cationic moiety are beneficial
for the activity of immobilized lipase. Silanes with alkyl- or aryl
nonhydrolizable groups used as precursors in combination with
tetramethoxysilane could generate composites with higher
enantioselectivity compared to the native enzyme in acylation
reactions of secondary alcohols. The optimal effect on both activity
and enantioselectivity was achieved for the composite made from
octyltrimethoxysilane and tetramethoxysilane at 1:1 molar ratio (60%
increase of total activity following immobilization and enantiomeric
ratio of 30). Ionic liquids also demonstrated valuable properties as
reaction media for the studied reactions, comparable with the usual
organic solvent, hexane.
Abstract: The purpose of this study was to explore the complex
flow structure a novel active-type micromixer that based on concept of
Wankle-type rotor. The characteristics of this micromixer are two
folds; a rapid mixing of reagents in a limited space due to the
generation of multiple vortices and a graduate increment in dynamic
pressure as the mixed reagents is delivered to the output ports.
Present micro-mixer is consisted of a rotor with shape of triangle
column, a blending chamber and several inlet and outlet ports. The
geometry of blending chamber is designed to make the rotor can be
freely internal rotated with a constant eccentricity ratio. When the
shape of the blending chamber and the rotor are fixed, the effects of
rotating speed of rotor and the relative locations of ports on the mixing
efficiency are numerical studied. The governing equations are
unsteady, two-dimensional incompressible Navier-Stokes equation
and the working fluid is the water. The species concentration equation
is also solved to reveal the mass transfer process of reagents in various
regions then to evaluate the mixing efficiency.
The dynamic mesh technique was implemented to model the
dynamic volume shrinkage and expansion of three individual
sub-regions of blending chamber when the rotor conducted a complete
rotating cycle. Six types of ports configuration on the mixing
efficiency are considered in a range of Reynolds number from 10 to
300. The rapid mixing process was accomplished with the multiple
vortex structures within a tiny space due to the equilibrium of shear
force, viscous force and inertial force. Results showed that the highest
mixing efficiency could be attained in the following conditions: two
inlet and two outlet ports configuration, that is an included angle of 60
degrees between two inlets and an included angle of 120 degrees
between inlet and outlet ports when Re=10.
Abstract: The aim of current study was to investigate the
changes in the quality parameters of Holstein bull semen during the
heat stress and the effect of feeding a source of omega-3 fatty acids
in this period. Samples were obtained from 19 Holstein bulls during
the expected time of heat stress in Iran (June to September 2009).
Control group (n=10) were fed a standard concentrate feed while
treatment group (n=9) had this feed top dressed with 100 g of an
omega-3 enriched nutriceutical. Semen quality was assessed on
ejaculates collected after 1, 5, 9 and 12 weeks of supplementation.
Computer-assisted assessment of sperm motility, viability (eosinnigrosin)
and hypo-osmotic swelling test (HOST) were conducted.
Heat stress affected sperm quality parameters by week 5 and 9
(p
Abstract: A manufacturing inventory model with shortages with
carrying cost, shortage cost, setup cost and demand quantity as
imprecise numbers, instead of real numbers, namely interval number
is considered here. First, a brief survey of the existing works on
comparing and ranking any two interval numbers on the real line
is presented. A common algorithm for the optimum production
quantity (Economic lot-size) per cycle of a single product (so as
to minimize the total average cost) is developed which works well
on interval number optimization under consideration. Finally, the
designed algorithm is illustrated with numerical example.
Abstract: The past decade has witnessed a good opportunities
for city development schemes in UK. The government encouraged
restoration of city centers to comprise mixed use developments with
high density residential apartments. Investments in regeneration areas
were doing well according to the analyses of Property Databank
(IPD). However, more recent analysis by IPD has shown that since
2007, property in regeneration areas has been more vulnerable to the
market downturn than other types of investment property. The early
stages of a property market downturn may be felt most in
regeneration where funding, investor confidence and occupier
demand would dissipate because the sector was considered more
marginal or risky when development costs rise. Moreover, the Bank
of England survey shows that lenders have sequentially tightened the
availability of credit for commercial real estate since mid-2007. A
sharp reduction in the willingness of banks to lend on commercial
property was recorded. The credit crunch has already affected
commercial property but its impact has been particularly severe in
certain kinds of properties where residential developments are
extremely difficult, in particular city centre apartments and buy-to-let
markets. Commercial property – retail, industrial leisure and mixed
use were also pressed, in Birmingham; tens of mixed use plots were
built to replace old factories in the heart of the city. The purpose of
these developments was to enable young professionals to work and
live in same place. Thousands of people lost their jobs during the
recession, moreover lending was more difficult and the future of
many developments is unknown. The recession casts its shadow upon
the society due to cuts in public spending by government, Inflation,
rising tuition fees and high rise in unemployment generated anger and
hatred was spreading among youth causing vandalism and riots in
many cities. Recent riots targeted many mixed used development in
the UK where banks, shops, restaurants and big stores were robbed
and set into fire leaving residents with horror and shock. This paper
examines the impact of the recession and riots on mixed use
development in UK.
Abstract: In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.
Abstract: An iterative definition of any n variable mean function is given in this article, which iteratively uses the two-variable form of the corresponding two-variable mean function. This extension method omits recursivity which is an important improvement compared with certain recursive formulas given before by Ando-Li-Mathias, Petz- Temesi. Furthermore it is conjectured here that this iterative algorithm coincides with the solution of the Riemann centroid minimization problem. Certain simulations are given here to compare the convergence rate of the different algorithms given in the literature. These algorithms will be the gradient and the Newton mehod for the Riemann centroid computation.
Abstract: Fatigue cracking continues to be the main challenges in
improving the performance of bituminous mixture pavements. The
purpose of this paper is to look at some aspects of the effects of fine
aggregate properties on the fatigue behaviour of hot mixture asphalt.
Two types of sand (quarry and mining sand) with two conventional
bitumen (PEN 50/60 & PEN 80/100) and four polymers modified
bitumen PMB (PM1_82, PM1_76, PM2_82 and PM2_76) were used.
Physical, chemical and mechanical tests were performed on the sands
to determine their effect when incorporated with a bituminous
mixture. According to the beam fatigue results, quarry sand that has
more angularity, rougher, higher shear strength and a higher
percentage of Aluminium oxide presented higher resistance to
fatigue. Also a PMB mixture gives better fatigue results than
conventional mixtures, this is due to the PMB having better viscosity
property than that of the conventional bitumen.
Abstract: Crime is a major societal problem for most of the
world's nations. Consequently, the police need to develop new
methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police
face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of
spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made).
These types of data are collected from several heterogeneous sources
in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for
crime data collection, integration and dissemination through mobile
devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This
paper proposes and investigates the use of service oriented
architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an
appropriate way to support data exchange and model sharing from
heterogeneous sources. Crime control also needs to facilitate mobile
spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and
anywhere.
Abstract: An interesting method to produce calcium carbonate is based in a gas-liquid reaction between carbon dioxide and aqueous solutions of calcium hydroxide. The design parameters for gas-liquid phase are flow regime, individual mass transfer, gas-liquid specific interfacial area. Most studies on gas-liquid phase were devoted to the experimental determination of some of these parameters, and more specifically, of the mass transfer coefficient, kLa which depends fundamentally on the superficial gas velocity and on the physical properties of absorption phase. The principle investigation was directed to study the effect of the vibration on the mass transfer coefficient kLa in gas-liquid phase during absorption of CO2 in the in aqueous solution of calcium hydroxide. The vibration with a higher frequency increase the mass transfer coefficient kLa, but vibration with lower frequency didn-t improve it, the mass transfer coefficient kLa increase with increase the superficial gas velocity.
Abstract: EGOTHOR is a search engine that indexes the Web
and allows us to search the Web documents. Its hit list contains URL
and title of the hits, and also some snippet which tries to shortly
show a match. The snippet can be almost always assembled by an
algorithm that has a full knowledge of the original document (mostly
HTML page). It implies that the search engine is required to store
the full text of the documents as a part of the index.
Such a requirement leads us to pick up an appropriate compression
algorithm which would reduce the space demand. One of the solutions
could be to use common compression methods, for instance gzip or
bzip2, but it might be preferable if we develop a new method which
would take advantage of the document structure, or rather, the textual
character of the documents.
There already exist a special compression text algorithms and
methods for a compression of XML documents. The aim of this
paper is an integration of the two approaches to achieve an optimal
level of the compression ratio
Abstract: With the rapid development in the field of life
sciences and the flooding of genomic information, the need for faster
and scalable searching methods has become urgent. One of the
approaches that were investigated is indexing. The indexing methods
have been categorized into three categories which are the lengthbased
index algorithms, transformation-based algorithms and mixed
techniques-based algorithms. In this research, we focused on the
transformation based methods. We embedded the N-gram method
into the transformation-based method to build an inverted index
table. We then applied the parallel methods to speed up the index
building time and to reduce the overall retrieval time when querying
the genomic database. Our experiments show that the use of N-Gram
transformation algorithm is an economical solution; it saves time and
space too. The result shows that the size of the index is smaller than
the size of the dataset when the size of N-Gram is 5 and 6. The
parallel N-Gram transformation algorithm-s results indicate that the
uses of parallel programming with large dataset are promising which
can be improved further.
Abstract: Although many studies on the assembly technology of
the bridge construction have dealt mostly with on the pier, girder or the
deck of the bridge, studies on the prefabricated barrier have rarely been
performed. For understanding structural characteristics and
application of the concrete barrier in the modular bridge, which is an
assembly of structure members, static loading test was performed.
Structural performances as a road barrier of the three methods,
conventional cast-in-place(ST), vertical bolt connection(BVC) and
horizontal bolt connection(BHC) were evaluated and compared
through the analyses of load-displacement curves, strain curves of the
steel, concrete strain curves and the visual appearances of crack
patterns. The vertical bolt connection(BVC) method demonstrated
comparable performance as an alternative to conventional
cast-in-place(ST) while providing all the advantages of prefabricated
technology. Necessities for the future improvement in nuts
enforcement as well as legal standard and regulation are also
addressed.
Abstract: In recent years, a number of works proposing the
combination of multiple classifiers to produce a single
classification have been reported in remote sensing literature. The
resulting classifier, referred to as an ensemble classifier, is
generally found to be more accurate than any of the individual
classifiers making up the ensemble. As accuracy is the primary
concern, much of the research in the field of land cover
classification is focused on improving classification accuracy. This
study compares the performance of four ensemble approaches
(boosting, bagging, DECORATE and random subspace) with a
univariate decision tree as base classifier. Two training datasets,
one without ant noise and other with 20 percent noise was used to
judge the performance of different ensemble approaches. Results
with noise free data set suggest an improvement of about 4% in
classification accuracy with all ensemble approaches in
comparison to the results provided by univariate decision tree
classifier. Highest classification accuracy of 87.43% was achieved
by boosted decision tree. A comparison of results with noisy data
set suggests that bagging, DECORATE and random subspace
approaches works well with this data whereas the performance of
boosted decision tree degrades and a classification accuracy of
79.7% is achieved which is even lower than that is achieved (i.e.
80.02%) by using unboosted decision tree classifier.
Abstract: This paper presents comparative study on recent
integer DCTs and a new method to construct a low sensitive structure
of integer DCT for colored input signals. The method refers to
sensitivity of multiplier coefficients to finite word length as an
indicator of how word length truncation effects on quality of output
signal. The sensitivity is also theoretically evaluated as a function of
auto-correlation and covariance matrix of input signal. The structure of
integer DCT algorithm is optimized by combination of lower sensitive
lifting structure types of IRT. It is evaluated by the sensitivity of
multiplier coefficients to finite word length expression in a function of
covariance matrix of input signal. Effectiveness of the optimum
combination of IRT in integer DCT algorithm is confirmed by quality
improvement comparing with existing case. As a result, the optimum
combination of IRT in each integer DCT algorithm evidently improves
output signal quality and it is still compatible with the existing one.
Abstract: The increasing importance of data stream arising in a
wide range of advanced applications has led to the extensive study of
mining frequent patterns. Mining data streams poses many new
challenges amongst which are the one-scan nature, the unbounded
memory requirement and the high arrival rate of data streams. In this
paper, we propose a new approach for mining itemsets on data
stream. Our approach SFIDS has been developed based on FIDS
algorithm. The main attempts were to keep some advantages of the
previous approach and resolve some of its drawbacks, and
consequently to improve run time and memory consumption. Our
approach has the following advantages: using a data structure similar
to lattice for keeping frequent itemsets, separating regions from each
other with deleting common nodes that results in a decrease in search
space, memory consumption and run time; and Finally, considering
CPU constraint, with increasing arrival rate of data that result in
overloading system, SFIDS automatically detect this situation and
discard some of unprocessing data. We guarantee that error of results
is bounded to user pre-specified threshold, based on a probability
technique. Final results show that SFIDS algorithm could attain
about 50% run time improvement than FIDS approach.
Abstract: This study compares three meta heuristics to minimize makespan (Cmax) for Hybrid Flow Shop (HFS) Scheduling Problem with Parallel Machines. This problem is known to be NP-Hard. This study proposes three algorithms among improvement heuristic searches which are: Genetic Algorithm (GA), Simulated Annealing (SA), and Tabu Search (TS). SA and TS are known as deterministic improvement heuristic search. GA is known as stochastic improvement heuristic search. A comprehensive comparison from these three improvement heuristic searches is presented. The results for the experiments conducted show that TS is effective and efficient to solve HFS scheduling problems.
Abstract: This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.
Abstract: The ability of UML to handle the modeling process of complex industrial software applications has increased its popularity to the extent of becoming the de-facto language in serving the design purpose. Although, its rich graphical notation naturally oriented towards the object-oriented concept, facilitates the understandability, it hardly successes to report all domainspecific aspects in a satisfactory way. OCL, as the standard language for expressing additional constraints on UML models, has great potential to help improve expressiveness. Unfortunately, it suffers from a weak formalism due to its poor semantic resulting in many obstacles towards the build of tools support and thus its application in the industry field. For this reason, many researches were established to formalize OCL expressions using a more rigorous approach. Our contribution join this work in a complementary way since it focuses specifically on OCL predefined properties which constitute an important part in the construction of OCL expressions. Using formal methods, we mainly succeed in expressing rigorously OCL predefined functions.