Abstract: Tip vortex cavitation is one of well known patterns of
cavitation phenomenon which occurs in axial pumps. This pattern of
cavitation occurs due to pressure difference between the pressure and
suction sides of blades of an axial pump. Since the pressure in the
pressure side of the blade is higher than the pressure in its suction
side, thus a very small portion of liquid flow flows back from
pressure side to the suction side. This fact is cause of tip vortex
cavitation and gap cavitation that may occur in axial pumps. In this
paper the results of our experimental investigation about movement
of tip vortex cavitation along blade edge due to reduction of pump
flow rate in an axial pump is reported. Results show that reduction of
pump flow rate in conjunction with increasing of outlet pressure
causes movement of tip vortex cavitation along blade edge towards
the blade tip. Results also show that by approaching tip vortex
cavitation to the blade tip, vortex tip pattern of cavitation replaces
with a cavitation phenomenon on the blade tip. Furthermore by
further reduction of pump flow rate and increasing of outlet pressure,
an unstable cavitation phenomenon occurs between each blade
leading edge and the next blade trailing edge.
Abstract: The decision to recruit manpower in an organization
requires clear identification of the criteria (attributes) that distinguish
successful from unsuccessful performance. The choice of appropriate
attributes or criteria in different levels of hierarchy in an organization
is a multi-criteria decision problem and therefore multi-criteria
decision making (MCDM) techniques can be used for prioritization
of such attributes. Analytic Hierarchy Process (AHP) is one such
technique that is widely used for deciding among the complex criteria
structure in different levels. In real applications, conventional AHP
still cannot reflect the human thinking style as precise data
concerning human attributes are quite hard to be extracted. Fuzzy
logic offers a systematic base in dealing with situations, which are
ambiguous or not well defined. This study aims at defining a
methodology to improve the quality of prioritization of an
employee-s performance measurement attributes under fuzziness. To
do so, a methodology based on the Extent Fuzzy Analytic Hierarchy
Process is proposed. Within the model, four main attributes such as
Subject knowledge and achievements, Research aptitude, Personal
qualities and strengths and Management skills with their subattributes
are defined. The two approaches conventional AHP
approach and the Extent Fuzzy Analytic Hierarchy Process approach
have been compared on the same hierarchy structure and criteria set.
Abstract: Selective oxidation of H2S to elemental sulfur in a
fixed bed reactor over newly synthesized alumina nanocatalysts was
physio-chemically investigated and results compared with a
commercial Claus catalyst. Amongst these new materials, Al2O3-
supported sodium oxide prepared with wet chemical technique and
Al2O3 nanocatalyst prepared with spray pyrolysis method were the
most active catalysts for selective oxidation of H2S to elemental
sulfur. Other prepared nanocatalysts were quickly deactivated,
mainly due to the interaction with H2S and conversion into sulfides.
Abstract: Polyphenolics and sugar are the components of many
fruit juices. In this work, the performance of ultra-filtration (UF) for
separating phenolic compounds from apple juice was studied by
performing batch experiments in a membrane module with an area of
0.1 m2 and fitted with a regenerated cellulose membrane of 1 kDa
MWCO. The effects of various operating conditions: transmembrane
pressure (3, 4, 5 bar), temperature (30, 35, 40 ºC), pH (2, 3, 4, 5),
feed concentration (3, 5, 7, 10, 15 ºBrix for apple juice) and feed flow
rate (1, 1.5, 1.8 L/min) on the performance were determined.
The optimum operating conditions were: transmembrane pressure
4 bar, temperature 30 ºC, feed flow rate 1 – 1.8 L/min, pH 3 and 10
Brix (apple juice). After performing ultrafiltration under these
conditions, the concentration of polyphenolics in retentate was
increased by a factor of up to 2.7 with up to 70% recovered in the
permeate and with approx. 20% of the sugar in that stream..
Application of diafiltration (addition of water to the concentrate) can
regain the flux by a factor of 1.5, which has been decreased due to
fouling. The material balance performed on the process has shown
the amount of deposits on the membrane and the extent of fouling in
the system. In conclusion, ultrafiltration has been demonstrated as a
potential technology to separate the polyphenolics and sugars from
their mixtures and can be applied to remove sugars from fruit juice.
Abstract: Contractor selection in Saudi Arabia is very important due to the large construction boom and the contractor role to get over construction risks. The need for investigating contractor selection is due to the following reasons; large number of defaulted or failed projects (18%), large number of disputes attributed to contractor during the project execution stage (almost twofold), the extension of the General Agreement on Tariffs and Trade (GATT) into construction industry, and finally the few number of researches. The selection strategy is not perfect and considered as the reason behind irresponsible contractors. As a response, this research was conducted to review the contractor selection strategies as an integral part of a long advanced research to develop a good selection model. Many techniques can be used to form a selection strategy; multi criteria for optimizing decision, prequalification to discover contractor-s responsibility, bidding process for competition, third party guarantee to enhance the selection, and fuzzy techniques for ambiguities and incomplete information.
Abstract: Three service providers in competition, try to optimize
their quality of service / content level and their service access
price. But, they have to deal with uncertainty on the consumers-
preferences. To reduce their uncertainty, they have the opportunity
to buy information and to build alliances. We determine the Shapley
value which is a fair way to allocate the grand coalition-s revenue
between the service providers. Then, we identify the values of β
(consumers- sensitivity coefficient to the quality of service / contents)
for which allocating the grand coalition-s revenue using the Shapley
value guarantees the system stability. For other values of β, we prove
that it is possible for the regulator to impose a per-period interest rate
maximizing the market coverage under equal allocation rules.
Abstract: This paper describes a new method for affine parameter
estimation between image sequences. Usually, the parameter
estimation techniques can be done by least squares in a quadratic
way. However, this technique can be sensitive to the presence
of outliers. Therefore, parameter estimation techniques for various
image processing applications are robust enough to withstand the
influence of outliers. Progressively, some robust estimation functions
demanding non-quadratic and perhaps non-convex potentials adopted
from statistics literature have been used for solving these. Addressing
the optimization of the error function in a factual framework for
finding a global optimal solution, the minimization can begin with
the convex estimator at the coarser level and gradually introduce nonconvexity
i.e., from soft to hard redescending non-convex estimators
when the iteration reaches finer level of multiresolution pyramid.
Comparison has been made to find the performance of the results
of proposed method with the results found individually using two
different estimators.
Abstract: Compression algorithms reduce the redundancy in
data representation to decrease the storage required for that data.
Lossless compression researchers have developed highly
sophisticated approaches, such as Huffman encoding, arithmetic
encoding, the Lempel-Ziv (LZ) family, Dynamic Markov
Compression (DMC), Prediction by Partial Matching (PPM), and
Burrows-Wheeler Transform (BWT) based algorithms.
Decompression is also required to retrieve the original data by
lossless means. A compression scheme for text files coupled with
the principle of dynamic decompression, which decompresses only
the section of the compressed text file required by the user instead of
decompressing the entire text file. Dynamic decompressed files offer
better disk space utilization due to higher compression ratios
compared to most of the currently available text file formats.
Abstract: In H.264/AVC video encoding, rate-distortion
optimization for mode selection plays a significant role to achieve
outstanding performance in compression efficiency and video quality.
However, this mode selection process also makes the encoding
process extremely complex, especially in the computation of the ratedistortion
cost function, which includes the computations of the sum
of squared difference (SSD) between the original and reconstructed
image blocks and context-based entropy coding of the block. In this
paper, a transform-domain rate-distortion optimization accelerator
based on fast SSD (FSSD) and VLC-based rate estimation algorithm
is proposed. This algorithm could significantly simplify the hardware
architecture for the rate-distortion cost computation with only
ignorable performance degradation. An efficient hardware structure
for implementing the proposed transform-domain rate-distortion
optimization accelerator is also proposed. Simulation results
demonstrated that the proposed algorithm reduces about 47% of total
encoding time with negligible degradation of coding performance.
The proposed method can be easily applied to many mobile video
application areas such as a digital camera and a DMB (Digital
Multimedia Broadcasting) phone.
Abstract: Wireless LAN technologies have picked up
momentum in the recent years due to their ease of deployment, cost
and availability. The era of wireless LAN has also given rise to
unique applications like VOIP, IPTV and unified messaging.
However, these real-time applications are very sensitive to network
and handoff latencies. To successfully support these applications,
seamless roaming during the movement of mobile station has become
crucial. Nowadays, centralized architecture models support roaming
in WLANs. They have the ability to manage, control and
troubleshoot large scale WLAN deployments. This model is managed
by Control and Provision of Wireless Access Point protocol
(CAPWAP). This paper covers the CAPWAP architectural solution
along with its proposals that have emerged. Based on the literature
survey conducted in this paper, we found that the proposed
algorithms to reduce roaming latency in CAPWAP architecture do
not support seamless roaming. Additionally, they are not sufficient
during the initial period of the network. This paper also suggests
important design consideration for mobility support in future
centralized IEEE 802.11 networks.
Abstract: When a high DC voltage is applied to a capacitor with
strongly asymmetrical electrodes, it generates a mechanical force that
affects the whole capacitor. This phenomenon is most likely to be
caused by the motion of ions generated around the smaller of the two
electrodes and their subsequent interaction with the surrounding
medium. A method to measure this force has been devised and used.
A formula describing the force has also been derived. After
comparing the data gained through experiments with those acquired
using the theoretical formula, a difference was found above a certain
value of current. This paper also gives reasons for this difference.
Abstract: In this paper, a joint source-channel coding (JSCC) scheme for time-varying channels is presented. The proposed scheme uses hierarchical framework for both source encoder and transmission via QAM modulation. Hierarchical joint source channel codes with hierarchical QAM constellations are designed to track the channel variations which yields to a higher throughput by adapting certain parameters of the receiver to the channel variation. We consider the problem of still image transmission over time-varying channels with channel state information (CSI) available at 1) receiver only and 2) both transmitter and receiver being informed about the state of the channel. We describe an algorithm that optimizes hierarchical source codebooks by minimizing the distortion due to source quantizer and channel impairments. Simulation results, based on image representation, show that, the proposed hierarchical system outperforms the conventional schemes based on a single-modulator and channel optimized source coding.
Abstract: This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.
Abstract: This paper presents a comparison of average outgoing
quality limit of the MCSP-2-C plan with MCSP-C when MCSP-2-C
has been developed from MCSP-C. The parameters used in MCSP-2-
C are: i (the clearance number), c (the acceptance number), m (the
number of conforming units to be found before allowing c nonconforming
units in the sampling inspection), f1 and f2 (the sampling
frequency at level 1 and 2, respectively). The average outgoing
quality limit (AOQL) values from two plans were compared and we
found that for all sets of i, r, and c values, MCSP-2-C gives higher
values than MCSP-C. For all sets of i, r, and c values, the average
outgoing quality values of MCSP-C and MCSP-2-C are similar when
p is low or high but is difference when p is moderate.
Abstract: Interior brick-infill partitions are usually considered as
non-structural components, and only their weight is accounted for in
practical structural design. In this study, the brick-infill panels are
simulated by compression struts to clarify their effect on the
progressive collapse potential of an earthquake-resistant RC building.
Three-dimensional finite element models are constructed for the RC
building subjected to sudden column loss. Linear static analyses are
conducted to investigate the variation of demand-to-capacity ratio
(DCR) of beam-end moment and the axial force variation of the beams
adjacent to the removed column. Study results indicate that the
brick-infill effect depends on their location with respect to the
removed column. As they are filled in a structural bay with a shorter
span adjacent to the column-removed line, more significant reduction
of DCR may be achieved. However, under certain conditions, the
brick infill may increase the axial tension of the two-span beam
bridging the removed column.
Abstract: This paper proposes a Fuzzy Expert System design to
determine the wearing properties of nitrided and non nitrided steel.
The proposed Fuzzy Expert System approach helps the user and the
manufacturer to forecast the wearing properties of nitrided and non
nitrided steel under specified laboratory conditions. Surfaces of the
engineering components are often nitrided for improving wear,
corosion, fatigue specifications. A major property of nitriding
process is reducing distortion and wearing of the metalic alloys. A
Fuzzy Expert System was developed for determining the wearing and
durability properties of nitrided and non nitrided steels that were
tested under different loads and different sliding speeds in the
laboratory conditions.
Abstract: The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.
Abstract: This paper suggests a new Affine Projection (AP) algorithm with variable data-reuse factor using the condition number as a decision factor. To reduce computational burden, we adopt a recently reported technique which estimates the condition number of an input data matrix. Several simulations show that the new algorithm has better performance than that of the conventional AP algorithm.
Abstract: Drought is natural and climate phenomenon and in fact server as a part of climate in an area and also it has significant environmental, social ,and economic consequences .drought differs from the other natural disasters from this viewpoint that it s a creeping phenomenon meaning that it progresses little and its difficult to determine the time of its onset and termination .most of the drought definitions are on based on precipitation shortage and consequently ,the shortage of water some of the activities related to the water such as agriculture In this research ,drought condition in Fars province was evacuated using SPI method within a 37 year – statistical –period(1974-2010)and maps related to the drought were prepared for each of the statistical period years. According to the results obtained from this research, the years 1974, 1976, 1975, 1982 with SPI (-1.03, 0.39, -1.05, -1.49) respectively, were the doughiest years and 1996,1997,2000 with SPI (2.49, 1.49, 1.46, 1.04) respectively, the most humid within the studying time series and the rest are in more normal conditions in the term of drought.
Abstract: Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.