Abstract: White scar oyster (Crassostrea belcheri) is often eaten
raw and being the leading vehicle for foodborne disease, especially
Salmonella Weltevreden which exposed the prominent and most
resistant to radiation. Gamma irradiation at a low dose of 1 kGy was
enough to eliminate S. Weltevreden contaminated in oyster meat at a
level up to 5 log CFU/g while it still retain the raw characteristics and
equivalent sensory quality as the non-irradiated one. Process
development of ready-to-eat chilled oyster meat was conducted by
shucking the meat, individually packed in plastic bags, subjected to 1
kGy gamma radiation at chilled condition and then stored in 4oC
refrigerated temperature. Microbiological determination showed the
absence of S. Weltevreden (5 log CFU/g initial inoculated) along the
whole storage time of 30 days. Sensory evaluation indicated the
decreasing in sensory scores along storage time which determining
the product shelf life to be 18 days compared to 15 days of nonirradiated
one. The most advantage of developed process was to
provide the safe raw oyster to consumers and in addition sensory
quality retained and 3-day extension shelf life also exist.
Abstract: The huge development of new technologies and the
apparition of open communication system more and more
sophisticated create a new challenge to protect digital content from
piracy. Digital watermarking is a recent research axis and a new
technique suggested as a solution to these problems. This technique
consists in inserting identification information (watermark) into
digital data (audio, video, image, databases...) in an invisible and
indelible manner and in such a way not to degrade original medium-s
quality. Moreover, we must be able to correctly extract the
watermark despite the deterioration of the watermarked medium (i.e
attacks). In this paper we propose a system for watermarking satellite
images. We chose to embed the watermark into frequency domain,
precisely the discrete wavelet transform (DWT). We applied our
algorithm on satellite images of Tunisian center. The experiments
show satisfying results. In addition, our algorithm showed an
important resistance facing different attacks, notably the compression
(JEPG, JPEG2000), the filtering, the histogram-s manipulation and
geometric distortions such as rotation, cropping, scaling.
Abstract: With the drastically growth in optical communication
technology, a lossless, low-crosstalk and multifunction optical switch
is most desirable for large-scale photonic network. To realize such a
switch, we have introduced the new architecture of optical switch
that embedded many functions on single device. The asymmetrical
architecture of OXADM consists of 3 parts; selective port, add/drop
operation, and path routing. Selective port permits only the interest
wavelength pass through and acts as a filter. While add and drop
function can be implemented in second part of OXADM architecture.
The signals can then be re-routed to any output port or/and perform
an accumulation function which multiplex all signals onto single path
and then exit to any interest output port. This will be done by path
routing operation. The unique features offered by OXADM has
extended its application to Fiber to-the Home Technology (FTTH),
here the OXADM is used as a wavelength management element in
Optical Line Terminal (OLT). Each port is assigned specifically with
the operating wavelengths and with the dynamic routing management
to ensure no traffic combustion occurs in OLT.
Abstract: Coagulation of water involves the use of coagulating
agents to bring the suspended matter in the raw water together for
settling and the filtration stage. Present study is aimed to examine the
effects of aluminum sulfate as coagulant in conjunction with Moringa
Oleifera Coagulant Protein as coagulant aid on turbidity, hardness,
and bacteria in turbid water. A conventional jar test apparatus was
employed for the tests. The best removal was observed at a pH of 7
to 7.5 for all turbidities. Turbidity removal efficiency was resulted
between % 80 to % 99 by Moringa Oleifera Coagulant Protein as
coagulant aid. Dosage of coagulant and coagulant aid decreased with
increasing turbidity. In addition, Moringa Oleifera Coagulant Protein
significantly has reduced the required dosage of primary coagulant.
Residual Al+3 in treated water were less than 0.2 mg/l and meets the
environmental protection agency guidelines. The results showed that
turbidity reduction of % 85.9- % 98 paralleled by a primary
Escherichia coli reduction of 1-3 log units (99.2 – 99.97%) was
obtained within the first 1 to 2 h of treatment. In conclusions,
Moringa Oleifera Coagulant Protein as coagulant aid can be used for
drinking water treatment without the risk of organic or nutrient
release. We demonstrated that optimal design method is an efficient
approach for optimization of coagulation-flocculation process and
appropriate for raw water treatment.
Abstract: The mechanical properties including flexural and
tensile of neat vinyl ester and polymer based on layered silicate
nanocomposite materials are discussed. The addition of layered
silicate into the polymer matrix increased the tensile and flexural
modulus up to 1 wt.% clay loading. The incorporation of more clay
resulted in decreasing the mechanical properties which was traced to
the existence of aggregation layers. Likewise, up to 1 wt.% clay
loading, the thermal behaviour showed significant improvements and
at higher clay loading the thermal pattern was reduced. The
aggregation layers imparted a negative impact on the overall
mechanical and thermal properties. Wide Angle X-ray Diffraction,
Scanning Electron Microscopy and Transmission Electron
Microscopy were utilised in order to characterise the interlamellar
structure of nanocomposites.
Abstract: Microwave energy is a superior alternative to several other thermal treatments. Extraction techniques are widely employed for the isolation of bioactive compounds and vegetable oils from oil seeds. Among the different and new available techniques, microwave pretreatment of seeds is a simple and desirable method for production of high quality vegetable oils. Microwave pretreatment for oil extraction has many advantages as follow: improving oil extraction yield and quality, direct extraction capability, lower energy consumption, faster processing time and reduced solvent levels compared with conventional methods. It allows also for better retention and availability of desirable nutraceuticals, such as phytosterols and tocopherols, canolol and phenolic compounds in the extracted oil such as rapeseed oil. This can be a new step to produce nutritional vegetable oils with improved shelf life because of high antioxidant content.
Abstract: Support Vector Machine (SVM) is a recent class of statistical classification and regression techniques playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM is applied to an infrared (IR) binary communication system with different types of channel models including Ricean multipath fading and partially developed scattering channel with additive white Gaussian noise (AWGN) at the receiver. The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these channel stochastic models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to classical binary signal maximum likelihood detection using a matched filter driven by On-Off keying (OOK) modulation. We found that the performance of SVM is superior to that of the traditional optimal detection schemes used in statistical communication, especially for very low signal-to-noise ratio (SNR) ranges. For large SNR, the performance of the SVM is similar to that of the classical detectors. The implication of these results is that SVM can prove very beneficial to IR communication systems that notoriously suffer from low SNR at the cost of increased computational complexity.
Abstract: This paper introduces a new variable step-size APA with decorrelation of AR input process is based on the MSD analysis. To achieve a fast convergence rate and a small steady-state estimation error, he proposed algorithm uses variable step size that is determined by minimising the MSD. In addition, experimental results show that the proposed algorithm is achieved better performance than the other algorithms.
Abstract: Air emissions from waste treatment plants often
consist of a combination of Volatile Organic Compounds (VOCs)
and odors. Hydrogen sulfide is one of the major odorous gases
present in the waste emissions coming from municipal wastewater
treatment facilities. Hydrogen sulfide (H2S) is odorous, highly toxic
and flammable. Exposure to lower concentrations can result in eye
irritation, a sore throat and cough, shortness of breath, and fluid in
the lungs. Biofiltration has become a widely accepted technology for
treating air streams containing H2S. When compared with other nonbiological
technologies, biofilter is more cost-effective for treating large
volumes of air containing low concentrations of biodegradable compounds.
Optimization of biofilter media is essential for many reasons such as:
providing a higher surface area for biofilm growth, low pressure drop,
physical stability, and good moisture retention. In this work, a novel
biofilter media is developed and tested at a pumping station of a
municipality located in the United Arab Emirates (UAE). The
media is found to be very effective (>99%) in removing H2S
concentrations that are expected in pumping stations under steady
state and shock loading conditions.
Abstract: This paper introduces an approach to construct a set of criteria for evaluating alternative options. Content analysis was used to collet criterion elements. Then the elements were classified and organized yielding to hierarchic structure. The reliability of the constructed criteria was evaluated in an experiment. Finally the criteria were used to evaluate alternative options indecision-making.
Abstract: In this paper we consider the issue of distributed adaptive estimation over sensor networks. To deal with more realistic scenario, different variance for observation noise is assumed for sensors in the network. To solve the problem of different variance of observation noise, the proposed method is divided into two phases: I) Estimating each sensor-s observation noise variance and II) using the estimated variances to obtain the desired parameter. Our proposed algorithm is based on a diffusion least mean square (LMS) implementation with linear combiner model. In the proposed algorithm, the step-size parameter the coefficients of linear combiner are adjusted according to estimated observation noise variances. As the simulation results show, the proposed algorithm considerably improves the diffusion LMS algorithm given in literature.
Abstract: Breast motion and discomfort has been studied in
Australia, Britain and the United States, while little information was
known about the breast motion conditions of Chinese women. The aim
of this paper was to study the breast motion and discomfort of Chinese
women in no bra condition, daily bra condition and sports bra
condition. Breast motion and discomfort of 8 participants was assessed
during walking at 5km h-1 and running at 10km h-1. Statistical methods
were used to analyze the difference and relationship between breast
displacement, perceived breast motion and breast discomfort. Three
indexes were developed to evaluate the functions of bras on reducing
objective breast motion, subjective breast motion and breast
discomfort. The result showed that breast motion of Chinese women
was smaller than previous research, which may be resulted from
smaller breast size in Asian women.
Abstract: The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.
Abstract: The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the
new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.
Abstract: Investment in a constructed facility represents a cost in
the short term that returns benefits only over the long term use of the
facility. Thus, the costs occur earlier than the benefits, and the owners
of facilities must obtain the capital resources to finance the costs of
construction. A project cannot proceed without an adequate
financing, and the cost of providing an adequate financing can be
quite large. For these reasons, the attention to the project finance is an
important aspect of project management. Finance is also a concern to
the other organizations involved in a project such as the general
contractor and material suppliers. Unless an owner immediately and
completely covers the costs incurred by each participant, these
organizations face financing problems of their own. At a more
general level, the project finance is the only one aspect of the general
problem of corporate finance. If numerous projects are considered
and financed together, then the net cash flow requirements constitute
the corporate financing problem for capital investment. Whether
project finance is performed at the project or at the corporate level
does not alter the basic financing problem .In this paper, we will first
consider facility financing from the owner's perspective, with due
consideration for its interaction with other organizations involved in a
project. Later, we discuss the problems of construction financing
which are crucial to the profitability and solvency of construction
contractors. The objective of this paper is to present the steps utilized
to determine the best combination of minimum project financing.
The proposed model considers financing; schedule and maximum net
area .The proposed model is called Project Financing and Schedule
Integration using Genetic Algorithms "PFSIGA". This model
intended to determine more steps (maximum net area) for any project
with a subproject. An illustrative example will demonstrate the
feature of this technique. The model verification and testing are put
into consideration.
Abstract: The aim of this paper is to present a methodology in
three steps to forecast supply chain demand. In first step, various data
mining techniques are applied in order to prepare data for entering
into forecasting models. In second step, the modeling step, an
artificial neural network and support vector machine is presented
after defining Mean Absolute Percentage Error index for measuring
error. The structure of artificial neural network is selected based on
previous researchers' results and in this article the accuracy of
network is increased by using sensitivity analysis. The best forecast
for classical forecasting methods (Moving Average, Exponential
Smoothing, and Exponential Smoothing with Trend) is resulted based
on prepared data and this forecast is compared with result of support
vector machine and proposed artificial neural network. The results
show that artificial neural network can forecast more precisely in
comparison with other methods. Finally, forecasting methods'
stability is analyzed by using raw data and even the effectiveness of
clustering analysis is measured.
Abstract: The advent of multi-million gate Field Programmable
Gate Arrays (FPGAs) with hardware support for multiplication opens
an opportunity to recreate a significant portion of the front end of a
human cochlea using this technology. In this paper we describe the
implementation of the cochlear filter and show that it is entirely
suited to a single device XC3S500 FPGA implementation .The filter
gave a good fit to real time data with efficiency of hardware usage.
Abstract: In-core memory requirement is a bottleneck in solving
large three dimensional Navier-Stokes finite element problem
formulations using sparse direct solvers. Out-of-core solution
strategy is a viable alternative to reduce the in-core memory
requirements while solving large scale problems. This study
evaluates the performance of various out-of-core sequential solvers
based on multifrontal or supernodal techniques in the context of
finite element formulations for three dimensional problems on a
Windows platform. Here three different solvers, HSL_MA78,
MUMPS and PARDISO are compared. The performance of these
solvers is evaluated on a 64-bit machine with 16GB RAM for finite
element formulation of flow through a rectangular channel. It is
observed that using out-of-core PARDISO solver, relatively large
problems can be solved. The implementation of Newton and
modified Newton's iteration is also discussed.
Abstract: Developing countries are facing a problem of slums and there appears to be no fool proof solution to eradicate them. For improving the quality of life there are three approaches of slum development and In-situ up-gradation approach is found to be the best one, while the relocation approach has proved to be failure. Factors responsible for failure of relocation projects are needed to be assessed, which is the basic aim of the paper. Factors responsible for failure of relocation projects are loss of livelihood, security of tenure and inefficiency of the Government. These factors are traced out & mapped from the examples of Western & Indian cities. National habitat, Resettlement policy emphasized relationship between shelter and work place. SRA has identified 55 slums for relocation due reservation of land uses, security of tenure and non- notified status of slums. The policy guidelines have been suggested for successful relocation projects. KeywordsLivelihood, Relocation, Slums, Urban poor.
Abstract: In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.