Abstract: This paper presents a design method of self-tuning
Quantitative Feedback Theory (QFT) by using improved deadbeat
control algorithm. QFT is a technique to achieve robust control with
pre-defined specifications whereas deadbeat is an algorithm that
could bring the output to steady state with minimum step size.
Nevertheless, usually there are large peaks in the deadbeat response.
By integrating QFT specifications into deadbeat algorithm, the large
peaks could be tolerated. On the other hand, emerging QFT with
adaptive element will produce a robust controller with wider
coverage of uncertainty. By combining QFT-based deadbeat
algorithm and adaptive element, superior controller that is called selftuning
QFT-based deadbeat controller could be achieved. The output
response that is fast, robust and adaptive is expected. Using a grain
dryer plant model as a pilot case-study, the performance of the
proposed method has been evaluated and analyzed. Grain drying
process is very complex with highly nonlinear behaviour, long delay,
affected by environmental changes and affected by disturbances.
Performance comparisons have been performed between the
proposed self-tuning QFT-based deadbeat, standard QFT and
standard dead-beat controllers. The efficiency of the self-tuning QFTbased
dead-beat controller has been proven from the tests results in
terms of controller’s parameters are updated online, less percentage
of overshoot and settling time especially when there are variations in
the plant.
Abstract: The present study has been taken to explore the
screening of in vitro antimicrobial activities of D-galactose-binding
sponge lectin (HOL-30). HOL-30 was purified from the marine
demosponge Halichondria okadai by affinity chromatography. The
molecular mass of the lectin was determined to be 30 kDa with a
single polypeptide by SDS-PAGE under non-reducing and reducing
conditions. HOL-30 agglutinated trypsinized and glutaraldehydefixed
rabbit and human erythrocytes with preference for type O
erythrocytes. The lectin was subjected to evaluation for inhibition of
microbial growth by the disc diffusion method against eleven human
pathogenic gram-positive and gram-negative bacteria. The lectin
exhibited strong antibacterial activity against gram-positive bacteria,
such as Bacillus megaterium and Bacillus subtilis. However, it did
not affect against gram-negative bacteria such as Salmonella typhi
and Escherichia coli. The largest zone of inhibition was recorded of
Bacillus megaterium (12 in diameter) and Bacillus subtilis (10 mm in
diameter) at a concentration of the lectin (250 μg/disc). On the other
hand, the antifungal activity of the lectin was investigated against six
phytopathogenic fungi based on food poisoning technique. The lectin
has shown maximum inhibition (22.83%) of mycelial growth of
Botrydiplodia theobromae at a concentration of 100 μg/mL media.
These findings indicate that the lectin may be of importance to
clinical microbiology and have therapeutic applications.
Abstract: In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.
Abstract: The purpose of this study was to investigate effects of
modality and redundancy principles on music theory learning among
pupils of different anxiety levels. The lesson of music theory was
developed in three different modes, audio and image (AI), text with
image (TI) and audio with image and text (AIT). The independent
variables were the three modes of courseware. The moderator
variable was the anxiety level, while the dependent variable was the
post test score. The study sample consisted of 405 third-grade pupils.
Descriptive and inferential statistics were conducted to analyze the
collected data. Analyses of covariance (ANCOVA) and Post hoc
were carried out to examine the main effects as well as the
interaction effects of the independent variables on the dependent
variable. The findings of this study showed that medium anxiety
pupils performed significantly better than low and high anxiety
pupils in all the three treatment modes. The AI mode was found to
help pupils with high anxiety significantly more than the TI and AIT
modes.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: Attachment of the circulating monocytes to the
endothelium is the earliest detectable events during formation of
atherosclerosis. The adhesion molecules, chemokines and matrix
proteases genes were identified to be expressed in atherogenesis.
Expressions of these genes may influence structural integrity of the
luminal endothelium. The aim of this study is to relate changes in the
ultrastructural morphology of the aortic luminal surface and gene
expressions of the endothelial surface, chemokine and MMP-12 in
normal and hypercholesterolemic rabbits. Luminal endothelial
surface from rabbit aortic tissue was examined by scanning electron
microscopy (SEM) using low vacuum mode to ascertain
ultrastructural changes in development of atherosclerotic lesion. Gene
expression of adhesion molecules, MCP-1 and MMP-12 were studied
by Real-time PCR. Ultrastructural observations of the aortic luminal
surface exhibited changes from normal regular smooth intact
endothelium to irregular luminal surface including marked globular
appearance and ruptures of the membrane layer. Real-time PCR
demonstrated differentially expressed of studied genes in
atherosclerotic tissues. The appearance of ultrastructural changes in
aortic tissue of hypercholesterolemic rabbits is suggested to have
relation with underlying changes of endothelial surface molecules,
chemokine and MMP-12 gene expressions.
Abstract: Model-based approaches have been applied successfully
to a wide range of tasks such as specification, simulation, testing, and
diagnosis. But one bottleneck often prevents the introduction of these
ideas: Manual modeling is a non-trivial, time-consuming task.
Automatically deriving models by observing and analyzing running
systems is one possible way to amend this bottleneck. To
derive a model automatically, some a-priori knowledge about the
model structure–i.e. about the system–must exist. Such a model
formalism would be used as follows: (i) By observing the network
traffic, a model of the long-term system behavior could be generated
automatically, (ii) Test vectors can be generated from the model,
(iii) While the system is running, the model could be used to diagnose
non-normal system behavior.
The main contribution of this paper is the introduction of a model
formalism called 'probabilistic regression automaton' suitable for the
tasks mentioned above.
Abstract: Support vector machines (SVMs) have shown
superior performance compared to other machine learning techniques,
especially in classification problems. Yet one limitation of SVMs is
the lack of an explanation capability which is crucial in some
applications, e.g. in the medical and security domains. In this paper, a
novel approach for eclectic rule-extraction from support vector
machines is presented. This approach utilizes the knowledge acquired
by the SVM and represented in its support vectors as well as the
parameters associated with them. The approach includes three stages;
training, propositional rule-extraction and rule quality evaluation.
Results from four different experiments have demonstrated the value
of the approach for extracting comprehensible rules of high accuracy
and fidelity.
Abstract: In this study, the contact problem of a layered composite which consists of two materials with different elastic constants and heights resting on two rigid flat supports with sharp edges is considered. The effect of gravity is neglected. While friction between the layers is taken into account, it is assumed that there is no friction between the supports and the layered composite so that only compressive tractions can be transmitted across the interface. The layered composite is subjected to a uniform clamping pressure over a finite portion of its top surface. The problem is reduced to a singular integral equation in which the contact pressure is the unknown function. The singular integral equation is evaluated numerically and the results for various dimensionless quantities are presented in graphical forms.
Abstract: Isobaric vapor-liquid equilibrium measurements are
reported for the binary mixture of Methyl acetate and
Isopropylbenzene at 97.3 kPa. The measurements have been
performed using a vapor recirculating type (modified Othmer's)
equilibrium still. The mixture shows positive deviation from ideality
and does not form an azeotrope. The activity coefficients have been
calculated taking into consideration the vapor phase nonideality. The
data satisfy the thermodynamic consistency tests of Herington and
Black. The activity coefficients have been satisfactorily correlated by
means of the Margules, NRTL, and Black equations. A comparison
of the values of activity coefficients obtained by experimental data
with the UNIFAC model has been made.
Abstract: The main goal of this paper is to establish a
methodology for testing and optimizing GPRS performance over
Libya GSM network as well as to propose a suitable optimization
technique to improve performance. Some measurements of
download, upload, throughput, round-trip time, reliability, handover,
security enhancement and packet loss over a GPRS access network
were carried out. Measured values are compared to the theoretical
values that could be calculated beforehand. This data should be
processed and delivered by the server across the wireless network to
the client. The client on the fly takes those pieces of the data and
process immediately. Also, we illustrate the results by describing the
main parameters that affect the quality of service. Finally, Libya-s
two mobile operators, Libyana Mobile Phone and Al-Madar al-
Jadeed Company are selected as a case study to validate our
methodology.
Abstract: Positron emission particle tracking (PEPT) is a
technique in which a single radioactive tracer particle can be
accurately tracked as it moves. A limitation of PET is that in order to
reconstruct a tomographic image it is necessary to acquire a large
volume of data (millions of events), so it is difficult to study rapidly
changing systems. By considering this fact, PEPT is a very fast
process compared with PET.
In PEPT detecting both photons defines a line and the annihilation
is assumed to have occurred somewhere along this line. The location
of the tracer can be determined to within a few mm from coincident
detection of a small number of pairs of back-to-back gamma rays and
using triangulation. This can be achieved many times per second and
the track of a moving particle can be reliably followed. This
technique was invented at the University of Birmingham [1].
The attempt in PEPT is not to form an image of the tracer particle
but simply to determine its location with time. If this tracer is
followed for a long enough period within a closed, circulating system
it explores all possible types of motion.
The application of PEPT to industrial process systems carried out
at the University of Birmingham is categorized in two subjects: the
behaviour of granular materials and viscous fluids. Granular
materials are processed in industry for example in the manufacture of
pharmaceuticals, ceramics, food, polymers and PEPT has been used
in a number of ways to study the behaviour of these systems [2].
PEPT allows the possibility of tracking a single particle within the
bed [3]. Also PEPT has been used for studying systems such as: fluid
flow, viscous fluids in mixers [4], using a neutrally-buoyant tracer
particle [5].
Abstract: Feature and model selection are in the center of
attention of many researches because of their impact on classifiers-
performance. Both selections are usually performed separately but
recent developments suggest using a combined GA-SVM approach to
perform them simultaneously. This approach improves the
performance of the classifier identifying the best subset of variables
and the optimal parameters- values. Although GA-SVM is an
effective method it is computationally expensive, thus a rough
method can be considered. The paper investigates a joined approach
of Genetic Algorithm and kernel matrix criteria to perform
simultaneously feature and model selection for SVM classification
problem. The purpose of this research is to improve the classification
performance of SVM through an efficient approach, the Kernel
Matrix Genetic Algorithm method (KMGA).
Abstract: In a bi-fuel diesel engine, the carburetor plays a vital
role in switching from fuel gas to petrol mode operation and viceversa.
The carburetor is the most important part of the fuel system of
a diesel engine. All diesel engines carry variable venturi mixer
carburetors. The basic operation of the carburetor mainly depends on
the restriction barrel called the venturi. When air flows through the
venturi, its speed increases and its pressure decreases. The main
challenge focuses on designing a mixing device which mixes the
supplied gas is the incoming air at an optimum ratio. In order to
surmount the identified problems, the way fuel gas and air flow in
the mixer have to be analyzed. In this case, the Computational Fluid
Dynamics or CFD approach is applied in design of the prototype
mixer. The present work is aimed at further understanding of the air
and fuel flow structure by performing CFD studies using a software
code. In this study for mixing air and gas in the condition that has
been mentioned in continuance, some mixers have been designed.
Then using of computational fluid dynamics, the optimum mixer has
been selected. The results indicated that mixer with 12 holes can
produce a homogenous mixture than those of 8-holes and 6-holes
mixer. Also the result showed that if inlet convergency was smoother
than outlet divergency, the mixture get more homogenous, the reason
of that is in increasing turbulence in outlet divergency.
Abstract: Vapour recompression system has been used to
enhance reduction in energy consumption and improvement in
energy effectiveness of distillation columns. However, the effects of
certain parameters have not been taken into consideration. One of
such parameters is the column heat loss which has either been
assumed to be a certain percent of reboiler heat transfer or negligible.
The purpose of this study was to evaluate the heat loss from an
ethanol-water vapour recompression distillation column with
pressure increase across the compressor (VRCAS) and compare the
results obtained and its effect on some parameters in similar system
(VRCCS) where the column heat loss has been assumed or neglected.
Results show that the heat loss evaluated was higher when compared
with that obtained for the column VRCCS. The results also showed
that increase in heat loss could have significant effect on the total
energy consumption, reboiler heat transfer, the number of trays and
energy effectiveness of the column.
Abstract: Characterized as rich mineral substances, low
temperature, few bacteria, and stability with numerous implementation
aspects on aquaculture, food, drinking, and leisure, the deep sea water
(DSW) development has become a new industry in the world. It has
been report that marine algae contain various biologically active
compounds. This research focued on the affections in cultivating
Sagrassum cristaefolium with different concentration of deep sea
water(DSW) and surface sea water(SSW). After two and four weeks,
the total phenolic contents were compared in Sagrassum cristaefolium
culturing with different ways, and the reductive activity of them was
also be tried with potassium ferricyanide. Those fresh seaweeds were
dried with oven and were ground to powder. Progressively, the marine
algae we cultured was extracted by water under the condition with
heating them at 90Ôäâ for 1hr.The total phenolic contents were be
executed using Folin–Ciocalteu method. The results were explaining
as follows: the highest total phenolic contents and the best reductive
ability of all could be observed on the 1/4 proportion of DSW to SSW
culturing in two weeks. Furthermore, the 1/2 proportion of DSW to
SSW also showed good reductive ability and plentiful phenolic
compositions. Finally, we confirmed that difference proportion of
DSW and SSW is the major point relating to ether the total phenolic
components or the reductive ability in the Sagrassum cristaefolium. In
the future, we will use this way to mass production the marine algae or
other micro algae on industry applications.
Abstract: The production of activated carbon from low or zero cost of agricultural by-products or wastes has received great attention from academics and practitioners due to its economic and environmental benefits. In the production of bamboo furniture, a significant amount of bamboo waste is inevitably generated. Therefore, this research aimed to prepare activated carbons from bamboo furniture waste by chemical (KOH) activation and determine their properties and adsorption capacities for water treatment. The influence of carbonization time on the properties and adsorption capacities of activated carbons was also investigated. The finding showed that the bamboo-derived activated carbons had microporous characteristics. They exhibited high tendency for the reduction of impurities present in effluent water. Their adsorption capacities were comparable to the adsorption capacity of a commercial activated carbon regarding to the reduction in COD, TDS and turbidity of the effluent water.
Abstract: Mercury adsorption on soil was investigated at
different ionic strengths using Ca(NO3)2 as a background electrolyte.
Results fitted the Langmuir equation and the adsorption isotherms
reached a plateau at higher equilibrium concentrations. Increasing
ionic strength decreased the sorption of mercury, due to the
competition of Ca ions for the sorption sites in the soils. The
influence of ionic strength was related to the mechanisms of heavy
metal sorption by the soil. These results can be of practical
importance both in the agriculture and contaminated soils since the
solubility of mercury in soils are strictly dependent on the adsorption
and release process.
Abstract: The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.