Abstract: Lutein is a dietary oxycarotenoid which is found
to reduce the risks of Age-related Macular Degeneration
(AMD). Supercritical fluid extraction of lutein esters from
marigold petals was carried out and was found to be much
effective than conventional solvent extraction. The
saponification of pre-concentrated lutein esters to produce free
lutein was studied which showed a composition of about 88%
total carotenoids (UV-VIS spectrophotometry) and 90.7%
lutein (HPLC). The lipase catalyzed hydrolysis of lutein esters
in conventional medium was investigated. The optimal
temperature, pH, enzyme concentration and water activity
were found to be 50°C, 7, 15% and 0.33 respectively and the
activity loss of lipase was about 25% after 8 times re-use in at
50°C for 12 days. However, the lipase catalyzed hydrolysis of
lutein esters in conventional media resulted in poor
conversions (16.4%).
Abstract: Managing the emergency situations at the Emergency
Staff requires a high co-operation between its members and their fast
decision making. For these purpose it is necessary to prepare Emergency Staff members adequately. The aim of this paper is to
describe the development of information support that focuses to
emergency staff processes and effective decisions. The information
support is based on the principles of process management, and
Process Framework for Emergency Management was used during the
development. The output is the information system that allows users
to simulate an emergency situation, including effective decision making. The system also evaluates the progress of the emergency
processes solving by quantitative and qualitative indicators. By using
the simulator, a higher quality education of specialists can be achieved. Therefore, negative impacts resulting from arising emergency situations can be directly reduced.
Abstract: Early breast cancer detection is an emerging field of
research as it can save the women infected by malignant tumors.
Microwave breast imaging is based on the electrical property contrast
between healthy and malignant tumor. This contrast can be detected
by use of microwave energy with an array of antennas that illuminate
the breast through coupling medium and by measuring the scattered
fields. In this paper, author has been presented the design and
simulation results of the bowtie antenna. This bowtie antenna is
designed for the detection of breast cancer detection.
Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: In this paper, the two-dimensional stagger grid
interface pressure (SGIP) model has been generalized and presented
into three-dimensional form. For this purpose, various models of
surface tension force for interfacial flows have been investigated and
compared with each other. The VOF method has been used for
tracking the interface. To show the ability of the SGIP model for
three-dimensional flows in comparison with other models, pressure
contours, maximum spurious velocities, norm spurious flow
velocities and pressure jump error for motionless drop of liquid and
bubble of gas are calculated using different models. It has been
pointed out that SGIP model in comparison with the CSF, CSS and
PCIL models produces the least maximum and norm spurious
velocities. Additionally, the new model produces more accurate
results in calculating the pressure jumps across the interface for
motionless drop of liquid and bubble of gas which is generated in
surface tension force.
Abstract: Tacit knowledge has been one of the most discussed
and contradictory concepts in the field of knowledge management
since the mid 1990s. The concept is used relatively vaguely to refer
to any type of information that is difficult to articulate, which has led
to discussions about the original meaning of the concept (adopted
from Polanyi-s philosophy) and the nature of tacit knowing. It is
proposed that the subject should be approached from the perspective
of cognitive science in order to connect tacit knowledge to
empirically studied cognitive phenomena. Some of the most
important examples of tacit knowing presented by Polanyi are
analyzed in order to trace the cognitive mechanisms of tacit knowing
and to promote better understanding of the nature of tacit knowledge.
The cognitive approach to Polanyi-s theory reveals that the
tacit/explicit typology of knowledge often presented in the
knowledge management literature is not only artificial but totally
opposite approach compared to Polanyi-s thinking.
Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: Comparison of two approaches for the simulation of
the dynamic behaviour of a permanent magnet linear actuator is
presented. These are full coupled model, where the electromagnetic
field, electric circuit and mechanical motion problems are solved
simultaneously, and decoupled model, where first a set of static
magnetic filed analysis is carried out and then the electric circuit and
mechanical motion equations are solved employing bi-cubic spline
approximations of the field analysis results. The results show that the
proposed decoupled model is of satisfactory accuracy and gives more
flexibility when the actuator response is required to be estimated for
different external conditions, e.g. external circuit parameters or
mechanical loads.
Abstract: A series of microarray experiments produces observations
of differential expression for thousands of genes across multiple
conditions.
Principal component analysis(PCA) has been widely used in
multivariate data analysis to reduce the dimensionality of the data in
order to simplify subsequent analysis and allow for summarization of
the data in a parsimonious manner. PCA, which can be implemented
via a singular value decomposition(SVD), is useful for analysis of
microarray data.
For application of PCA using SVD we use the DNA microarray
data for the small round blue cell tumors(SRBCT) of childhood
by Khan et al.(2001). To decide the number of components which
account for sufficient amount of information we draw scree plot.
Biplot, a graphic display associated with PCA, reveals important
features that exhibit relationship between variables and also the
relationship of variables with observations.
Abstract: The objectives of this research are to produce
prototype coconut oil based solvent offset printing inks and to
analyze a basic quality of printing work derived from coconut oil
based solvent offset printing inks, by mean of bringing coconut oil
for producing varnish and bringing such varnish to produce black
offset printing inks. Then, analysis of qualities i.e. CIELAB value,
density value, and dot gain value of printing work from coconut oil
based solvent offset printing inks which printed on gloss-coated
woodfree paper weighs 130 grams were done. The research result of
coconut oil based solvent offset printing inks indicated that the
suitable varnish formulation is using 51% of coconut oil, 36% of
phenolic resin, and 14% of solvent oil 14%, while the result of
producing black offset ink displayed that the suitable formula of
printing ink is using varnish mixed with 20% of coconut oil, and the
analyzing printing work of coconut oil based solvent offset printing
inks which printed on paper, the results were as follows: CIELAB
value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* =
1.86, density value is at 1.27 and dot gain value was high at mid tone
area of image area.
Abstract: In modern human computer interaction systems
(HCI), emotion recognition is becoming an imperative characteristic.
The quest for effective and reliable emotion recognition in HCI has
resulted in a need for better face detection, feature extraction and
classification. In this paper we present results of feature space analysis
after briefly explaining our fully automatic vision based emotion
recognition method. We demonstrate the compactness of the feature
space and show how the 2d/3d based method achieves superior features
for the purpose of emotion classification. Also it is exposed that
through feature normalization a widely person independent feature
space is created. As a consequence, the classifier architecture has
only a minor influence on the classification result. This is particularly
elucidated with the help of confusion matrices. For this purpose
advanced classification algorithms, such as Support Vector Machines
and Artificial Neural Networks are employed, as well as the simple k-
Nearest Neighbor classifier.
Abstract: Falling has been one of the major concerns and threats
to the independence of the elderly in their daily lives. With the
worldwide significant growth of the aging population, it is essential
to have a promising solution of fall detection which is able to operate
at high accuracy in real-time and supports large scale implementation
using multiple cameras. Field Programmable Gate Array (FPGA) is a
highly promising tool to be used as a hardware accelerator in many
emerging embedded vision based system. Thus, it is the main
objective of this paper to present an FPGA-based solution of visual
based fall detection to meet stringent real-time requirements with
high accuracy. The hardware architecture of visual based fall
detection which utilizes the pixel locality to reduce memory accesses
is proposed. By exploiting the parallel and pipeline architecture of
FPGA, our hardware implementation of visual based fall detection
using FGPA is able to achieve a performance of 60fps for a series of
video analytical functions at VGA resolutions (640x480). The results
of this work show that FPGA has great potentials and impacts in
enabling large scale vision system in the future healthcare industry
due to its flexibility and scalability.
Abstract: Energetic and structural results for ethanol-water mixtures as a function of the mole fraction were calculated using Monte Carlo methodology. Energy partitioning results obtained for equimolar water-ethanol mixture and ether organic liquids are compared. It has been shown that at xet=0.22 the RDFs for waterethanol and ethanol-ethanol interactions indicated strong hydrophobic interactions between ethanol molecules and the local structure of solution is less structured at this concentration as at ether ones. Results obtained for ethanol-water mixture as a function of concentration are in good agreement with the experimental data.
Abstract: An immunomodulator bioproduct is prepared in a
batch bioprocess with a modified bacterium Pseudomonas
aeruginosa. The bioprocess is performed in 100 L Bioengineering
bioreactor with 42 L cultivation medium made of peptone, meat
extract and sodium chloride. The optimal bioprocess parameters were
determined: temperature – 37 0C, agitation speed - 300 rpm, aeration
rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max.
4 % of the medium volume, duration - 6 hours. This kind of
bioprocesses are appreciated as difficult to control because their
dynamic behavior is highly nonlinear and time varying. The aim of
the paper is to present (by comparison) different models based on
experimental data.
The analysis criteria were modeling error and convergence rate.
The estimated values and the modeling analysis were done by using
the Table Curve 2D.
The preliminary conclusions indicate Andrews-s model with a
maximum specific growth rate of the bacterium in the range of
0.8 h-1.
Abstract: Housing is a basic human right. The provision of new
house shall be free from any defects, even for the defects that people
do normally considered as 'cosmetic defects'. This paper studies
about the building defects of newly completed house of 72 unit of
double-storey terraced located in Bangi, Selangor. The building
survey implemented using protocol 1 (visual inspection). As for new
house, the survey work is very stringent in determining the defects
condition and priority. Survey and reporting procedure is carried out
based on CSP1 Matrix that involved scoring system, photographs and
plan tagging. The analysis is done using Statistical Package for Social
Sciences (SPSS). The finding reveals that there are 2119 defects
recorded in 72 terraced houses. The cumulative score obtained was
27644 while the overall rating is 13.05. These results indicate that the
construction quality of the newly terraced houses is low and not up to
an acceptable standard as the new house should be.
Abstract: In this paper, we propose a modified version of the
Constant Modulus Algorithm (CMA) tailored for blind Decision
Feedback Equalizer (DFE) of first order Markovian time varying
channels. The proposed NonStationary CMA (NSCMA) is designed
so that it explicitly takes into account the Markovian structure of
the channel nonstationarity. Hence, unlike the classical CMA, the
NSCMA is not blind with respect to the channel time variations.
This greatly helps the equalizer in the case of realistic channels, and
avoids frequent transmissions of training sequences.
This paper develops a theoretical analysis of the steady state
performance of the CMA and the NSCMA for DFEs within a time
varying context. Therefore, approximate expressions of the mean
square errors are derived. We prove that in the steady state, the
NSCMA exhibits better performance than the classical CMA. These
new results are confirmed by simulation.
Through an experimental study, we demonstrate that the Bit Error
Rate (BER) is reduced by the NSCMA-DFE, and the improvement
of the BER achieved by the NSCMA-DFE is as significant as the
channel time variations are severe.
Abstract: To unveil the mechanism of fast autooxidation of fish
myoglobins, the effect of temperature on the structural change of tuna
myoglobin was investigated. Purified myoglobin was subjected to
preincubation at 5, 20, 50 and 40oC. Overall helical structural decay
through thermal treatment up to 95oC was monitored by circular
dichroism spectrometry, while the structural changes around the heme
pocket was measured by ultraviolet/visible absorption spectrophotometry.
As a result, no essential structural change of myoglobin
was observed under 30oC, roughly equivalent to their body
temperature, but the structure was clearly damaged at 40oC. The Soret
band absorption hardly differed irrespective of preincubation
temperature, suggesting that the structure around the heme pocket was
not perturbed even after thermal treatment.
Abstract: An Optimal Power Flow based on Improved Particle
Swarm Optimization (OPF-IPSO) with Generator Capability Curve
Constraint is used by NN-OPF as a reference to get pattern of
generator scheduling. There are three stages in Designing NN-OPF.
The first stage is design of OPF-IPSO with generator capability curve
constraint. The second stage is clustering load to specific range and
calculating its index. The third stage is training NN-OPF using
constructive back propagation method. In training process total load
and load index used as input, and pattern of generator scheduling
used as output. Data used in this paper is power system of Java-Bali.
Software used in this simulation is MATLAB.
Abstract: This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA
datasets. An important challenge in reliably detecting aortic is the
need to overcome problems associated with intensity
inhomogeneities. Level sets are part of an important class of methods
that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the
level set formulation aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level
sets, and are shown to be more effective than other approaches in
coping with intensity inhomogeneities. We have applied the Courant
Friedrichs Levy (CFL) condition as stability criterion for our algorithm.
Abstract: In this paper, a mathematical model of human immunodeficiency
virus (HIV) is utilized and an optimization problem is
proposed, with the final goal of implementing an optimal 900-day
structured treatment interruption (STI) protocol. Two type of commonly
used drugs in highly active antiretroviral therapy (HAART),
reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are
considered. In order to solving the proposed optimization problem an
adaptive memetic algorithm with population management (AMAPM)
is proposed. The AMAPM uses a distance measure to control the
diversity of population in genotype space and thus preventing the
stagnation and premature convergence. Moreover, the AMAPM uses
diversity parameter in phenotype space to dynamically set the population
size and the number of crossovers during the search process.
Three crossover operators diversify the population, simultaneously.
The progresses of crossover operators are utilized to set the number
of each crossover per generation. In order to escaping the local optima
and introducing the new search directions toward the global optima,
two local searchers assist the evolutionary process. In contrast to
traditional memetic algorithms, the activation of these local searchers
is not random and depends on both the diversity parameters in
genotype space and phenotype space. The capability of AMAPM in
finding optimal solutions compared with three popular metaheurestics
is introduced.