Abstract: In this paper, a new automated methodology to detect the optic disc (OD) automatically in retinal images from patients with risk of being affected by Diabetic Retinopathy (DR) and Macular Edema (ME) is presented. The detection procedure comprises two independent methodologies. On one hand, a location methodology obtains a pixel that belongs to the OD using image contrast analysis and structure filtering techniques and, on the other hand, a boundary segmentation methodology estimates a circular approximation of the OD boundary by applying mathematical morphology, edge detection techniques and the Circular Hough Transform. The methodologies were tested on a set of 1200 images composed of 229 retinographies from patients affected by DR with risk of ME, 431 with DR and no risk of ME and 540 images of healthy retinas. The location methodology obtained 98.83% success rate, whereas the OD boundary segmentation methodology obtained good circular OD boundary approximation in 94.58% of cases. The average computational time measured over the total set was 1.67 seconds for OD location and 5.78 seconds for OD boundary segmentation.
Abstract: In the present paper the extreme shear stresses with the corresponding planes are established using the freely available computer tools like the Gnuplot, Sage, R, Python and Octave. In order to support these freely available computer tools, their strong symbolical and graphical abilities are illustrated. The nature of the stationary points obtained by the Method of Lagrangian Multipliers can be determined using freely available computer symbolical tools like Sage. The characters of the stationary points can be explained in the easiest way using freely available computer graphical tools like Gnuplot, Sage, R, Python and Octave. The presented figures improve the understanding of the problem and the obtained solutions for the majority of students of civil or mechanical engineering.
Abstract: The main objectives of this study were to identify
attributes that influence customer satisfaction and determine their
relationships with customer satisfaction. The variables included in
this research are place/ambience, food quality and service quality as
independent variables and customer satisfaction as the dependent
variable. A survey questionnaire which consisted of three parts to
measure demographic factors, independent variables, and dependent
variables was constructed based on items determined by past
research. 149 respondents from one of the well known hotel in Kuala
Lumpur, MALAYSIA were selected as a sample. Psychometric
testing was conducted to determine the reliability and validity of the
questionnaire. From the findings, there were positive significant
relationship between place/ambience (r=0.563**, p=0.000) and
service quality (r=0.544**, p=0.000) with customer satisfaction.
However, although relationship between food quality and customer
satisfaction was significant, it was in the negative direction (r=-
0.268**, p=0.001). New findings were discovered after conducting
this research and previous research findings were strengthened by the
results of this research. Future researchers could concentrate on
determining attributes that influence customer satisfaction when
cost/price is not a factor and reasons for place/ambience is currently
becoming the leading factor in determining customer satisfaction.
Abstract: Islamic institutions in Malaysia play a variety of
socioeconomic roles such as poverty alleviation. To perform this role,
these institutions face a major task in identifying the poverty group.
Most of these institutions measure and operationalize poverty from
the monetary perspective using variables such as income, expenditure
or consumption. In practice, most Islamic institutions in Malaysia use
the monetary approach in measuring poverty through the
conventional Poverty Line Income (PLI) method and recently, the
had al kifayah (HAK) method using total necessities of a household
from an Islamic perspective. The objective of this paper is to present
the PLI and also the HAK method. This micro-data study would
highlight the similarities and differences of both the methods.A
survey aided by a structured questionnaire was carried out on 260
selected head of households in the state of Selangor. The paper
highlights several demographic factors that are associated with the
three monetary indicators in the study, namely income, PLI and
HAK. In addition, the study found that these monetary variables are
significantly related with each other.
Abstract: Intrusion detection systems (IDS)are crucial components
of the security mechanisms of today-s computer systems.
Existing research on intrusion detection has focused on sequential
intrusions. However, intrusions can also be formed by concurrent
interactions of multiple processes. Some of the intrusions caused
by these interactions cannot be detected using sequential intrusion
detection methods. Therefore, there is a need for a mechanism that
views the distributed system as a whole. L-BIDS (Lattice-Based
Intrusion Detection System) is proposed to address this problem. In
the L-BIDS framework, a library of intrusions and distributed traces
are represented as lattices. Then these lattices are compared in order
to detect intrusions in the distributed traces.
Abstract: The fluid flow and the properties of the hydraulic
fluid inside a torque converter are the main topics of interest in this
research. The primary goal is to investigate the applicability of
various viscous fluids inside the torque converter. The Taguchi
optimization method is adopted to analyse the fluid flow in a torque
converter from a design perspective. Calculations are conducted in
maximizing the pressure since greater the pressure, greater the torque
developed. Using the values of the S/N ratios obtained, graphs are
plotted. Computational Fluid Dynamics (CFD) analysis is also
conducted.
Abstract: A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.
Abstract: A bond graph model of an electrical transformer
including the nonlinear saturation is presented. A nonlinear observer for the transformer based on multivariable circle
criterion in the physical domain is proposed. In order to show the saturation and hysteresis effects on the electrical transformer,
simulation results are obtained. Finally, the paper describes that convergence of the estimates to the true states is achieved.
Abstract: The various types of frequent pattern discovery
problem, namely, the frequent itemset, sequence and graph mining
problems are solved in different ways which are, however, in certain
aspects similar. The main approach of discovering such patterns can
be classified into two main classes, namely, in the class of the levelwise
methods and in that of the database projection-based methods.
The level-wise algorithms use in general clever indexing structures
for discovering the patterns. In this paper a new approach is proposed
for discovering frequent sequences and tree-like patterns efficiently
that is based on the level-wise issue. Because the level-wise
algorithms spend a lot of time for the subpattern testing problem, the
new approach introduces the idea of using automaton theory to solve
this problem.
Abstract: The paper explores the development of an optimization of method and apparatus for retrieving extended high dynamic range from digital negative image. Architectural photo imaging can benefit from high dynamic range imaging (HDRI) technique for preserving and presenting sufficient luminance in the shadow and highlight clipping image areas. The HDRI technique that requires multiple exposure images as the source of HDRI rendering may not be effective in terms of time efficiency during the acquisition process and post-processing stage, considering it has numerous potential imaging variables and technical limitations during the multiple exposure process. This paper explores an experimental method and apparatus that aims to expand the dynamic range from digital negative image in HDRI environment. The method and apparatus explored is based on a single source of RAW image acquisition for the use of HDRI post-processing. It will cater the optimization in order to avoid and minimize the conventional HDRI photographic errors caused by different physical conditions during the photographing process and the misalignment of multiple exposed image sequences. The study observes the characteristics and capabilities of RAW image format as digital negative used for the retrieval of extended high dynamic range process in HDRI environment.
Abstract: Program slicing is the task of finding all statements in
a program that directly or indirectly influence the value of a variable
occurrence. The set of statements that can affect the value of a
variable at some point in a program is called a program backward
slice. In several software engineering applications, such as program
debugging and measuring program cohesion and parallelism, several
slices are computed at different program points. The existing
algorithms for computing program slices are introduced to compute a
slice at a program point. In these algorithms, the program, or the
model that represents the program, is traversed completely or
partially once. To compute more than one slice, the same algorithm
is applied for every point of interest in the program. Thus, the same
program, or program representation, is traversed several times.
In this paper, an algorithm is introduced to compute all forward
static slices of a computer program by traversing the program
representation graph once. Therefore, the introduced algorithm is
useful for software engineering applications that require computing
program slices at different points of a program. The program
representation graph used in this paper is called Program Dependence
Graph (PDG).
Abstract: In this study, an inland metropolitan area, Gwangju, in Korea was selected to assess the amplification potential of earthquake motion and provide the information for regional seismic countermeasure. A geographic information system-based expert system was implemented for reliably predicting the spatial geotechnical layers in the entire region of interesting by building a geo-knowledge database. Particularly, the database consists of the existing boring data gathered from the prior geotechnical projects and the surface geo-knowledge data acquired from the site visit. For practical application of the geo-knowledge database to estimate the earthquake hazard potential related to site amplification effects at the study area, seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created within GIS framework. In addition, seismic zonation of site classification was also performed to determine the site amplification coefficients for seismic design at any site in the study area. KeywordsEarthquake hazard, geo-knowledge, geographic information system, seismic zonation, site period.
Abstract: Stipples are desired for pattern fillings and
transparency effects. In contrast, some graphics standards, including
OpenGL ES 1.1 and 2.0, omitted this feature. We represent details of
providing line stipples and polygon stipples, through combining
texture mapping and alpha blending functions. We start from the
OpenGL-specified stipple-related API functions. The details of
mathematical transformations are explained to get the correct texture
coordinates. Then, the overall algorithm is represented, and its
implementation results are followed. We accomplished both of line
and polygon stipples, and verified its result with conformance test
routines.
Abstract: Aim of this paper is to explore the prospect of a new approach of mobile phone banking in Libya. This study evaluates customer knowledge on commercial mobile banking in Libya. To examine the relationship between age, occupation and intention for using mobile banking for commercial purpose, a survey was conducted to gather information from one hundred Libyan bank clients. The results indicate that Libyan customers have accepted the new technology and they are ready to use it. There is no significant joint relationship between age and occupation found in intention to use mobile banking in Libya. On the other hand, the customers’ knowledge about mobile banking has a greater relationship with the intention. This study has implications for demographic researches and consumer behaviour disciplines. It also has profitable implications for banks and managers in Libya, as it will assist in better understanding of the Libyan consumers and their activities, when they develop their market strategies and new service.
Abstract: Super-resolution is nowadays used for a high-resolution
image produced from several low-resolution noisy frames. In
this work, we consider the problem of high-quality interpolation of a
single noise-free image. Such images may come from different sources,
i.e., they may be frames of videos, individual pictures, etc. On
the other hand, in the encoder we apply a downsampling via
bidimen-sional interpolation of each frame, and in the decoder we
apply a upsampling by which we restore the original size of the
image. If the compression ratio is very high, then we use a
convolutive mask that restores the edges, eliminating the blur.
Finally, both, the encoder and the complete decoder are implemented
on General-Purpose computation on Graphics Processing Units
(GPGPU) cards. In fact, the mentioned mask is coded inside texture
memory of a GPGPU.
Abstract: In this paper a Public Key Cryptosystem is proposed
using the number theoretic transforms (NTT) over a ring of integer
modulo a composite number. The key agreement is similar to
ElGamal public key algorithm. The security of the system is based on
solution of multivariate linear congruence equations and discrete
logarithm problem. In the proposed cryptosystem only fixed numbers
of multiplications are carried out (constant complexity) and hence the
encryption and decryption can be done easily. At the same time, it is
very difficult to attack the cryptosystem, since the cipher text is a
sequence of integers which are interrelated. The system provides
authentication also. Using Mathematica version 5.0 the proposed
algorithm is justified with a numerical example.
Abstract: Steganography is the art of hiding and transmitting data
through apparently innocuous carriers in an effort to conceal the
existence of the data. A lot of steganography algorithms have been
proposed recently. Many of them use the digital image data as a carrier.
In data hiding scheme of halftoning and coordinate projection, still
image data is used as a carrier, and the data of carrier image are
modified for data embedding. In this paper, we present three features
for analysis of data hiding via halftoning and coordinate projection.
Also, we present a classifier using the proposed three features.
Abstract: Due to the tremendous amount of information provided
by the World Wide Web (WWW) developing methods for mining
the structure of web-based documents is of considerable interest. In
this paper we present a similarity measure for graphs representing
web-based hypertext structures. Our similarity measure is mainly
based on a novel representation of a graph as linear integer strings,
whose components represent structural properties of the graph. The
similarity of two graphs is then defined as the optimal alignment of
the underlying property strings. In this paper we apply the well known
technique of sequence alignments for solving a novel and challenging
problem: Measuring the structural similarity of generalized trees.
In other words: We first transform our graphs considered as high
dimensional objects in linear structures. Then we derive similarity
values from the alignments of the property strings in order to
measure the structural similarity of generalized trees. Hence, we
transform a graph similarity problem to a string similarity problem for
developing a efficient graph similarity measure. We demonstrate that
our similarity measure captures important structural information by
applying it to two different test sets consisting of graphs representing
web-based document structures.
Abstract: In today scenario, to meet enhanced demand imposed
by domestic, commercial and industrial consumers, various
operational & control activities of Radial Distribution Network
(RDN) requires a focused attention. Irrespective of sub-domains
research aspects of RDN like network reconfiguration, reactive
power compensation and economic load scheduling etc, network
performance parameters are usually estimated by an iterative process
and is commonly known as load (power) flow algorithm. In this
paper, a simple mechanism is presented to implement the load flow
analysis (LFA) algorithm. The reported algorithm utilizes graph
theory principles and is tested on a 69- bus RDN.
Abstract: The mosques have been appearance in Thailand since
Ayutthaya Kingdom (1350 to 1767 A.D.) Until today, more than 400 years later; there are many styles of art form behind their structure.
This research intended to identify Islamic Art in Thai mosques. A framework was applied using qualitative research methods; Thai
Muslims with dynamic roles in Islamic culture were interviewed. In
addition, a field survey of 40 selected mosques from 175 Thai
mosques was studied. Data analysis will be according to the pattern
of each period. The identification of Islamic Art in Thai Mosques are
1) the image of Thai identity: with Thai traditional art style and Government policy. 2) The image of the Ethnological identity: with
the traditional culture of Asian Muslims in Thailand. 3) The image of
the Nostalgia identity: with Islamic and Arabian conservative style.
4) The image of the Neo Classic identity: with Neo – Classic and
Contemporary art. 5) The image of the new identity: with Post
Modern and Deconstruction art.