Abstract: This paper presents a method for the detection of OD in the retina which takes advantage of the powerful preprocessing techniques such as the contrast enhancement, Gabor wavelet transform for vessel segmentation, mathematical morphology and Earth Mover-s distance (EMD) as the matching process. The OD detection algorithm is based on matching the expected directional pattern of the retinal blood vessels. Vessel segmentation method produces segmentations by classifying each image pixel as vessel or nonvessel, based on the pixel-s feature vector. Feature vectors are composed of the pixel-s intensity and 2D Gabor wavelet transform responses taken at multiple scales. A simple matched filter is proposed to roughly match the direction of the vessels at the OD vicinity using the EMD. The minimum distance provides an estimate of the OD center coordinates. The method-s performance is evaluated on publicly available DRIVE and STARE databases. On the DRIVE database the OD center was detected correctly in all of the 40 images (100%) and on the STARE database the OD was detected correctly in 76 out of the 81 images, even in rather difficult pathological situations.
Abstract: Neural processors have shown good results for
detecting a certain character in a given input matrix. In this paper, a
new idead to speed up the operation of neural processors for character
detection is presented. Such processors are designed based on cross
correlation in the frequency domain between the input matrix and the
weights of neural networks. This approach is developed to reduce the
computation steps required by these faster neural networks for the
searching process. The principle of divide and conquer strategy is
applied through image decomposition. Each image is divided into
small in size sub-images and then each one is tested separately by
using a single faster neural processor. Furthermore, faster character
detection is obtained by using parallel processing techniques to test the
resulting sub-images at the same time using the same number of faster
neural networks. In contrast to using only faster neural processors, the
speed up ratio is increased with the size of the input image when using
faster neural processors and image decomposition. Moreover, the
problem of local subimage normalization in the frequency domain is
solved. The effect of image normalization on the speed up ratio of
character detection is discussed. Simulation results show that local
subimage normalization through weight normalization is faster than
subimage normalization in the spatial domain. The overall speed up
ratio of the detection process is increased as the normalization of
weights is done off line.
Abstract: One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Abstract: Carbon dioxide is one of the major green house gases.
It is removed from different streams using amine absorption process.
Sterically hindered amines are suggested as good CO2 absorbers.
Solubility of carbon dioxide (CO2) was measured in aqueous
solutions of 2-Amino-2-methyl-1-propanol (AMP) at temperatures 30
oC, 40 oC and 60 oC. The effect of pressure and temperature was
studied over various concentrations of AMP. It has been found that
pressure has positive effect on CO2 solubility where as solubility
decreased with increasing temperature. Absorption performance of
AMP increased with increasing pressure. Solubility of aqueous AMP
was compared with mo-ethanolamine (MEA) and the absorption
capacity of aqueous solutions of AMP was found to be better.
Abstract: Exponentially weighted moving average control chart (EWMA) is a popular chart used for detecting shift in the mean of parameter of distributions in quality control. The objective of this paper is to compare the efficiency of control chart to detect an increases in the mean of a process. In particular, we compared the Maximum Exponentially Weighted Moving Average (MaxEWMA) and Maximum Generally Weighted Moving Average (MaxGWMA) control charts when the observations are Exponential distribution. The criteria for evaluate the performance of control chart is called, the Average Run Length (ARL). The result of comparison show that in the case of process is small sample size, the MaxEWMA control chart is more efficiency to detect shift in the process mean than MaxGWMA control chart. For the case of large sample size, the MaxEWMA control chart is more sensitive to detect small shift in the process mean than MaxGWMA control chart, and when the process is a large shift in mean, the MaxGWMA control chart is more sensitive to detect mean shift than MaxEWMA control chart.
Abstract: Simulation model is an easy way to build up models
to represent real life scenarios, to identify bottlenecks and to enhance
system performance. Using a valid simulation model may give
several advantages in creating better manufacturing design in order to
improve the system performances. This paper presents result of
implementing a simulation model to design hard disk drive
manufacturing process by applying line balancing to improve both
productivity and quality of hard disk drive process. The line balance
efficiency showed 86% decrease in work in process, output was
increased by an average of 80%, average time in the system was
decreased 86% and waiting time was decreased 90%.
Abstract: This paper describes a 2.4 GHz passive switch mixer
and a 5/2.5 GHz voltage-controlled negative Gm oscillator (VCO)
with an inversion-mode MOS varactor. Both circuits are implemented
using a 1P8M 0.13 μm process. The switch mixer has an input
referred 1 dB compression point of -3.89 dBm and a conversion
gain of -0.96 dB when the local oscillator power is +2.5 dBm.
The VCO consumes only 1.75 mW, while drawing 1.45 mA from a
1.2 V supply voltage. In order to reduce the passives size, the VCO
natural oscillation frequency is 5 GHz. A clocked CMOS divideby-
two circuit is used for frequency division and quadrature phase
generation. The VCO has a -109 dBc/Hz phase noise at 1 MHz
frequency offset and a 2.35-2.5 GHz tuning range (after the frequency
division), thus complying with ZigBee requirements.
Abstract: The given work is devoted to the description of
Information Technologies NAS of Azerbaijan created and
successfully maintained in Institute. On the basis of the decision of
board of the Supreme Certifying commission at the President of the
Azerbaijan Republic and Presidium of National Academy of
Sciences of the Azerbaijan Republic, the organization of training
courses on Computer Sciences for all post-graduate students and
dissertators of the republic, taking of examinations of candidate
minima, it was on-line entrusted to Institute of Information
Technologies of the National Academy of Sciences of Azerbaijan.
Therefore, teaching the computer sciences to post-graduate
students and dissertators a scientific - methodological manual on
effective application of new information technologies for research
works by post-graduate students and dissertators and taking of
candidate minima is carried out in the Educational Center.
Information and communication technologies offer new
opportunities and prospects of their application for teaching and
training. The new level of literacy demands creation of essentially
new technology of obtaining of scientific knowledge. Methods of
training and development, social and professional requirements,
globalization of the communicative economic and political projects
connected with construction of a new society, depends on a level of
application of information and communication technologies in the
educational process. Computer technologies develop ideas of
programmed training, open completely new, not investigated
technological ways of training connected to unique opportunities of
modern computers and telecommunications. Computer technologies
of training are processes of preparation and transfer of the
information to the trainee by means of computer. Scientific and
technical progress as well as global spread of the technologies
created in the most developed countries of the world is the main
proof of the leading role of education in XXI century. Information
society needs individuals having modern knowledge. In practice, all
technologies, using special technical information means (computer,
audio, video) are called information technologies of education.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: With the proliferation of the mobile device
technologies, mobile learning can be used to complement and
improve traditional learning problems. Both students and teachers
need a proper and handy system to monitor and keep track the
performance of the students. This paper presents an implementation
of M-learning for primary school in Malaysia by using an open
source technology. It focuses on learning mathematics using
handheld devices for primary schools- students aged 11 and 12 years
old. Main users for this system include students, teachers and the
administrator. This application suggests a new mobile learning
environment with mobile graph for tracking the students- progress
and performance. The purpose of this system is not to replace
traditional classroom but to complement the learning process. In a
testing conducted, students who used this system performed better in
their examination.
Abstract: Segmentation of a color image composed of different
kinds of regions can be a hard problem, namely to compute for an
exact texture fields. The decision of the optimum number of
segmentation areas in an image when it contains similar and/or un
stationary texture fields. A novel neighborhood-based segmentation
approach is proposed. A genetic algorithm is used in the proposed
segment-pass optimization process. In this pass, an energy function,
which is defined based on Markov Random Fields, is minimized. In
this paper we use an adaptive threshold estimation method for image
thresholding in the wavelet domain based on the generalized
Gaussian distribution (GGD) modeling of sub band coefficients. This
method called Normal Shrink is computationally more efficient and
adaptive because the parameters required for estimating the threshold
depend on sub band data energy that used in the pre-stage of
segmentation. A quad tree is employed to implement the multi
resolution framework, which enables the use of different strategies at
different resolution levels, and hence, the computation can be
accelerated. The experimental results using the proposed
segmentation approach are very encouraging.
Abstract: The present article deals with a composite casting process that allows to produce bilayer AlSn6-Al strips based on the technique of horizontal continuous casting. In the first part experimental investigations on the production of a single layer AlSn6 strip are described. Afterwards essential results of basic compound casting trials using simple test specimen are presented to define the thermal conditions required for a metallurgical compound between the alloy AlSn6 and pure aluminium. Subsequently, numerical analyses are described. A finite element model was used to examine a continuous composite casting process. As a result of the simulations the main influencing parameters concerning the thermal conditions within the composite casting region could be pointed out. Finally, basic guidance is given for the design of an appropriate composite mould system.
Abstract: CO2 is the primary anthropogenic greenhouse gas,
accounting for 77% of the human contribution to the greenhouse
effect in 2004. In the recent years, global concentration of CO2 in the
atmosphere is increasing rapidly. CO2 emissions have an impact on
global climate change. Anthropogenic CO2 is emitted primarily from
fossil fuel combustion. Carbon capture and storage (CCS) is one
option for reducing CO2 emissions. There are three major approaches
for CCS: post-combustion capture, pre-combustion capture and
oxyfuel process. Post-combustion capture offers some advantages as
existing combustion technologies can still be used without radical
changes on them.
There are several post combustion gas separation and capture
technologies being investigated, namely; (a) absorption, (b)
cryogenic separation, (c) membrane separation (d) micro algal biofixation
and (e) adsorption. Apart from establishing new techniques,
the exploration of capture materials with high separation performance
and low capital cost are paramount importance. However, the
application of adsorption from either technology, require easily
regenerable and durable adsorbents with a high CO2 adsorption
capacity. It has recently been reported that the cost of the CO2
capture can be reduced by using this technology. In this paper, the
research progress (from experimental results) in adsorbents for CO2
adsorption, storage, and separations were reviewed and future
research directions were suggested as well.
Abstract: The burst of Web 2.0 technology and social
networking tools manifest different styles of learning and managing
knowledge among both knowledge workers and adult learners. In the
Western countries, open-learning concept has been made popular due
to the ease of use and the reach that the technology provides. In
Malaysia, there are still some gaps between the learners- acceptance
of technology and the full implementation of the technology in the
education system. There is a need to understand how adult learners,
who are knowledge workers, manage their personal knowledge via
social networking tools, especially in their learning process. Four
processes of personal knowledge management (PKM) and four
cognitive enablers are proposed supported by analysed data on adult
learners in a university. The model derived from these processes and
enablers is tested and presented, with recommendations on features to be included in adult learners- learning environment.
Abstract: In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.
Abstract: One of the major challenges in the Information
Retrieval field is handling the massive amount of information
available to Internet users. Existing ranking techniques and strategies
that govern the retrieval process fall short of expected accuracy.
Often relevant documents are buried deep in the list of documents
returned by the search engine. In order to improve retrieval accuracy
we examine the issue of language effect on the retrieval process.
Then, we propose a solution for a more biased, user-centric relevance
for retrieved data. The results demonstrate that using indices based
on variations of the same language enhances the accuracy of search
engines for individual users.
Abstract: Processing tabah bamboo shoot as fermented pickle is
one of the way to increase the shelf life of this bamboo shoot. The
advantage of this shoot is low concentration of hydro cyanic acid
(HCN) make it potential for functional food product. This study
aimed to determine the characteristic of tabah bamboo shoot pickle
such as total of lactic acid bacteria (LAB), pH, total acidity, and
hydro cyanic acid (HCN) content, and also find the LAB’s type
involved during fermentation, and organic acids’ profiles. The pickle
was made by natural fermentation with 6% salt concentration and
fermentation conducted for 13 days.
The result showed during the fermentation time, in the 4th day
LAB’s number was highest as much as 72 x 107 CFU/ml and the
lowest pH was 3.09. We also found decreasing in HCN from 37.8
ppm at the beginning to 20.52 ppm at the end of fermentation
process. The organic acids detected during the fermentation were
lactic acid with the highest concentration was 0.0546 g/100 g and
small amount of acetic acid. By using PCR method, the 18 of LABs
which had rod shape were detected as member of Lactobacillus spp.,
in which 17 strains detected as L. plantarum.
Abstract: Recently, analysis and designing of the structures
based on the Reliability theory have been the center of attention.
Reason of this attention is the existence of the natural and random
structural parameters such as the material specification, external
loads, geometric dimensions etc. By means of the Reliability theory,
uncertainties resulted from the statistical nature of the structural
parameters can be changed into the mathematical equations and the
safety and operational considerations can be considered in the
designing process. According to this theory, it is possible to study the
destruction probability of not only a specific element but also the
entire system. Therefore, after being assured of safety of every
element, their reciprocal effects on the safety of the entire system can
be investigated.
Abstract: The typical coupled-tanks process that is TITO
plant has the difficulty in controller design because changing
of system dynamics and interacting of process. This paper
presents design methodology of auto-adjustable PI controller
using MRAC technique. The proposed method can adjust the
controller parameters in response to changes in plant and
disturbance real time by referring to the reference model that
specifies properties of the desired control system.
Abstract: Utilization of waste material in asphalt pavement
would be beneficial in order to find an alternative solution to increase
service life of asphalt pavement and reduce environmental pollution
as well. One of these waste materials is Polyethylene Terephthalate
(PET) which is a type of polyester material and is produced in a large
extent. This research program is investigating the effects of adding
waste PET particles into the asphalt mixture with a maximum size of
2.36 mm. Different percentages of PET were added into the mixture
during dry process. Gap-graded mixture (SMA 14) and PG 80-100
asphalt binder have been used for this study. To evaluate PET
reinforced asphalt mixture different laboratory investigations have
been conducted on specimens. Marshall Stability test was carried
out. Besides, stiffness modulus test and indirect tensile fatigue test
were conducted on specimens at optimum asphalt content. It was
observed that in many cases PET reinforced SMA mixture had better
mechanical properties in comparison with control mixture.