Abstract: In this research the Preparation of Land use map of
scanner LISS III satellite data, belonging to the IRS in the Aghche
region in Isfahan province, is studied carefully. For this purpose, the
IRS satellite images of August 2008 and various land preparation
uses in region including rangelands, irrigation farming, dry farming,
gardens and urban areas were separated and identified. Therefore, the
GPS and Erdas Imaging software were used and three methods of
Maximum Likelihood, Mahalanobis Distance and Minimum Distance
were analyzed. In each of these methods, matrix error and Kappa
index were calculated and accuracy of each method, based on
percentages: 53.13, 56.64 and 48.44, were obtained respectively.
Considering the low accuracy of these methods in separation of land
preparation use, the visual interpretation of the map was used.
Finally, regional visits of 150 points were noted at random and no
error was observed. It shows that the map prepared by visual
interpretation is in high accuracy. Although the probable errors due
to visual interpretation and geometric correction might happen but
the desired accuracy of the map which is more than 85 percent is
reliable.
Abstract: This paper focuses on operational risk measurement
techniques and on economic capital estimation methods. A data
sample of operational losses provided by an anonymous Central
European bank is analyzed using several approaches. Loss
Distribution Approach and scenario analysis method are considered.
Custom plausible loss events defined in a particular scenario are
merged with the original data sample and their impact on capital
estimates and on the financial institution is evaluated. Two main
questions are assessed – What is the most appropriate statistical
method to measure and model operational loss data distribution? and
What is the impact of hypothetical plausible events on the financial
institution? The g&h distribution was evaluated to be the most
suitable one for operational risk modeling. The method based on the
combination of historical loss events modeling and scenario analysis
provides reasonable capital estimates and allows for the measurement
of the impact of extreme events on banking operations.
Abstract: This paper presents a comparative study of Ant Colony and Genetic Algorithms for VLSI circuit bi-partitioning. Ant colony optimization is an optimization method based on behaviour of social insects [27] whereas Genetic algorithm is an evolutionary optimization technique based on Darwinian Theory of natural evolution and its concept of survival of the fittest [19]. Both the methods are stochastic in nature and have been successfully applied to solve many Non Polynomial hard problems. Results obtained show that Genetic algorithms out perform Ant Colony optimization technique when tested on the VLSI circuit bi-partitioning problem.
Abstract: This paper describes a new method for extracting the fetal heart rate (fHR) and the fetal heart rate variability (fHRV) signal non-invasively using abdominal maternal electrocardiogram (mECG) recordings. The extraction is based on the fundamental frequency (Fourier-s) theorem. The fundamental frequency of the mother-s electrocardiogram signal (fo-m) is calculated directly from the abdominal signal. The heart rate of the fetus is usually higher than that of the mother; as a result, the fundamental frequency of the fetal-s electrocardiogram signal (fo-f) is higher than that of the mother-s (fo-f > fo-m). Notch filters to suppress mother-s higher harmonics were designed; then a bandpass filter to target fo-f and reject fo-m is implemented. Although the bandpass filter will pass some other frequencies (harmonics), we have shown in this study that those harmonics are actually carried on fo-f, and thus have no impact on the evaluation of the beat-to-beat changes (RR intervals). The oscillations of the time-domain extracted signal represent the RR intervals. We have also shown in this study that zero-to-zero evaluation of the periods is more accurate than the peak-to-peak evaluation. This method is evaluated both on simulated signals and on different abdominal recordings obtained at different gestational ages.
Abstract: To understand life as biological system, evolutionary
understanding is indispensable. Protein interactions data are rapidly
accumulating and are suitable for system-level evolutionary analysis.
We have analyzed yeast protein interaction network by both
mathematical and biological approaches. In this poster presentation,
we inferred the evolutionary birth periods of yeast proteins by
reconstructing phylogenetic profile. It has been thought that hub
proteins that have high connection degree are evolutionary old. But
our analysis showed that hub proteins are entirely evolutionary new.
We also examined evolutionary processes of protein complexes. It
showed that member proteins of complexes were tend to have
appeared in the same evolutionary period. Our results suggested that
protein interaction network evolved by modules that form the
functional unit. We also reconstructed standardized phylogenetic trees
and calculated evolutionary rates of yeast proteins. It showed that
there is no obvious correlation between evolutionary rates and
connection degrees of yeast proteins.
Abstract: Terrorism represents an unexpected and unwanted change which challenges one-s social identity. We carried out a study to explore the demographic variables- role on the perception of personal and national threat, and to investigate the effects of perceived terrorist threat on people-s ways of life, moods, opinions and hopes. 313 residents of Palermo (Italy) were interviewed. The results pointed out that the fear of terrorism affects three areas: the cognitive, the emotional and the behavioural one.
Abstract: This paper focuses on the data-driven generation
of fuzzy IF...THEN rules. The resulted fuzzy rule base can be
applied to build a classifier, a model used for prediction, or
it can be applied to form a decision support system. Among
the wide range of possible approaches, the decision tree and
the association rule based algorithms are overviewed, and two
new approaches are presented based on the a priori fuzzy
clustering based partitioning of the continuous input variables.
An application study is also presented, where the developed
methods are tested on the well known Wisconsin Breast Cancer
classification problem.
Abstract: This work deals with aspects of support vector learning for large-scale data mining tasks. Based on a decomposition algorithm that can be run in serial and parallel mode we introduce a data transformation that allows for the usage of an expensive generalized kernel without additional costs. In order to speed up the decomposition algorithm we analyze the problem of working set selection for large data sets and analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our modifications and settings lead to improvement of support vector learning performance and thus allow using extensive parameter search methods to optimize classification accuracy.
Abstract: Bologna process has influenced enhancing studentcentered
learning in Estonian higher education since 2009, but there
is no information about what helps or hinders students to achieve
learning outcomes and how quality of student-centered learning
might be improved. The purpose of this study is to analyze two
questions from outcome-based course evaluation questionnaire which
is used in Estonian Entrepreneurship University of Applied Sciences.
In this qualitative research, 384 students from 22 different courses
described what helped and hindered them to achieve learning
outcomes. The analysis showed that the aspects that hinder students
to achieve learning outcomes are mostly personal: time management,
family and personal matters, motivation and non-academic activities.
The results indicate that students- learning is commonly supported by
school, where teacher, teaching and characteristics of teaching
methods help mostly to achieve learning outcomes, also learning
material, practical assignments and independent study was brought
up as one of the key elements.
Abstract: One of the important applications of gas turbines is
their utilization for heat recovery steam generator in combine-cycle technology. Exhaust flow and energy are two key parameters for
determining heat recovery steam generator performance which are mainly determined by the main gas turbine components performance
data. For this reason a method was developed for determining the
exhaust energy in the new edition of ASME PTC22. The result of this investigation shows that the method of standard has considerable
error. Therefore in this paper a new method is presented for modifying of the performance calculation. The modified method is
based on exhaust gas constituent analysis and combustion calculations. The case study presented here by two kind of General
Electric gas turbine design data for validation of methodologies. The
result shows that the modified method is more precise than the ASME PTC22 method. The exhaust flow calculation deviation from
design data is 1.5-2 % by ASME PTC22 method so that the deviation regarding with modified method is 0.3-0.5%. Based on precision of
analyzer instruments, the method can be suitable alternative for gas
turbine standard performance test. In advance two methods are
proposed based on known and unknown fuel in modified method procedure. The result of this paper shows that the difference between
the two methods is below than %0.02. In according to reasonable esult of the second procedure (unknown fuel composition), the
method can be applied to performance evaluation of gas turbine, so that the measuring cost and data gathering should be reduced.
Abstract: This paper presents the automated methods employed
for extracting craniofacial landmarks in white light images as part of
a registration framework designed to support three neurosurgical
procedures. The intraoperative space is characterised by white light
stereo imaging while the preoperative plan is performed on CT scans.
The registration aims at aligning these two modalities to provide a
calibrated environment to enable image-guided solutions. The
neurosurgical procedures can then be carried out by mapping the
entry and target points from CT space onto the patient-s space. The
registration basis adopted consists of natural landmarks (eye corner
and ear tragus). A 5mm accuracy is deemed sufficient for these three
procedures and the validity of the selected registration basis in
achieving this accuracy has been assessed by simulation studies. The
registration protocol is briefly described, followed by a presentation
of the automated techniques developed for the extraction of the
craniofacial features and results obtained from tests on the AR and
FERET databases. Since the three targeted neurosurgical procedures
are routinely used for head injury management, the effect of
bruised/swollen faces on the automated algorithms is assessed. A
user-interactive method is proposed to deal with such unpredictable
circumstances.
Abstract: Optical network uses a tool for routing called Latin
router. These routers use particular algorithms for routing. For
example, we can refer to LDF algorithm that uses backtracking (one
of CSP methods) for problem solving. In this paper, we proposed
new approached for completion routing table (DRA&CRA
algorithm) and compare with pervious proposed ways and showed
numbers of backtracking, blocking and run time for DRA algorithm
less than LDF and CRA algorithm.
Abstract: This paper was aimed to survey the level of awareness
of traditional grocery stores in Bangkok in these categories: location,
service quality, risk, shopping, worthwhile, shopping satisfaction, and
future shopping intention. The paper was also aimed to survey factors
influencing the decision to shop at traditional grocery stores in
Bangkok in the future. The findings revealed that consumers had a
high level of awareness of traditional grocery stores in Bangkok.
Consumers were aware that the price was higher and it was riskier to
buy goods and services at traditional grocery stores but they still had
a high level of preference to patronage traditional grocery stores. This
was due to the reasons that there was a high level of satisfaction from
the factors of the friendliness of the owner, the ability to negotiate the
price, the ability to buy on credit, free delivery, and the enjoyment to
meet with other customers in the same neighborhood.
Abstract: Design of a constant chord propeller is presented in
this paper in order to reduce propeller-s design procedure-s costs. The
design process was based on Lock and Goldstein-s techniques of
propeller design and analysis. In order to calculate optimum chord of
propeller, chord of a referential element is generalized as whole
blades chord. The design outcome which named CS-X-1 is modeled
& analyzed by CFD methods using K-ε: R.N.G turbulence model.
Convergence of results of two codes proved that outcome results of
design process are reliable. Design result is a two-blade propeller
with a total diameter of 1.1 meter, radial velocity of 3000 R.P.M,
efficiency above .75 and power coefficient near 1.05.
Abstract: In this paper, a new face recognition method based on
PCA (principal Component Analysis), LDA (Linear Discriminant
Analysis) and neural networks is proposed. This method consists of
four steps: i) Preprocessing, ii) Dimension reduction using PCA, iii)
feature extraction using LDA and iv) classification using neural
network. Combination of PCA and LDA is used for improving the
capability of LDA when a few samples of images are available and
neural classifier is used to reduce number misclassification caused by
not-linearly separable classes. The proposed method was tested on
Yale face database. Experimental results on this database
demonstrated the effectiveness of the proposed method for face
recognition with less misclassification in comparison with previous
methods.
Abstract: Carbon Capture & Storage (CCS) is one of the various
methods that can be used to reduce the carbon footprint of the
energy sector. This paper focuses on the absorption of CO2 from
flue gas using packed columns, whose efficiency is highly dependent
on the structure of the liquid films within the column. To study the
characteristics of liquid films a CFD solver, OpenFOAM is utilised
to solve two-phase, isothermal film flow using the volume-of-fluid
(VOF) method. The model was validated using existing experimental
data and the Nusselt theory. It was found that smaller plate inclination
angles, with respect to the horizontal plane, resulted in larger wetted
areas on smooth plates. However, only a slight improvement in
the wetted area was observed. Simulations were also performed
using a ridged plate and it was observed that these surface textures
significantly increase the wetted area of the plate. This was mainly
attributed to the channelling effect of the ridges, which helped to
oppose the surface tension forces trying to minimise the surface area.
Rivulet formations on the ridged plate were also flattened out and
spread across a larger proportion of the plate width.
Abstract: This paper presents investigation effects of a sharp edged gust on aeroelastic behavior and time-domain response of a typical section model using Jones approximate aerodynamics for pure plunging motion. Flutter analysis has been done by using p and p-k methods developed for presented finite-state aerodynamic model for a typical section model (airfoil). Introduction of gust analysis as a linear set of ordinary differential equations in a simplified procedure has been carried out by using transformation into an eigenvalue problem.
Abstract: According to the new developments in the field of information and communication technologies, the necessity arises for active use of these new technologies in education. It is clear that the integration of technology in education system will be different for primary-higher education or traditional- distance education. In this study, the subject of the integration of technology for distance education was discussed. The subject was taken from the viewpoint of students. With using the information of student feedback about education program in which new technological medias are used, how can survey variables can be separated into the factors as positive, negative and supporter and how can be redesigned education strategy of the higher education associations with the examining the variables of each determinated factor is explained. The paper concludes with the recommendations about the necessitity of working as a group of different area experts and using of numerical methods in establishing of education strategy to be successful.
Abstract: The amount and heterogeneity of data in biomedical research, notably in interdisciplinary research, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charite Medical School in Berlin has established together with the German Research Foundation (DFG) a new information service center for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). The system is based on a service-oriented architecture (SOA) with main and auxiliary modules arranged in four layers. To improve the reuse and efficient arrangement of the services the functionalities are described as business processes using the standardised Business Process Execution Language (BPEL).
Abstract: Analysis and visualization of microarraydata is veryassistantfor biologists and clinicians in the field of diagnosis and treatment of patients. It allows Clinicians to better understand the structure of microarray and facilitates understanding gene expression in cells. However, microarray dataset is a complex data set and has thousands of features and a very small number of observations. This very high dimensional data set often contains some noise, non-useful information and a small number of relevant features for disease or genotype. This paper proposes a non-linear dimensionality reduction algorithm Local Principal Component (LPC) which aims to maps high dimensional data to a lower dimensional space. The reduced data represents the most important variables underlying the original data. Experimental results and comparisons are presented to show the quality of the proposed algorithm. Moreover, experiments also show how this algorithm reduces high dimensional data whilst preserving the neighbourhoods of the points in the low dimensional space as in the high dimensional space.