Abstract: The rapid growth of e-Commerce services is
significantly observed in the past decade. However, the method to
verify the authenticated users still widely depends on numeric
approaches. A new search on other verification methods suitable for
online e-Commerce is an interesting issue. In this paper, a new online
signature-verification method using angular transformation is
presented. Delay shifts existing in online signatures are estimated by
the estimation method relying on angle representation. In the
proposed signature-verification algorithm, all components of input
signature are extracted by considering the discontinuous break points
on the stream of angular values. Then the estimated delay shift is
captured by comparing with the selected reference signature and the
error matching can be computed as a main feature used for verifying
process. The threshold offsets are calculated by two types of error
characteristics of the signature verification problem, False Rejection
Rate (FRR) and False Acceptance Rate (FAR). The level of these two
error rates depends on the decision threshold chosen whose value is
such as to realize the Equal Error Rate (EER; FAR = FRR). The
experimental results show that through the simple programming,
employed on Internet for demonstrating e-Commerce services, the
proposed method can provide 95.39% correct verifications and 7%
better than DP matching based signature-verification method. In
addition, the signature verification with extracting components
provides more reliable results than using a whole decision making.
Abstract: Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.
Abstract: Image processing for capsule endoscopy requires large
memory and it takes hours for diagnosis since operation time is
normally more than 8 hours. A real-time analysis algorithm of capsule
images can be clinically very useful. It can differentiate abnormal
tissue from health structure and provide with correlation information
among the images. Bleeding is our interest in this regard and we
propose a method of detecting frames with potential bleeding in
real-time. Our detection algorithm is based on statistical analysis and
the shapes of bleeding spots. We tested our algorithm with 30 cases of
capsule endoscopy in the digestive track. Results were excellent where
a sensitivity of 99% and a specificity of 97% were achieved in
detecting the image frames with bleeding spots.
Abstract: In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Abstract: One of the key research issues in wireless sensor networks (WSNs) is how to efficiently deploy sensors to cover an area. In this paper, we present a Fishnet Based Dispatch Scheme (FiBDS) with energy aware mobility and interest based sensing angle. We propose two algorithms, one is FiBDS centralized algorithm and another is FiBDS distributed algorithm. The centralized algorithm is designed specifically for the non-time critical applications, commonly known as non real-time applications while the distributed algorithm is designed specifically for the time critical applications, commonly known as real-time applications. The proposed dispatch scheme works in a phase-selection manner. In this in each phase a specific constraint is dealt with according to the specified priority and then moved onto the next phase and at the end of each only the best suited nodes for the phase are chosen. Simulation results are presented to verify their effectiveness.
Abstract: The people are differed by their capabilities, skills and mental agilities. The evolution of human from childhood when they are completely dependent up to adultness the time they gradually set the dependency free is too complicated, by considering they have all started from almost one point but some become cleverer and some less. The main control command of a cybernetic hand should be posted by remaining healthy organs of disabled Person. These commands can be from several channels, which their recording and detecting are different and need complicated study. In this research, we suppose that, this stage has been done or in the other words, the command has been already sent and detected. So the main goal is to control a long hand, upper elbow hand missing, by an interest angle define by disabled. It means that, the system input is the position desired by disables and the output is the elbow-joint angle variation. Therefore the goal is a suitable control design based on neural network theory in order to meet the given mapping.
Abstract: Sleep spindles are the most interesting hallmark of
stage 2 sleep EEG. Their accurate identification in a
polysomnographic signal is essential for sleep professionals to help
them mark Stage 2 sleep. Sleep Spindles are also promising objective
indicators for neurodegenerative disorders. Visual spindle scoring
however is a tedious workload. In this paper three different
approaches are used for the automatic detection of sleep spindles:
Short Time Fourier Transform, Wavelet Transform and Wave
Morphology for Spindle Detection. In order to improve the results, a
combination of the three detectors is presented and comparison with
human expert scorers is performed. The best performance is obtained
with a combination of the three algorithms which resulted in a
sensitivity and specificity of 94% when compared to human expert
scorers.
Abstract: Microtomographic images and thin section (TS)
images were analyzed and compared against some parameters of
geological interest such as porosity and its distribution along the
samples. The results show that microtomography (CT) analysis,
although limited by its resolution, have some interesting information
about the distribution of porosity (homogeneous or not) and can also
quantify the connected and non-connected pores, i.e., total porosity.
TS have no limitations concerning resolution, but are limited by the
experimental data available in regards to a few glass sheets for
analysis and also can give only information about the connected
pores, i.e., effective porosity. Those two methods have their own
virtues and flaws but when paired together they are able to
complement one another, making for a more reliable and complete
analysis.
Abstract: After the accounting scandals and the financial crisis, regulators have stressed the need for more financial experts on boards. Several studies conducted in countries with developed capital markets report positive effects of board financial competencies. As each country offers a different context and specific institutional factors this paper addresses the subject in the context of Romania. The Romanian capital market offers an interesting research field because of the heterogeneity of listed firms. After analyzing board members education based on public information posted on listed companies websites and their annual reports we found a positive association between the proportion of board members holding a postgraduate degree in financial fields and market based performance measured by Tobin q. We found also that the proportion of Board members holding degrees in financial fields is higher in bigger firms and firms with more concentrated ownership.
Abstract: Segmentation is an important step in medical image
analysis and classification for radiological evaluation or computer
aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT
generally first segment the area of interest (lung) and then analyze
the separately obtained area for nodule detection in order to
diagnosis the disease. For normal lung, segmentation can be
performed by making use of excellent contrast between air and
surrounding tissues. However this approach fails when lung is
affected by high density pathology. Dense pathologies are present in
approximately a fifth of clinical scans, and for computer analysis
such as detection and quantification of abnormal areas it is vital that
the entire and perfectly lung part of the image is provided and no
part, as present in the original image be eradicated. In this paper we
have proposed a lung segmentation technique which accurately
segment the lung parenchyma from lung CT Scan images. The
algorithm was tested against the 25 datasets of different patients
received from Ackron Univeristy, USA and AGA Khan Medical
University, Karachi, Pakistan.
Abstract: Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automation of the analysis and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for this task and serve as input into a radial basis function network that is trained to discriminate transient shapes from pulse like to wave like. We concentrate on signals in the Very Low Frequency (VLF, 3 -30 kHz) range in this paper, but the developed methods are independent of this specific choice.
Abstract: The present energy situation and the concerns
about global warming has stimulated active research interest
in non-petroleum, carbon free compounds and non-polluting
fuels, particularly for transportation, power generation, and
agricultural sectors. Environmental concerns and limited
amount of petroleum fuels have caused interests in the
development of alternative fuels for internal combustion (IC)
engines. The petroleum crude reserves however, are declining
and consumption of transport fuels particularly in the
developing countries is increasing at high rates. Severe
shortage of liquid fuels derived from petroleum may be faced
in the second half of this century. Recently more and more
stringent environmental regulations being enacted in the USA
and Europe have led to the research and development
activities on clean alternative fuels. Among the gaseous fuels
hydrogen is considered to be one of the clean alternative fuel.
Hydrogen is an interesting candidate for future internal
combustion engine based power trains. In this experimental
investigation, the performance and combustion analysis were
carried out on a direct injection (DI) diesel engine using
hydrogen with diesel following the TMI(Time Manifold
Injection) technique at different injection timings of 10
degree,45 degree and 80 degree ATDC using an electronic
control unit (ECU) and injection durations were controlled.
Further, the tests have been carried out at a constant speed of
1500rpm at different load conditions and it can be observed
that brake thermal efficiency increases with increase in load
conditions with a maximum gain of 15% at full load
conditions during all injection strategies of hydrogen. It was
also observed that with the increase in hydrogen energy share
BSEC started reducing and it reduced to a maximum of 9% as
compared to baseline diesel at 10deg ATDC injection during
maximum injection proving the exceptional combustion
properties of hydrogen.
Abstract: This paper is a continuation of our interest in the influence of temperature on specific retention volumes and the resulting infinite dilution activity coefficients. This has a direct effect in the design of absorption and stripping columns for the abatement of volatile organic compounds. The interaction of 13 volatile organic compounds (VOCs) with polydimethylsiloxane (PDMS) at varying temperatures was studied by gas liquid chromatography (GLC). Infinite dilution activity coefficients and specific retention volumes obtained in this study were found to be in agreement with those obtained from static headspace and group contribution methods by the authors as well as literature values for similar systems. Temperature variation also allows for transport calculations for different seasons. The results of this work confirm that PDMS is well suited for the scrubbing of VOCs from waste gas streams. Plots of specific retention volumes against temperature gave linear van-t Hoff plots.
Abstract: In parallel, broadcasting has changed rapidly with the
changing of the world at the same area. Broadcasting is also
influenced and reshaped in terms of the emergence of new
communication technologies. These developments have resulted a lot
of economic and social consequences. The most important
consequences of these results are those of the powers of the
governments to control over the means of communication and control
mechanisms related to the descriptions of the new issues. For this
purpose, autonomous and independent regulatory bodies have been
established by the state. One of these regulatory bodies is the Radio
and Television Supreme Council, which to be established in 1994,
with the Code no 3984. Today’s Radio and Television Supreme
Council which is responsible for the regulation of the radio and
television broadcasts all across Turkey has an important and effective
position as autonomous and independent regulatory body. The Radio
and Television Supreme Council acts as being a remarkable organizer
for a sensitive area of radio and television broadcasting on one hand,
and the area of democratic, liberal and keep in mind the concept of
the public interest by putting certain principles for the functioning of
the Board control, in the context of media policy as one of the central
organs, on the other hand.
In this study, the role of the Radio and Television Supreme
Council is examined in accordance with the Code no 3894 in order to
control over the communication and control mechanisms as well as
the examination of the changes in the duties of the Code No. 6112,
dated 2011.
Abstract: Hydrogen is an important chemical in many industries
and it is expected to become one of the major fuels for energy
generation in the future. Unfortunately, hydrogen does not exist in its
elemental form in nature and therefore has to be produced from
hydrocarbons, hydrogen-containing compounds or water.
Above its critical point (374.8oC and 22.1MPa), water has lower
density and viscosity, and a higher heat capacity than those of
ambient water. Mass transfer in supercritical water (SCW) is
enhanced due to its increased diffusivity and transport ability. The
reduced dielectric constant makes supercritical water a better solvent
for organic compounds and gases. Hence, due to the aforementioned
desirable properties, there is a growing interest toward studies
regarding the gasification of organic matter containing biomass or
model biomass solutions in supercritical water.
In this study, hydrogen and biofuel production by the catalytic
gasification of 2-Propanol in supercritical conditions of water was
investigated. Pt/Al2O3and Ni/Al2O3were the catalysts used in the
gasification reactions. All of the experiments were performed under a
constant pressure of 25MPa. The effects of five reaction temperatures
(400, 450, 500, 550 and 600°C) and five reaction times (10, 15, 20,
25 and 30 s) on the gasification yield and flammable component
content were investigated.
Abstract: Personal computers draw non-sinusoidal current
with odd harmonics more significantly. Power Quality of
distribution networks is severely affected due to the flow of these
generated harmonics during the operation of electronic loads. In
this paper, mathematical modeling of odd harmonics in current like
3rd, 5th, 7th and 9th influencing the power quality has been presented.
Live signals have been captured with the help of power quality
analyzer for analysis purpose. The interesting feature is that Total
Harmonic Distortion (THD) in current decreases with the increase
of nonlinear loads has been verified theoretically. The results
obtained using mathematical expressions have been compared with
the practical results and exciting results have been found.
Abstract: Probabilistic techniques in computer programs are becoming
more and more widely used. Therefore, there is a big
interest in the formal specification, verification, and development
of probabilistic programs. In our work-in-progress project, we are
attempting to make a constructive framework for developing probabilistic
programs formally. The main contribution of this paper
is to introduce an intermediate artifact of our work, a Z-based
formalism called PZ, by which one can build set theoretical models of
probabilistic programs. We propose to use a constructive set theory,
called CZ set theory, to interpret the specifications written in PZ.
Since CZ has an interpretation in Martin-L¨of-s theory of types, this
idea enables us to derive probabilistic programs from correctness
proofs of their PZ specifications.
Abstract: The purpose of this paper is to describe the process of
setting up a learning community within an elementary school in
Ontario, Canada. The description is provided through reflection and
examination of field notes taken during the yearlong training and
implementation process. Specifically the impact of teachers- capacity
on the creation of a learning community was of interest. This paper is
intended to inform and add to the debate around the tensions that
exist in implementing a bottom-up professional development model
like the learning community in a top-down organizational structure.
My reflections of the process illustrate that implementation of the
learning community professional development model may be
difficult and yet transformative in the professional lives of the
teachers, students, and administration involved in the change process.
I conclude by suggesting the need for a new model of professional
development that requires a transformative shift in power dynamics
and a shift in the view of what constitutes effective professional
learning.
Abstract: Emotion in speech is an issue that has been attracting
the interest of the speech community for many years, both in the
context of speech synthesis as well as in automatic speech
recognition (ASR). In spite of the remarkable recent progress in
Large Vocabulary Recognition (LVR), it is still far behind the
ultimate goal of recognising free conversational speech uttered by
any speaker in any environment. Current experimental tests prove
that using state of the art large vocabulary recognition systems the
error rate increases substantially when applied to
spontaneous/emotional speech. This paper shows that recognition
rate for emotionally coloured speech can be improved by using a
language model based on increased representation of emotional
utterances.
Abstract: The amount of the information being churned out by the field of biology has jumped manifold and now requires the extensive use of computer techniques for the management of this information. The predominance of biological information such as protein sequence similarity in the biological information sea is key information for detecting protein evolutionary relationship. Protein sequence similarity typically implies homology, which in turn may imply structural and functional similarities. In this work, we propose, a learning method for detecting remote protein homology. The proposed method uses a transformation that converts protein sequence into fixed-dimensional representative feature vectors. Each feature vector records the sensitivity of a protein sequence to a set of amino acids substrings generated from the protein sequences of interest. These features are then used in conjunction with support vector machines for the detection of the protein remote homology. The proposed method is tested and evaluated on two different benchmark protein datasets and it-s able to deliver improvements over most of the existing homology detection methods.