Abstract: An attempt has been made several times to identify
and discuss the U.S. experience on the formation of political nation in
political science. The purpose of this research paper is to identify the
main aspects of the formation of civic identity in the United States
and Kazakhstan, through the identification of similarities and
differences that can get practical application in making decisions of
national policy issues in the context of globalization, as well as to
answer the questions “What should unite the citizens of Kazakhstan
to the nation?" and “What should be the dominant identity: civil or
ethnic (national) one?"
Can Kazakhstan being multiethnic country like America, adopt its
experience in the formation of a civic nation? Since it is believed that
the “multi-ethnic state of the population is a characteristic feature of
most modern countries in the world," it states that “inter-ethnic
integration is one of the most important aspects of the problem of
forming a new social community (metaetnic - Kazakh people,
Kazakh nation" [1].
Abstract: Today modern simulations solutions in the wind turbine industry have achieved a high degree of complexity and detail in result. Limitations exist when it is time to validate model results against measurements. Regarding Model validation it is of special interest to identify mode frequencies and to differentiate them from the different excitations. A wind turbine is a complex device and measurements regarding any part of the assembly show a lot of noise. Input excitations are difficult or even impossible to measure due to the stochastic nature of the environment. Traditional techniques for frequency analysis or features extraction are widely used to analyze wind turbine sensor signals, but have several limitations specially attending to non stationary signals (Events). A new technique based on autoregresive analysis techniques is introduced here for a specific application, a comparison and examples related to different events in the wind turbine operations are presented.
Abstract: The Learning Management Systems present learning
environment which offers a collection of e-learning tools in a
package that allows a common interface and information sharing
among the tools. South East European University initial experience
in LMS was with the usage of the commercial LMS-ANGEL. After a
three year experience on ANGEL usage because of expenses that
were very high it was decided to develop our own software. As part
of the research project team for the in-house design and development
of the new LMS, we primarily had to select the features that would
cover our needs and also comply with the actual trends in the area of
software development, and then design and develop the system. In
this paper we present the process of LMS in-house development for
South East European University, its architecture, conception and
strengths with a special accent on the process of migration and
integration with other enterprise applications.
Abstract: In this paper we illuminate a frequency domain based
classification method for video scenes. Videos from certain topical
areas often contain activities with repeating movements. Sports
videos, home improvement videos, or videos showing mechanical
motion are some example areas. Assessing main and side frequencies
of each repeating movement gives rise to the motion type. We
obtain the frequency domain by transforming spatio-temporal motion
trajectories. Further on we explain how to compute frequency features
for video clips and how to use them for classifying. The focus of
the experimental phase is on transforms utilized for our system.
By comparing various transforms, experiments show the optimal
transform for a motion frequency based approach.
Abstract: In this paper we examine the use of global texture analysis based approaches for the purpose of Persian font recognition in machine-printed document images. Most existing methods for font recognition make use of local typographical features and connected component analysis. However derivation of such features is not an easy task. Gabor filters are appropriate tools for texture analysis and are motivated by human visual system. Here we consider document images as textures and use Gabor filter responses for identifying the fonts. The method is content independent and involves no local feature analysis. Two different classifiers Weighted Euclidean Distance and SVM are used for the purpose of classification. Experiments on seven different type faces and four font styles show average accuracy of 85% with WED and 82% with SVM classifier over typefaces
Abstract: Automatic detection of bleeding is of practical
importance since capsule endoscopy produces an extremely large
number of images. Algorithm development of bleeding detection in
the digestive tract is difficult due to different contrasts among the
images, food dregs, secretion and others. In this study, were assigned
weighting factors derived from the independent features of the
contrast and brightness between bleeding and normality. Spectral
analysis based on weighting factors was fast and accurate. Results
were a sensitivity of 87% and a specificity of 90% when the accuracy
was determined for each pixel out of 42 endoscope images.
Abstract: Feature and model selection are in the center of
attention of many researches because of their impact on classifiers-
performance. Both selections are usually performed separately but
recent developments suggest using a combined GA-SVM approach to
perform them simultaneously. This approach improves the
performance of the classifier identifying the best subset of variables
and the optimal parameters- values. Although GA-SVM is an
effective method it is computationally expensive, thus a rough
method can be considered. The paper investigates a joined approach
of Genetic Algorithm and kernel matrix criteria to perform
simultaneously feature and model selection for SVM classification
problem. The purpose of this research is to improve the classification
performance of SVM through an efficient approach, the Kernel
Matrix Genetic Algorithm method (KMGA).
Abstract: Purpose of this work is to develop an automatic classification system that could be useful for radiologists in the breast cancer investigation. The software has been designed in the framework of the MAGIC-5 collaboration. In an automatic classification system the suspicious regions with high probability to include a lesion are extracted from the image as regions of interest (ROIs). Each ROI is characterized by some features based generally on morphological lesion differences. A study in the space features representation is made and some classifiers are tested to distinguish the pathological regions from the healthy ones. The results provided in terms of sensitivity and specificity will be presented through the ROC (Receiver Operating Characteristic) curves. In particular the best performances are obtained with the Neural Networks in comparison with the K-Nearest Neighbours and the Support Vector Machine: The Radial Basis Function supply the best results with 0.89 ± 0.01 of area under ROC curve but similar results are obtained with the Probabilistic Neural Network and a Multi Layer Perceptron.
Abstract: One of the disadvantages of using OFDM is the larger
peak to averaged power ratio (PAPR) in its time domain signal. The
larger PAPR signal would course the fatal degradation of bit error
rate performance (BER) due to the inter-modulation noise in the nonlinear
channel. This paper proposes an improved DSI (Dummy
Sequence Insertion) method, which can achieve the better PAPR and
BER performances. The feature of proposed method is to optimize
the phase of each dummy sub-carrier so as to reduce the PAPR
performance by changing all predetermined phase coefficients in the
time domain signal, which is calculated for data sub-carriers and
dummy sub-carriers separately. To achieve the better PAPR
performance, this paper also proposes to employ the time-frequency
domain swapping algorithm for fine adjustment of phase coefficient
of the dummy subcarriers, which can achieve the less complexity of
processing and achieves the better PAPR and BER performances
than those for the conventional DSI method. This paper presents
various computer simulation results to verify the effectiveness of
proposed method as comparing with the conventional methods in the
non-linear channel.
Abstract: Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.
Abstract: Approximate tandem repeats in a genomic sequence are
two or more contiguous, similar copies of a pattern of nucleotides.
They are used in DNA mapping, studying molecular evolution
mechanisms, forensic analysis and research in diagnosis of inherited
diseases. All their functions are still investigated and not well
defined, but increasing biological databases together with tools for
identification of these repeats may lead to discovery of their specific
role or correlation with particular features. This paper presents a new
approach for finding approximate tandem repeats in a given sequence,
where the similarity between consecutive repeats is measured using
the Hamming distance. It is an enhancement of a method for finding
exact tandem repeats in DNA sequences based on the Burrows-
Wheeler transform.
Abstract: The controllable electrical loss which consists of the
copper loss and iron loss can be minimized by the optimal control of
the armature current vector. The control algorithm of current vector
minimizing the electrical loss is proposed and the optimal current
vector can be decided according to the operating speed and the load
conditions. The proposed control algorithm is applied to the
experimental PM motor drive system and this paper presents a
modern approach of speed control for permanent magnet
synchronous motor (PMSM) applied for Electric Vehicle using a
nonlinear control. The regulation algorithms are based on the
feedback linearization technique. The direct component of the current
is controlled to be zero which insures the maximum torque operation.
The near unity power factor operation is also achieved. More over,
among EV-s motor electric propulsion features, the energy efficiency
is a basic characteristic that is influenced by vehicle dynamics and
system architecture. For this reason, the EV dynamics are taken into
account.
Abstract: In this paper, a new robust audio fingerprinting
algorithm in MP3 compressed domain is proposed with high
robustness to time scale modification (TSM). Instead of simply
employing short-term information of the MP3 stream, the new
algorithm extracts the long-term features in MP3 compressed domain
by using the modulation frequency analysis. Our experiment has
demonstrated that the proposed method can achieve a hit rate of
above 95% in audio retrieval and resist the attack of 20% TSM. It has
lower bit error rate (BER) performance compared to the other
algorithms. The proposed algorithm can also be used in other
compressed domains, such as AAC.
Abstract: Traditional object segmentation methods are time consuming and computationally difficult. In this paper, onedimensional object detection along the secant lines is applied. Statistical features of texture images are computed for the recognition process. Example matrices of these features and formulae for calculation of similarities between two feature patterns are expressed. And experiments are also carried out using these features.
Abstract: Gesture recognition is a challenging task for extracting
meaningful gesture from continuous hand motion. In this paper, we propose an automatic system that recognizes isolated gesture,
in addition meaningful gesture from continuous hand motion for Arabic numbers from 0 to 9 in real-time based on Hidden Markov Models (HMM). In order to handle isolated gesture, HMM using
Ergodic, Left-Right (LR) and Left-Right Banded (LRB) topologies is applied over the discrete vector feature that is extracted from stereo
color image sequences. These topologies are considered to different
number of states ranging from 3 to 10. A new system is developed to recognize the meaningful gesture based on zero-codeword detection
with static velocity motion for continuous gesture. Therefore, the
LRB topology in conjunction with Baum-Welch (BW) algorithm for
training and forward algorithm with Viterbi path for testing presents the best performance. Experimental results show that the proposed system can successfully recognize isolated and meaningful gesture and achieve average rate recognition 98.6% and 94.29% respectively.
Abstract: This paper proposes a visual cryptography by random
grids scheme with identifiable shares. The method encodes an image
O in two shares that exhibits the following features: (1) each generated
share has the same scale as O, (2) any share singly has noise-like
appearance that reveals no secret information on O, (3) the secrets can
be revealed by superimposing the two shares, (4) folding a share up
can disclose some identification patterns, and (5) both of the secret
information and the designated identification patterns are recognized
by naked eye without any computation. The property to show up
identification patterns on folded shares establishes a simple and
friendly interface for users to manage the numerous shares created by
VC schemes.
Abstract: In this paper, we propose a novel adaptive
spatiotemporal filter that utilizes image sequences in order to remove
noise. The consecutive frames include: current, previous and next
noisy frames. The filter proposed in this paper is based upon the
weighted averaging pixels intensity and noise variance in image
sequences. It utilizes the Appropriate Number of Consecutive Frames
(ANCF) based on the noisy pixels intensity among the frames. The
number of consecutive frames is adaptively calculated for each
region in image and its value may change from one region to another
region depending on the pixels intensity within the region. The
weights are determined by a well-defined mathematical criterion,
which is adaptive to the feature of spatiotemporal pixels of the
consecutive frames. It is experimentally shown that the proposed
filter can preserve image structures and edges under motion while
suppressing noise, and thus can be effectively used in image
sequences filtering. In addition, the AWA filter using ANCF is
particularly well suited for filtering sequences that contain segments
with abruptly changing scene content due to, for example, rapid
zooming and changes in the view of the camera.
Abstract: Iris localization is a very important approach in
biometric identification systems. Identification process usually is
implemented in three levels: iris localization, feature extraction, and
pattern matching finally. Accuracy of iris localization as the first step
affects all other levels and this shows the importance of iris
localization in an iris based biometric system. In this paper, we
consider Daugman iris localization method as a standard method,
propose a new method in this field and then analyze and compare the
results of them on a standard set of iris images. The proposed method
is based on the detection of circular edge of iris, and improved by
fuzzy circles and surface energy difference contexts. Implementation
of this method is so easy and compared to the other methods, have a
rather high accuracy and speed. Test results show that the accuracy of
our proposed method is about Daugman method and computation
speed of it is 10 times faster.
Abstract: A number of mass spectrometry applications are already available as web-based and windows-based systems to calculate isotope pattern and to display the mass spectrum based on the specific molecular formula besides providing necessary information. These applications were evaluated and compared with our new alternative application called Theoretical Isotope Generator (TIG) in terms of its functionality and features provided to prove this new application is working better and performing well. TIG provides extra features than others, complete with several functionality such as drawing, normalizing and zooming the generated graph that convey with the molecular information in a number of formats by providing the details of the calculation and molecules. Thus, any chemist, students, lecturers and researchers from anywhere could use TIG to gain related information on molecules and their relative intensity.
Abstract: Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.