Abstract: A frequency grouping approach for multi-channel
instantaneous blind source separation (I-BSS) of convolutive
mixtures is proposed for a lower net residual inter-symbol
interference (ISI) and inter-channel interference (ICI) than the
conventional short-time Fourier transform (STFT) approach. Starting
in the time domain, STFTs are taken with overlapping windows to
convert the convolutive mixing problem into frequency domain
instantaneous mixing. Mixture samples at the same frequency but
from different STFT windows are grouped together forming unique
frequency groups.
The individual frequency group vectors are input to the I-BSS
algorithm of choice, from which the output samples are dispersed
back to their respective STFT windows. After applying the inverse
STFT, the resulting time domain signals are used to construct the
complete source estimates via the weighted overlap-add method
(WOLA). The proposed algorithm is tested for source deconvolution
given two mixtures, and simulated along with the STFT approach to
illustrate its superiority for fairly motionless sources.
Abstract: Location-based services (LBS) exploit the known
location of a user to provide services dependent on their geographic
context and personalized needs [1].
The development and arrival of broadband mobile data networks
supported with mobile terminals equipped with new location
technologies like GPS have finally created opportunities for
implementation of LBS applications. But, from the other side,
collecting location information data in general raises privacy
concerns.
This paper presents results from two surveys of LBS acceptance in
Croatia. The first survey was administered on 181 students, and the
second extended survey involved pattern of 180 Croatian citizens.
We developed questionnaire which consists of descriptions of 15
different applications with scale which measures perceptions and
attitudes of users towards these applications.
We report the results to identify potential commercial applications
for LBS in B2C segment. Our findings suggest that some types of
applications like emergency&safety services and navigation have
significantly higher rate of acceptance than other types.
Abstract: Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain.
In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of cambered NACA 64-210 and symmetric
NACA 0012 airfoils. Our results show significant increase in drag and decrease in lift. We used preprocessing software gridgen for creation of geometry and mesh, used fluent as solver and techplot as postprocessor. Discrete phase modeling called DPM is used to model the rain particles using two phase flow approach. The rain particles are assumed to be inert.
Both airfoils showed significant decrease in lift and increase in drag in simulated rain environment. The most significant difference between these two airfoils was the NACA 64-210 more sensitivity than NACA 0012 to liquid water content (LWC). We believe that the results showed in this paper will be useful for the designer of the commercial aircrafts and UAVs, and will be helpful for training of the pilots to control the airplanes in heavy rain.
Abstract: Rapid progress in process automation and tightening
quality standards result in a growing demand being placed on fault
detection and diagnostics methods to provide both speed and
reliability of motor quality testing. Doubly fed induction generators
are used mainly for wind energy conversion in MW power plants.
This paper presents a detection of an inter turn stator and an open
phase faults, in a doubly fed induction machine whose stator and
rotor are supplied by two pulse width modulation (PWM) inverters.
The method used in this article to detect these faults, is based on
Park-s Vector Approach, using a neural network.
Abstract: Defining strategic position of the organizations within
the industry environment is one of the basic and most important
phases of strategic planning to which extent that one of the
fundamental schools of strategic planning is the strategic positioning
school. In today-s knowledge-based economy and dynamic
environment, it is essential for universities as the centers of
education, knowledge creation and knowledge worker evolvement.
Till now, variant models with different approaches to strategic
positioning are deployed in defining the strategic position within the
various industries. Balanced Scorecard as one of the powerful models
for strategic positioning, analyzes all aspects of the organization
evenly. In this paper with the consideration of BSC strength in
strategic evaluation, it is used for analyzing the environmental
position of the best-s Iranian Business Schools. The results could be
used in developing strategic plans for these schools as well as other
Iranian Management and Business Schools.
Abstract: Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.
Abstract: Self-organizing map (SOM) is a well known data
reduction technique used in data mining. It can reveal structure in
data sets through data visualization that is otherwise hard to detect
from raw data alone. However, interpretation through visual
inspection is prone to errors and can be very tedious. There are
several techniques for the automatic detection of clusters of code
vectors found by SOM, but they generally do not take into account
the distribution of code vectors; this may lead to unsatisfactory
clustering and poor definition of cluster boundaries, particularly
where the density of data points is low. In this paper, we propose the
use of an adaptive heuristic particle swarm optimization (PSO)
algorithm for finding cluster boundaries directly from the code
vectors obtained from SOM. The application of our method to
several standard data sets demonstrates its feasibility. PSO algorithm
utilizes a so-called U-matrix of SOM to determine cluster boundaries;
the results of this novel automatic method compare very favorably to
boundary detection through traditional algorithms namely k-means
and hierarchical based approach which are normally used to interpret
the output of SOM.
Abstract: R&D risk management has been suggested as one of
the management approaches for accomplishing the goals of public
R&D investment. The investment in basic science and core technology
development is the essential roles of government for securing the
social base needed for continuous economic growth. And, it is also an
important role of the science and technology policy sectors to generate
a positive environment in which the outcomes of public R&D can be
diffused in a stable fashion by controlling the uncertainties and risk
factors in advance that may arise during the application of such
achievements to society and industry. Various policies have already
been implemented to manage uncertainties and variables that may
have negative impact on accomplishing public R& investment goals.
But we may derive new policy measures for complementing the
existing policies and for exploring progress direction by analyzing
them in a policy package from the viewpoint of R&D risk
management.
Abstract: Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In Neural Network that address classification problems, training set, testing set, learning rate are considered as key tasks. That is collection of input/output patterns that are used to train the network and used to assess the network performance, set the rate of adjustments. This paper describes a proposed back propagation neural net classifier that performs cross validation for original Neural Network. In order to reduce the optimization of classification accuracy, training time. The feasibility the benefits of the proposed approach are demonstrated by means of five data sets like contact-lenses, cpu, weather symbolic, Weather, labor-nega-data. It is shown that , compared to exiting neural network, the training time is reduced by more than 10 times faster when the dataset is larger than CPU or the network has many hidden units while accuracy ('percent correct') was the same for all datasets but contact-lences, which is the only one with missing attributes. For contact-lences the accuracy with Proposed Neural Network was in average around 0.3 % less than with the original Neural Network. This algorithm is independent of specify data sets so that many ideas and solutions can be transferred to other classifier paradigms.
Abstract: In this paper, we first show a relationship between two
stabilizing controllers, which presents an extended feedback system
using two stabilizing controllers. Then, we apply this relationship to
the two-stage compensator design. In this paper, we consider singleinput
single-output plants. On the other hand, we do not assume the
coprime factorizability of the model. Thus, the results of this paper
are based on the factorization approach only, so that they can be
applied to numerous linear systems.
Abstract: Organizational structure of the Turkish state
universities is a form of bureaucracy, a high efficient system in
rational and formal control. According to the dimensional approach
bureaucracy can occur in an organization in a degree, as some
bureaucracy characteristics can be stronger than others. In addition,
the units of an organization due to their different specific
characteristic properties can perceive the bureaucracy differently. In
the study, Hall-s Organizational Inventory, which was developed for
evaluating the degree of bureaucratization from the dimensional
perspective, is used to find out if there is a difference in the
perception of the bureaucracy between the academicians working in
three different departments and two faculties in the same university.
Abstract: We provide a supervised speech-independent voice recognition technique in this paper. In the feature extraction stage we propose a mel-cepstral based approach. Our feature vector classification method uses a special nonlinear metric, derived from the Hausdorff distance for sets, and a minimum mean distance classifier.
Abstract: This paper discusses a new, systematic approach to
the synthesis of a NP-hard class of non-regenerative Boolean
networks, described by FON[FOFF]={mi}[{Mi}], where for every
mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such
that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where
'n' represents the number of distinct primary inputs). The method
automatically ensures exact minimization for certain important selfdual
functions with 2n-1 points in its one-set. The elements meant for
grouping are determined from a newly proposed weighted incidence
matrix. Then the binary value corresponding to the candidate pair is
correlated with the proposed binary value matrix to enable direct
synthesis. We recommend algebraic factorization operations as a post
processing step to enable reduction in literal count. The algorithm
can be implemented in any high level language and achieves best
cost optimization for the problem dealt with, irrespective of the
number of inputs. For other cases, the method is iterated to
subsequently reduce it to a problem of O(n-1), O(n-2),.... and then
solved. In addition, it leads to optimal results for problems exhibiting
higher degree of adjacency, with a different interpretation of the
heuristic, and the results are comparable with other methods.
In terms of literal cost, at the technology independent stage, the
circuits synthesized using our algorithm enabled net savings over
AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of-
Products or ESOP forms) and AND-OR-EXOR logic by 45.57%,
41.78% and 41.78% respectively for the various problems.
Circuit level simulations were performed for a wide variety of
case studies at 3.3V and 2.5V supply to validate the performance of
the proposed method and the quality of the resulting synthesized
circuits at two different voltage corners. Power estimation was
carried out for a 0.35micron TSMC CMOS process technology. In
comparison with AOI logic, the proposed method enabled mean
savings in power by 42.46%. With respect to AND-EXOR logic, the
proposed method yielded power savings to the tune of 31.88%, while
in comparison with AND-OR-EXOR level networks; average power
savings of 33.23% was obtained.
Abstract: This research is to study the types of products and
services that employs 'ambient media and respective techniques in its
advertisement materials. Data collection has been done via analyses of a total of 62 advertisements that employed ambient media
approach in Thailand during the years 2004 to 2011. The 62 advertisement were qualifying advertisements of the Adman Awards
& Symposium under the category of Outdoor & Ambience. Analysis
results reveal that there is a total of 14 products and services that
chooses to utilize ambient media in its advertisement. Amongst all ambient media techniques, 'intrusion' uses the value of a medium in
its representation of content most often. Following intrusion is 'interaction', where consumers are invited to participate and interact
with the advertising materials. 'Illusion' ranks third in its ability to subject the viewers to distortions of reality that makes the division
between reality and fantasy less clear.
Abstract: We propose a reduced-ordermodel for the instantaneous
hydrodynamic force on a cylinder. The model consists of a system of
two ordinary differential equations (ODEs), which can be integrated
in time to yield very accurate histories of the resultant force and
its direction. In contrast to several existing models, the proposed
model considers the actual (total) hydrodynamic force rather than its
perpendicular or parallel projection (the lift and drag), and captures
the complete force rather than the oscillatory part only. We study
and provide descriptions of the relationship between the model
parameters, evaluated utilizing results from numerical simulations,
and the Reynolds number so that the model can be used at any
arbitrary value within the considered range of 100 to 500 to provide
accurate representation of the force without the need to perform timeconsuming
simulations and solving the partial differential equations
(PDEs) governing the flow field.
Abstract: This paper studies the duration or survival time of commercial banks active in the Moscovian three month Rouble deposits market, during the 1994-1997 period. The privatization process of the Russian commercial banking industry, after the 1988 banking reform, caused a massive entry of new banks followed by a period of high rates of exit. As a consequence, many firms went bankrupt without refunding their deposits. Therefore, both for the banks and for the banks- depositors, it is of interest to analyze which are the significant characteristics that motivate the exit or the closing of the bank. We propose a different methodology based on penalized weighted least squares which represents a very general, flexible and innovative approach for this type of analysis. The more relevant results are that smaller banks exit sooner, banks that enter the market in the last part of the study have shorter durations. As expected, the more experienced banks have a longer duration in the market. In addition, the mean survival time is lower for banks which offer extreme interest rates.
Abstract: Trust is essential for further and wider acceptance of
contemporary e-services. It was first addressed almost thirty years
ago in Trusted Computer System Evaluation Criteria standard by
the US DoD. But this and other proposed approaches of that
period were actually solving security. Roughly some ten years ago,
methodologies followed that addressed trust phenomenon at its core,
and they were based on Bayesian statistics and its derivatives, while
some approaches were based on game theory. However, trust is a
manifestation of judgment and reasoning processes. It has to be dealt
with in accordance with this fact and adequately supported in cyber
environment. On the basis of the results in the field of psychology
and our own findings, a methodology called qualitative algebra has
been developed, which deals with so far overlooked elements of trust
phenomenon. It complements existing methodologies and provides a
basis for a practical technical solution that supports management of
trust in contemporary computing environments. Such solution is also
presented at the end of this paper.
Abstract: Several methods have been proposed for color image
compression but the reconstructed image had very low signal to noise
ratio which made it inefficient. This paper describes a lossy
compression technique for color images which overcomes the
drawbacks. The technique works on spatial domain where the pixel
values of RGB planes of the input color image is mapped onto two
dimensional planes. The proposed technique produced better results
than JPEG2000, 2DPCA and a comparative study is reported based
on the image quality measures such as PSNR and MSE.Experiments
on real time images are shown that compare this methodology with
previous ones and demonstrate its advantages.
Abstract: Heating is inevitable in any bearing operation. This
leads to not only the thinning of the lubricant but also could lead to a
thermal deformation of the bearing. The present work is an attempt to
analyze the influence of thermal deformation on the thermohydrodynamic
lubrication of infinitely long tilted pad slider rough
bearings. As a consequence of heating the slider is deformed and is
assumed to take a parabolic shape. Also the asperities expand leading
to smaller effective film thickness. Two different types of surface
roughness are considered: longitudinal roughness and transverse
roughness. Christensen-s stochastic approach is used to derive the
Reynolds-type equations. Density and viscosity are considered to be
temperature dependent. The modified Reynolds equation, momentum
equation, continuity equation and energy equation are decoupled and
solved using finite difference method to yield various bearing
characteristics. From the numerical simulations it is observed that the
performance of the bearing is significantly affected by the thermal
distortion of the slider and asperities and even the parallel sliders
seem to carry some load.
Abstract: In this paper a novel method for multiple one dimensional real valued sinusoidal signal frequency estimation in the presence of additive Gaussian noise is postulated. A computationally simple frequency estimation method with efficient statistical performance is attractive in many array signal processing applications. The prime focus of this paper is to combine the subspace-based technique and a simple peak search approach. This paper presents a variant of the Propagator Method (PM), where a collaborative approach of SUMWE and Propagator method is applied in order to estimate the multiple real valued sine wave frequencies. A new data model is proposed, which gives the dimension of the signal subspace is equal to the number of frequencies present in the observation. But, the signal subspace dimension is twice the number of frequencies in the conventional MUSIC method for estimating frequencies of real-valued sinusoidal signal. The statistical analysis of the proposed method is studied, and the explicit expression of asymptotic (large-sample) mean-squared-error (MSE) or variance of the estimation error is derived. The performance of the method is demonstrated, and the theoretical analysis is substantiated through numerical examples. The proposed method can achieve sustainable high estimation accuracy and frequency resolution at a lower SNR, which is verified by simulation by comparing with conventional MUSIC, ESPRIT and Propagator Method.