Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of NACA 64-210 & NACA 0012 airfoils. For our analysis, CFD method and preprocessing grid generator are used as our main analytical tools, and the simulation of rain is accomplished via two phase flow approach-s Discrete Phase Model (DPM). Raindrops are assumed to be non-interacting, non-deforming, non-evaporating and non-spinning spheres. Both airfoil sections exhibited significant reduction in lift and increase in drag for a given lift condition in simulated rain. The most significant difference between these two airfoils was the sensitivity of the NACA 64-210 to liquid water content (LWC), while NACA 0012 performance losses in the rain environment is not a function of LWC . It is expected that the quantitative information gained in this paper will be useful to the operational airline industry and greater effort such as small scale and full scale flight tests should put in this direction to further improve aviation safety.
Abstract: Many digital signal processing, techniques have been used to automatically distinguish protein coding regions (exons) from non-coding regions (introns) in DNA sequences. In this work, we have characterized these sequences according to their nonlinear dynamical features such as moment invariants, correlation dimension, and largest Lyapunov exponent estimates. We have applied our model to a number of real sequences encoded into a time series using EIIP sequence indicators. In order to discriminate between coding and non coding DNA regions, the phase space trajectory was first reconstructed for coding and non-coding regions. Nonlinear dynamical features are extracted from those regions and used to investigate a difference between them. Our results indicate that the nonlinear dynamical characteristics have yielded significant differences between coding (CR) and non-coding regions (NCR) in DNA sequences. Finally, the classifier is tested on real genes where coding and non-coding regions are well known.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: This paper proposes new enhancement models to the
methods of nonlinear anisotropic diffusion to greatly reduce speckle
and preserve image features in medical ultrasound images. By
incorporating local physical characteristics of the image, in this case
scatterer density, in addition to the gradient, into existing tensorbased
image diffusion methods, we were able to greatly improve the
performance of the existing filtering methods, namely edge
enhancing (EE) and coherence enhancing (CE) diffusion. The new
enhancement methods were tested using various ultrasound images,
including phantom and some clinical images, to determine the
amount of speckle reduction, edge, and coherence enhancements.
Scatterer density weighted nonlinear anisotropic diffusion
(SDWNAD) for ultrasound images consistently outperformed its
traditional tensor-based counterparts that use gradient only to weight
the diffusivity function. SDWNAD is shown to greatly reduce
speckle noise while preserving image features as edges, orientation
coherence, and scatterer density. SDWNAD superior performances
over nonlinear coherent diffusion (NCD), speckle reducing
anisotropic diffusion (SRAD), adaptive weighted median filter
(AWMF), wavelet shrinkage (WS), and wavelet shrinkage with
contrast enhancement (WSCE), make these methods ideal
preprocessing steps for automatic segmentation in ultrasound
imaging.
Abstract: In this paper a novel approach for generalized image
retrieval based on semantic contents is presented. A combination of
three feature extraction methods namely color, texture, and edge
histogram descriptor. There is a provision to add new features in
future for better retrieval efficiency. Any combination of these
methods, which is more appropriate for the application, can be used
for retrieval. This is provided through User Interface (UI) in the
form of relevance feedback. The image properties analyzed in this
work are by using computer vision and image processing algorithms.
For color the histogram of images are computed, for texture cooccurrence
matrix based entropy, energy, etc, are calculated and for
edge density it is Edge Histogram Descriptor (EHD) that is found.
For retrieval of images, a novel idea is developed based on greedy
strategy to reduce the computational complexity. The entire system
was developed using AForge.Imaging (an open source product),
MATLAB .NET Builder, C#, and Oracle 10g. The system was tested
with Coral Image database containing 1000 natural images and
achieved better results.
Abstract: The objective of the paper is to develop the forecast
model for the HW flows. The methodology of the research included
6 modules: historical data, assumptions, choose of indicators, data
processing, and data analysis with STATGRAPHICS, and forecast
models. The proposed methodology was validated for the case study
for Latvia. Hypothesis on the changes in HW for time period of
2010-2020 have been developed and mathematically described with
confidence level of 95.0% and 50.0%. Sensitivity analysis for the
analyzed scenarios was done. The results show that the growth of
GDP affects the total amount of HW in the country. The total amount
of the HW is projected to be within the corridor of – 27.7% in the
optimistic scenario up to +87.8% in the pessimistic scenario with
confidence level of 50.0% for period of 2010-2020. The optimistic
scenario has shown to be the least flexible to the changes in the GDP
growth.
Abstract: The upgrading of low quality crude natural gas (NG) is attracting interest due to high demand of pipeline-grade gas in recent years. Membrane processes are commercially proven technology for the removal of impurities like carbon dioxide from NG. In this work, cross flow mathematical model has been suggested to be incorporated with ASPEN HYSYS as a user defined unit operation in order to design the membrane system for CO2/CH4 separation. The effect of operating conditions (such as feed composition and pressure) and membrane selectivity on the design parameters (methane recovery and total membrane area required for the separation) has been studied for different design configurations. These configurations include single stage (with and without recycle) and double stage membrane systems (with and without permeate or retentate recycle). It is shown that methane recovery can be improved by recycling permeate or retentate stream as well as by using double stage membrane systems. The ASPEN HYSYS user defined unit operation proposed in the study has potential to be applied for complex membrane system design and optimization.
Abstract: In this study, a Loop Back Algorithm for component
connected labeling for detecting objects in a digital image is
presented. The approach is using loop back connected component
labeling algorithm that helps the system to distinguish the object
detected according to their label. Deferent than whole window
scanning technique, this technique reduces the searching time for
locating the object by focusing on the suspected object based on
certain features defined. In this study, the approach was also
implemented for a face detection system. Face detection system is
becoming interesting research since there are many devices or
systems that require detecting the face for certain purposes. The input
can be from still image or videos, therefore the sub process of this
system has to be simple, efficient and accurate to give a good result.
Abstract: On-board Error Detection and Correction (EDAC)
devices aim to secure data transmitted between the central
processing unit (CPU) of a satellite onboard computer and its local
memory. This paper presents a comparison of the performance of
four low complexity EDAC techniques for application in Random
Access Memories (RAMs) on-board small satellites. The
performance of a newly proposed EDAC architecture is measured
and compared with three different EDAC strategies, using the same
FPGA technology. A statistical analysis of single-event upset (SEU)
and multiple-bit upset (MBU) activity in commercial memories
onboard Alsat-1 is given for a period of 8 years
Abstract: Effectiveness of Artificial Neural Networks (ANN)
and Support Vector Machines (SVM) classifiers for fault diagnosis of
rolling element bearings are presented in this paper. The
characteristic features of vibration signals of rotating driveline that
was run in its normal condition and with faults introduced were used
as input to ANN and SVM classifiers. Simple statistical features such
as standard deviation, skewness, kurtosis etc. of the time-domain
vibration signal segments along with peaks of the signal and peak of
power spectral density (PSD) are used as features to input the ANN
and SVM classifier. The effect of preprocessing of the vibration
signal by Discreet Wavelet Transform (DWT) prior to feature
extraction is also studied. It is shown from the experimental results
that the performance of SVM classifier in identification of bearing
condition is better then ANN and pre-processing of vibration signal
by DWT enhances the effectiveness of both ANN and SVM classifier
Abstract: ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.
Abstract: This project relates to a two-wheeled self balancing
robot for transferring loads on different locations along a path. This
robot specifically functions as a dual mode navigation to navigate
efficiently along a desired path. First, as a plurality of distance
sensors mounted at both sides of the body for collecting information
on tilt angle of the body and second, as a plurality of speed sensors
mounted at the bottom of the body for collecting information of the
velocity of the body in relative to the ground. A microcontroller for
processing information collected from the sensors and configured to
set the path and to balance the body automatically while a processor
operatively coupled to the microcontroller and configured to compute
change of the tilt and velocity of the body. A direct current motor
operatively coupled to the microcontroller for controlling the wheels
and characterized in that a remote control is operatively coupled to
the microcontroller to operate the robot in dual navigation modes.
Abstract: An information procuring and processing emerging technology wireless sensor network (WSN) Consists of autonomous nodes with versatile devices underpinned by applications. Nodes are equipped with different capabilities such as sensing, computing, actuation and wireless communications etc. based on application requirements. A WSN application ranges from military implementation in the battlefield, environmental monitoring, health sector as well as emergency response of surveillance. The nodes are deployed independently to cooperatively monitor the physical and environmental conditions. The architecture of WSN differs based on the application requirements and focus on low cost, flexibility, fault tolerance capability, deployment process as well as conserve energy. In this paper we have present the characteristics, architecture design objective and architecture of WSN
Abstract: In this paper, we propose a practical digital music matching system that is robust to variation in sound qualities. The proposed system is subdivided into two parts: client and server. The client part consists of the input, preprocessing and feature extraction modules. The preprocessing module, including the music onset module, revises the value gap occurring on the time axis between identical songs of different formats. The proposed method uses delta-grouped Mel frequency cepstral coefficients (MFCCs) to extract music features that are robust to changes in sound quality. According to the number of sound quality formats (SQFs) used, a music server is constructed with a feature database (FD) that contains different sub feature databases (SFDs). When the proposed system receives a music file, the selection module selects an appropriate SFD from a feature database; the selected SFD is subsequently used by the matching module. In this study, we used 3,000 queries for matching experiments in three cases with different FDs. In each case, we used 1,000 queries constructed by mixing 8 SQFs and 125 songs. The success rate of music matching improved from 88.6% when using single a single SFD to 93.2% when using quadruple SFDs. By this experiment, we proved that the proposed method is robust to various sound qualities.
Abstract: Machine Translation, (hereafter in this document
referred to as the "MT") faces a lot of complex problems from its
origination. Extracting multiword expressions is also one of the
complex problems in MT. Finding multiword expressions during
translating a sentence from English into Urdu, through existing
solutions, takes a lot of time and occupies system resources. We have
designed a simple relational data approach, in which we simply set a
bit in dictionary (database) for multiword, to find and handle
multiword expression. This approach handles multiword efficiently.
Abstract: One of the approaches enabling people with amputated
limbs to establish some sort of interface with the real world includes
the utilization of the myoelectric signal (MES) from the remaining
muscles of those limbs. The MES can be used as a control input to a
multifunction prosthetic device. In this control scheme, known as the
myoelectric control, a pattern recognition approach is usually utilized
to discriminate between the MES signals that belong to different
classes of the forearm movements. Since the MES is recorded using
multiple channels, the feature vector size can become very large. In
order to reduce the computational cost and enhance the generalization
capability of the classifier, a dimensionality reduction method is
needed to identify an informative yet moderate size feature set. This
paper proposes a new fuzzy version of the well known Fisher-s
Linear Discriminant Analysis (LDA) feature projection technique.
Furthermore, based on the fact that certain muscles might contribute
more to the discrimination process, a novel feature weighting scheme
is also presented by employing Particle Swarm Optimization (PSO)
for estimating the weight of each feature. The new method, called
PSOFLDA, is tested on real MES datasets and compared with other
techniques to prove its superiority.
Abstract: Cognitive Dissonance can be conceived both as a concept related to the tendency to avoid internal contradictions in certain situations, and as a higher order theory about information processing in the human mind. In the last decades, this last sense has been strongly surpassed by the former, as nearly all experiment on the matter discuss cognitive dissonance as an output of motivational contradictions. In that sense, the question remains: is cognitive dissonance a process intrinsically associated with the way that the mind processes information, or is it caused by such specific contradictions? Objective: To evaluate the effects of cognitive dissonance in the absence of rewards or any mechanisms to manipulate motivation. Method: To solve this question, we introduce a new task, the hypothetical social arrays paradigm, which was applied to 50 undergraduate students. Results: Our findings support the perspective that the human mind shows a tendency to avoid internal dissonance even when there are no rewards or punishment involved. Moreover, our findings also suggest that this principle works outside the conscious level.
Abstract: This study focuses on bureau management
technologies and information systems in developing countries.
Developing countries use such systems which facilitate executive and
organizational functions through the utilization of bureau
management technologies and provide the executive staff with
necessary information.
The concepts of data and information differ from each other in
developing countries, and thus the concepts of data processing and
information processing are different. Symbols represent ideas,
objects, figures, letters and numbers. Data processing system is an
integrated system which deals with the processing of the data related
to the internal and external environment of the organization in order
to make decisions, create plans and develop strategies; it goes
without saying that this system is composed of both human beings
and machines. Information is obtained through the acquisition and
the processing of data. On the other hand, data are raw
communicative messages. Within this framework, data processing
equals to producing plausible information out of raw data.
Organizations in developing countries need to obtain information
relevant to them because rapid changes in the organizational arena
require rapid access to accurate information. The most significant
role of the directors and managers who work in the organizational
arena is to make decisions. Making a correct decision is possible only
when the directors and managers are equipped with sound ideas and
appropriate information. Therefore, acquisition, organization and
distribution of information gain significance. Today-s organizations
make use of computer-assisted “Management Information Systems"
in order to obtain and distribute information.
Decision Support System which is closely related to practice is an
information system that facilitates the director-s task of making
decisions. Decision Support System integrates human intelligence,
information technology and software in order to solve the complex
problems. With the support of the computer technology and software
systems, Decision Support System produces information relevant to
the decision to be made by the director and provides the executive
staff with supportive ideas about the decision.
Artificial Intelligence programs which transfer the studies and
experiences of the people to the computer are called expert systems.
An expert system stores expert information in a limited area and can
solve problems by deriving rational consequences.
Bureau management technologies and information systems in
developing countries create a kind of information society and
information economy which make those countries have their places
in the global socio-economic structure and which enable them to play
a reasonable and fruitful role; therefore it is of crucial importance to
make use of information and management technologies in order to
work together with innovative and enterprising individuals and it is
also significant to create “scientific policies" based on information
and technology in the fields of economy, politics, law and culture.
Abstract: The purpose of this paper is to propose a framework for constructing correct parallel processing programs based on Equivalent Transformation Framework (ETF). ETF regards computation as In the framework, a problem-s domain knowledge and a query are described in definite clauses, and computation is regarded as transformation of the definite clauses. Its meaning is defined by a model of the set of definite clauses, and the transformation rules generated must preserve meaning. We have proposed a parallel processing method based on “specialization", a part of operation in the transformations, which resembles substitution in logic programming. The method requires “Memo-tree", a history of specialization to maintain correctness. In this paper we proposes the new method for the specialization-base parallel processing without Memo-tree.