Abstract: In the paper, the relative performances on spectral
classification of short exon and intron sequences of the human and
eleven model organisms is studied. In the simulations, all
combinations of sixteen one-sequence numerical representations, four
threshold values, and four window lengths are considered. Sequences
of 150-base length are chosen and for each organism, a total of
16,000 sequences are used for training and testing. Results indicate
that an appropriate combination of one-sequence numerical
representation, threshold value, and window length is essential for
arriving at top spectral classification results. For fixed-length
sequences, the precisions on exon and intron classification obtained
for different organisms are not the same because of their genomic
differences. In general, precision increases as sequence length
increases.
Abstract: Certifications such as the Passive House Standard aim to reduce the final space heating energy demand of residential buildings. Space conditioning, notably heating, is responsible for nearly 70% of final residential energy consumption in Europe. There is therefore significant scope for the reduction of energy consumption through improvements to the energy efficiency of residential buildings. However, these certifications totally overlook the energy embodied in the building materials used to achieve this greater operational energy efficiency. The large amount of insulation and the triple-glazed high efficiency windows require a significant amount of energy to manufacture. While some previous studies have assessed the life cycle energy demand of passive houses, including their embodied energy, these rely on incomplete assessment techniques which greatly underestimate embodied energy and can lead to misleading conclusions. This paper analyses the embodied and operational energy demands of a case study passive house using a comprehensive hybrid analysis technique to quantify embodied energy. Results show that the embodied energy is much more significant than previously thought. Also, compared to a standard house with the same geometry, structure, finishes and number of people, a passive house can use more energy over 80 years, mainly due to the additional materials required. Current building energy efficiency certifications should widen their system boundaries to include embodied energy in order to reduce the life cycle energy demand of residential buildings.
Abstract: Sub-prime mortgage crisis which began in the US is
regarded as the most economic crisis since the Great Depression in the
early 20th century. Especially, hidden problems on efficient operation
of a business were disclosed at a time and many financial institutions
went bankrupt and filed for court receivership. The collapses of
physical market lead to bankruptcy of manufacturing and construction
businesses. This study is to analyze dynamic efficiency of construction
businesses during the five years at the turn of the global financial
crisis. By discovering the trend and stability of efficiency of a
construction business, this study-s objective is to improve
management efficiency of a construction business in the
ever-changing construction market. Variables were selected by
analyzing corporate information on top 20 construction businesses in
Korea and analyzed for static efficiency in 2008 and dynamic
efficiency between 2006 and 2010. Unlike other studies, this study
succeeded in deducing efficiency trend and stability of a construction
business for five years by using the DEA/Window model. Using the
analysis result, efficient and inefficient companies could be figured
out. In addition, relative efficiency among DMU was measured by
comparing the relationship between input and output variables of
construction businesses. This study can be used as a literature to
improve management efficiency for companies with low efficiency
based on efficiency analysis of construction businesses.
Abstract: the cursive nature of the Arabic writing makes it
difficult to accurately segment characters or even deal with the whole
word efficiently. Therefore, in this paper, a printed Arabic sub-word
recognition system is proposed. The suggested algorithm utilizes
geometrical moments as descriptors for the separated sub-words.
Three types of moments are investigated and applied to the printed
sub-word images after dividing each image into multiple parts using
windowing. Since moments are global descriptors, the windowing
mechanism allows the moments to be applied to local regions of the
sub-word. The local-global mixture of the proposed scheme increases
the discrimination power of the moments while keeping the
simplicity and ease of use of moments.
Abstract: Routing security is a major concerned in Wireless
Sensor Network since a large scale of unattended nodes is deployed
in ad hoc fashion with no possibility of a global addressing due to a
limitation of node-s memory and the node have to be self organizing
when the systems require a connection with the other nodes. It
becomes more challenging when the nodes have to act as the router
and tightly constrained on energy and computational capabilities
where any existing security mechanisms are not allowed to be fitted
directly. These reasons thus increasing vulnerabilities to the network
layer particularly and to the whole network, generally. In this paper,
a Dynamic Window Secured Implicit Geographic Forwarding
(DWSIGF) routing is presented where a dynamic time is used for
collection window to collect Clear to Send (CTS) control packet in
order to find an appropriate hoping node. The DWIGF is expected to
minimize a chance to select an attacker as the hoping node that
caused by a blackhole attack that happen because of the CTS rushing
attack, which promise a good network performance with high packet
delivery ratios.
Abstract: All the geophysical phenomena including river
networks and flow time series are fractal events inherently and fractal
patterns can be investigated through their behaviors. A non-linear
system like a river basin can well be analyzed by a non-linear
measure such as the fractal analysis. A bilateral study is held on the
fractal properties of the river network and the river flow time series.
A moving window technique is utilized to scan the fractal properties
of them. Results depict both events follow the same strategy
regarding to the fractal properties. Both the river network and the
time series fractal dimension tend to saturate in a distinct value.
Abstract: Interactions among proteins are the basis of various
life events. So, it is important to recognize and research protein
interaction sites. A control set that contains 149 protein molecules
were used here. Then 10 features were extracted and 4 sample sets
that contained 9 sliding windows were made according to features.
These 4 sample sets were calculated by Radial Basis Functional neutral
networks which were optimized by Particle Swarm Optimization
respectively. Then 4 groups of results were obtained. Finally, these 4
groups of results were integrated by decision fusion (DF) and Genetic
Algorithm based Selected Ensemble (GASEN). A better accuracy was
got by DF and GASEN. So, the integrated methods were proved to
be effective.
Abstract: The paper examines the performance of bit-interleaved parity (BIP) methods in error rate monitoring, and in declaration and clearing of alarms in those transport networks that employ automatic protection switching (APS). The BIP-based error rate monitoring is attractive for its simplicity and ease of implementation. The BIP-based results are compared with exact results and are found to declare the alarms too late, and to clear the alarms too early. It is concluded that the standards development and systems implementation should take into account the fact of early clearing and late declaration of alarms. The window parameters defining the detection and clearing thresholds should be set so as to build sufficient hysteresis into the system to ensure that BIP-based implementations yield acceptable performance results.
Abstract: Continuously growing needs for Internet applications
that transmit massive amount of data have led to the emergence of
high speed network. Data transfer must take place without any
congestion and hence feedback parameters must be transferred from
the receiver end to the sender end so as to restrict the sending rate in
order to avoid congestion. Even though TCP tries to avoid
congestion by restricting the sending rate and window size, it never
announces the sender about the capacity of the data to be sent and
also it reduces the window size by half at the time of congestion
therefore resulting in the decrease of throughput, low utilization of
the bandwidth and maximum delay. In this paper, XCP protocol is
used and feedback parameters are calculated based on arrival rate,
service rate, traffic rate and queue size and hence the receiver
informs the sender about the throughput, capacity of the data to be
sent and window size adjustment, resulting in no drastic decrease in
window size, better increase in sending rate because of which there is
a continuous flow of data without congestion. Therefore as a result of
this, there is a maximum increase in throughput, high utilization of
the bandwidth and minimum delay. The result of the proposed work
is presented as a graph based on throughput, delay and window size.
Thus in this paper, XCP protocol is well illustrated and the various
parameters are thoroughly analyzed and adequately presented.
Abstract: An image texture analysis and target recognition approach of using an improved image texture feature coding method (TFCM) and Support Vector Machine (SVM) for target detection is presented. With our proposed target detection framework, targets of interest can be detected accurately. Cascade-Sliding-Window technique was also developed for automated target localization. Application to mammogram showed that over 88% of normal mammograms and 80% of abnormal mammograms can be correctly identified. The approach was also successfully applied to Synthetic Aperture Radar (SAR) and Ground Penetrating Radar (GPR) images for target detection.
Abstract: We present a general comparison of punctual kriging based image restoration for different neighbourhood sizes. The formulation of the technique under consideration is based on punctual kriging and fuzzy concepts for image restoration in spatial domain. Three different neighbourhood windows are considered to estimate the semivariance at different lags for studying its effect in reduction of negative weights resulted in punctual kriging, consequently restoration of degraded images. Our results show that effect of neighbourhood size higher than 5x5 on reduction in negative weights is insignificant. In addition, image quality measures, such as structure similarity indices, peak signal to noise ratios and the new variogram based quality measures; show that 3x3 window size gives better performance as compared with larger window sizes.
Abstract: The client server systems using mobile
communications networks for data transmission became very
attractive for many economic agents, in the purpose of promoting and
offering electronic services to their clients. E-services are suitable for
business developing and financial benefits increasing. The products
or services can be efficiently delivered to a large number of clients,
using mobile Internet access technologies. The clients can have
access to e-services, anywhere and anytime, with the support of 3G,
GPRS, WLAN, etc., channels bandwidth, data services and protocols.
Based on the mobile communications networks evolution and
development, a convergence of technological and financial interests
of mobile operators, software developers, mobile terminals producers
and e-content providers is established. These will lead to a high level
of integration of IT&C resources and will facilitate the value added
services delivery through the mobile communications networks. In
this paper it is presented a client server system, for e-services access,
with Smartphones and PDA-s mobile software applications, installed
on Symbian and Windows Mobile operating systems.
Abstract: Noise contamination in a magnetic resonance (MR)
image could occur during acquisition, storage, and transmission in
which effective filtering is required to avoid repeating the MR
procedure. In this paper, an iterative asymmetrical triangle fuzzy
filter with moving average center (ATMAVi filter) is used to reduce
different levels of salt and pepper noise in a brain MR image. Besides
visual inspection on filtered images, the mean squared error (MSE) is
used as an objective measurement. When compared with the median
filter, simulation results indicate that the ATMAVi filter is effective
especially for filtering a higher level noise (such as noise density =
0.45) using a smaller window size (such as 3x3) when operated
iteratively or using a larger window size (such as 5x5) when operated
non-iteratively.
Abstract: The Automatic Speech Recognition (ASR) applied to
Arabic language is a challenging task. This is mainly related to the
language specificities which make the researchers facing multiple
difficulties such as the insufficient linguistic resources and the very
limited number of available transcribed Arabic speech corpora. In
this paper, we are interested in the development of a HMM-based
ASR system for Standard Arabic (SA) language. Our fundamental
research goal is to select the most appropriate acoustic parameters
describing each audio frame, acoustic models and speech recognition
unit. To achieve this purpose, we analyze the effect of varying frame
windowing (size and period), acoustic parameter number resulting
from features extraction methods traditionally used in ASR, speech
recognition unit, Gaussian number per HMM state and number of
embedded re-estimations of the Baum-Welch Algorithm. To evaluate
the proposed ASR system, a multi-speaker SA connected-digits
corpus is collected, transcribed and used throughout all experiments.
A further evaluation is conducted on a speaker-independent continue
SA speech corpus. The phonemes recognition rate is 94.02% which is
relatively high when comparing it with another ASR system
evaluated on the same corpus.
Abstract: Real-time embedded systems should benefit from
component-based software engineering to handle complexity and
deal with dependability. In these systems, applications should not
only be logically correct but also behave within time windows.
However, in the current component based software engineering
approaches, a few of component models handles time properties in
a manner that allows efficient analysis and checking at the
architectural level. In this paper, we present a meta-model for
component-based software description that integrates timing
issues. To achieve a complete functional model of software
components, our meta-model focuses on four functional aspects:
interface, static behavior, dynamic behavior, and interaction
protocol. With each aspect we have explicitly associated a time
model. Such a time model can be used to check a component-s
design against certain properties and to compute the timing
properties of component assemblies.
Abstract: Advances in computing applications in recent years
have prompted the demand for more flexible scheduling models for
QoS demand. Moreover, in practical applications, partly violated
temporal constraints can be tolerated if the violation meets certain
distribution. So we need extend the traditional Liu and Lanland model
to adapt to these circumstances. There are two extensions, which are
the (m, k)-firm model and Window-Constrained model. This paper
researches on weakly hard real-time constraints and their combination
to support QoS. The fact that a practical application can tolerate some
violations of temporal constraint under certain distribution is
employed to support adaptive QoS on the open real-time system. The
experiment results show these approaches are effective compared to
traditional scheduling algorithms.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the economic
opportunities through business innovation and growth. We found
evidences in literature that SOA (Service Oriented Architecture) is a
promising emerging technology which can deliver the desired
economic opportunity through modularity, flexibility and loosecoupling.
SOA can also help firms to connect in network which can
open a new window of opportunity to collaborate in innovation and
right kind of outsourcing
Abstract: Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).
Abstract: The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Abstract: This paper describes the process used in the
automation of the Maritime UAV commands using the Kinect sensor.
The AR Drone is a Quadrocopter manufactured by Parrot [1] to be
controlled using the Apple operating systems such as iPhones and
Ipads. However, this project uses the Microsoft Kinect SDK and
Microsoft Visual Studio C# (C sharp) software, which are compatible
with Windows Operating System for the automation of the navigation
and control of the AR drone.
The navigation and control software for the Quadrocopter runs on
a windows 7 computer. The project is divided into two sections; the
Quadrocopter control system and the Kinect sensor control system.
The Kinect sensor is connected to the computer using a USB cable
from which commands can be sent to and from the Kinect sensors.
The AR drone has Wi-Fi capabilities from which it can be connected
to the computer to enable transfer of commands to and from the
Quadrocopter.
The project was implemented in C#, a programming language that
is commonly used in the automation systems. The language was
chosen because there are more libraries already established in C# for
both the AR drone and the Kinect sensor.
The study will contribute toward research in automation of
systems using the Quadrocopter and the Kinect sensor for navigation
involving a human operator in the loop. The prototype created has
numerous applications among which include the inspection of vessels
such as ship, airplanes and areas that are not accessible by human
operators.