Abstract: This paper presents an algorithm for reconstructing phase and magnitude responses of the impulse response when only the output data are available. The system is driven by a zero-mean independent identically distributed (i.i.d) non-Gaussian sequence that is not observed. The additive noise is assumed to be Gaussian. This is an important and essential problem in many practical applications of various science and engineering areas such as biomedical, seismic, and speech processing signals. The method is based on evaluating the bicepstrum of the third-order statistics of the observed output data. Simulations results are presented that demonstrate the performance of this method.
Abstract: Enzymatic hydrolysis is one of the major steps involved in the conversion from sugarcane bagasse to yield ethanol. This process offers potential for yields and selectivity higher, lower energy costs and milder operating conditions than chemical processes. However, the presence of some factors such as lignin content, crystallinity degree of the cellulose, and particle sizes, limits the digestibility of the cellulose present in the lignocellulosic biomasses. Pretreatment aims to improve the access of the enzyme to the substrate. In this study sugarcane bagasse was submitted chemical pretreatment that consisted of two consecutive steps, the first with dilute sulfuric acid (1 % (v/v) H2SO4), and the second with alkaline solutions with different concentrations of NaOH (1, 2, 3 and 4 % (w/v)). Thermal Analysis (TG/ DTG and DTA) was used to evaluate hemicellulose, cellulose and lignin contents in the samples. Scanning Electron Microscopy (SEM) was used to evaluate the morphological structures of the in natura and chemically treated samples. Results showed that pretreatments were effective in chemical degradation of lignocellulosic materials of the samples, and also was possible to observe the morphological changes occurring in the biomasses after pretreatments.
Abstract: In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.
Abstract: The goal of a network-based intrusion detection
system is to classify activities of network traffics into two major
categories: normal and attack (intrusive) activities. Nowadays, data
mining and machine learning plays an important role in many
sciences; including intrusion detection system (IDS) using both
supervised and unsupervised techniques. However, one of the
essential steps of data mining is feature selection that helps in
improving the efficiency, performance and prediction rate of
proposed approach. This paper applies unsupervised K-means
clustering algorithm with information gain (IG) for feature selection
and reduction to build a network intrusion detection system. For our
experimental analysis, we have used the new NSL-KDD dataset,
which is a modified dataset for KDDCup 1999 intrusion detection
benchmark dataset. With a split of 60.0% for the training set and the
remainder for the testing set, a 2 class classifications have been
implemented (Normal, Attack). Weka framework which is a java
based open source software consists of a collection of machine
learning algorithms for data mining tasks has been used in the testing
process. The experimental results show that the proposed approach is
very accurate with low false positive rate and high true positive rate
and it takes less learning time in comparison with using the full
features of the dataset with the same algorithm.
Abstract: In Content-Based Image Retrieval systems it is
important to use an efficient indexing technique in order to perform
and accelerate the search in huge databases. The used indexing
technique should also support the high dimensions of image features.
In this paper we present the hierarchical index NOHIS-tree (Non
Overlapping Hierarchical Index Structure) when we scale up to very
large databases. We also present a study of the influence of clustering
on search time. The performance test results show that NOHIS-tree
performs better than SR-tree. Tests also show that NOHIS-tree keeps
its performances in high dimensional spaces. We include the
performance test that try to determine the number of clusters in
NOHIS-tree to have the best search time.
Abstract: The Mobile IP Standard has been developed to support mobility over the Internet. This standard contains several drawbacks as in the cases where packets are routed via sub-optimal paths and significant amount of signaling messages is generated due to the home registration procedure which keeps the network aware of the current location of the mobile nodes. Recently, a dynamic hierarchical mobility management strategy for mobile IP networks (DHMIP) has been proposed to reduce home registrations costs. However, this strategy induces a packet delivery delay and increases the risk of packet loss. In this paper, we propose an enhanced version of the dynamic hierarchical strategy that reduces the packet delivery delay and minimizes the risk of packet loss. Preliminary results obtained from simulations are promising. They show that the enhanced version outperforms the original dynamic hierarchical mobility management strategy version.
Abstract: This paper explores the idea of globalisation and
considers accounting-s role in that process in order to develop new
spaces for accounting research. That-s why in this paper we are
looking for questions not necessary for answers. Adopting an
'alternative' view of accounting it-s related to the fact that we sees
accounting as social and evolutionist process, that pays heed to those
voices arguing for greater social and environmental justice, and that
draws attention to the role of accounting researchers in the process of
globalisation. The paper defines globalisation and expands the
globalisation and accounting research agenda introducing in this
context the harmonization process in accounting. There are the two
main systems which are disputing the first stage of being the
benchmark: GAAP and IFRS. Each of them has his pluses and
minuses on being the selected one. Due to this fact a convergence of
the two, joining the advantages and disadvantages of the two should
be the solution for an unique international accounting solution. Is this
idea realizable, what steps has been made until now, what should be
done in the future. The paper is emphasising the role of the cultural
differences in the process of imposing of an unique international
accounting system by the global organizations..
Abstract: This paper develops a pedometer with a three-axis acceleration sensor that can be placed with any angle. The proposed pedometer measures the number of steps while users walk, jog or run. It can be worn on users’ waistband or placed within pocket or backpack. The work address to improve on the general pedometers, which can only be used in a single direction or can only count of steps without the continuous exercise judgment mechanism. Finally, experimental results confirm the superior performance of the proposed pedometer.
Abstract: In pattern recognition applications the low level segmentation and the high level object recognition are generally considered as two separate steps. The paper presents a method that bridges the gap between the low and the high level object recognition. It is based on a Bayesian network representation and network propagation algorithm. At the low level it uses hierarchical structure of quadratic spline wavelet image bases. The method is demonstrated for a simple circuit diagram component identification problem.
Abstract: In Egypt, the concept of Asset Management (AM) is
new; however, the need for applying it has become crucial because
deteriorating or losing an asset is unaffordable in a developing
country like Egypt. Therefore the current study focuses on
educational buildings as one of the most important assets regarding
planning, building, operating and maintenance expenditures. The
main objective of this study is to develop a SAMF for educational
buildings in Egypt. The General Authority for Educational Buildings
(GAEB) was chosen as a case study of the current research as it
represents the biggest governmental organization responsible for
planning, operating and maintaining schools in Egypt. To achieve the
research objective, structured interviews were conducted with senior
managers of GAEB using a pre designed questionnaire to explore the
current practice of AM. Gab analysis technique was applied against
best practices compounded from a vast literature review to identify
gaps between current practices and the desired one. The previous
steps mainly revealed; limited knowledge about strategic asset
management, no clear goals, no training, no real risk plan and lack of
data, technical and financial resources. Based on the findings, a
SAMF for GAEB was introduced and Framework implementation
steps and assessment techniques were explained in detail.
Abstract: Continuously variable transmission (CVT) is a type of
automatic transmission that can change the gear ratio to any arbitrary
setting within the limits. The most common type of CVT operates on
a pulley system that allows an infinite variability between highest and
lowest gears with no discrete steps. However, the current CVT
system with hydraulic actuation method suffers from the power loss.
It needs continuous force for the pulley to clamp the belt and hold the
torque resulting in large amount of energy consumption. This study
focused on the development of an electromechanical actuated control
CVT to eliminate the problem that faced by the existing CVT. It is
conducted with several steps; computing and selecting the
appropriate sizing for stroke length, lead screw system and etc. From
the visual observation it was found that the CVT system of this
research is satisfactory.
Abstract: Using ab initio theoretical calculations, we present
analysis of fragmentation process. The analysis is performed in two
steps. The first step is calculation of fragmentation energies by ab
initio calculations. The second step is application of the energies to
kinetic description of process. The energies of fragments are
presented in this paper. The kinetics of fragmentation process can be
described by numerical models. The method for kinetic analysis is
described in this paper. The result - composition of fragmentation
products - will be calculated in future. The results from model can be
compared to the concentrations of fragments from mass spectrum.
Abstract: In this study, we present a new and fast algorithm for lung segmentation using CTA images. This process is quite important especially at lung vessel segmentation, detection of pulmonary emboly, finding nodules or segmentation of airways. Applied method has been carried out at four steps. At first step, images have been applied optimal threshold. At the second one, the subsegment vessels, which have a place in lung region and which are in small dimension, have been removed. At the third one, identifying and segmentation of lungs and airway edges have been carried out. Lastly, by throwing away the airway, lung segmentation has been presented.
Abstract: In this paper, the shape design process is briefly discussed emphasizing the use of topology optimization in the conceptual design stage. The basic idea is to view feasible domains for sensitivity region concepts. In this method, the main process consists of two steps: as the design moves further inside the feasible domain using Taguchi method, and thus becoming more successful topology optimization, the sensitivity region becomes larger. In designing a double-eccentric butterfly valve, related to hydrodynamic performance and disc structure, are discussed where the use of topology optimization has proven to dramatically improve an existing design and significantly decrease the development time of a shape design. Computational Fluid Dynamics (CFD) analysis results demonstrate the validity of this approach.
Abstract: The Time-Domain Boundary Element Method (TDBEM)
is a well known numerical technique that handles quite
properly dynamic analyses considering infinite dimension media.
However, when these analyses are also related to nonlinear behavior,
very complex numerical procedures arise considering the TD-BEM,
which may turn its application prohibitive. In order to avoid this
drawback and model nonlinear infinite media, the present work
couples two BEM formulations, aiming to achieve the best of two
worlds. In this context, the regions expected to behave nonlinearly
are discretized by the Domain Boundary Element Method (D-BEM),
which has a simpler mathematical formulation but is unable to deal
with infinite domain analyses; the TD-BEM is employed as in the
sense of an effective non-reflexive boundary. An iterative procedure
is considered for the coupling of the TD-BEM and D-BEM, which is
based on a relaxed renew of the variables at the common interfaces.
Elastoplastic models are focused and different time-steps are allowed
to be considered by each BEM formulation in the coupled analysis.
Abstract: one of the significant factors for improving the
accuracy of Land Surface Temperature (LST) retrieval is the correct
understanding of the directional anisotropy for thermal radiance. In
this paper, the multiple scattering effect between heterogeneous
non-isothermal surfaces is described rigorously according to the
concept of configuration factor, based on which a directional thermal
radiance model is built, and the directional radiant character for urban
canopy is analyzed. The model is applied to a simple urban canopy
with row structure to simulate the change of Directional Brightness
Temperature (DBT). The results show that the DBT is aggrandized
because of the multiple scattering effects, whereas the change range of
DBT is smoothed. The temperature difference, spatial distribution,
emissivity of the components can all lead to the change of DBT. The
“hot spot" phenomenon occurs when the proportion of high
temperature component in the vision field came to a head. On the other
hand, the “cool spot" phenomena occur when low temperature
proportion came to the head. The “spot" effect disappears only when
the proportion of every component keeps invariability. The model
built in this paper can be used for the study of directional effect on
emissivity, the LST retrieval over urban areas and the adjacency effect
of thermal remote sensing pixels.
Abstract: In this work we present a solution for DAGC (Digital
Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4
GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used
enables gain control over Low Noise Amplifier (LNA) and a
Variable Gain Amplifier (VGA). The control over those signals is
performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better
signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the
average power of the baseband signal close to the desired set point.
DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and
actual gain setting, adjusting a gain factor of the accumulation, and
applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying
the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the
DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.
Abstract: Recently, business environment and customer needs
have become rapidly changing, hence it is very difficult to fulfill
sophisticated customer needs by product or service innovation only. In
practice, to cope with this problem, various manufacturing companies
have developed services to combine with their products. Along with
this, many academic studies on PSS (Product Service System) which is
the integrated system of products and services have been conducted
from the viewpoint of manufacturers. On the other hand, service
providers are also attempting to develop service-supporting products
to increase their service competitiveness and provide differentiated
value. However, there is a lack of research based on the service-centric
point of view. Accordingly, this paper proposes a concept generation
method for service-supporting product development from the
service-centric point of view. This method is designed to be executed
in five consecutive steps: situation analysis, problem definition,
problem resolution, solution evaluation, and concept generation. In
the proposed approach, some tools of TRIZ (Theory of Solving
Inventive Problem) such as ISQ (Innovative Situation Questionnaire)
and 40 inventive principles are employed in order to define problems
of the current services and solve them by generating
service-supporting product concepts. This research contributes to the
development of service-supporting products and service-centric PSSs.
Abstract: Calcite aCalcite and aragonite are the two common
polymorphs of CaCO3 observed as biominerals. It is universal that
the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM).
In vivo crystallization, Mg2+ inhibits calcite formation. For this
reason, stony corals skeleton may be formed only aragonite crystals
in the biocalcification. It is special in case of soft corals of which
formed only calcite crystal; however, this interesting phenomenon,
still uncharacterized in the marine environment, has been explored in
this study using newly purified cell-free proteins isolated from the
endoskeletal sclerites of soft coral. By recording the decline of pH in
vitro, the control of CaCO3 nucleation and crystal growth by the cellfree
proteins was revealed. Using Atomic Force Microscope, here we
find that these endoskeletal cell-free proteins significantly design the
morphological shape in the molecular-scale kinetics of crystal
formation and those proteins act as surfactants to promote ion
attachment at calcite steps.nd aragonite are the two common polymorphs of CaCO3 observed as biominerals. It is universal that the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM). In vivo crystallization, Mg2+ inhibits calcite formation. For this reason, stony corals skeleton may be formed only aragonite crystals in the biocalcification. It is special in case of soft corals of which formed only calcite crystal; however, this interesting phenomenon, still uncharacterized in the marine environment, has been explored in this study using newly purified cell-free proteins isolated from the endoskeletal sclerites of soft coral. By recording the decline of pH in vitro, the control of CaCO3 nucleation and crystal growth by the cell-free proteins was revealed. Using Atomic Force Microscope, here we find that these endoskeletal cell-free proteins significantly design the morphological shape in the molecular-scale kinetics of crystal formation and those proteins act as surfactants to promote ion attachment at calcite steps. KeywordsBiomineralization, Calcite, Cell-free protein, Soft coral
Abstract: This paper presents the novel Rao-Blackwellised
particle filter (RBPF) for mobile robot simultaneous localization and
mapping (SLAM) using monocular vision. The particle filter is
combined with unscented Kalman filter (UKF) to extending the path
posterior by sampling new poses that integrate the current observation
which drastically reduces the uncertainty about the robot pose. The
landmark position estimation and update is also implemented through
UKF. Furthermore, the number of resampling steps is determined
adaptively, which seriously reduces the particle depletion problem,
and introducing the evolution strategies (ES) for avoiding particle
impoverishment. The 3D natural point landmarks are structured with
matching Scale Invariant Feature Transform (SIFT) feature pairs. The
matching for multi-dimension SIFT features is implemented with a
KD-Tree in the time cost of O(log2
N). Experiment results on real robot
in our indoor environment show the advantages of our methods over
previous approaches.