Abstract: In 1990 [1] the subband-DFT (SB-DFT) technique was proposed. This technique used the Hadamard filters in the decomposition step to split the input sequence into low- and highpass sequences. In the next step, either two DFTs are needed on both bands to compute the full-band DFT or one DFT on one of the two bands to compute an approximate DFT. A combination network with correction factors was to be applied after the DFTs. Another approach was proposed in 1997 [2] for using a special discrete wavelet transform (DWT) to compute the discrete Fourier transform (DFT). In the first step of the algorithm, the input sequence is decomposed in a similar manner to the SB-DFT into two sequences using wavelet decomposition with Haar filters. The second step is to perform DFTs on both bands to obtain the full-band DFT or to obtain a fast approximate DFT by implementing pruning at both input and output sides. In this paper, the wavelet-based DFT (W-DFT) with Haar filters is interpreted as SB-DFT with Hadamard filters. The only difference is in a constant factor in the combination network. This result is very important to complete the analysis of the W-DFT, since all the results concerning the accuracy and approximation errors in the SB-DFT are applicable. An application example in spectral analysis is given for both SB-DFT and W-DFT (with different filters). The adaptive capability of the SB-DFT is included in the W-DFT algorithm to select the band of most energy as the band to be computed. Finally, the W-DFT is extended to the two-dimensional case. An application in image transformation is given using two different types of wavelet filters.
Abstract: We developed a GPS-based navigation device for the
blind, with audio guidance in Thai language. The device is composed
of simple and inexpensive hardware components. Its user interface is
quite simple. It determines optimal routes to various landmarks in our
university campus by using heuristic search for the next waypoints.
We tested the device and made note of its limitations and possible
extensions.
Abstract: The intention of this study to design the probability optimized sewing sack-s workstation based on ergonomics for productivity improvement and decreasing musculoskeletal disorders. The physical dimensions of two workers were using to design the new workstation. The physical dimensions are (1) sitting height, (2) mid shoulder height sitting, (3) shoulder breadth, (4) knee height, (5) popliteal height, (6) hip breadth and (7) buttock-knee length. The 5th percentile of buttock knee length sitting (51 cm), the 50th percentile of mid shoulder height sitting (62 cm) and the 95th percentile of popliteal height (43 cm) and hip breadth (45 cm) applied to design the workstation for sewing sack-s operator and the others used to adjust the components of this workstation. The risk assessment by RULA before and after using the probability optimized workstation were 7 and 7 scores and REBA scores were 11 and 5, respectively. Body discomfort-abnormal index was used to assess muscle fatigue of operators before adjustment workstation found that neck muscles, arm muscles area, muscles on the back and the lower back muscles fatigue. Therefore, the extension and flexion exercise was applied to relief musculoskeletal stresses. The workers exercised 15 minutes before the beginning and the end of work for 5 days. After that, the capability of flexion and extension muscles- workers were increasing in 3 muscles (arm, leg, and back muscles).
Abstract: The rapidly increasing costs of power line extensions
and fossil fuel, combined with the desire to reduce carbon dioxide
emissions pushed the development of hybrid power system suited for
remote locations, the purpose in mind being that of autonomous local
power systems. The paper presents the suggested solution for a “high
penetration" hybrid power system, it being determined by the
location of the settlement and its “zero policy" on carbon dioxide
emissions. The paper focuses on the technical solution and the power
flow management algorithm of the system, taking into consideration
local conditions of development.
Abstract: In this paper, a theoretical formula is presented to
predict the instantaneous folding force of the first fold creation in a
square column under axial loading. Calculations are based on analysis
of “Basic Folding Mechanism" introduced by Wierzbicki and
Abramowicz. For this purpose, the sum of dissipated energy rate under
bending around horizontal and inclined hinge lines and dissipated
energy rate under extensional deformations are equated to the work rate
of the external force on the structure. Final formula obtained in this
research, reasonably predicts the instantaneous folding force of the first
fold creation versus folding distance and folding angle and also predicts
the instantaneous folding force instead of the average value. Finally,
according to the calculated theoretical relation, instantaneous folding
force of the first fold creation in a square column was sketched
versus folding distance and was compared to the experimental results
which showed a good correlation.
Abstract: We summarize information that facilitates choosing an ontology language for knowledge intensive applications. This paper is a short version of the ontology language state-of-the-art and evolution analysis carried out for choosing an ontology language in the IST Esperonto project. At first, we analyze changes and evolution that took place in the filed of Semantic Web languages during the last years, in particular, around the ontology languages of the RDF/S and OWL family. Second, we present current trends in development of Semantic Web languages, in particular, rule support extensions for Semantic Web languages and emerging ontology languages such as WSMO languages.
Abstract: Dual bell nozzle is a promising one among the altitude
adaptation nozzle concepts, which offer increased nozzle
performance in rocket engines. Its advantage is the simplicity it offers
due to the absence of any additional mechanical device or movable
parts. Hence it offers reliability along with improved nozzle
performance as demanded by future launch vehicles. Among other
issues, the flow transition to the extension nozzle of a dual bell
nozzle is one of the major issues being studied in the development of
dual bell nozzle. A parameter named over-expansion factor, which
controls the value of the wall inflection angle, has been reported to
have substantial influence in this transition process. This paper
studies, through CFD and cold flow experiments, the effect of overexpansion
factor on flow transition in dual bell nozzles.
Abstract: This paper is an extension of a previous work where a diagonally implicit harmonic balance method was developed and applied to simulate oscillatory motions of pitching airfoil and wing. A more detailed study on the accuracy, convergence, and the efficiency of the method is carried out in the current paperby varying the number of harmonics in the solution approximation. As the main advantage of the method is itsusage for the design optimization of the unsteady problems, its application to more practical case of rotor flow analysis during forward flight is carried out and compared with flight test data and time-accurate computation results.
Abstract: Advances in computing applications in recent years
have prompted the demand for more flexible scheduling models for
QoS demand. Moreover, in practical applications, partly violated
temporal constraints can be tolerated if the violation meets certain
distribution. So we need extend the traditional Liu and Lanland model
to adapt to these circumstances. There are two extensions, which are
the (m, k)-firm model and Window-Constrained model. This paper
researches on weakly hard real-time constraints and their combination
to support QoS. The fact that a practical application can tolerate some
violations of temporal constraint under certain distribution is
employed to support adaptive QoS on the open real-time system. The
experiment results show these approaches are effective compared to
traditional scheduling algorithms.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the economic
opportunities through business innovation and growth. We found
evidences in literature that SOA (Service Oriented Architecture) is a
promising emerging technology which can deliver the desired
economic opportunity through modularity, flexibility and loosecoupling.
SOA can also help firms to connect in network which can
open a new window of opportunity to collaborate in innovation and
right kind of outsourcing
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: A novel methodology has been used to design an
evaporator coil of a refrigerant. The methodology used is through a
complete Computer Aided Design /Computer Aided Engineering
approach, by means of a Computational Fluid Dynamic/Finite
Element Analysis model which is executed many times for the
thermal-fluid exploration of several designs' configuration by an
commercial optimizer. Hence the design is carried out automatically
by parallel computations, with an optimization package taking the
decisions rather than the design engineer. The engineer instead takes
decision regarding the physical settings and initializing of the
computational models to employ, the number and the extension of the
geometrical parameters of the coil fins and the optimization tools to
be employed. The final design of the coil geometry found to be better
than the initial design.
Abstract: Magnetic and semiconductor nanomaterials exhibit
novel magnetic and optical properties owing to their unique size and
shape-dependent effects. With shrinking the size down to nanoscale
region, various anomalous properties that normally not present in bulk
start to dominate. Ability in harnessing of these anomalous properties
for the design of various advance electronic devices is strictly
dependent on synthetic strategies. Hence, current research has focused
on developing a rational synthetic control to produce high quality
nanocrystals by using organometallic approach to tune both size and
shape of the nanomaterials. In order to elucidate the growth
mechanism, transmission electron microscopy was employed as a
powerful tool in performing real time-resolved morphologies and
structural characterization of magnetic (Fe3O4) and semiconductor
(ZnO) nanocrystals. The current synthetic approach is found able to
produce nanostructures with well-defined shapes. We have found that
oleic acid is an effective capping ligand in preparing oxide-based
nanostructures without any agglomerations, even at high temperature.
The oleate-based precursors and capping ligands are fatty acid
compounds, which are respectively originated from natural palm oil
with low toxicity. In comparison with other synthetic approaches in
producing nanostructures, current synthetic method offers an effective
route to produce oxide-based nanomaterials with well-defined shapes
and good monodispersity. The nanocystals are well-separated with
each other without any stacking effect. In addition, the as-synthesized
nanopellets are stable in terms of chemically and physically if
compared to those nanomaterials that are previous reported. Further
development and extension of current synthetic strategy are being
pursued to combine both of these materials into nanocomposite form
that will be used as “smart magnetic nanophotocatalyst" for industry
waste water treatment.
Abstract: The increased use of biodiesel implies variations on both greenhouse gases and air pollutant emissions. Some studies point out that the use of biodiesel blends on diesel can help in controlling air pollution and promote a reduction of CO2 emissions. Reductions on PM, SO2, VOC and CO emissions are also expected, however NOx emissions may increase, which may potentiate O3 formation. This work aims to assess the impact of the biodiesel use on air quality, through a numerical modeling study, taking the Northern region of Portugal as a case study. The emission scenarios are focused on 2008 (baseline year) and 2020 (target year of Renewable Energy Directive-RED) and on three biodiesel blends (B0, B10 and B20). In a general way the use of biodiesel by 2020 will reduce the CO2 and air pollutants emissions in the Northern Portugal, improving air quality. However it will be in a very small extension.
Abstract: This paper looks into areas not covered by prominent
Agent-Oriented Software Engineering (AOSE) methodologies.
Extensive paper review led to the identification of two issues, first
most of these methodologies almost neglect semantic web and
ontology. Second, as expected, each one has its strength and
weakness and may focus on some phases of the development
lifecycle but not all of the phases. The work presented here builds
extensions to a highly regarded AOSE methodology (MaSE) in order
to cover the areas that this methodology does not concentrate on. The
extensions include introducing an ontology stage for semantic
representation and integrating early requirement specification from a
methodology which mainly focuses on that. The integration involved
developing transformation rules (with the necessary handling of nonmatching
notions) between the two sets of representations and
building the software which automates the transformation. The
application of this integration on a case study is also presented in the
paper. The main flow of MaSE stages was changed to smoothly
accommodate the new additions.
Abstract: A healthcare monitoring system is presented in this
paper. This system is based on ultra-low power sensor nodes and a
personal server, which is based on hardware and software extensions
to a Personal Digital Assistant (PDA)/Smartphone. The sensor node
collects data from the body of a patient and sends it to the personal
server where the data is processed, displayed and made ready to be
sent to a healthcare network, if necessary. The personal server
consists of a compact low power receiver module and equipped with
a Smartphone software. The receiver module takes less than 30 × 30
mm board size and consumes approximately 25 mA in active mode.
Abstract: This paper presents an improved image segmentation
model with edge preserving regularization based on the
piecewise-smooth Mumford-Shah functional. A level set formulation
is considered for the Mumford-Shah functional minimization in
segmentation, and the corresponding partial difference equations are
solved by the backward Euler discretization. Aiming at encouraging
edge preserving regularization, a new edge indicator function is
introduced at level set frame. In which all the grid points which is used
to locate the level set curve are considered to avoid blurring the edges
and a nonlinear smooth constraint function as regularization term is
applied to smooth the image in the isophote direction instead of the
gradient direction. In implementation, some strategies such as a new
scheme for extension of u+ and u- computation of the grid points and
speedup of the convergence are studied to improve the efficacy of the
algorithm. The resulting algorithm has been implemented and
compared with the previous methods, and has been proved efficiently
by several cases.
Abstract: This paper presents a new steganography approach suitable for Arabic texts. It can be classified under steganography feature coding methods. The approach hides secret information bits within the letters benefiting from their inherited points. To note the specific letters holding secret bits, the scheme considers the two features, the existence of the points in the letters and the redundant Arabic extension character. We use the pointed letters with extension to hold the secret bit 'one' and the un-pointed letters with extension to hold 'zero'. This steganography technique is found attractive to other languages having similar texts to Arabic such as Persian and Urdu.
Abstract: Current advancements in nanotechnology are dependent on the capabilities that can enable nano-scientists to extend their eyes and hands into the nano-world. For this purpose, a haptics (devices capable of recreating tactile or force sensations) based system for AFM (Atomic Force Microscope) is proposed. The system enables the nano-scientists to touch and feel the sample surfaces, viewed through AFM, in order to provide them with better understanding of the physical properties of the surface, such as roughness, stiffness and shape of molecular architecture. At this stage, the proposed work uses of ine images produced using AFM and perform image analysis to create virtual surfaces suitable for haptics force analysis. The research work is in the process of extension from of ine to online process where interaction will be done directly on the material surface for realistic analysis.
Abstract: Multi-dimensional principal component analysis
(PCA) is the extension of the PCA, which is used widely as the
dimensionality reduction technique in multivariate data analysis, to
handle multi-dimensional data. To calculate the PCA the singular
value decomposition (SVD) is commonly employed by the reason of
its numerical stability. The multi-dimensional PCA can be calculated
by using the higher-order SVD (HOSVD), which is proposed by
Lathauwer et al., similarly with the case of ordinary PCA. In this
paper, we apply the multi-dimensional PCA to the multi-dimensional
medical data including the functional independence measure (FIM)
score, and describe the results of experimental analysis.