Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: With deep development of software reuse, componentrelated
technologies have been widely applied in the development of
large-scale complex applications. Component identification (CI) is
one of the primary research problems in software reuse, by analyzing
domain business models to get a set of business components with high
reuse value and good reuse performance to support effective reuse.
Based on the concept and classification of CI, its technical stack is
briefly discussed from four views, i.e., form of input business models,
identification goals, identification strategies, and identification
process. Then various CI methods presented in literatures are
classified into four types, i.e., domain analysis based methods,
cohesion-coupling based clustering methods, CRUD matrix based
methods, and other methods, with the comparisons between these
methods for their advantages and disadvantages. Additionally, some
insufficiencies of study on CI are discussed, and the causes are
explained subsequently. Finally, it is concluded with some
significantly promising tendency about research on this problem.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the economic
opportunities through business innovation and growth. We found
evidences in literature that SOA (Service Oriented Architecture) is a
promising emerging technology which can deliver the desired
economic opportunity through modularity, flexibility and loosecoupling.
SOA can also help firms to connect in network which can
open a new window of opportunity to collaborate in innovation and
right kind of outsourcing
Abstract: Product Data Management (PDM) systems for Computer
Aided Design (CAD) file management are widely established
in design processes. This management system is indispensable for
design collaboration or when design task distribution is present. It is
thus surprising that engineering design curricula has not paid much
attention in the education of PDM systems. This is also the case
for eduction of ecodesign and environmental evaluation of products.
With the rise of sustainability as a strategic aspect in companies,
environmental concerns are becoming a key issue in design. This
paper discusses the establishment of a PDM platform to be used
among technical and vocational schools in Austria. The PDM system
facilitates design collaboration among these schools. Further, it will
be discussed how the PDM system has been prepared in order to
facilitate environmental evaluation of parts, components and subassemblies
of a product. By integrating a Business Intelligence
solution, environmental Life Cycle Assessment and communication
of results is enabled.
Abstract: To explore pipelines is one of various bio-mimetic
robot applications. The robot may work in common buildings such as
between ceilings and ducts, in addition to complicated and massive
pipeline systems of large industrial plants. The bio-mimetic robot finds
any troubled area or malfunction and then reports its data. Importantly,
it can not only prepare for but also react to any abnormal routes in the
pipeline. The pipeline monitoring tasks require special types of mobile
robots. For an effective movement along a pipeline, the movement of
the robot will be similar to that of insects or crawling animals. During
its movement along the pipelines, a pipeline monitoring robot has an
important task of finding the shapes of the approaching path on the
pipes. In this paper we propose an effective solution to the pipeline
pattern recognition, based on the fuzzy classification rules for the
measured IR distance data.
Abstract: This study explores how the mechanics of learning
paves the way to engineering innovation. Theories related to learning
in the new product/service innovation are reviewed from an
organizational perspective, behavioral perspective, and engineering
perspective. From this, an engineering team-s external interactions
for knowledge brokering and internal composition for skill balance
are examined from a learning and innovation viewpoints. As a result,
an integrated learning model is developed by reconciling the
theoretical perspectives as well as developing propositions that
emphasize the centrality of learning, and its drivers, in the
engineering product/service development. The paper also provides a
review and partial validation of the propositions using the results of a
previously published field study in the aerospace industry.
Abstract: The article deals with dividends and their distribution from investors from a theoretical point of view. Some studies try to analyzed the reaction of the market on the dividend announcement and found out the change of dividend policy is associated with abnormal returns around the dividend announcement date. Another researches directly questioned the investors about their dividend preference and beliefs. Investors want the dividend from many reasons (e.g. some of them explain the dividend preference by the existence of transaction cost; investors prefer the dividend today, because there is less risky; the managers have private information about the firm). The most controversial theory of dividend policy was developed by Modigliani and Miller (1961) who demonstrated that in the perfect and complete capital markets the dividend policy is irrelevant and the value of the company is independent of its payout policy. Nevertheless, in the real world the capital markets are imperfect, because of asymmetric information, transaction costs, incomplete contracting possibilities and taxes.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: Structural redundancy is an interesting point in
seismic design of structures. Initially, the structural redundancy is
described as indeterminate degree of a system. Although many definitions are presented for redundancy in structures, recently the
definition of structural redundancy has been related to the configuration of structural system and the number of lateral load
transferring directions in the structure. The steel frames with infill walls are general systems in the constructing of usual residential buildings in some countries. It is
obviously declared that the performance of structures will be affected by adding masonry infill walls. In order to investigate the effect of
infill walls on the redundancy of the steel frame which constructed
with masonry walls, the components of redundancy including redundancy variation index, redundancy strength index and
redundancy response modification factor were extracted for the
frames with masonry infills. Several steel frames with typical storey number and various numbers of bays were designed and considered.
The redundancy of frames with and without infill walls was evaluated by proposed method. The results showed the presence of infill causes increase of redundancy.
Abstract: Response to the public health-related emergencies is analysed here for a rural university in South Africa. The structure of the designated emergency plan covers all the phases of the disaster management cycle. The plan contains elements of the vulnerability model and the technocratic model of emergency management. The response structures are vertically and horizontally integrated, while the planning contains elements of scenario-based and functional planning. The available number of medical professionals at the Rhodes University, along with the medical insurance rates, makes the staff and students potentially more medically vulnerable than the South African population. The main improvements of the emergency management are required in the tornado response and the information dissemination during health emergencies. The latter should involve the increased use of social media and e-mails, following the Taylor model of communication. Infrastructure must be improved in the telecommunication sector in the face of unpredictable electricity outages.
Abstract: In this paper we present an enhanced noise reduction method for robust speech recognition using Adaptive Gain Equalizer with Non linear Spectral Subtraction. In Adaptive Gain Equalizer method (AGE), the input signal is divided into a number of subbands that are individually weighed in time domain, in accordance to the short time Signal-to-Noise Ratio (SNR) in each subband estimation at every time instant. Instead of focusing on suppression the noise on speech enhancement is focused. When analysis was done under various noise conditions for speech recognition, it was found that Adaptive Gain Equalizer method algorithm has an obvious failing point for a SNR of -5 dB, with inadequate levels of noise suppression for SNR less than this point. This work proposes the implementation of AGE when coupled with Non linear Spectral Subtraction (AGE-NSS) for robust speech recognition. The experimental result shows that out AGE-NSS performs the AGE when SNR drops below -5db level.
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: In this paper, a novel feature-based image
watermarking scheme is proposed. Zernike moments which have
invariance properties are adopted in the scheme. In the proposed
scheme, feature points are first extracted from host image and several
circular patches centered on these points are generated. The patches
are used as carriers of watermark information because they can be
regenerated to locate watermark embedding positions even when
watermarked images are severely distorted. Zernike transform is then
applied to the patches to calculate local Zernike moments. Dither
modulation is adopted to quantize the magnitudes of the Zernike
moments followed by false alarm analysis. Experimental results show
that quality degradation of watermarked image is visually
transparent. The proposed scheme is very robust against image
processing operations and geometric attacks.
Abstract: Current trends in manufacturing are characterized by
production broadening, innovation cycle shortening, and the products
having a new shape, material and functions. The production strategy
focused on time needed change from the traditional functional
production structure to flexible manufacturing cells and lines.
Production by automated manufacturing system (AMS) is one of the
most important manufacturing philosophies in the last years. The
main goals of the project we are involved in lies on building a
laboratory in which will be located a flexible manufacturing system
consisting of at least two production machines with NC control
(milling machines, lathe). These machines will be linked to a
transport system and they will be served by industrial robots. Within
this flexible manufacturing system a station for the quality control
consisting of a camera system and rack warehouse will be also
located. The design, analysis and improvement of this manufacturing
system, specially with a special focus on the communication among
devices constitute the main aims of this paper. The key determining
factors for the manufacturing system design are: the product, the
production volume, the used machines, the disposable manpower, the
disposable infrastructure and the legislative frame for the specific
cases.
Abstract: This paper reviews recent studies and particularly the
effects of Climate Change in the North Tropical Atlantic by studying
atmospheric conditions that prevailed in 2005 ; Coral Bleaching
HotSpot and Hurricane Katrina. In the aim to better understand and
estimate the impact of the physical phenomenon, i.e. Thermal
Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on
marine animals from Guadeloupe (French Caribbean Island) were
carried out. Recorded measures show Sea Surface Temperature (SST)
up to 35°C in August which is much higher than data recorded by
NOAA satellites 32°C. After having reviewed the process that led to
the creation of Hurricane Katrina which hit New Orleans in August
29, 2005, it will be shown that the climatic conditions in the
Caribbean from August to October 2005 have influenced Katrina
evolution. This TOHS is a combined effect of various phenomenon
which represent an additional factor to estimate future climate
changes.
Abstract: A linear system is called a fully fuzzy linear system (FFLS) if quantities in this system are all fuzzy numbers. For the FFLS, we investigate its solution and develop a new approximate method for solving the FFLS. Observing the numerical results, we find that our method is accurate than the iterative Jacobi and Gauss- Seidel methods on approximating the solution of FFLS.
Abstract: This paper will focus on modeling, analysis and simulation of a 42V/14V dc/dc converter based architecture. This architecture is considered to be technically a viable solution for automotive dual-voltage power system for passenger car in the near further. An interleaved dc/dc converter system is chosen for the automotive converter topology due to its advantages regarding filter reduction, dynamic response, and power management. Presented herein, is a model based on one kilowatt interleaved six-phase buck converter designed to operate in a Discontinuous Conduction Mode (DCM). The control strategy of the converter is based on a voltagemode- controlled Pulse Width Modulation (PWM) with a Proportional-Integral-Derivative (PID). The effectiveness of the interleaved step-down converter is verified through simulation results using control-oriented simulator, MatLab/Simulink.
Abstract: In this paper optical code-division multiple-access (OCDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of onedimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver is considered in the analysis. Using analytical model, we compute and compare packet-success probability for 1-D and 2-D codes in an O-CDMA network and the analysis shows improved performance with 2-D codes as compared to 1-D codes.
Abstract: In this work a new platform for mobile-health systems is
presented. System target application is providing decision support to
rescue corps or military medical personnel in combat areas. Software
architecture relies on a distributed client-server system that manages a
wireless ad-hoc networks hierarchy in which several different types of
client operate. Each client is characterized for different hardware and
software requirements. Lower hierarchy levels rely in a network of
completely custom devices that store clinical information and patient
status and are designed to form an ad-hoc network operating in the
2.4 GHz ISM band and complying with the IEEE 802.15.4 standard
(ZigBee). Medical personnel may interact with such devices, that are
called MICs (Medical Information Carriers), by means of a PDA
(Personal Digital Assistant) or a MDA (Medical Digital Assistant),
and transmit the information stored in their local databases as well as
issue a service request to the upper hierarchy levels by using IEEE
802.11 a/b/g standard (WiFi). The server acts as a repository that
stores both medical evacuation forms and associated events (e.g., a
teleconsulting request). All the actors participating in the diagnostic
or evacuation process may access asynchronously to such repository
and update its content or generate new events. The designed system
pretends to optimise and improve information spreading and flow
among all the system components with the aim of improving both
diagnostic quality and evacuation process.