Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: Environmental awareness and the recent
environmental policies have forced many electric utilities to
restructure their operational practices to account for their emission
impacts. One way to accomplish this is by reformulating the
traditional economic dispatch problem such that emission effects are
included in the mathematical model. This paper presents a Particle
Swarm Optimization (PSO) algorithm to solve the Economic-
Emission Dispatch problem (EED) which gained recent attention due
to the deregulation of the power industry and strict environmental
regulations. The problem is formulated as a multi-objective one with
two competing functions, namely economic cost and emission
functions, subject to different constraints. The inequality constraints
considered are the generating unit capacity limits while the equality
constraint is generation-demand balance. A novel equality constraint
handling mechanism is proposed in this paper. PSO algorithm is
tested on a 30-bus standard test system. Results obtained show that
PSO algorithm has a great potential in handling multi-objective
optimization problems and is capable of capturing Pareto optimal
solution set under different loading conditions.
Abstract: With deep development of software reuse, componentrelated
technologies have been widely applied in the development of
large-scale complex applications. Component identification (CI) is
one of the primary research problems in software reuse, by analyzing
domain business models to get a set of business components with high
reuse value and good reuse performance to support effective reuse.
Based on the concept and classification of CI, its technical stack is
briefly discussed from four views, i.e., form of input business models,
identification goals, identification strategies, and identification
process. Then various CI methods presented in literatures are
classified into four types, i.e., domain analysis based methods,
cohesion-coupling based clustering methods, CRUD matrix based
methods, and other methods, with the comparisons between these
methods for their advantages and disadvantages. Additionally, some
insufficiencies of study on CI are discussed, and the causes are
explained subsequently. Finally, it is concluded with some
significantly promising tendency about research on this problem.
Abstract: This article examines the emergence and development
of the Kazakhstan species of humanism. The biggest challenge for
Kazakhstan in terms of humanism is connected with advocating
human values in parallel to promoting national interests; preserving
the continuity of traditions in various spheres of life, business and
culture. This should be a common goal for the entire society, the
main direction for a national intelligence, and a platform for the state
policy. An idea worth considering is a formation of national humanist
tradition model; the challenges are adapting people to live in the
context of new industrial and innovative economic conditions,
keeping the balance during intensive economic development of the
country, and ensuring social harmony in the society.
Abstract: The article deals with dividends and their distribution from investors from a theoretical point of view. Some studies try to analyzed the reaction of the market on the dividend announcement and found out the change of dividend policy is associated with abnormal returns around the dividend announcement date. Another researches directly questioned the investors about their dividend preference and beliefs. Investors want the dividend from many reasons (e.g. some of them explain the dividend preference by the existence of transaction cost; investors prefer the dividend today, because there is less risky; the managers have private information about the firm). The most controversial theory of dividend policy was developed by Modigliani and Miller (1961) who demonstrated that in the perfect and complete capital markets the dividend policy is irrelevant and the value of the company is independent of its payout policy. Nevertheless, in the real world the capital markets are imperfect, because of asymmetric information, transaction costs, incomplete contracting possibilities and taxes.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: In this paper the direct kinematic model of a multiple
applications three degrees of freedom industrial manipulator, was
developed using the homogeneous transformation matrices and the
Denavit - Hartenberg parameters, likewise the inverse kinematic
model was developed using the same method, verifying that in the
workload border the inverse kinematic presents considerable errors,
therefore a genetic algorithm was implemented to optimize the model
improving greatly the efficiency of the model.
Abstract: E-government projects have potential for greater efficiency and effectiveness of government operations. For this reason, many developing countries governments have invested heavily in this agenda and an increasing number of e-government projects are being implemented. However, there is a lack of clear case material, which describes the potentialities and consequence experienced by organizations trying to manage with this change. The Ministry of State for Administrative Development (MSAD) is the organization responsible for the e-Government program in Egypt since early 2004. This paper presents a case study of the process of admission to public universities and institutions in Egypt which is led by MSAD. Underlining the key benefits resulting from the initiative, explaining the strategies and the development steps used to implement it, and highlighting the main obstacles encountered and how they were overcome will help repeat the experience in other useful e-government projects.
Abstract: Cell formation is the first step in the design of cellular
manufacturing systems. In this study, a general purpose
computational scheme employing a hybrid tabu search algorithm as
the core is proposed to solve the cell formation problem and its
variants. In the proposed scheme, great flexibilities are left to the
users. The core solution searching algorithm embedded in the scheme
can be easily changed to any other meta-heuristic algorithms, such as
the simulated annealing, genetic algorithm, etc., based on the
characteristics of the problems to be solved or the preferences the
users might have. In addition, several counters are designed to control
the timing of conducting intensified solution searching and diversified
solution searching strategies interactively.
Abstract: Response to the public health-related emergencies is analysed here for a rural university in South Africa. The structure of the designated emergency plan covers all the phases of the disaster management cycle. The plan contains elements of the vulnerability model and the technocratic model of emergency management. The response structures are vertically and horizontally integrated, while the planning contains elements of scenario-based and functional planning. The available number of medical professionals at the Rhodes University, along with the medical insurance rates, makes the staff and students potentially more medically vulnerable than the South African population. The main improvements of the emergency management are required in the tornado response and the information dissemination during health emergencies. The latter should involve the increased use of social media and e-mails, following the Taylor model of communication. Infrastructure must be improved in the telecommunication sector in the face of unpredictable electricity outages.
Abstract: As the global climate changes, the threat from
landslides and debris flows increases. Learning how a watershed
initiates landslides under abnormal rainfall conditions and predicting
landslide magnitude and frequency distribution is thus important.
Landslides show a power-law distribution in the frequency-area
distribution. The distribution curve shows an exponent gradient 1.0 in
the Sandpile model test. Will the landslide frequency-area statistics
show a distribution similar to the Sandpile model under extreme
rainfall conditions? The purpose of the study is to identify the extreme
rainfall-induced landslide frequency-area distribution in the Laonong
River Basin in southern Taiwan. Results of the analysis show that a
lower gradient of landslide frequency-area distribution could be
attributed to the transportation and deposition of debris flow areas that
are included in the landslide area.
Abstract: This study aims to investigate the gender differences in
spatial navigation using the tasks of 2-D matrix navigation and
recognition of real driving scene. The results can be summarized as
followings. First, female subjects responded faster in 2-D matrix
navigation task than male subjects when landmark instructions were
provided. Second, in recognition task, male subjects recognized the
key elements involved in the past driving scene more accurately than
female subjects. In particular, female subjects tended to miss
peripheral information. These results suggest the possibility of gender
differences in spatial navigation.
Abstract: In this paper we present an enhanced noise reduction method for robust speech recognition using Adaptive Gain Equalizer with Non linear Spectral Subtraction. In Adaptive Gain Equalizer method (AGE), the input signal is divided into a number of subbands that are individually weighed in time domain, in accordance to the short time Signal-to-Noise Ratio (SNR) in each subband estimation at every time instant. Instead of focusing on suppression the noise on speech enhancement is focused. When analysis was done under various noise conditions for speech recognition, it was found that Adaptive Gain Equalizer method algorithm has an obvious failing point for a SNR of -5 dB, with inadequate levels of noise suppression for SNR less than this point. This work proposes the implementation of AGE when coupled with Non linear Spectral Subtraction (AGE-NSS) for robust speech recognition. The experimental result shows that out AGE-NSS performs the AGE when SNR drops below -5db level.
Abstract: This article presents a resistorless current-mode firstorder allpass filter based on second generation current controlled current conveyors (CCCIIs). The features of the circuit are that: the pole frequency can be electronically controlled via the input bias current: the circuit description is very simple, consisting of 2 CCCIIs and single grounded capacitor, without any external resistors and component matching requirements. Consequently, the proposed circuit is very appropriate to further develop into an integrated circuit. Low input and high output impedances of the proposed configuration enable the circuit to be cascaded in current-mode without additional current buffers. The PSpice simulation results are depicted. The given results agree well with the theoretical anticipation. The application example as a current-mode quadrature oscillator is included.
Abstract: Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Abstract: In this paper, a novel feature-based image
watermarking scheme is proposed. Zernike moments which have
invariance properties are adopted in the scheme. In the proposed
scheme, feature points are first extracted from host image and several
circular patches centered on these points are generated. The patches
are used as carriers of watermark information because they can be
regenerated to locate watermark embedding positions even when
watermarked images are severely distorted. Zernike transform is then
applied to the patches to calculate local Zernike moments. Dither
modulation is adopted to quantize the magnitudes of the Zernike
moments followed by false alarm analysis. Experimental results show
that quality degradation of watermarked image is visually
transparent. The proposed scheme is very robust against image
processing operations and geometric attacks.
Abstract: Current trends in manufacturing are characterized by
production broadening, innovation cycle shortening, and the products
having a new shape, material and functions. The production strategy
focused on time needed change from the traditional functional
production structure to flexible manufacturing cells and lines.
Production by automated manufacturing system (AMS) is one of the
most important manufacturing philosophies in the last years. The
main goals of the project we are involved in lies on building a
laboratory in which will be located a flexible manufacturing system
consisting of at least two production machines with NC control
(milling machines, lathe). These machines will be linked to a
transport system and they will be served by industrial robots. Within
this flexible manufacturing system a station for the quality control
consisting of a camera system and rack warehouse will be also
located. The design, analysis and improvement of this manufacturing
system, specially with a special focus on the communication among
devices constitute the main aims of this paper. The key determining
factors for the manufacturing system design are: the product, the
production volume, the used machines, the disposable manpower, the
disposable infrastructure and the legislative frame for the specific
cases.
Abstract: This paper reviews recent studies and particularly the
effects of Climate Change in the North Tropical Atlantic by studying
atmospheric conditions that prevailed in 2005 ; Coral Bleaching
HotSpot and Hurricane Katrina. In the aim to better understand and
estimate the impact of the physical phenomenon, i.e. Thermal
Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on
marine animals from Guadeloupe (French Caribbean Island) were
carried out. Recorded measures show Sea Surface Temperature (SST)
up to 35°C in August which is much higher than data recorded by
NOAA satellites 32°C. After having reviewed the process that led to
the creation of Hurricane Katrina which hit New Orleans in August
29, 2005, it will be shown that the climatic conditions in the
Caribbean from August to October 2005 have influenced Katrina
evolution. This TOHS is a combined effect of various phenomenon
which represent an additional factor to estimate future climate
changes.
Abstract: A linear system is called a fully fuzzy linear system (FFLS) if quantities in this system are all fuzzy numbers. For the FFLS, we investigate its solution and develop a new approximate method for solving the FFLS. Observing the numerical results, we find that our method is accurate than the iterative Jacobi and Gauss- Seidel methods on approximating the solution of FFLS.