Abstract: Real-time embedded systems should benefit from
component-based software engineering to handle complexity and
deal with dependability. In these systems, applications should not
only be logically correct but also behave within time windows.
However, in the current component based software engineering
approaches, a few of component models handles time properties in
a manner that allows efficient analysis and checking at the
architectural level. In this paper, we present a meta-model for
component-based software description that integrates timing
issues. To achieve a complete functional model of software
components, our meta-model focuses on four functional aspects:
interface, static behavior, dynamic behavior, and interaction
protocol. With each aspect we have explicitly associated a time
model. Such a time model can be used to check a component-s
design against certain properties and to compute the timing
properties of component assemblies.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: Product Data Management (PDM) systems for Computer
Aided Design (CAD) file management are widely established
in design processes. This management system is indispensable for
design collaboration or when design task distribution is present. It is
thus surprising that engineering design curricula has not paid much
attention in the education of PDM systems. This is also the case
for eduction of ecodesign and environmental evaluation of products.
With the rise of sustainability as a strategic aspect in companies,
environmental concerns are becoming a key issue in design. This
paper discusses the establishment of a PDM platform to be used
among technical and vocational schools in Austria. The PDM system
facilitates design collaboration among these schools. Further, it will
be discussed how the PDM system has been prepared in order to
facilitate environmental evaluation of parts, components and subassemblies
of a product. By integrating a Business Intelligence
solution, environmental Life Cycle Assessment and communication
of results is enabled.
Abstract: This study explores how the mechanics of learning
paves the way to engineering innovation. Theories related to learning
in the new product/service innovation are reviewed from an
organizational perspective, behavioral perspective, and engineering
perspective. From this, an engineering team-s external interactions
for knowledge brokering and internal composition for skill balance
are examined from a learning and innovation viewpoints. As a result,
an integrated learning model is developed by reconciling the
theoretical perspectives as well as developing propositions that
emphasize the centrality of learning, and its drivers, in the
engineering product/service development. The paper also provides a
review and partial validation of the propositions using the results of a
previously published field study in the aerospace industry.
Abstract: Nondestructive testing in engineering is an inverse
Cauchy problem for Laplace equation. In this paper the problem
of nondestructive testing is expressed by a Laplace-s equation with
third-kind boundary conditions. In order to find unknown values on
the boundary, the method of fundamental solution is introduced and
realized. Because of the ill-posedness of studied problems, the TSVD
regularization technique in combination with L-curve criteria and
Generalized Cross Validation criteria is employed. Numerical results
are shown that the TSVD method combined with L-curve criteria is
more efficient than the TSVD method combined with GCV criteria.
The abstract goes here.
Abstract: A novel methodology has been used to design an
evaporator coil of a refrigerant. The methodology used is through a
complete Computer Aided Design /Computer Aided Engineering
approach, by means of a Computational Fluid Dynamic/Finite
Element Analysis model which is executed many times for the
thermal-fluid exploration of several designs' configuration by an
commercial optimizer. Hence the design is carried out automatically
by parallel computations, with an optimization package taking the
decisions rather than the design engineer. The engineer instead takes
decision regarding the physical settings and initializing of the
computational models to employ, the number and the extension of the
geometrical parameters of the coil fins and the optimization tools to
be employed. The final design of the coil geometry found to be better
than the initial design.
Abstract: This paper focuses on a novel method for semantic
searching and retrieval of information about learning materials.
Metametadata encapsulate metadata instances by using the properties
and attributes provided by ontologies rather than describing learning
objects. A novel metametadata taxonomy has been developed which
provides the basis for a semantic search engine to extract, match and
map queries to retrieve relevant results. The use of ontological views
is a foundation for viewing the pedagogical content of metadata
extracted from learning objects by using the pedagogical attributes
from the metametadata taxonomy. Using the ontological approach
and metametadata (based on the metametadata taxonomy) we present
a novel semantic searching mechanism.These three strands – the
taxonomy, the ontological views, and the search algorithm – are
incorporated into a novel architecture (OMESCOD) which has been
implemented.
Abstract: Green buildings have been commonly cited to be more
expensive than conventional buildings. However, limited research
has been conducted to clearly identify elements that contribute to this
cost differential. The construction cost of buildings can be typically
divided into “hard" costs and “soft" cost elements. Using a review
analysis of existing literature, the study identified six main elements
in green buildings that contribute to the general cost elements that are
“soft" in nature. The six elements found are insurance, developer-s
experience, design cost, certification, commissioning and energy
modeling. Out of the six elements, most literatures have highlighted
the increase in design cost for green design as compared to
conventional design due to additional architectural and engineering
costs, eco-charettes, extra design time, and the further need for a
green consultant. The study concluded that these elements of soft cost
contribute to the green premium or cost differential of green
buildings.
Abstract: The problem of spam has been seriously troubling the Internet community during the last few years and currently reached an alarming scale. Observations made at CERN (European Organization for Nuclear Research located in Geneva, Switzerland) show that spam mails can constitute up to 75% of daily SMTP traffic. A naïve Bayesian classifier based on a Bag Of Words representation of an email is widely used to stop this unwanted flood as it combines good performance with simplicity of the training and classification processes. However, facing the constantly changing patterns of spam, it is necessary to assure online adaptability of the classifier. This work proposes combining such a classifier with another NBC (naïve Bayesian classifier) based on pairs of adjacent words. Only the latter will be retrained with examples of spam reported by users. Tests are performed on considerable sets of mails both from public spam archives and CERN mailboxes. They suggest that this architecture can increase spam recall without affecting the classifier precision as it happens when only the NBC based on single words is retrained.
Abstract: Presents a concept for a multidisciplinary process
supporting effective task transitions between different technical
domains during the architectural design stage.
A system configuration challenge is the multifunctional driven
increased solution space. As a consequence, more iteration is needed
to find a global optimum, i.e. a compromise between involved
disciplines without negative impact on development time. Since state
of the art standards like ISO 15288 and VDI 2206 do not provide a
detailed methodology on multidisciplinary design process, higher
uncertainties regarding final specifications arise. This leads to the
need of more detailed and standardized concepts or processes which
could mitigate risks.
The performed work is based on analysis of multidisciplinary
interaction, of modeling and simulation techniques. To demonstrate
and prove the applicability of the presented concept, it is applied to
the design of aircraft high lift systems, in the context of the
engineering disciplines kinematics, actuation, monitoring, installation
and structure design.
Abstract: With high speed vessels getting ever more sophisti-cated, travelling at higher and higher speeds and operating in With high speed vessels getting ever more sophisticated,
travelling at higher and higher speeds and operating in areas of
high maritime traffic density, training becomes of the highest priority
to ensure that safety levels are maintained, and risks are adequately
mitigated. Training onboard the actual craft on the actual route still
remains the most effective way for crews to gain experience. However,
operational experience and incidents during the last 10 years
demonstrate the need for supplementary training whether in the area
of simulation or man to man, man/ machine interaction. Training and
familiarisation of the crew is the most important aspect in preventing
incidents. The use of simulator, computer and web based training
systems in conjunction with onboard training focusing on critical
situations will improve the man machine interaction and thereby
reduce the risk of accidents. Today, both ship simulator and bridge
teamwork courses are now becoming the norm in order to improve
further emergency response and crisis management skills. One of the
main causes of accidents is the human factor. An efficient way to
reduce human errors is to provide high-quality training to the personnel
and to select the navigators carefully.areas of high maritime traffic density, training becomes of the highest priority to ensure that safety levels are maintained, and risks are adequately mitigated. Training onboard the actual craft on the actual route still remains the most effective way for crews to gain experience. How-ever, operational experience and incidents during the last 10 years demonstrate the need for supplementary training whether in the area of simulation or man to man, man/ machine interaction. Training and familiarisation of the crew is the most important aspect in preventing incidents. The use of simulator, computer and web based training systems in conjunction with onboard training focusing on critical situations will improve the man machine interaction and thereby reduce the risk of accidents. Today, both ship simulator and bridge teamwork courses are now becoming the norm in order to improve further emergency response and crisis management skills. One of the main causes of accidents is the human factor. An efficient way to reduce human errors is to provide high-quality training to the person-nel and to select the navigators carefully. KeywordsCBT - WBT systems, Human factors.
Abstract: This paper looks into areas not covered by prominent
Agent-Oriented Software Engineering (AOSE) methodologies.
Extensive paper review led to the identification of two issues, first
most of these methodologies almost neglect semantic web and
ontology. Second, as expected, each one has its strength and
weakness and may focus on some phases of the development
lifecycle but not all of the phases. The work presented here builds
extensions to a highly regarded AOSE methodology (MaSE) in order
to cover the areas that this methodology does not concentrate on. The
extensions include introducing an ontology stage for semantic
representation and integrating early requirement specification from a
methodology which mainly focuses on that. The integration involved
developing transformation rules (with the necessary handling of nonmatching
notions) between the two sets of representations and
building the software which automates the transformation. The
application of this integration on a case study is also presented in the
paper. The main flow of MaSE stages was changed to smoothly
accommodate the new additions.
Abstract: The prediction of long-term deformations of concrete and reinforced concrete structures has been a field of extensive research and several different creep models have been developed so far. Most of the models were developed for constant concrete stresses, thus, in case of varying stresses a specific superposition principle or time-integration, respectively, is necessary. Nowadays, when modeling concrete creep the engineering focus is rather on the application of sophisticated time-integration methods than choosing the more appropriate creep model. For this reason, this paper presents a method to quantify the uncertainties of creep prediction originating from the selection of creep models or from the time-integration methods. By adapting variance based global sensitivity analysis, a methodology is developed to quantify the influence of creep model selection or choice of time-integration method. Applying the developed method, general recommendations how to model creep behavior for varying stresses are given.
Abstract: The internet has become an attractive avenue for
global e-business, e-learning, knowledge sharing, etc. Due to
continuous increase in the volume of web content, it is not practically
possible for a user to extract information by browsing and integrating
data from a huge amount of web sources retrieved by the existing
search engines. The semantic web technology enables advancement
in information extraction by providing a suite of tools to integrate
data from different sources. To take full advantage of semantic web,
it is necessary to annotate existing web pages into semantic web
pages. This research develops a tool, named OWIE (Ontology-based
Web Information Extraction), for semantic web annotation using
domain specific ontologies. The tool automatically extracts
information from html pages with the help of pre-defined ontologies
and gives them semantic representation. Two case studies have been
conducted to analyze the accuracy of OWIE.
Abstract: Malate dehydrogenase-glutamate oxaloacetate
aminotransferase (MDh-GOAT) enzyme complex (the EC) was
isolated and purified from wheat and rise, their some main physicchemical
properties were studied. Michael-s constants of the EC
MDh-GOAT to malate, glutamate and NAD were investigated. This
kinetic results show a high relationship to glutamate. Taking into
account important role of the the EC in catabolism of glutamate – the
central amino acid of a nitric exchange, there is a sharp necessity of
deeper studying of this enzyme complex. Therefore the basic purpose
of the work is studying the basic physical and chemical properties of
this enzyme complex discovered by us, which would be very
important for understanding the mechanisms of reaction catalyzed by
the EC.
Abstract: Recently, genetic algorithms (GA) and particle swarm optimization (PSO) technique have attracted considerable attention among various modern heuristic optimization techniques. The GA has been popular in academia and the industry mainly because of its intuitiveness, ease of implementation, and the ability to effectively solve highly non-linear, mixed integer optimization problems that are typical of complex engineering systems. PSO technique is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. In this paper both PSO and GA optimization are employed for finding stable reduced order models of single-input- single-output large-scale linear systems. Both the techniques guarantee stability of reduced order model if the original high order model is stable. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example from literature and the results are compared with recently published conventional model reduction technique.
Abstract: Aiming at most of the aviation products are facing the problem of fatigue fracture in vibration environment, we makes use of the testing result of a bracket, analysis for the structure with ANSYS-Workbench, predict the life of the bracket by different ways, and compared with the testing result. With the research on analysis methods, make an organic combination of simulation analysis and testing, Not only ensure the accuracy of simulation analysis and life predict, but also make a dynamic supervision of product life process, promote the application of finite element simulation analysis in engineering practice.
Abstract: Electrospinning is a broadly used technology to obtain
polymeric nanofibers ranging from several micrometers down to
several hundred nanometers for a wide range of applications. It offers
unique capabilities to produce nanofibers with controllable porous
structure. With smaller pores and higher surface area than regular
fibers, electrospun fibers have been successfully applied in various
fields, such as, nanocatalysis, tissue engineering scaffolds, protective
clothing, filtration, biomedical, pharmaceutical, optical electronics,
healthcare, biotechnology, defense and security, and environmental
engineering. In this study, polyurethane nanofibers were obtained
under different electrospinning parameters. Fiber morphology and
diameter distribution were investigated in order to understand them
as a function of process parameters.
Abstract: Foundation of tower crane serves to ensure stability
against vertical and horizontal forces. If foundation stress is not
sufficient, tower crane may be subject to overturning, shearing or
foundation settlement. Therefore, engineering review of stable support
is a highly critical part of foundation design. However, there are not
many professionals who can conduct engineering review of tower
crane foundation and, if any, they have information only on a small
number of cranes in which they have hands-on experience. It is also
customary to rely on empirical knowledge and tower crane renter-s
recommendations rather than designing foundation on the basis of
engineering knowledge. Therefore, a foundation design automation
system considering not only lifting conditions but also overturning
risk, shearing and vertical force may facilitate production of foolproof
foundation design for experts and enable even non-experts to utilize
professional knowledge that only experts can access now. This study
proposes Automatic Design Algorithm for the Tower Crane
Foundations considering load and horizontal force.