Abstract: Absolute pitch is the ability to identify a musical note without a reference tone. Training for absolute pitch often occurs in preschool education. It is necessary to clarify how well the trainee can make use of synesthesia in order to evaluate the effect of the training. To the best of our knowledge, there are no existing methods for objectively confirming whether the subject is using synesthesia. Therefore, in this study, we present a method to distinguish the use of color-auditory synesthesia from the separate use of color and audition during absolute pitch training. This method measures blood volume in the prefrontal cortex using functional Near-infrared spectroscopy (fNIRS) and assumes that the cognitive step has two parts, a non-linear step and a linear step. For the linear step, we assume a second order ordinary differential equation. For the non-linear part, it is extremely difficult, if not impossible, to create an inverse filter of such a complex system as the brain. Therefore, we apply a method based on a self-organizing map (SOM) and are guided by the available data. The presented method was tested using 15 subjects, and the estimation accuracy is reported.
Abstract: The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.
Abstract: Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.
Abstract: Finite Element Models (FEMs) are widely used in order to study and predict the dynamic properties of structures and usually, the prediction can be obtained with much more accuracy in the case of a single component than in the case of assemblies. Especially for structural dynamics studies, in the low and middle frequency range, most complex FEMs can be seen as assemblies made by linear components joined together at interfaces. From a modelling and computational point of view, these types of joints can be seen as localized sources of stiffness and damping and can be modelled as lumped spring/damper elements, most of time, characterized by nonlinear constitutive laws. On the other side, most of FE programs are able to run nonlinear analysis in time-domain. They treat the whole structure as nonlinear, even if there is one nonlinear degree of freedom (DOF) out of thousands of linear ones, making the analysis unnecessarily expensive from a computational point of view. In this work, a methodology in order to obtain the nonlinear frequency response of structures, whose nonlinearities can be considered as localized sources, is presented. The work extends the well-known Structural Dynamic Modification Method (SDMM) to a nonlinear set of modifications, and allows getting the Nonlinear Frequency Response Functions (NLFRFs), through an ‘updating’ process of the Linear Frequency Response Functions (LFRFs). A brief summary of the analytical concepts is given, starting from the linear formulation and understanding what the implications of the nonlinear one, are. The response of the system is formulated in both: time and frequency domain. First the Modal Database is extracted and the linear response is calculated. Secondly the nonlinear response is obtained thru the NL SDMM, by updating the underlying linear behavior of the system. The methodology, implemented in MATLAB, has been successfully applied to estimate the nonlinear frequency response of two systems. The first one is a two DOFs spring-mass-damper system, and the second example takes into account a full aircraft FE Model. In spite of the different levels of complexity, both examples show the reliability and effectiveness of the method. The results highlight a feasible and robust procedure, which allows a quick estimation of the effect of localized nonlinearities on the dynamic behavior. The method is particularly powerful when most of the FE Model can be considered as acting linearly and the nonlinear behavior is restricted to few degrees of freedom. The procedure is very attractive from a computational point of view because the FEM needs to be run just once, which allows faster nonlinear sensitivity analysis and easier implementation of optimization procedures for the calibration of nonlinear models.
Abstract: Current production-oriented factories need maintenance operators to work in shifts monitoring and inspecting complex systems and different equipment in the situation of mechanical breakdown. Augmented reality (AR) is an emerging technology that embeds data into the environment for situation awareness to help maintenance operators make decisions and solve problems. An application was designed to identify the problem of steam generators and inspection centrifugal pumps. The objective of this research was to find the best medium of AR and type of problem solving strategies among analogy, focal object method and mean-ends analysis. Two scenarios of inspecting leakage were temperature and vibration. Two experiments were used in usability evaluation and future innovation, which included decision-making process and problem-solving strategy. This study found that maintenance operators prefer build-in magnifier to zoom the components (55.6%), 3D exploded view to track the problem parts (50%), and line chart to find the alter data or information (61.1%). There is a significant difference in the use of analogy (44.4%), focal objects (38.9%) and mean-ends strategy (16.7%). The marked differences between maintainers and operators are of the application of a problem solving strategy. However, future work should explore multimedia information retrieval which supports maintenance operators for decision-making.
Abstract: Nowadays, manufactures are encountered with production of different version of products due to quality, cost and time constraints. On the other hand, Additive Manufacturing (AM) as a production method based on CAD model disrupts the design and manufacturing cycle with new parameters. To consider these issues, the researchers utilized Design For Manufacturing (DFM) approach for AM but until now there is no integrated approach for design and manufacturing of product through the AM. So, this paper aims to provide a general methodology for managing the different production issues, as well as, support the interoperability with AM process and different Product Life Cycle Management tools. The problem is that the models of System Engineering which is used for managing complex systems cannot support the product evolution and its impact on the product life cycle. Therefore, it seems necessary to provide a general methodology for managing the product’s diversities which is created by using AM. This methodology must consider manufacture and assembly during product design as early as possible in the design stage. The latest approach of DFM, as a methodology to analyze the system comprehensively, integrates manufacturing constraints in the numerical model in upstream. So, DFM for AM is used to import the characteristics of AM into the design and manufacturing process of a hybrid product to manage the criteria coming from AM. Also, the research presents an integrated design method in order to take into account the knowledge of layers manufacturing technologies. For this purpose, the interface model based on the skin and skeleton concepts is provided, the usage and manufacturing skins are used to show the functional surface of the product. Also, the material flow and link between the skins are demonstrated by usage and manufacturing skeletons. Therefore, this integrated approach is a helpful methodology for designer and manufacturer in different decisions like material and process selection as well as, evaluation of product manufacturability.
Abstract: With 40% of total world energy consumption,
building systems are developing into technically complex large
energy consumers suitable for application of sophisticated power
management approaches to largely increase the energy efficiency
and even make them active energy market participants. Centralized
control system of building heating and cooling managed by
economically-optimal model predictive control shows promising
results with estimated 30% of energy efficiency increase. The research
is focused on implementation of such a method on a case study
performed on two floors of our faculty building with corresponding
sensors wireless data acquisition, remote heating/cooling units and
central climate controller. Building walls are mathematically modeled
with corresponding material types, surface shapes and sizes. Models
are then exploited to predict thermal characteristics and changes in
different building zones. Exterior influences such as environmental
conditions and weather forecast, people behavior and comfort
demands are all taken into account for deriving price-optimal climate
control. Finally, a DC microgrid with photovoltaics, wind turbine,
supercapacitor, batteries and fuel cell stacks is added to make the
building a unit capable of active participation in a price-varying
energy market. Computational burden of applying model predictive
control on such a complex system is relaxed through a hierarchical
decomposition of the microgrid and climate control, where the
former is designed as higher hierarchical level with pre-calculated
price-optimal power flows control, and latter is designed as lower
level control responsible to ensure thermal comfort and exploit
the optimal supply conditions enabled by microgrid energy flows
management. Such an approach is expected to enable the inclusion
of more complex building subsystems into consideration in order to
further increase the energy efficiency.
Abstract: The rheological response of blends obtained from
quaternized polysulfone and polyvinyl alcohol in N-methyl-2-
pyrrolidone as against structural peculiarity of polymers from the
blend, composition of polymer mixtures, as well as the types of
interactions were investigated. Results show that the variation of
polyvinyl alcohol composition in the studied system determines
changes of the rheological properties, suggesting that the PVA acts as
a plasticizer. Consequently, rheological behavior of complex system,
described by the nonlinear flow curve, indicates the impact of
polyvinil alcohol content to polysulfone solution, in order to facilitate
the subsequently preparation of bioactive membranes.
Abstract: Energy has a prominent role for development of
nations. Countries which have energy resources also have strategic
power in the international trade of energy since it is essential for all
stages of production in the economy. Thus, it is important for
countries to analyze the weaknesses and strength of the system. On
the other side, international trade is one of the fields that are analyzed
as a complex network via network analysis. Complex network is one
of the tools to analyze complex systems with heterogeneous agents
and interaction between them. A complex network consists of nodes
and the interactions between these nodes. Total properties which
emerge as a result of these interactions are distinct from the sum of
small parts (more or less) in complex systems. Thus, standard
approaches to international trade are superficial to analyze these
systems. Network analysis provides a new approach to analyze
international trade as a network. In this network, countries constitute
nodes and trade relations (export or import) constitute edges. It
becomes possible to analyze international trade network in terms of
high degree indicators which are specific to complex networks such
as connectivity, clustering, assortativity/disassortativity, centrality,
etc. In this analysis, international trade of crude oil and coal which
are types of fossil fuel has been analyzed from 2005 to 2014 via
network analysis. First, it has been analyzed in terms of some
topological parameters such as density, transitivity, clustering etc.
Afterwards, fitness to Pareto distribution has been analyzed via
Kolmogorov-Smirnov test. Finally, weighted HITS algorithm has
been applied to the data as a centrality measure to determine the real
prominence of countries in these trade networks. Weighted HITS
algorithm is a strong tool to analyze the network by ranking countries
with regards to prominence of their trade partners. We have
calculated both an export centrality and an import centrality by
applying w-HITS algorithm to the data. As a result, impacts of the
trading countries have been presented in terms of high-degree
indicators.
Abstract: In this paper, synchronization of multiple chaotic
semiconductor lasers is achieved by appealing to complex system
theory. In particular, we consider dynamical networks composed by
semiconductor laser, as interconnected nodes, where the interaction
in the networks are defined by coupling the first state of each node.
An interest case is synchronized with master-slave configuration in
star topology. Nodes of these networks are modeled for the laser and
simulate by Matlab. These results are applicable to private
communication.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: Current systems complexity has reached a degree that
requires addressing conception and design issues while taking into
account environmental, operational, social, legal and financial
aspects. Therefore, one of the main challenges is the way complex
systems are specified and designed. The exponential growing effort,
cost and time investment of complex systems in modeling phase
emphasize the need for a paradigm, a framework and an environment
to handle the system model complexity. For that, it is necessary to
understand the expectations of the human user of the model and his
limits. This paper presents a generic framework for designing
complex systems, highlights the requirements a system model needs
to fulfill to meet human user expectations, and suggests a graphbased
formalism for modeling complex systems. Finally, a set of
transformations are defined to handle the model complexity.
Abstract: In more complex systems, such as automotive
gearbox, a rigorous treatment of the data is necessary because there
are several moving parts (gears, bearings, shafts, etc.), and in this
way, there are several possible sources of errors and also noise. The
basic objective of this work is the detection of damage in automotive
gearbox. The detection methods used are the wavelet method, the
bispectrum; advanced filtering techniques (selective filtering) of
vibrational signals and mathematical morphology. Gearbox vibration
tests were performed (gearboxes in good condition and with defects)
of a production line of a large vehicle assembler. The vibration
signals are obtained using five accelerometers in different positions
of the sample. The results obtained using the kurtosis, bispectrum,
wavelet and mathematical morphology showed that it is possible to
identify the existence of defects in automotive gearboxes.
Abstract: The building sector is responsible, in many
industrialized countries, for about 40% of the total energy
requirements, so it seems necessary to devote some efforts in this
area in order to achieve a significant reduction of energy
consumption and of greenhouse gases emissions.
The paper presents a study aiming at providing a design
methodology able to identify the best configuration of the system
building/plant, from a technical, economic and environmentally point
of view.
Normally, the classical approach involves a building's energy
loads analysis under steady state conditions, and subsequent selection
of measures aimed at improving the energy performance, based on
previous experience made by architects and engineers in the design
team. Instead, the proposed approach uses a sequence of two wellknown
scientifically validated calculation methods (TRNSYS and
RETScreen), that allow quite a detailed feasibility analysis.
To assess the validity of the calculation model, an existing,
historical building in Central Italy, that will be the object of
restoration and preservative redevelopment, was selected as a casestudy.
The building is made of a basement and three floors, with a
total floor area of about 3,000 square meters.
The first step has been the determination of the heating and
cooling energy loads of the building in a dynamic regime by means,
which allows simulating the real energy needs of the building in
function of its use. Traditional methodologies, based as they are on
steady-state conditions, cannot faithfully reproduce the effects of
varying climatic conditions and of inertial properties of the structure.
With this model is possible to obtain quite accurate and reliable
results that allow identifying effective combinations building-HVAC
system.
The second step has consisted of using output data obtained as
input to the calculation model, which enables to compare different
system configurations from the energy, environmental and financial
point of view, with an analysis of investment, and operation and
maintenance costs, so allowing determining the economic benefit of
possible interventions.
The classical methodology often leads to the choice of
conventional plant systems, while our calculation model provides a
financial-economic assessment for innovative energy systems and
low environmental impact.
Computational analysis can help in the design phase, particularly
in the case of complex structures with centralized plant systems, by
comparing the data returned by the calculation model for different
design options.
Abstract: Ontologies offer a means for representing and sharing
information in many domains, particularly in complex domains. For
example, it can be used for representing and sharing information
of System Requirement Specification (SRS) of complex systems
like the SRS of ERTMS/ETCS written in natural language. Since
this system is a real-time and critical system, generic ontologies,
such as OWL and generic ERTMS ontologies provide minimal
support for modeling temporal information omnipresent in these SRS
documents. To support the modeling of temporal information, one
of the challenges is to enable representation of dynamic features
evolving in time within a generic ontology with a minimal redesign
of it. The separation of temporal information from other information
can help to predict system runtime operation and to properly design
and implement them. In addition, it is helpful to provide a reasoning
and querying techniques to reason and query temporal information
represented in the ontology in order to detect potential temporal
inconsistencies. To address this challenge, we propose a lightweight
3-layer temporal Quality of Service (QoS) ontology for representing,
reasoning and querying over temporal and non-temporal information
in a complex domain ontology. Representing QoS entities in separated
layers can clarify the distinction between the non QoS entities
and the QoS entities in an ontology. The upper generic layer of
the proposed ontology provides an intuitive knowledge of domain
components, specially ERTMS/ETCS components. The separation of
the intermediate QoS layer from the lower QoS layer allows us to
focus on specific QoS Characteristics, such as temporal or integrity
characteristics. In this paper, we focus on temporal information that
can be used to predict system runtime operation. To evaluate our
approach, an example of the proposed domain ontology for handover
operation, as well as a reasoning rule over temporal relations in this
domain-specific ontology, are presented.
Abstract: Recently, an increasing number of researchers have
been focusing on working out realistic solutions to sustainability
problems. As sustainability issues gain higher importance for
organisations, the management of such decisions becomes critical.
Knowledge representation is a fundamental issue of complex
knowledge based systems. Many types of sustainability problems
would benefit from models based on experts’ knowledge. Cognitive
maps have been used for analyzing and aiding decision making. A
cognitive map can be made of almost any system or problem. A
fuzzy cognitive map (FCM) can successfully represent knowledge
and human experience, introducing concepts to represent the essential
elements and the cause and effect relationships among the concepts to
model the behaviour of any system. Integrated waste management
systems (IWMS) are complex systems that can be decomposed to
non-related and related subsystems and elements, where many factors
have to be taken into consideration that may be complementary,
contradictory, and competitive; these factors influence each other and
determine the overall decision process of the system. The goal of the
present paper is to construct an efficient IWMS which considers
various factors. The authors’ intention is to propose an expert based
system design approach for implementing expert decision support in
the area of IWMSs and introduces an appropriate methodology for
the development and analysis of group FCM. A framework for such a
methodology consisting of the development and application phases is
presented.
Abstract: The study investigated the implementation of the
Neural Network (NN) techniques for prediction of the loading of Cu
ions onto clinoptilolite. The experimental design using analysis of
variance (ANOVA) was chosen for testing the adequacy of the
Neural Network and for optimizing of the effective input parameters
(pH, temperature and initial concentration). Feed forward, multi-layer
perceptron (MLP) NN successfully tracked the non-linear behavior of
the adsorption process versus the input parameters with mean squared
error (MSE), correlation coefficient (R) and minimum squared error
(MSRE) of 0.102, 0.998 and 0.004 respectively. The results showed
that NN modeling techniques could effectively predict and simulate
the highly complex system and non-linear process such as ionexchange.
Abstract: One of the main biomedical problem lies in detecting dependencies in semi structured data. Solution includes biomedical portal and algorithms (integral rating health criteria, multidimensional data visualization methods). Biomedical portal allows to process diagnostic and research data in parallel mode using Microsoft System Center 2012, Windows HPC Server cloud technologies. Service does not allow user to see internal calculations instead it provides practical interface. When data is sent for processing user may track status of task and will achieve results as soon as computation is completed. Service includes own algorithms and allows diagnosing and predicating medical cases. Approved methods are based on complex system entropy methods, algorithms for determining the energy patterns of development and trajectory models of biological systems and logical–probabilistic approach with the blurring of images.
Abstract: It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.
Abstract: Due to the fact that there exist only a small number of complex systems in artificial immune system (AIS) that work out nonlinear problems, nonlinear AIS approaches, among the well-known solution techniques, need to be developed. Gaussian function is usually used as similarity estimation in classification problems and pattern recognition. In this study, diagnosis of breast cancer, the second type of the most widespread cancer in women, was performed with different distance calculation functions that euclidean, gaussian and gaussian-euclidean hybrid function in the clonal selection model of classical AIS on Wisconsin Breast Cancer Dataset (WBCD), which was taken from the University of California, Irvine Machine-Learning Repository. We used 3-fold cross validation method to train and test the dataset. According to the results, the maximum test classification accuracy was reported as 97.35% by using of gaussian-euclidean hybrid function for fold-3. Also, mean of test classification accuracies for all of functions were obtained as 94.78%, 94.45% and 95.31% with use of euclidean, gaussian and gaussian-euclidean, respectively. With these results, gaussian-euclidean hybrid function seems to be a potential distance calculation method, and it may be considered as an alternative distance calculation method for hard nonlinear classification problems.