Abstract: This paper presents a multi-objective optimal design of
a cascade control system for an underactuated mechanical system.
Cascade control structures usually include two control algorithms
(inner and outer). To design such a control system properly, the
following conflicting objectives should be considered at the same
time: 1) the inner closed-loop control must be faster than the outer
one, 2) the inner loop should fast reject any disturbance and prevent
it from propagating to the outer loop, 3) the controlled system
should be insensitive to measurement noise, and 4) the controlled
system should be driven by optimal energy. Such a control problem
can be formulated as a multi-objective optimization problem such
that the optimal trade-offs among these design goals are found.
To authors best knowledge, such a problem has not been studied
in multi-objective settings so far. In this work, an underactuated
mechanical system consisting of a rotary servo motor and a ball
and beam is used for the computer simulations, the setup parameters
of the inner and outer control systems are tuned by NSGA-II
(Non-dominated Sorting Genetic Algorithm), and the dominancy
concept is used to find the optimal design points. The solution of
this problem is not a single optimal cascade control, but rather a set
of optimal cascade controllers (called Pareto set) which represent the
optimal trade-offs among the selected design criteria. The function
evaluation of the Pareto set is called the Pareto front. The solution
set is introduced to the decision-maker who can choose any point
to implement. The simulation results in terms of Pareto front and
time responses to external signals show the competing nature among
the design objectives. The presented study may become the basis for
multi-objective optimal design of multi-loop control systems.
Abstract: Critical depth meters, such as abroad crested weir, Venture Flume and combined control flume are standard devices for measuring flow in open channels. The discharge relation for these devices cannot be solved directly, but it needs iteration process to account for the approach velocity head. In this paper, analytical solution was developed to calculate the discharge in a combined critical depth-meter namely, a hump combined with lateral contraction in rectangular channel with subcritical approach flow including energy losses. Also analytical formulae were derived for approach velocity head coefficient for different types of critical depth meters. The solution was derived by solving a standard cubic equation considering energy loss on the base of trigonometric identity. The advantage of this technique is to avoid iteration process adopted in measuring flow by these devices. Numerical examples are chosen for demonstration of the proposed solution.
Abstract: Inherited complexity is one of the difficult tasks in software engineering field. Further, it is said that there is no physical laws or standard guidelines suit for designing different types of software. Hence, to make the software engineering as a matured engineering discipline like others, it is necessary that it has its own theoretical frameworks and laws. Software designing and development is a human effort which takes a lot of time and considers various parameters for successful completion of the software. The cognitive informatics plays an important role for understanding the essential characteristics of the software. The aim of this work is to consider the fundamental characteristics of the source code of Object-Oriented software i.e. complexity and understandability. The complexity of the programs is analyzed with the help of extracted important attributes of the source code, which is further utilized to evaluate the understandability factor. The aforementioned characteristics are analyzed on the basis of 16 C++ programs by distributing them to forty MCA students. They all tried to understand the source code of the given program and mean time is taken as the actual time needed to understand the program. For validation of this work, Briand’s framework is used and the presented metric is also evaluated comparatively with existing metric which proves its robustness.
Abstract: Climate change would cause mean sea level to rise +1
m by 2100. To prevent coastal floods resulting from the sea level
rising, different flood control structures have been built, with
acceptable protection levels. Gothenburg with the River Göta älv
located on the southwest coast of Sweden is a vulnerable city to the
accelerated rises in mean sea level. We evaluated using a sea barrage
in the River Göta älv to protect Gothenburg during this century. The
highest sea level was estimated to 2.95 m above the current mean sea
level by 2100. To verify flood protection against such high sea levels,
both barriers have to be closed. To prevent high water level in the
River Göta älv reservoir, the barriers would be open when the sea
level is low. The suggested flood control structures would
successfully protect the city from flooding events during this century.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.
Abstract: Software maintenance and mainly software
comprehension pose the largest costs in the software lifecycle. In
order to assess the cost of software comprehension, various
complexity measures have been proposed in the literature. This paper
proposes new cognitive-spatial complexity measures, which combine
the impact of spatial as well as architectural aspect of the software to
compute the software complexity. The spatial aspect of the software
complexity is taken into account using the lexical distances (in
number of lines of code) between different program elements and the
architectural aspect of the software complexity is taken into
consideration using the cognitive weights of control structures
present in control flow of the program. The proposed measures are
evaluated using standard axiomatic frameworks and then, the
proposed measures are compared with the corresponding existing
cognitive complexity measures as well as the spatial complexity
measures for object-oriented software. This study establishes that the
proposed measures are better indicators of the cognitive effort
required for software comprehension than the other existing
complexity measures for object-oriented software.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: Evolutionary robotics is concerned with the design of
intelligent systems with life-like properties by means of simulated
evolution. Approaches in evolutionary robotics can be categorized
according to the control structures that represent the behavior and the
parameters of the controller that undergo adaptation. The basic idea
is to automatically synthesize behaviors that enable the robot to
perform useful tasks in complex environments. The evolutionary
algorithm searches through the space of parameterized controllers
that map sensory perceptions to control actions, thus realizing a
specific robotic behavior. Further, the evolutionary algorithm
maintains and improves a population of candidate behaviors by
means of selection, recombination and mutation. A fitness function
evaluates the performance of the resulting behavior according to the
robot-s task or mission. In this paper, the focus is in the use of
genetic algorithms to solve a multi-objective optimization problem
representing robot behaviors; in particular, the A-Compander Law is
employed in selecting the weight of each objective during the
optimization process. Results using an adaptive fitness function show
that this approach can efficiently react to complex tasks under
variable environments.