Abstract: Water erosion is the major cause of the erosion that shapes the earth's surface. Modeling water erosion requires the use of software and GIS programs, commercial or closed source. The very high prices for commercial GIS licenses, motivates users and researchers to find open source software as relevant and applicable as the proprietary GIS. The objective of this study is the modeling of water erosion and the hydrogeological and morphophysical characterization of the Oued M'Goun watershed (southern flank of the Central High Atlas) developed by free programs of GIS. The very pertinent results are obtained by executing tasks and algorithms in a simple and easy way. Thus, the various geoscientific and geostatistical analyzes of a digital elevation model (SRTM 30 m resolution) and their combination with the treatments and interpretation of satellite imagery information allowed us to characterize the region studied and to map the area most vulnerable to water erosion.
Abstract: In order to analyze large-scale scientific data, research
on data exploration and visualization has gained popularity. In this
paper, we focus on the exploration and visualization of scientific
simulation data, and define a spatial V-Optimal histogram for
data summarization. We propose histogram construction algorithms
based on a general binary hierarchical partitioning as well as
a more specific one, the l-grid partitioning. For effective data
summarization and efficient data visualization in scientific data
analysis, we propose an optimal algorithm as well as a heuristic
algorithm for histogram construction. To verify the effectiveness and
efficiency of the proposed methods, we conduct experiments on the
massive evacuation simulation data.
Abstract: This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.
Abstract: This paper discusses the implementation of the boundary element method (BEM) on an Excel spreadsheet and how it can be used in teaching vector calculus and simulation. There are two separate spreadheets, within which Laplace equation is solved by the BEM in two dimensions (LIBEM2) and axisymmetric three dimensions (LBEMA). The main algorithms are implemented in the associated programming language within Excel, Visual Basic for Applications (VBA). The BEM only requires a boundary mesh and hence it is a relatively accessible method. The BEM in the open spreadsheet environment is demonstrated as being useful as an aid to teaching and learning. The application of the BEM implemented on a spreadsheet for educational purposes in introductory vector calculus and simulation is explored. The development of assignment work is discussed, and sample results from student work are given. The spreadsheets were found to be useful tools in developing the students’ understanding of vector calculus and in simulating heat conduction.
Abstract: Driven by the demand of intelligent monitoring in
rehabilitation centers or hospitals, a high accuracy real-time location
system based on UWB (ultra-wideband) technology was proposed.
The system measures precise location of a specific person, traces his
movement and visualizes his trajectory on the screen for doctors or
administrators. Therefore, doctors could view the position of the
patient at any time and find them immediately and exactly when
something emergent happens. In our design process, different
algorithms were discussed, and their errors were analyzed. In addition,
we discussed about a , simple but effective way of correcting the
antenna delay error, which turned out to be effective. By choosing the
best algorithm and correcting errors with corresponding methods, the
system attained a good accuracy. Experiments indicated that the
ranging error of the system is lower than 7 cm, the locating error is
lower than 20 cm, and the refresh rate exceeds 5 times per second. In
future works, by embedding the system in wearable IoT (Internet of
Things) devices, it could provide not only physical parameters, but
also the activity status of the patient, which would help doctors a lot in
performing healthcare.
Abstract: In this paper, ways of modeling dynamic measurement
systems are discussed. Specially, for linear system with single-input
single-output, it could be modeled with shallow neural network.
Then, gradient based optimization algorithms are used for searching
the proper coefficients. Besides, method with normal equation and
second order gradient descent are proposed to accelerate the modeling
process, and ways of better gradient estimation are discussed. It
shows that the mathematical essence of the learning objective is
maximum likelihood with noises under Gaussian distribution. For
conventional gradient descent, the mini-batch learning and gradient
with momentum contribute to faster convergence and enhance model
ability. Lastly, experimental results proved the effectiveness of second
order gradient descent algorithm, and indicated that optimization with
normal equation was the most suitable for linear dynamic models.
Abstract: Multiobjective Particle Swarm Optimization (MOPSO) has shown an effective performance for solving test functions and real-world optimization problems. However, this method has a premature convergence problem, which may lead to lack of diversity. In order to improve its performance, this paper presents a hybrid approach which embedded the MOPSO into the island model and integrated a local search technique, Variable Neighborhood Search, to enhance the diversity into the swarm. Experiments on two series of test functions have shown the effectiveness of the proposed approach. A comparison with other evolutionary algorithms shows that the proposed approach presented a good performance in solving multiobjective optimization problems.
Abstract: Today, Short Message Service (SMS) is an important means of communication. SMS is not only used in informal environment for communication and transaction, but it is also used in formal environments such as institutions, organizations, companies, and business world as a tool for communication and transactions. Therefore, there is a need to secure the information that is being transmitted through this medium to ensure security of information both in transit and at rest. But, encryption has been identified as a means to provide security to SMS messages in transit and at rest. Several past researches have proposed and developed several encryption algorithms for SMS and Information Security. This research aims at comparing the performance of common Asymmetric encryption algorithms on SMS security. The research employs the use of three algorithms, namely RSA, McEliece, and RABIN. Several experiments were performed on SMS of various sizes on android mobile device. The experimental results show that each of the three techniques has different key generation, encryption, and decryption times. The efficiency of an algorithm is determined by the time that it takes for encryption, decryption, and key generation. The best algorithm can be chosen based on the least time required for encryption. The obtained results show the least time when McEliece size 4096 is used. RABIN size 4096 gives most time for encryption and so it is the least effective algorithm when considering encryption. Also, the research shows that McEliece size 2048 has the least time for key generation, and hence, it is the best algorithm as relating to key generation. The result of the algorithms also shows that RSA size 1024 is the most preferable algorithm in terms of decryption as it gives the least time for decryption.
Abstract: Cellular complexity stems from the interactions
among thousands of different molecular species. Thanks to the
emerging fields of systems and synthetic biology, scientists are
beginning to unravel these regulatory, signaling, and metabolic
interactions and to understand their coordinated action. Reverse
engineering of biological networks has has several benefits but a
poor quality of data combined with the difficulty in reproducing
it limits the applicability of these methods. A few years back,
many of the commonly used predictive algorithms were tested
on a network constructed in the yeast Saccharomyces cerevisiae
(S. cerevisiae) to resolve this issue. The network was a synthetic
network of five genes regulating each other for the so-called in
vivo reverse-engineering and modeling assessment (IRMA). The
network was constructed in S. cereviase since it is a simple and well
characterized organism. The synthetic network included a variety
of regulatory interactions, thus capturing the behaviour of larger
eukaryotic gene networks on a smaller scale. We derive a new set of
algorithms by solving a nonlinear optimization problem and show
how these algorithms outperform other algorithms on these datasets.
Abstract: In recent years, real-time spatial applications, like
location-aware services and traffic monitoring, have become more
and more important. Such applications result dynamic environments
where data as well as queries are continuously moving. As a result,
there is a tremendous amount of real-time spatial data generated
every day. The growth of the data volume seems to outspeed the
advance of our computing infrastructure. For instance, in real-time
spatial Big Data, users expect to receive the results of each query
within a short time period without holding in account the load
of the system. But with a huge amount of real-time spatial data
generated, the system performance degrades rapidly especially in
overload situations. To solve this problem, we propose the use of
data partitioning as an optimization technique. Traditional horizontal
and vertical partitioning can increase the performance of the system
and simplify data management. But they remain insufficient for
real-time spatial Big data; they can’t deal with real-time and
stream queries efficiently. Thus, in this paper, we propose a novel
data partitioning approach for real-time spatial Big data named
VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial
Big data). This contribution is an implementation of the Matching
algorithm for traditional vertical partitioning. We find, firstly, the
optimal attribute sequence by the use of Matching algorithm. Then,
we propose a new cost model used for database partitioning, for
keeping the data amount of each partition more balanced limit and
for providing a parallel execution guarantees for the most frequent
queries. VPA-RTSBD aims to obtain a real-time partitioning scheme
and deals with stream data. It improves the performance of query
execution by maximizing the degree of parallel execution. This affects
QoS (Quality Of Service) improvement in real-time spatial Big Data
especially with a huge volume of stream data. The performance of
our contribution is evaluated via simulation experiments. The results
show that the proposed algorithm is both efficient and scalable, and
that it outperforms comparable algorithms.
Abstract: Optimization is an important tool in making decisions and in analysing physical systems. In mathematical terms, an optimization problem is the problem of finding the best solution from among the set of all feasible solutions. The paper discusses the Whale Optimization Algorithm (WOA), and its applications in different fields. The algorithm is tested using MATLAB because of its unique and powerful features. The benchmark functions used in WOA algorithm are grouped as: unimodal (F1-F7), multimodal (F8-F13), and fixed-dimension multimodal (F14-F23). Out of these benchmark functions, we show the experimental results for F7, F11, and F19 for different number of iterations. The search space and objective space for the selected function are drawn, and finally, the best solution as well as the best optimal value of the objective function found by WOA is presented. The algorithmic results demonstrate that the WOA performs better than the state-of-the-art meta-heuristic and conventional algorithms.
Abstract: Motion recognition from videos is actually a very
complex task due to the high variability of motions. This paper
describes the challenges of human motion recognition, especially
motion representation step with relevant features. Our descriptor
vector is inspired from Laban Movement Analysis method. We
propose discriminative features using the Random Forest algorithm
in order to remove redundant features and make learning algorithms
operate faster and more effectively. We validate our method on
MSRC-12 and UTKinect datasets.
Abstract: This paper presents an optimization method based
on genetic algorithm for the energy management inside buildings
developed in the frame of the project Smart Living Lab (SLL)
in Fribourg (Switzerland). This algorithm optimizes the interaction
between renewable energy production, storage systems and energy
consumers. In comparison with standard algorithms, the innovative
aspect of this project is the extension of the smart regulation
over three simultaneous criteria: the energy self-consumption, the
decrease of greenhouse gas emissions and operating costs. The
genetic algorithm approach was chosen due to the large quantity
of optimization variables and the non-linearity of the optimization
function. The optimization process includes also real time data of the
building as well as weather forecast and users habits. This information
is used by a physical model of the building energy resources to predict
the future energy production and needs, to select the best energetic
strategy, to combine production or storage of energy in order to
guarantee the demand of electrical and thermal energy. The principle
of operation of the algorithm as well as typical output example of
the algorithm is presented.
Abstract: In the present work we developed an image processing
algorithm to measure water droplets characteristics during dropwise
condensation on pillared surfaces. The main problem in this process is
the similarity between shape and size of water droplets and the pillars.
The developed method divides droplets into four main groups based
on their size and applies the corresponding algorithm to segment each
group. These algorithms generate binary images of droplets based
on both their geometrical and intensity properties. The information
related to droplets evolution during time including mean radius and
drops number per unit area are then extracted from the binary images.
The developed image processing algorithm is verified using manual
detection and applied to two different sets of images corresponding
to two kinds of pillared surfaces.
Abstract: Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.
Abstract: Classification of high resolution polarimetric Synthetic Aperture Radar (PolSAR) images plays an important role in land cover and land use management. Recently, classification algorithms based on Bag of Visual Words (BOVW) model have attracted significant interest among scholars and researchers in and out of the field of remote sensing. In this paper, BOVW model with pixel based low-level features has been implemented to classify a subset of San Francisco bay PolSAR image, acquired by RADARSAR 2 in C-band. We have used segment-based decision-making strategy and compared the result with the result of traditional Support Vector Machine (SVM) classifier. 90.95% overall accuracy of the classification with the proposed algorithm has shown that the proposed algorithm is comparable with the state-of-the-art methods. In addition to increase in the classification accuracy, the proposed method has decreased undesirable speckle effect of SAR images.
Abstract: The present work proposes the development of an adaptive control system which enables the suppression of Pilot Induced Oscillations (PIO) in Digital Fly-By-Wire (DFBW) aircrafts. The proposed system consists of a Modified Model Reference Adaptive Control (M-MRAC) integrated with the Gain Scheduling technique. The PIO oscillations are detected using a Real Time Oscillation Verifier (ROVER) algorithm, which then enables the system to switch between two reference models; one in PIO condition, with low proneness to the phenomenon and another one in normal condition, with high (or medium) proneness. The reference models are defined in a closed loop condition using the Linear Quadratic Regulator (LQR) control methodology for Multiple-Input-Multiple-Output (MIMO) systems. The implemented algorithms are simulated in software implementations with state space models and commercial flight simulators as the controlled elements and with pilot dynamics models. A sequence of pitch angles is considered as the reference signal, named as Synthetic Task (Syntask), which must be tracked by the pilot models. The initial outcomes show that the proposed system can detect and suppress (or mitigate) the PIO oscillations in real time before it reaches high amplitudes.
Abstract: This paper will consider the problem of sequential
mining patterns embedded in a database by handling the time
constraints as defined in the GSP algorithm (level wise algorithms).
We will compare two previous approaches GTC and PSP, that
resumes the general principles of GSP. Furthermore this paper will
discuss PG-hybrid algorithm, that using PSP and GTC. The results
show that PSP and GTC are more efficient than GSP. On the other
hand, the GTC algorithm performs better than PSP. The PG-hybrid
algorithm use PSP algorithm for the two first passes on the database,
and GTC approach for the following scans. Experiments show that
the hybrid approach is very efficient for short, frequent sequences.
Abstract: Road traffic accidents are among the principal causes of
traffic congestion, causing human losses, damages to health and the
environment, economic losses and material damages. Studies about
traditional road traffic accidents in urban zones represents very high
inversion of time and money, additionally, the result are not current.
However, nowadays in many countries, the crowdsourced GPS based
traffic and navigation apps have emerged as an important source
of information to low cost to studies of road traffic accidents and
urban congestion caused by them. In this article we identified the
zones, roads and specific time in the CDMX in which the largest
number of road traffic accidents are concentrated during 2016. We
built a database compiling information obtained from the social
network known as Waze. The methodology employed was Discovery
of knowledge in the database (KDD) for the discovery of patterns
in the accidents reports. Furthermore, using data mining techniques
with the help of Weka. The selected algorithms was the Maximization
of Expectations (EM) to obtain the number ideal of clusters for the
data and k-means as a grouping method. Finally, the results were
visualized with the Geographic Information System QGIS.
Abstract: Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.