The Whale Optimization Algorithm and Its Implementation in MATLAB

Optimization is an important tool in making decisions and in analysing physical systems. In mathematical terms, an optimization problem is the problem of finding the best solution from among the set of all feasible solutions. The paper discusses the Whale Optimization Algorithm (WOA), and its applications in different fields. The algorithm is tested using MATLAB because of its unique and powerful features. The benchmark functions used in WOA algorithm are grouped as: unimodal (F1-F7), multimodal (F8-F13), and fixed-dimension multimodal (F14-F23). Out of these benchmark functions, we show the experimental results for F7, F11, and F19 for different number of iterations. The search space and objective space for the selected function are drawn, and finally, the best solution as well as the best optimal value of the objective function found by WOA is presented. The algorithmic results demonstrate that the WOA performs better than the state-of-the-art meta-heuristic and conventional algorithms.

The Creative Unfolding of “Reduced Descriptive Structures” in Musical Cognition: Technical and Theoretical Insights Based on the OpenMusic and PWGL Long-Term Feedback

We here describe the theoretical and philosophical understanding of a long term use and development of algorithmic computer-based tools applied to music composition. The findings of our research lead us to interrogate some specific processes and systems of communication engaged in the discovery of specific cultural artworks: artistic creation in the sono-musical domain. Our hypothesis is that the patterns of auditory learning cannot be only understood in terms of social transmission but would gain to be questioned in the way they rely on various ranges of acoustic stimuli modes of consciousness and how the different types of memories engaged in the percept-action expressive systems of our cultural communities also relies on these shadowy conscious entities we named “Reduced Descriptive Structures”.

A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System

This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.

Activation Parameters of the Low Temperature Creep Controlling Mechanism in Martensitic Steels

Martensitic steels with an ultimate tensile strength beyond 2000 MPa are applied in the powertrain of vehicles due to their excellent fatigue strength and high creep resistance. However, the creep controlling mechanism in martensitic steels at ambient temperatures up to 423 K is not evident. The purpose of this study is to review the low temperature creep (LTC) behavior of martensitic steels at temperatures from 363 K to 523 K. Thus, the validity of a logarithmic creep law is reviewed and the stress and temperature dependence of the creep parameters α and β are revealed. Furthermore, creep tests are carried out, which include stepped changes in temperature or stress, respectively. On one hand, the change of the creep rate due to a temperature step provides information on the magnitude of the activation energy of the LTC controlling mechanism and on the other hand, the stress step approach provides information on the magnitude of the activation volume. The magnitude, the temperature dependency, and the stress dependency of both material specific activation parameters may deliver a significant contribution to the disclosure of the nature of the LTC rate controlling mechanism.

Ultrasound Assisted Extraction and Microwave Assisted Extraction of Carotenoids from Melon Shells

Cantaloupes (muskmelon and watermelon) contain biologically active molecules such as carotenoids which are natural pigments used as food colorants and afford health benefits. ß-carotene is the major source of carotenoids present in muskmelon and watermelon shell. Carotenoids were extracted using Microwave assisted extraction (MAE) and Ultrasound assisted extraction (UAE) utilising organic lipophilic solvents such as acetone, methanol, and hexane. Extraction conditions feed-solvent ratio, microwave power, ultrasound frequency, temperature and particle size were varied and optimized. It was found that the yield of carotenoids was higher using UAE than MAE, and muskmelon had the highest yield of carotenoids when was ethanol used as a solvent for 0.5 mm particle size.

Performance of the Strong Stability Method in the Univariate Classical Risk Model

In this paper, we study the performance of the strong stability method of the univariate classical risk model. We interest to the stability bounds established using two approaches. The first based on the strong stability method developed for a general Markov chains. The second approach based on the regenerative processes theory . By adopting an algorithmic procedure, we study the performance of the stability method in the case of exponential distribution claim amounts. After presenting numerically and graphically the stability bounds, an interpretation and comparison of the results have been done.

Performance Analysis of a Hybrid DF-AF Hybrid RF/FSO System under Gamma Gamma Atmospheric Turbulence Channel Using MPPM Modulation

The performance of hybrid amplify and forward - decode and forward (AF-DF) hybrid radio frequency/free space optical (RF/FSO) communication system, that adopts M-ary pulse position modulation (MPPM) techniques, is analyzed. Both exact and approximate symbol-error rates (SERs) are derived. The random variations of the received optical irradiance, produced by the atmospheric turbulence, is modeled by the gamma-gamma (GG) statistical distribution. A closed-form expression for the probability density function (PDF) is derived for the whole above system is obtained. Thanks to the use of hybrid AF-DF hybrid RF/FSO configuration and MPPM, the effects of atmospheric turbulence is mitigated; hence the capacity of combating atmospheric turbulence and the transmissitted signal quality are improved.

A Semi-Implicit Phase Field Model for Droplet Evolution

A semi-implicit phase field method for droplet evolution is proposed. Using the phase field Cahn-Hilliard equation, we are able to track the interface in multiphase flow. The idea of a semi-implicit finite difference scheme is reviewed and employed to solve two nonlinear equations, including the Navier-Stokes and the Cahn-Hilliard equations. The use of a semi-implicit method allows us to have larger time steps compared to explicit schemes. The governing equations are coupled and then solved by a GMRES solver (generalized minimal residual method) using modified Gram-Schmidt orthogonalization. To show the validity of the method, we apply the method to the simulation of a rising droplet, a leaky dielectric drop and the coalescence of drops. The numerical solutions to the phase field model match well with existing solutions over a defined range of variables.

Implementation of State-Space and Super-Element Techniques for the Modeling and Control of Smart Structures with Damping Characteristics

Minimizing the weight in flexible structures means reducing material and costs as well. However, these structures could become prone to vibrations. Attenuating these vibrations has become a pivotal engineering problem that shifted the focus of many research endeavors. One technique to do that is to design and implement an active control system. This system is mainly composed of a vibrating structure, a sensor to perceive the vibrations, an actuator to counteract the influence of disturbances, and finally a controller to generate the appropriate control signals. In this work, two different techniques are explored to create two different mathematical models of an active control system. The first model is a finite element model with a reduced number of nodes and it is called a super-element. The second model is in the form of state-space representation, i.e. a set of partial differential equations. The damping coefficients are calculated and incorporated into both models. The effectiveness of these models is demonstrated when the system is excited by its first natural frequency and an active control strategy is developed and implemented to attenuate the resulting vibrations. Results from both modeling techniques are presented and compared.

Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control

An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.

A Study of Combined Mechanical and Chemical Stabilisation of Fine Grained Dredge Soil of River Jhelum

After the recent devastating flood in Kashmir in 2014, dredging of the local water bodies, especially Jhelum River has become a priority for the government. Local government under the project name of 'Comprehensive Flood Management Programme' plans to undertake an increase in discharge of existing flood channels by removal of encroachments and acquisition of additional land, dredging and other works of the water bodies. The total quantity of soil to be dredged will be 16.15 lac cumecs. Dredged soil is a major component that would result from the project which requires disposal/utilization. This study analyses the effect of cement and sand on the engineering properties of soil. The tests were conducted with variable additions of sand (10%, 20% and 30%), whereas cement was added at 12%. Samples with following compositions: soil-cement (12%) and soil-sand (30%) were tested as well. Laboratory experiments were conducted to determine the engineering characteristics of soil, i.e., compaction, strength, and CBR characteristics. The strength characteristics of the soil were determined by unconfined compressive strength test and direct shear test. Unconfined compressive strength of the soil was tested immediately and for a curing period of seven days. CBR test was performed for unsoaked, soaked (worst condition- 4 days) and cured (4 days) samples.

The Excess Loop Delay Calibration in a Bandpass Continuous-Time Delta Sigma Modulators Based on Q-Enhanced LC Filter

The Q-enhanced LC filters are the most used architecture in the Bandpass (BP) Continuous-Time (CT) Delta-Sigma (ΣΔ) modulators, due to their: high frequencies operation, high linearity than the active filters and a high quality factor obtained by Q-enhanced technique. This technique consists of the use of a negative resistance that compensate the ohmic losses in the on-chip inductor. However, this technique introduces a zero in the filter transfer function which will affect the modulator performances in term of Dynamic Range (DR), stability and in-band noise (Signal-to-Noise Ratio (SNR)). In this paper, we study the effect of this zero and we demonstrate that a calibration of the excess loop delay (ELD) is required to ensure the best performances of the modulator. System level simulations are done for a 2ndorder BP CT (ΣΔ) modulator at a center frequency of 300MHz. Simulation results indicate that the optimal ELD should be reduced by 13% to achieve the maximum SNR and DR compared to the ideal LC-based ΣΔ modulator.

Generative Syntaxes: Macro-Heterophony and the Form of ‘Synchrony’

One of the most powerful language innovation in the twentieth century music was the heterophony–hypostasis of the vertical syntax entered into the sphere of interest of many composers, such as George Enescu, Pierre Boulez, Mauricio Kagel, György Ligeti and others. The heterophonic syntax has a history of its growth, which means a succession of different concepts and writing techniques. The trajectory of settling this phenomenon does not necessarily take into account the chronology: there are highly complex primary stages and advanced stages of returning to the simple forms of writing. In folklore, the plurimelodic simultaneities are free or random and originate from the (unintentional) differences/‘deviations’ from the state of unison, through a variety of ornaments, melismas, imitations, elongations and abbreviations, all in a flexible rhythmic and non-periodic/immeasurable framework, proper to the parlando-rubato rhythmics. Within the general framework of the multivocal organization, the heterophonic syntax in elaborate (academic) version has imposed itself relatively late compared with polyphony and homophony. Of course, the explanation is simple, if we consider the causal relationship between the sound vocabulary elements – in this case, the modalism – and the typologies of vertical organization appropriate for it. Therefore, adding up the ‘classic’ pathway of the writing typologies (monody – polyphony – homophony), heterophony - applied equally to the structures of modal, serial or synthesis vocabulary – reclaims necessarily an own macrotemporal form, in the sense of the analogies enshrined by the evolution of the musical styles and languages: polyphony→fugue, homophony→sonata. Concerned about the prospect of edifying a new musical ontology, the composer Ştefan Niculescu experienced – along with the mathematical organization of heterophony according to his own original methods – the possibility of extrapolation of this phenomenon in macrostructural plan, reaching this way to the unique form of ‘synchrony’. Founded on coincidentia oppositorum principle (involving the ‘one-multiple’ binom), the sound architecture imagined by Ştefan Niculescu consists in one (temporal) model / algorithm of articulation of two sound states: 1. monovocality state (principle of identity) and 2. multivocality state (principle of difference). In this context, the heterophony becomes an (auto)generative mechanism, with macrotemporal amplitude, strategy that will be grown by the composer, practically throughout his creation (see the works: Ison I, Ison II, Unisonos I, Unisonos II, Duplum, Triplum, Psalmus, Héterophonies pour Montreux (Homages to Enescu and Bartók etc.). For the present demonstration, we selected one of the most edifying works of Ştefan Niculescu – Simphony II, Opus dacicum – where the form of (heterophony-)synchrony acquires monumental-symphonic features, representing an emblematic case for the complexity level achieved by this type of vertical syntax in the twentieth century music.

Building the Professional Readiness of Graduates from Day One: An Empirical Approach to Curriculum Continuous Improvement

Industry employers require new graduates to bring with them a range of knowledge, skills and abilities which mean these new employees can immediately make valuable work contributions. These will be a combination of discipline and professional knowledge, skills and abilities which give graduates the technical capabilities to solve practical problems whilst interacting with a range of stakeholders. Underpinning the development of these disciplines and professional knowledge, skills and abilities, are “enabling” knowledge, skills and abilities which assist students to engage in learning. These are academic and learning skills which are essential to common starting points for both the learning process of students entering the course as well as forming the foundation for the fully developed graduate knowledge, skills and abilities. This paper reports on a project created to introduce and strengthen these enabling skills into the first semester of a Bachelor of Information Technology degree in an Australian polytechnic. The project uses an action research approach in the context of ongoing continuous improvement for the course to enhance the overall learning experience, learning sequencing, graduate outcomes, and most importantly, in the first semester, student engagement and retention. The focus of this is implementing the new curriculum in first semester subjects of the course with the aim of developing the “enabling” learning skills, such as literacy, research and numeracy based knowledge, skills and abilities (KSAs). The approach used for the introduction and embedding of these KSAs, (as both enablers of learning and to underpin graduate attribute development), is presented. Building on previous publications which reported different aspects of this longitudinal study, this paper recaps on the rationale for the curriculum redevelopment and then presents the quantitative findings of entering students’ reading literacy and numeracy knowledge and skills degree as well as their perceived research ability. The paper presents the methodology and findings for this stage of the research. Overall, the cohort exhibits mixed KSA levels in these areas, with a relatively low aggregated score. In addition, the paper describes the considerations for adjusting the design and delivery of the new subjects with a targeted learning experience, in response to the feedback gained through continuous monitoring. Such a strategy is aimed at accommodating the changing learning needs of the students and serves to support them towards achieving the enabling learning goals starting from day one of their higher education studies.

A Review of the Characteristics and Optimization of Optical Properties of Zirconia Ceramics for Aesthetic Dental Restorations

The ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP) has been used as a dental biomaterial for several decades. The strength and toughness of this material can be accounted for by its toughening mechanisms, which include transformation toughening, crack deflection, zone shielding, contact shielding, and crack bridging. Prevention of crack propagation is of critical importance in high-fatigue situations, such as those encountered in mastication and para-function. However, the poor translucence of Y-TZP in polycrystalline form is such that it may not meet the aesthetic requirements due to its white/grey appearance. To improve the optical properties of Y-TZP, more detailed study of the optical properties is required; in particular, precise evaluation of the refractive index, absorption coefficient, and scattering coefficient are necessary. The measurement of the optical parameters has been based on the assumption that light scattered from biological media is isotropically distributed over all angles. In fact, the optical behavior of real biological materials depends on the angular scattering of light due to the anisotropic nature of the materials. The purpose of the present work is to evaluate the optical properties (including color, opacity/translucence, scattering, and fluorescence) of zirconia dental ceramics and their control through modification of the chemical composition, phase composition, and surface microstructure.

Pythagorean-Platonic Lattice Method for Finding all Co-Prime Right Angle Triangles

This paper presents a method for determining all of the co-prime right angle triangles in the Euclidean field by looking at the intersection of the Pythagorean and Platonic right angle triangles and the corresponding lattice that this produces. The co-prime properties of each lattice point representing a unique right angle triangle are then considered. This paper proposes a conjunction between these two ancient disparaging theorists. This work has wide applications in information security where cryptography involves improved ways of finding tuples of prime numbers for secure communication systems. In particular, this paper has direct impact in enhancing the encryption and decryption algorithms in cryptography.

Adsorption of Phenolic Compounds on Activated Carbon DSAC36-24

Activated carbon DSAC36-24 iy is adsorbent materials, characterized by a specific surface area of 548.13 m²g⁻¹. Their manufacture uses the natural raw materials like the nucleus of dates. In this study the treatment is done in two stages: A chemical treatment by H3PO4 followed by a physical treatment under nitrogen for 1 hour then under stream of CO2 for 24 hours. A characterization of the various parameters was determined such as the measurement of the specific surface area, determination of pHPZC, bulk density, iodine value. The study of the adsorption of organic molecules (hydroquinone, paranitrophenol, 2,4-dinitrophenol, 2,4,6-trinitrophenol) indicates that the adsorption phenomena are essentially due to the van der Waals interaction. In the case of organic molecules carrying the polar substituents, the existence of hydrogen bonds is also proved by the donor-acceptor forces. The study of the pH effect was done with modeling by different models (Langmuir, Freundlich, Langmuir-Freundlich, Redlich-Peterson), a kinetic treatment is also followed by the application of Lagergren, Weber, Macky.

Comparative Study of Different Enhancement Techniques for Computed Tomography Images

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Forecasting the Volatility of Geophysical Time Series with Stochastic Volatility Models

This work is devoted to the study of modeling geophysical time series. A stochastic technique with time-varying parameters is used to forecast the volatility of data arising in geophysics. In this study, the volatility is defined as a logarithmic first-order autoregressive process. We observe that the inclusion of log-volatility into the time-varying parameter estimation significantly improves forecasting which is facilitated via maximum likelihood estimation. This allows us to conclude that the estimation algorithm for the corresponding one-step-ahead suggested volatility (with ±2 standard prediction errors) is very feasible since it possesses good convergence properties.

A Distributed Mobile Agent Based on Intrusion Detection System for MANET

This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the  signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness  for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).