Introduction to Electron Spectroscopy for Surfaces Characterization

Spectroscopy is the study of the spectrum produced by the radiation-matter interaction which requires the study of electromagnetic radiation (or electrons) emitted, absorbed, or scattered by matter. Thus, the spectral analysis is using spectrometers which enables us to obtain curves that express the distribution of the energy emitted (spectrum). Analysis of emission spectra can therefore constitute several methods depending on the range of radiation energy. The most common methods used are Auger electron spectroscopy (AES) and Electron Energy Losses Spectroscopy (EELS), which allow the determination of the atomic structure on the surface. This paper focalized essentially on the Electron Energy Loss Spectroscopy.

Data Analysis Techniques for Predictive Maintenance on Fleet of Heavy-Duty Vehicles

The present study proposes a methodology for the efficient daily management of fleet vehicles and construction machinery. The application covers the area of remote monitoring of heavy-duty vehicles operation parameters, where specific sensor data are stored and examined in order to provide information about the vehicle’s health. The vehicle diagnostics allow the user to inspect whether maintenance tasks need to be performed before a fault occurs. A properly designed machine learning model is proposed for the detection of two different types of faults through classification. Cross validation is used and the accuracy of the trained model is checked with the confusion matrix.

The Applicability of Distillation as an Alternative Nuclear Reprocessing Method

A customized two-stage model has been developed to simulate, analyse, and visualize distillation of actinides as a useful alternative low-pressure separation method in the nuclear recycling cases. Under the most optimal conditions of idealized thermodynamic equilibrium stages and under total reflux of distillate the investigated cases of chloride systems for the separation of such actinides are (A) UCl4-CsCl-PuCl3 and (B) ThCl4-NaCl-PuCl3. Simulatively, uranium tetrachloride in case A is successfully separated by distillation into a six-stage distillation column, and thorium tetrachloride from case B into an eight-stage distillation column. For this, a permissible mole fraction value of 1E-06 has been assumed for the residual impurification degree. With further separation effort of eleven to seventeen required separation stages, the monochlorides of plutonium trichloride from both systems A and B are simulatively shown to be separated as high pure distillation products.

Lagrangian Flow Skeletons Captured in the Wake of a Swimming Nematode C. elegans Using an Immersed Boundary Fluid-Structure Interaction Approach

In this paper, Lagrangian coherent structure (LCS) concept is applied to wake flows generated in the up/down-stream of a swimming nematode C. elegans in an intermediate Re number range, i.e., 250-1200. It materializes Lagrangian hidden structures depicting flow transport barriers. To pursue the goals, nematode swimming in a quiescent fluid flow environment is numerically simulated by a two-way fluid-structure interaction (FSI) approach with the aid of immersed boundary method (IBM). In this regard, incompressible Navier-Stokes equations, fully-coupled with Lagrangian deformation equations for the immersed body, are solved using IB2d code. For all simulations, nematode’s body is modeled with a parametrized spring-fiber built-in case available in the computational code. Reverse von-Kármán vortex street formation and vortex shedding characteristics are studied and discussed in details via LCS approach, including grid resolution, integration time and Reynolds number effects. Results unveil presence of different flow regions with distinct fluid particle fates in the swimming animal’s wake and formation of so-called ‘mushroom-shaped’ structures in attracting LCS identities.

Fast and Robust Long-term Tracking with Effective Searching Model

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Study of Compatibility and Oxidation Stability of Vegetable Insulating Oils

The use of vegetable oil (or natural ester) as an insulating fluid in electrical transformers is a trend that aims to contribute to environmental preservation since it is biodegradable and non-toxic. Besides, vegetable oil has high flash and combustion points, being considered a fire safety fluid. However, vegetable oil is usually less stable towards oxidation than mineral oil. Both insulating fluids, mineral and vegetable oils, need to be tested periodically according to specific standards. Oxidation stability can be determined by the induction period measured by conductivity method (Rancimat) by monitoring the effectivity of oil’s antioxidant additives, a methodology already developed for food application and biodiesel but still not standardized for insulating fluids. Besides adequate oxidation stability, fluids must be compatible with transformer's construction materials under normal operating conditions to ensure that damage to the oil and parts of the transformer does not occur. ASTM standard and Brazilian normative differ in parameters evaluated, which reveals the need to regulate tests for each oil type. The aim of this study was to assess oxidation stability and compatibility of vegetable oils to suggest the best way to assure a viable performance of vegetable oil as transformer insulating fluid. The determination of the induction period for several vegetable insulating oils from the local market by using Rancimat was carried out according to BS EN 14112 standard, at different temperatures (110, 120, and 130 °C). Also, the compatibility of vegetable oil was assessed according to ASTM and ABNT NBR standards. The main results showed that the best temperature for use in the Rancimat test is 130 °C, which allows a better observation of conductivity change. The compatibility test results presented differences between vegetable and mineral oil standards that should be taken into account in oil testing since materials compatibility and oxidation stability are essential for equipment reliability.

Towards End-To-End Disease Prediction from Raw Metagenomic Data

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

The Journey from Lean Manufacturing to Industry 4.0: The Rail Manufacturing Process in Mexico

Nowadays, Lean Manufacturing and Industry 4.0 are very important in every country. One of the main benefits is continued market presence. It has been identified that there is a need to change existing educational programs, as well as update the knowledge and skills of existing employees. It should be borne in mind that behind each technological improvement, there is a human being. Human talent cannot be neglected. The main objectives of this article are to review the link between Lean Manufacturing, the incorporation of Industry 4.0 and the steps to follow to implement it; analyze the current situation and study the implications and benefits of this new trend, with a particular focus on Mexico. Lean Manufacturing and Industry 4.0 implementation waves must always take care of the most important capital – intellectual capital. The methodology used in this article comprised the following steps: reviewing the reality of the fourth industrial revolution, reviewing employees’ skills on the journey to become world-class, and analyzing the situation in Mexico. Lean Manufacturing and Industry 4.0 were studied not as exclusive concepts, but as complementary ones. The methodological framework used is focused on motivating companies’ collaborators to guarantee common results, innovate, and remain in the market in the face of new requirements from company stakeholders. The key findings were that both trends emphasize the need to improve communication across the entire company and incorporate new technologies into everyday work, from the shop floor to administrative staff, to help improve processes. Taking care of people, activities and processes will bring a company success. In the specific case of Mexico, companies in all sectors need to be aware of and implement technological improvements according to their specific needs. Low-cost labor represents one of the most typical barriers. In conclusion, companies must build a roadmap according to their strategy and needs to achieve their short, medium- and long-term goals.

Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform

The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.

Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Systematic Examination of Methods Supporting the Social Innovation Process

Innovation is the key element of economic development and a key factor in social processes. Technical innovations can be identified as prerequisites and causes of social change and cannot be created without the renewal of society. The study of social innovation can be characterised as one of the significant research areas of our day. The study’s aim is to identify the process of social innovation, which can be defined by input, transformation, and output factors. This approach divides the social innovation process into three parts: situation analysis, implementation, follow-up. The methods associated with each stage of the process are illustrated by the chronological line of social innovation. In this study, we have sought to present methodologies that support long- and short-term decision-making that is easy to apply, have different complementary content, and are well visualised for different user groups. When applying the methods, the reference objects are different: county, district, settlement, specific organisation. The solution proposed by the study supports the development of a methodological combination adapted to different situations. Having reviewed metric and conceptualisation issues, we wanted to develop a methodological combination along with a change management logic suitable for structured support to the generation of social innovation in the case of a locality or a specific organisation. In addition to a theoretical summary, in the second part of the study, we want to give a non-exhaustive picture of the two counties located in the north-eastern part of Hungary through specific analyses and case descriptions.

A Review and Comparative Analysis on Cluster Ensemble Methods

Clustering is an unsupervised learning technique for aggregating data objects into meaningful classes so that intra cluster similarity is maximized and inter cluster similarity is minimized in data mining. However, no single clustering algorithm proves to be the most effective in producing the best result. As a result, a new challenging technique known as the cluster ensemble approach has blossomed in order to determine the solution to this problem. For the cluster analysis issue, this new technique is a successful approach. The cluster ensemble's main goal is to combine similar clustering solutions in a way that achieves the precision while also improving the quality of individual data clustering. Because of the massive and rapid creation of new approaches in the field of data mining, the ongoing interest in inventing novel algorithms necessitates a thorough examination of current techniques and future innovation. This paper presents a comparative analysis of various cluster ensemble approaches, including their methodologies, formal working process, and standard accuracy and error rates. As a result, the society of clustering practitioners will benefit from this exploratory and clear research, which will aid in determining the most appropriate solution to the problem at hand.

Lean Manufacturing: Systematic Layout Planning Application to an Assembly Line Layout of a Welding Industry

The purpose of this paper is to present the process of elaborating the layout of an assembly line of a welding industry using the principles of lean manufacturing as the main driver. The objective of this paper is relevant since the current layout of the assembly line causes non-productive times for operators, being related to the lean waste of unnecessary movements. The methodology used for the project development was Project-based Learning (PBL), which is an active way of learning focused on real problems. The process of selecting the methodology for layout planning was developed considering three criteria to evaluate the most relevant one for this paper's goal. As a result of this evaluation, Systematic Layout Planning was selected, and three steps were added to it – Value Stream Mapping for the current situation and after layout changed and the definition of lean tools and layout type. This inclusion was to consider lean manufacturing in the layout redesign of the industry. The layout change resulted in an increase in the value-adding time of operations carried out in the sector, reduction in movement times between previous and final assemblies, and in cost savings regarding the man-hour value of the employees, which can be invested in productive hours instead of movement times.

Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Evaluating the Feasibility of Magnetic Induction to Cross an Air-Water Boundary

A magnetic induction based underwater communication link is evaluated using an analytical model and a custom Finite-Difference Time-Domain (FDTD) simulation tool. The analytical model is based on the Sommerfeld integral, and a full-wave simulation tool evaluates Maxwell’s equations using the FDTD method in cylindrical coordinates. The analytical model and FDTD simulation tool are then compared and used to predict the system performance for various transmitter depths and optimum frequencies of operation. To this end, the system bandwidth, signal to noise ratio, and the magnitude of the induced voltage are used to estimate the expected channel capacity. The models show that in seawater, a relatively low-power and small coils may be capable of obtaining a throughput of 40 to 300 kbps, for the case where a transmitter is at depths of 1 to 3 m and a receiver is at a height of 1 m.

Evaluation of Gingival Hyperplasia Caused by Medications

Purpose: Drug gingival hyperplasia is an uncommon pathology encountered during routine work in dental units. The purpose of this paper is to present the clinical appearance of gingival hyperplasia caused by medications. There are already three classes of medications that cause hyperplasia and based on data from the literature, the clinical cases encountered and included in this study have been compared. Materials and Methods: The study was conducted in a total of 311 patients, out of which 182 patients were included in our study, meeting the inclusion criteria. After each patient's history was recorded and it was found that patients were in their knowledge of chronic illness, undergoing treatment of gingivitis hypertrophic drugs was performed with a clinical examination of oral cavity and assessment by vertical and horizontal evaluation according to the periodontal indexes. Results: Of the data collected during the study, it was observed that 97% of patients with gingival hyperplasia are treated with nifedipine. 84% of patients treated with selected medicines and gingival hyperplasia in the oral cavity has been exposed at time period for more than 1 year and 1 month. According to the GOI, in the first rank of this index are about 21% of patients, in the second rank are 52%, in the third rank are 24% and in the fourth grade are 3%. According to the horizontal growth index of gingival hyperplasia, grade 1 included about 61% of patients and grade 2 included about 39% of patients with gingival hyperplasia. Bacterial index divides patients by degrees: grading 0 - 8.2%, grading 1 - 32.4%, grading 2 - 14% and grading 3 - 45.1%. Conclusions: The highest percentage of gingival hyperplasia caused by drugs is due to dosing of nifedipine for a duration of dosing and application for systemic healing for more than 1 year.

Traditional Dyeing of Silk with Natural Dyes by Eco-Friendly Method

In traditional dyeing of natural fibers with natural dyes, metal salts are commonly used to increase color stability. This method always carries the risk of environmental pollution (contamination of arable soils and fresh groundwater) due to the release of dyeing effluents containing large amounts of metal. Therefore, researchers are always looking for new methods to obtain a green dyeing system. In this research, the use of the enzymatic dyeing method to prevent environmental pollution with metals and reduce production costs has been proposed. After degumming and bleaching, raw silk fabrics were dyed with natural dyes (Madder and Sumac) by three methods (pre-mordanting with a metal salt, one-step enzymatic dyeing, and two-step enzymatic dyeing). Results show that silk dyed with natural dyes by the enzymatic method has higher color strength and colorfastness than the pretreated with a metal salt. Also, the amount of remained dyes in the dyeing wastewater is significantly reduced by the enzymatic method. It is found that the enzymatic dyeing method leads to improvement of dye absorption, color strength, soft hand, no change in color shade, low production costs (due to low dyeing temperature), and a significant reduction in environmental pollution.

Real-Time Land Use and Land Information System in Homagama Divisional Secretariat Division

Lands are valuable & limited resource which constantly changes with the growth of the population. An efficient and good land management system is essential to avoid conflicts associated with lands. This paper aims to design the prototype model of a Mobile GIS Land use and Land Information System in real-time. Homagama Divisional Secretariat Division situated in the western province of Sri Lanka was selected as the study area. The prototype model was developed after reviewing related literature. The methodology was consisted of designing and modeling the prototype model into an application running on a mobile platform. The system architecture mainly consists of a Google mapping app for real-time updates with firebase support tools. Thereby, the method of implementation consists of front-end and back-end components. Software tools used in designing applications are Android Studio with JAVA based on GeoJSON File structure. Android Studio with JAVA in GeoJSON File Synchronize to Firebase was found to be the perfect mobile solution for continuously updating Land use and Land Information System (LIS) in real-time in the present scenario. The mobile-based land use and LIS developed in this study are multiple user applications catering to different hierarchy levels such as basic users, supervisory managers, and database administrators. The benefits of this mobile mapping application will help public sector field officers with non-GIS expertise to overcome the land use planning challenges with land use updated in real-time.

Select-Low and Select-High Methods for the Wheeled Robot Dynamic States Control

The paper enquires on the two methods of the wheeled robot braking torque control. Those two methods are applied when the adhesion coefficient under left side wheels is different from the adhesion coefficient under the right side wheels. In case of the select-low (SL) method the braking torque on both wheels is controlled by the signals originating from the wheels on the side of the lower adhesion. In the select-high (SH) method the torque is controlled by the signals originating from the wheels on the side of the higher adhesion. The SL method is securing stable and secure robot behaviors during the braking process. However, the efficiency of this method is relatively low. The SH method is more efficient in terms of time and braking distance but in some situations may cause wheels blocking. It is important to monitor the velocity of all wheels and then take a decision about the braking torque distribution accordingly. In case of the SH method the braking torque slope may require significant decrease in order to avoid wheel blocking.

Numerical Modelling of Dust Propagation in the Atmosphere of Tbilisi City in Case of Western Background Light Air

Tbilisi, a large city of the South Caucasus, is a junction point connecting Asia and Europe, Russia and republics of the Asia Minor. Over the last years, its atmosphere has been experienced an increasing anthropogenic load. Numerical modeling method is used for study of Tbilisi atmospheric air pollution. By means of 3D non-linear non-steady numerical model a peculiarity of city atmosphere pollution is investigated during background western light air. Dust concentration spatial and time changes are determined. There are identified the zones of high, average and less pollution, dust accumulation areas, transfer directions etc. By numerical modeling, there is shown that the process of air pollution by the dust proceeds in four stages, and they depend on the intensity of motor traffic, the micro-relief of the city, and the location of city mains. In the interval of time 06:00-09:00 the intensive growth, 09:00-15:00 a constancy or weak decrease, 18:00-21:00 an increase, and from 21:00 to 06:00 a reduction of the dust concentrations take place. The highly polluted areas are located in the vicinity of the city center and at some peripherical territories of the city, where the maximum dust concentration at 9PM is equal to 2 maximum allowable concentrations. The similar investigations conducted in case of various meteorological situations will enable us to compile the map of background urban pollution and to elaborate practical measures for ambient air protection.