Early Depression Detection for Young Adults with a Psychiatric and AI Interdisciplinary Multimodal Framework

During COVID-19, the depression rate has increased dramatically. Young adults are most vulnerable to the mental health effects of the pandemic. Lower-income families have a higher ratio to be diagnosed with depression than the general population, but less access to clinics. This research aims to achieve early depression detection at low cost, large scale, and high accuracy with an interdisciplinary approach by incorporating clinical practices defined by American Psychiatric Association (APA) as well as multimodal AI framework. The proposed approach detected the nine depression symptoms with Natural Language Processing sentiment analysis and a symptom-based Lexicon uniquely designed for young adults. The experiments were conducted on the multimedia survey results from adolescents and young adults and unbiased Twitter communications. The result was further aggregated with the facial emotional cues analyzed by the Convolutional Neural Network on the multimedia survey videos. Five experiments each conducted on 10k data entries reached consistent results with an average accuracy of 88.31%, higher than the existing natural language analysis models. This approach can reach 300+ million daily active Twitter users and is highly accessible by low-income populations to promote early depression detection to raise awareness in adolescents and young adults and reveal complementary cues to assist clinical depression diagnosis.

The Journey from Lean Manufacturing to Industry 4.0: The Rail Manufacturing Process in Mexico

Nowadays, Lean Manufacturing and Industry 4.0 are very important in every country. One of the main benefits is continued market presence. It has been identified that there is a need to change existing educational programs, as well as update the knowledge and skills of existing employees. It should be borne in mind that behind each technological improvement, there is a human being. Human talent cannot be neglected. The main objectives of this article are to review the link between Lean Manufacturing, the incorporation of Industry 4.0 and the steps to follow to implement it; analyze the current situation and study the implications and benefits of this new trend, with a particular focus on Mexico. Lean Manufacturing and Industry 4.0 implementation waves must always take care of the most important capital – intellectual capital. The methodology used in this article comprised the following steps: reviewing the reality of the fourth industrial revolution, reviewing employees’ skills on the journey to become world-class, and analyzing the situation in Mexico. Lean Manufacturing and Industry 4.0 were studied not as exclusive concepts, but as complementary ones. The methodological framework used is focused on motivating companies’ collaborators to guarantee common results, innovate, and remain in the market in the face of new requirements from company stakeholders. The key findings were that both trends emphasize the need to improve communication across the entire company and incorporate new technologies into everyday work, from the shop floor to administrative staff, to help improve processes. Taking care of people, activities and processes will bring a company success. In the specific case of Mexico, companies in all sectors need to be aware of and implement technological improvements according to their specific needs. Low-cost labor represents one of the most typical barriers. In conclusion, companies must build a roadmap according to their strategy and needs to achieve their short, medium- and long-term goals.

The Impact of Alumina Cement on Properties of Portland Cement Slurries and Mortars

The addition of a small amount of alumina cement to Portland cement results in immediate setting, a rapid increase in the compressive strength and a clear increase of the adhesion to concrete substrate. This phenomenon is used, among others, for the production of liquid floor self-levelling compounds. Alumina cement is several times more expensive than Portland cement and is a component having a significant impact on prices of products manufactured with its use. For the production of liquid floor self-levelling compounds, low-alumina cement containing approximately 40% Al2O3 is normally used. The aim of the study was to determine the impact of Portland cement with the addition of alumina cement on the basic physical and mechanical properties of cement slurries and mortars. CEM I 42.5R and three types of alumina cement containing 40%, 50% and 70% of Al2O3 were used for the tests. Mixes containing 4%, 6%, 8%, 10% and 12% of different varieties of alumina cement were prepared; for which, the time of initial and final setting, compressive and flexural strength and adhesion to concrete substrate were determined. The analysis of the obtained test results showed that a similar immediate setting effect and clearly better adhesion strength can be obtained using the addition of 6% of high-alumina cement than 12% of low-alumina cement. As the prices of these cements are similar, this can give significant financial savings in the production of liquid floor self-levelling compounds.

A Review and Comparative Analysis on Cluster Ensemble Methods

Clustering is an unsupervised learning technique for aggregating data objects into meaningful classes so that intra cluster similarity is maximized and inter cluster similarity is minimized in data mining. However, no single clustering algorithm proves to be the most effective in producing the best result. As a result, a new challenging technique known as the cluster ensemble approach has blossomed in order to determine the solution to this problem. For the cluster analysis issue, this new technique is a successful approach. The cluster ensemble's main goal is to combine similar clustering solutions in a way that achieves the precision while also improving the quality of individual data clustering. Because of the massive and rapid creation of new approaches in the field of data mining, the ongoing interest in inventing novel algorithms necessitates a thorough examination of current techniques and future innovation. This paper presents a comparative analysis of various cluster ensemble approaches, including their methodologies, formal working process, and standard accuracy and error rates. As a result, the society of clustering practitioners will benefit from this exploratory and clear research, which will aid in determining the most appropriate solution to the problem at hand.

Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Chemistry and Biological Activity of Feed Additive for Poultry Farming

Essential oils are one of the most important groups of biologically active substances present in plants. Due to the chemical diversity of components, essential oils and their preparations have a wide spectrum of pharmacological action. They have bactericidal, antiviral, fungicidal, antiprotozoal, anti-inflammatory, spasmolytic, sedative and other activities. They are expectorant, spasmolytic, sedative, hypotensive, secretion enhancing, antioxidant remedies. Based on preliminary pharmacological studies, we have developed a formulation called “Phytobiotic” containing essential oils, a feed additive for poultry as an alternative to antibiotics. Phytobiotic is a water-soluble powder containing a composition of essential oils of thyme, clary, monarda and auxiliary substances: dry extract of liquorice and inhalation lactose. On this stage of research, the goal was to study the chemical composition of provided phytobiotic, identify the main substances and determine their quantity, investigate the biological activity of phytobiotic through in vitro and in vivo studies. Using gas chromatography-mass spectrometry, 38 components were identified in phytobiotic, representing acyclic-, monocyclic-, bicyclic-, and sesquiterpenes. Together with identification of main active substances, their quantitative content was determined, including acyclic terpene alcohol β-linalool, acyclic terpene ketone linalyl acetate, monocyclic terpenes: D-limonene and γ-terpinene, monocyclic aromatic terpene thymol. Provided phytobiotic has pronounced and at the same time broad spectrum of antibacterial activity. In the cell model, phytobiotic showed weak antioxidant activity, and it was stronger in the ORAC (chemical model) tests. Meanwhile anti-inflammatory activity was also observed. When fowls were supplied feed enriched with phytobiotic, it was observed that gained weight of the chickens in the experimental group exceeded the same data for the control group during the entire period of the experiment. The survival rate of broilers in the experimental group during the growth period was 98% compared to -94% in the control group. As a result of conducted researches probable four different mechanisms which are important for the action of phytobiotics were identified: sensory, metabolic, antioxidant and antibacterial action. General toxic, possible local irritant and allergenic effects of phytobiotic were also investigated. Performed assays proved that formulation is safe.

Lead-Free Inorganic Cesium Tin-Germanium Triiodide Perovskites for Photovoltaic Application

The toxicity of lead associated with the lifecycle of perovskite solar cells (PSCs( is a serious concern which may prove to be a major hurdle in the path toward their commercialization. The current proposed lead-free PSCs including Ag(I), Bi(III), Sb(III), Ti(IV), Ge(II), and Sn(II) low-toxicity cations are still plagued with the critical issues of poor stability and low efficiency. This is mainly because of their chemical stability. In the present research, utilization of all inorganic CsSnGeI3 based materials offers the advantages to enhance resistance of device to degradation, reduce the cost of cells, and minimize the carrier recombination. The presence of inorganic halide perovskite improves the photovoltaic parameters of PCSs via improved surface coverage and stability. The inverted structure of simulated devices using a 1D simulator like solar cell capacitance simulator (SCAPS) version 3308 involves TCOHTL/Perovskite/ETL/Au contact layer. PEDOT:PSS, PCBM, and CsSnGeI3 used as hole transporting layer (HTL), electron transporting layer (ETL), and perovskite absorber layer in the inverted structure for the first time. The holes are injected from highly stable and air tolerant Sn0.5Ge0.5I3 perovskite composition to HTM and electrons from the perovskite to ETL. Simulation results revealed a great dependence of power conversion efficiency (PCE) on the thickness and defect density of perovskite layer. Here the effect of an increase in operating temperature from 300 K to 400 K on the performance of CsSnGeI3 based perovskite devices is investigated. Comparison between simulated CsSnGeI3 based PCSs and similar real testified devices with spiro-OMeTAD as HTL showed that the extraction of carriers at the interfaces of perovskite absorber depends on the energy level mismatches between perovskite and HTL/ETL. We believe that optimization results reported here represent a critical avenue for fabricating the stable, low-cost, efficient, and eco-friendly all-inorganic Cs-Sn-Ge based lead-free perovskite devices.

A Real-Time Bayesian Decision-Support System for Predicting Suspect Vehicle’s Intended Target Using a Sparse Camera Network

We present a decision-support tool to assist an operator in the detection and tracking of a suspect vehicle traveling to an unknown target destination. Multiple data sources, such as traffic cameras, traffic information, weather, etc., are integrated and processed in real-time to infer a suspect’s intended destination chosen from a list of pre-determined high-value targets. Previously, we presented our work in the detection and tracking of vehicles using traffic and airborne cameras. Here, we focus on the fusion and processing of that information to predict a suspect’s behavior. The network of cameras is represented by a directional graph, where the edges correspond to direct road connections between the nodes and the edge weights are proportional to the average time it takes to travel from one node to another. For our experiments, we construct our graph based on the greater Los Angeles subset of the Caltrans’s “Performance Measurement System” (PeMS) dataset. We propose a Bayesian approach where a posterior probability for each target is continuously updated based on detections of the suspect in the live video feeds. Additionally, we introduce the concept of ‘soft interventions’, inspired by the field of Causal Inference. Soft interventions are herein defined as interventions that do not immediately interfere with the suspect’s movements; rather, a soft intervention may induce the suspect into making a new decision, ultimately making their intent more transparent. For example, a soft intervention could be temporarily closing a road a few blocks from the suspect’s current location, which may require the suspect to change their current course. The objective of these interventions is to gain the maximum amount of information about the suspect’s intent in the shortest possible time. Our system currently operates in a human-on-the-loop mode where at each step, a set of recommendations are presented to the operator to aid in decision-making. In principle, the system could operate autonomously, only prompting the operator for critical decisions, allowing the system to significantly scale up to larger areas and multiple suspects. Once the intended target is identified with sufficient confidence, the vehicle is reported to the authorities to take further action. Other recommendations include a selection of road closures, i.e., soft interventions, or to continue monitoring. We evaluate the performance of the proposed system using simulated scenarios where the suspect, starting at random locations, takes a noisy shortest path to their intended target. In all scenarios, the suspect’s intended target is unknown to our system. The decision thresholds are selected to maximize the chances of determining the suspect’s intended target in the minimum amount of time and with the smallest number of interventions. We conclude by discussing the limitations of our current approach to motivate a machine learning approach, based on reinforcement learning in order to relax some of the current limiting assumptions.

Fighting COVID-19: Lessons and Experience from the World’s Largest Economies

The paper reviews the insights gained in combating COVID-19 in the US, Japan, and China. After evaluation and investigation, we found that China’s and Japan’s experience of fighting COVID-19 is commendable. The Chinese government and the Japanese administration have implemented highly effective governance and public health course of action to fight COVID-19. Government-led epidemic control with a staunch belief in science can roll out effective pandemic control strategies. In contrast, the US failed to react to COVID-19 effectively. The relaxed public health measures of ending shutdowns prematurely were not working. When the US keeps business open after the spring shutdown, COVID-19 cases are soaring. Such experiences inform us effective governance and a mandatory and stricter approach can better curb a pandemic than milder measures in handling a public health emergency. And China and Japan, where collectivistic culture reins, can better maneuver a public health crisis with collective efforts.

Cardiac Biosignal and Adaptation in Confined Nuclear Submarine Patrol

Isolated and confined environments (ICE) present several challenges which may adversely affect human’s psychology and physiology. Submariners in Sub-Surface Ballistic Nuclear (SSBN) mission exposed to these environmental constraints must be able to perform complex tasks as part of their normal duties, as well as during crisis periods when emergency actions are required or imminent. The operational and environmental constraints they face contribute to challenge human adaptability. The impact of such a constrained environment has yet to be explored. Establishing a knowledge framework is a determining factor, particularly in view of the next long space travels. Ensuring that the crews are maintained in optimal operational conditions is a real challenge because the success of the mission depends on them. This study focused on the evaluation of the impact of stress on mental health and sensory degradation of submariners during a mission on SSBN using cardiac biosignal (heart rate variability, HRV) clustering. This is a pragmatic exploratory study of a prospective cohort included 19 submariner volunteers. HRV was recorded at baseline to classify by clustering the submariners according to their stress level based on parasympathetic (Pa) activity. Impacts of high Pa (HPa) versus low Pa (LPa) level at baseline were assessed on emotional state and sensory perception (interoception and exteroception) as a cardiac biosignal during the patrol and at a recovery time one month after. Whatever the time, no significant difference was found in mental health between groups. There are significant differences in the interoceptive, exteroceptive and physiological functioning during the patrol and at recovery time. To sum up, compared to the LPa group, the HPa maintains a higher level in psychosensory functioning during the patrol and at recovery but exhibits a decrease in Pa level. The HPa group has less adaptable HRV characteristics, less unpredictability and flexibility of cardiac biosignals while the LPa group increases them during the patrol and at recovery time. This dissociation between psychosensory and physiological adaptation suggests two treatment modalities for ICE environments. To our best knowledge, our results are the first to highlight the impact of physiological differences in the HRV profile on the adaptability of submariners. Further studies are needed to evaluate the negative emotional and cognitive effects of ICEs based on the cardiac profile. Artificial intelligence offers a promising future for maintaining high level of operational conditions. These future perspectives will not only allow submariners to be better prepared, but also to design feasible countermeasures that will help support analog environments that bring us closer to a trip to Mars.

Traditional Dyeing of Silk with Natural Dyes by Eco-Friendly Method

In traditional dyeing of natural fibers with natural dyes, metal salts are commonly used to increase color stability. This method always carries the risk of environmental pollution (contamination of arable soils and fresh groundwater) due to the release of dyeing effluents containing large amounts of metal. Therefore, researchers are always looking for new methods to obtain a green dyeing system. In this research, the use of the enzymatic dyeing method to prevent environmental pollution with metals and reduce production costs has been proposed. After degumming and bleaching, raw silk fabrics were dyed with natural dyes (Madder and Sumac) by three methods (pre-mordanting with a metal salt, one-step enzymatic dyeing, and two-step enzymatic dyeing). Results show that silk dyed with natural dyes by the enzymatic method has higher color strength and colorfastness than the pretreated with a metal salt. Also, the amount of remained dyes in the dyeing wastewater is significantly reduced by the enzymatic method. It is found that the enzymatic dyeing method leads to improvement of dye absorption, color strength, soft hand, no change in color shade, low production costs (due to low dyeing temperature), and a significant reduction in environmental pollution.

Solid State Drive End to End Reliability Prediction, Characterization and Control

A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.

Numerical Modelling of Dust Propagation in the Atmosphere of Tbilisi City in Case of Western Background Light Air

Tbilisi, a large city of the South Caucasus, is a junction point connecting Asia and Europe, Russia and republics of the Asia Minor. Over the last years, its atmosphere has been experienced an increasing anthropogenic load. Numerical modeling method is used for study of Tbilisi atmospheric air pollution. By means of 3D non-linear non-steady numerical model a peculiarity of city atmosphere pollution is investigated during background western light air. Dust concentration spatial and time changes are determined. There are identified the zones of high, average and less pollution, dust accumulation areas, transfer directions etc. By numerical modeling, there is shown that the process of air pollution by the dust proceeds in four stages, and they depend on the intensity of motor traffic, the micro-relief of the city, and the location of city mains. In the interval of time 06:00-09:00 the intensive growth, 09:00-15:00 a constancy or weak decrease, 18:00-21:00 an increase, and from 21:00 to 06:00 a reduction of the dust concentrations take place. The highly polluted areas are located in the vicinity of the city center and at some peripherical territories of the city, where the maximum dust concentration at 9PM is equal to 2 maximum allowable concentrations. The similar investigations conducted in case of various meteorological situations will enable us to compile the map of background urban pollution and to elaborate practical measures for ambient air protection.

Toward Discovering an Architectural Typology Based on the Theory of Affordance

This paper revolves around the concept of affordance. It aims to discover and develop an architectural typology based on the ecological concept of affordance. In order to achieve this aim, an analytical study is conducted and two sources were taken into account: 1- Gibson's definition of the concept of affordance and 2- The researches that are concerned on the affordance categorisation. As a result, this paper concluded 16 typologies of affordances, including the possibilities of mixing them based on both sources. To clarify these typologies and provide further understanding, a wide range of architectural examples are presented and proposed in the paper. To prove this vocabulary’s capability to diagnose and evaluate the affordance of different environments, an experimental study with two processes have been adapted: 1. Diagnostic process: the interpretation of the environments with regards to its affordance by using the new vocabulary (the developed typologies). 2. Evaluating process: the evaluation of the environments that have been interpreted and classified with regards to their affordances. By using the measures of emotional experience (the positive affect ‘PA’ and the negative affect ‘NA’) and the architectural evaluation criteria (beauty, economy and function). The experimental study proves that the typologies are capable of reading the affordance within different environments. Additionally, it explains how these different typologies reflect different interactions based on the previous processes. The data which are concluded from the evaluation of measures explain how different typologies of affordance that have already reflected different environments had different evaluations. In fact, some of them are recommended while the others are not. In other words, the paper draws a roadmap for designers to diagnose, evaluate and analyse the affordance into different architectural environments. After that, it guides them through adapting the best interaction (affordance category), which they intend to adapt into their proposed designs.

Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Film Sensors for the Harsh Environment Application

A capacitance level sensor with a segmented film electrode and a thin-film volume flow sensor with an innovative by-pass sleeve is presented as industrial products for the application in a harsh environment. The working principle of such sensors is well known; however, the traditional sensors show some limitations for certain industrial measurements. The two sensors presented in this paper overcome this limitation and enlarge the application spectrum. The problem is analyzed, and the solution is given. The emphasis of the paper is on developing the problem-solving concepts and the realization of the corresponding measuring circuits. These should give advice and encouragement, how we can still develop electronic measuring products in an almost saturated market.

Shaking Force Balancing of Mechanisms: An Overview

The balancing of mechanisms is a well-known problem in the field of mechanical engineering because the variable dynamic loads cause vibrations, as well as noise, wear and fatigue of the machines. A mechanical system with unbalance shaking force and shaking moment transmits substantial vibration to the frame. Therefore, the objective of the balancing is to cancel or reduce the variable dynamic reactions transmitted to the frame. The resolution of this problem consists in the balancing of the shaking force and shaking moment. It can be fully or partially, by internal mass redistribution via adding counterweights or by modification of the mechanism's architecture via adding auxiliary structures. The balancing problems are of continue interest to researchers. Several laboratories around the world are very active in this area and new results are published regularly. However, despite its ancient history, mechanism balancing theory continues to be developed and new approaches and solutions are constantly being reported. Various surveys have been published that disclose particularities of balancing methods. The author believes that this is an appropriate moment to present a state of the art of the shaking force balancing studies completed by new research results. This paper presents an overview of methods devoted to the shaking force balancing of mechanisms, as well as the historical aspects of the origins and the evolution of the balancing theory of mechanisms.

A Review on Process Parameters of Ti/Al Dissimilar Joint Using Laser Beam Welding

The use of laser beam welding for joining titanium and aluminum offers more advantages compared with conventional joining processes. Dissimilar metal combination is very much needed for aircraft structural industries and research activities. The quality of a weld joint is directly influenced by the welding input parameters. The common problem that is faced by the manufactures is the control of the process parameters to obtain a good weld joint with minimal detrimental. To overcome this issue, various parameters can be preferred to obtain quality of weld joint. In this present study an overall literature review on processing parameters such as offset distance, welding speed, laser power, shielding gas and filler metals are discussed with the effects on quality weldment. Additionally, mechanical properties of welds joint are discussed. The aim of the report is to review the recent progress in the welding of dissimilar titanium (Ti) and aluminum (Al) alloys to provide a basis for follow up research.

Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Study of the Energy Efficiency of Buildings under Tropical Climate with a View to Sustainable Development: Choice of Material Adapted to the Protection of the Environment

In the context of sustainable development and climate change, the adaptation of buildings to the climatic context in hot climates is a necessity if we want to improve living conditions in housing and reduce the risks to the health and productivity of occupants due to thermal discomfort in buildings. One can find a wide variety of efficient solutions but with high costs. In developing countries, especially tropical countries, we need to appreciate a technology with a very limited cost that is affordable for everyone, energy efficient and protects the environment. Biosourced insulation is a product based on plant fibers, animal products or products from recyclable paper or clothing. Their development meets the objectives of maintaining biodiversity, reducing waste and protecting the environment. In tropical or hot countries, the aim is to protect the building from solar thermal radiation, a source of discomfort. The aim of this work is in line with the logic of energy control and environmental protection, the approach is to make the occupants of buildings comfortable, reduce their carbon dioxide emissions (CO2) and decrease their energy consumption (energy efficiency). We have chosen to study the thermo-physical properties of banana leaves and sawdust, especially their thermal conductivities, direct measurements were made using the flash method and the hot plate method. We also measured the heat flow on both sides of each sample by the hot box method. The results from these different experiences show that these materials are very efficient used as insulation. We have also conducted a building thermal simulation using banana leaves as one of the materials under Design Builder software. Air-conditioning load as well as CO2 release was used as performance indicator. When the air-conditioned building cell is protected on the roof by banana leaves and integrated into the walls with solar protection of the glazing, it saves up to 64.3% of energy and avoids 57% of CO2 emissions.