Fracture Toughness Properties and FTIR Analysis of Corn Fiber Green Composites

The present work introduced a green composite consisting of corn natural fiber of constant concentration of 10% by weight incorporation with poly methyl methacrylate matrix biomaterial prepared by hand lay-up technique. Corn natural fibers were treated with two concentrations of sodium hydroxide solution (3% and 5%) with different immersed time (1.5 and 3 hours) at room temperature. The fracture toughness test of untreated and alkali treated corn fiber composites were performed. The effect of chemically treated on fracture properties of composites has been analyzed using Fourier transform infrared (FTIR) spectroscopy. The experimental results showed that the alkali treatment improved the fracture properties in terms of plane strain fracture toughness KIC. It was found that the plane strain fracture toughness KIC increased by up to 62% compared to untreated fiber composites. On the other hand, increases in both concentrations of alkali solution and time of soaking to 5% NaOH and 3 hours, respectively reduced the values of KIC lower than the value of the unfilled material.

Freighter Aircraft Selection Using Entropic Programming for Multiple Criteria Decision Making Analysis

This paper proposes entropic programming for the freighter aircraft selection problem using the multiple criteria decision analysis method. The study aims to propose a systematic and comprehensive framework by focusing on the perspective of freighter aircraft selection. In order to achieve this goal, an integrated entropic programming approach was proposed to evaluate and rank alternatives. The decision criteria and aircraft alternatives were identified from the research data analysis. The objective criteria weights were determined by the mean weight method and the standard deviation method. The proposed entropic programming model was applied to a practical decision problem for evaluating and selecting freighter aircraft. The proposed entropic programming technique gives robust, reliable, and efficient results in modeling decision making analysis problems. As a result of entropic programming analysis, Boeing B747-8F, a freighter aircraft alternative ( a3), was chosen as the most suitable freighter aircraft candidate.   

Comparison of Composite Programming and Compromise Programming for Aircraft Selection Problem Using Multiple Criteria Decision Making Analysis Method

In this paper, the comparison of composite programming and compromise programming for the aircraft selection problem is discussed using the multiple criteria decision analysis method. The decision making process requires the prior definition and fulfillment of certain factors, especially when it comes to complex areas such as aircraft selection problems. The proposed technique gives more efficient results by extending the composite programming and compromise programming, which are widely used in modeling multiple criteria decisions. The proposed model is applied to a practical decision problem for evaluating and selecting aircraft problems.A selection of aircraft was made based on the proposed approach developed in the field of multiple criteria decision making. The model presented is solved by using the following methods: composite programming, and compromise programming. The importance values of the weight coefficients of the criteria are calculated using the mean weight method. The evaluation and ranking of aircraft are carried out using the composite programming and compromise programming methods. In order to determine the stability of the model and the ability to apply the developed composite programming and compromise programming approach, the paper analyzes its sensitivity, which involves changing the value of the coefficient λ and q in the first part. The second part of the sensitivity analysis relates to the application of different multiple criteria decision making methods, composite programming and compromise programming. In addition, in the third part of the sensitivity analysis, the Spearman correlation coefficient of the ranks obtained was calculated which confirms the applicability of all the proposed approaches.

Decision-Making Strategies on Smart Dairy Farms: A Review

Farm management and operations will drastically change due to access to real-time data, real-time forecasting and tracking of physical items in combination with Internet of Things (IoT) developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm decision-making process does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyze on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue and environmental impact. Evolutionary Computing (EC) can be very effective in finding the optimal combination of sets of some objects and finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and EC in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management and its uptake has become a continuing trend.

Combination of Tensile Strength and Elongation of Reverse Rolled TaNbHfZrTi Refractory High Entropy Alloy

The refractory high entropy alloys are potential materials for high-temperature applications because of their ability to retain high strength up to 1600°C. However, their practical applications were limited due to poor elongation at room temperature. Therefore, decreasing the average valence electron concentrations (VEC) is an effective design strategy to improve the intrinsic ductility of refractory high entropy alloys. In this work, the high-entropy alloy TaNbHfZrTi was processed at room temperature by each step reverse rolling up to a 90% reduction in thickness. Subsequently, the reverse rolled 90% samples were utilized for annealing treatment at 800°C and 1000°C for 1 h to understand phase stability, microstructure, texture, and mechanical properties. The reverse rolled 90% condition contains body-centered cubic (BCC) single-phase; upon annealing at 800 °C, the formation of secondary phase BCC-2 prevailed. The partial recrystallization and complete recrystallization microstructures were developed for annealed at 800°C and 1000°C, respectively. The reverse rolled condition and 1000°C annealed temperature exhibit extraordinary room temperature tensile properties with high ultimate tensile strength (UTS) without compromising loss of ductility called “strength-ductility” trade-off. The reverse-rolled 90% and annealing treatment carried out at temperature about 1000°C for 1 h consist of UTS 1430 MPa and 1556 MPa with an appreciable amount of 21% and 20% elongation, respectively. The development of hierarchical microstructure prevailed for the annealed 1000°C which led to the simultaneous increase in tensile strength and elongation.

Methodology of Personalizing Interior Spaces in Public Libraries

Creating public spaces which are tailored for the specific demands of the individuals is one of the challenges for the contemporary interior designers. Improving the general knowledge as well as providing a forum for all walks of life to exploit is one of the objectives of a public library. In this regard, interior design in consistent with the demands of the individuals is of paramount importance. Seemingly, study spaces, in particular, those in close relation to the personalized sector, have proven to be challenging, according to the literature. To address this challenge, attributes of individuals, namely, perception of people from public spaces and their interactions with the so-called spaces, should be analyzed to provide interior designers with something to work on. This paper follows the analytic-descriptive research methodology by outlining case study libraries which have personalized public libraries with the investigation of the type of personalization as its primary objective and (I) recognition of physical schedule and the know-how of the spatial connection in indoor design of a library and (II) analysis of each personalized space in relation to other spaces of the library as its secondary objectives. The significance of the current research lies in the concept of personalization as one of the most recent methods of attracting people to libraries. Previous research exists in this regard, but the lack of data concerning personalization makes this topic worth investigating. Hence, this study aims to put forward approaches through real-case studies for the designers to deal with this concept.

A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection

With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.

Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Energy Policy in Nigeria: Prospects and Challenges

Energy is the major force that drives any country`s socio-economic development. Without electricity, the country could be at risk of losing many potential investors. As such, good policy implementation could play a significant role in harnessing all the available energy resources. Nigeria has the prospects of meeting its energy demand and supply if there are good policies and proper implementation of them. The current energy supply needs to improve in order to meet the present and future demand. Sustainable energy development is the way forward. Renewable energy plays a significant role in socio-economic development of any country. Nigeria is a country blessed with abundant natural resources such as, solar radiation for solar power, water for hydropower, wind for wind power, and biomass from both plants and animal’s waste. Both conventional energy (fossil fuel) and unconventional energy (renewable) could be harmonized like in the case of energy mix or biofuels. Biofuels like biodiesel could be produced from biomass and combined with petro-diesel in different ratios. All these can be achieved if good policy is in place. The challenges could be well overcome with good policy, masses awareness, technological knowledge and other incentives that can attract investors in Nigerian energy sector.

Design and Construction of an Impulse Current Generator for Lightning Strike Experiments

There has been a rising trend in using impulse current generators to investigate the lightning strike protection of materials including aluminum and composites in structures such as wind turbine blade and aircraft body. The focus of this research is to present an impulse current generator built in the High Voltage Lab at Mississippi State University. The generator is capable of producing component A and D of the natural lightning discharges in accordance with the Society of Automotive Engineers (SAE) standard, which is widely used in the aerospace industry. The generator can supply lightning impulse energy up to 400 kJ with the capability of producing impulse currents with magnitudes greater than 200 kA. The electrical circuit and physical components of an improved impulse current generator are described and several lightning strike waveforms with different amplitudes is presented for comparing with the standard waveform. The results of this study contribute to the fundamental understanding the functionality of the impulse current generators and present an impulse current generator developed at the High Voltage Lab of Mississippi State University.

Emotion Detection in Twitter Messages Using Combination of Long Short-Term Memory and Convolutional Deep Neural Networks

One of the most significant issues as attended a lot in recent years is that of recognizing the sentiments and emotions in social media texts. The analysis of sentiments and emotions is intended to recognize the conceptual information such as the opinions, feelings, attitudes and emotions of people towards the products, services, organizations, people, topics, events and features in the written text. These indicate the greatness of the problem space. In the real world, businesses and organizations are always looking for tools to gather ideas, emotions, and directions of people about their products, services, or events related to their own. This article uses the Twitter social network, one of the most popular social networks with about 420 million active users, to extract data. Using this social network, users can share their information and opinions about personal issues, policies, products, events, etc. It can be used with appropriate classification of emotional states due to the availability of its data. In this study, supervised learning and deep neural network algorithms are used to classify the emotional states of Twitter users. The use of deep learning methods to increase the learning capacity of the model is an advantage due to the large amount of available data. Tweets collected on various topics are classified into four classes using a combination of two Bidirectional Long Short Term Memory network and a Convolutional network. The results obtained from this study with an average accuracy of 93%, show good results extracted from the proposed framework and improved accuracy compared to previous work.

Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining

In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

The Role of Synthetic Data in Aerial Object Detection

The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represent another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.

A Commercial Building Plug Load Management System That Uses Internet of Things Technology to Automatically Identify Plugged-In Devices and Their Locations

Plug and process loads (PPLs) account for a large portion of U.S. commercial building energy use. There is a huge potential to reduce whole building consumption by targeting PPLs for energy savings measures or implementing some form of plug load management (PLM). Despite this potential, there has yet to be a widely adopted commercial PLM technology. This paper describes the Automatic Type and Location Identification System (ATLIS), a PLM system framework with automatic and dynamic load detection (ADLD). ADLD gives PLM systems the ability to automatically identify devices as they are plugged into the outlets of a building. The ATLIS framework takes advantage of smart, connected devices to identify device locations in a building, meter and control their power, and communicate this information to a central database. ATLIS includes five primary capabilities: location identification, communication, control, energy metering, and data storage. A laboratory proof of concept (PoC) demonstrated all but the energy metering capability, and these capabilities were validated using a series of system tests. The PoC was able to identify when a device was plugged into an outlet and the location of the device in the building. When a device was moved, the PoC’s dashboard and database were automatically updated with the new location. The PoC implemented controls to devices from the system dashboard so that devices maintained correct schedules regardless of where they were plugged in within the building. ATLIS’s primary technology application is improved PLM, but other applications include asset management, energy audits, and interoperability for grid-interactive efficient buildings. An ATLIS-based system could also be used to direct power to critical devices, such as ventilators, during a brownout or blackout. Such a framework is an opportunity to make PLM more widespread and reduce the amount of energy consumed by PPLs in current and future commercial buildings.

An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits, and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depending on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Performance Evaluation of Minimum Quantity Lubrication on EN3 Mild Steel Turning

Lubrication, cooling and chip removal are the desired functions of any cutting fluid. Conventional or flood lubrication requires high volume flow rate and cost associated with this is higher. In addition, flood lubrication possesses health risks to machine operator. To avoid these consequences, dry machining and minimum quantity are two alternatives. Dry machining cannot be a suited alternative as it can generate greater heat and poor surface finish. Here, turning work is carried out on a Lathe machine using EN3 Mild steel. Variable cutting speeds and depth of cuts are provided and corresponding temperatures and surface roughness values were recorded. Experimental results are analyzed by Minitab software. Regression analysis, main effect plot, and interaction plot conclusion are drawn by using ANOVA. There is a 95.83% reduction in the use of cutting fluid. MQL gives a 9.88% reduction in tool temperature, this will improve tool life. MQL produced a 17.64% improved surface finish. MQL appears to be an economical and environmentally compatible lubrication technique for sustainable manufacturing.

Sustainable Balanced Scorecard for Kaizen Evaluation: Comparative Study between Egypt and Japan

Continuous improvement activities are becoming a key organizational success factor; those improvement activities include but are not limited to kaizen, six sigma, lean production, and continuous improvement projects. Kaizen is a Japanese philosophy of continuous improvement by making small incremental changes to improve an organization’s performance, reduce costs, reduce delay time, reduce waste in production, etc. This research aims at proposing a measuring system for kaizen activities from a sustainable balanced scorecard perspective. A survey was developed and disseminated among kaizen experts in both Egypt and Japan with the purpose of allocating key performance indicators for both kaizen process (critical success factors) and result (kaizen benefits) into the five sustainable balanced scorecard perspectives. This research contributes to the extant literature by presenting a kaizen measurement of both kaizen process and results that will illuminate the benefits of using kaizen. Also, the presented measurement can help in the sustainability of kaizen implementation across various sectors and industries. Thus, grasping the full benefits of kaizen implementation will contribute to the spread of kaizen understanding and practice. Also, this research provides insights on the social and cultural differences that would influence the kaizen success. Determining the combination of the proper kaizen measures could be used by any industry, whether service or manufacturing for better kaizen activities measurement. The comparison between Japanese implementation of kaizen, as the pioneers of continuous improvement, and Egyptian implementation will help recommending better practices of kaizen in Egypt and contributing to the 2030 sustainable development goals. The study results reveal that there is no significant difference in allocating kaizen benefits between Egypt and Japan. However, with regard to the critical success factors some differences appeared reflecting the social differences and understanding between both countries, a single integrated measurement was reached between the Egyptian and Japanese allocation highlighting the Japanese experts’ opinion as the ultimate criterion for selection.

Technological Applications in Automobile Manufacturing Sector: A Case Study Analysis

The research focuses on the applicable technologies in the automobile industry and their effects on the productivity and annual revenue of the industry. A study has been conducted on six major automobile manufacturing industries represented in this research as M1, M2, M3, M4, M5 and M6. The results indicate that M1, which is a pioneer in technological applications, remains the market leader, followed by M5 and M2 taking the second and third positions, respectively. M3, M6 and M4 are the followers and are placed next in positions. It has also been observed that M1 and M2 have entered into an agreement to share the basic structural technologies and they maintain long-term and trusted relationships with their suppliers through the Keiretsu system. With technological giants such as Apple, Microsoft, Uber and Google entering the automobile industry in recent years, an upward trend is expected in the futuristic market with self-driving cars to dominate the automobile sector. To keep up with the market trend, it is essential for automobile manufacturers to understand the importance of developing technological capabilities and skills to be competitive in the marketplace.

Adaptive Control Strategy of Robot Polishing Force Based on Position Impedance

Manual polishing has problems such as high labor intensity, low production efficiency and difficulty in guaranteeing the consistency of polishing quality. The use of robot polishing instead of manual polishing can effectively avoid these problems. Polishing force directly affects the quality of polishing, so accurate tracking and control of polishing force is one of the most important conditions for improving the accuracy of robot polishing. The traditional force control strategy is difficult to adapt to the strong coupling of force control and position control during the robot polishing process. Therefore, based on the analysis of force-based impedance control and position-based impedance control, this paper proposed a type of adaptive controller. Based on force feedback control of active compliance control, the controller can adaptively estimate the stiffness and position of the external environment and eliminate the steady-state force error produced by traditional impedance control. The simulation results of the model show that the adaptive controller has good adaptability to changing environmental positions and environmental stiffness, and can accurately track and control polishing force.

Optimization of Hemp Fiber Reinforced Concrete for Mix Design Method

The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. HF were obtained from the manufacturer and hand processed to ensure uniformity in width and length. The fibers were added to concrete as both wet and dry mix to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed that the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375 indicating a variation in the mixing process. While completing the dry mix, the addition of plain HF caused them to intertwine creating lumps and inconsistency. However, during the wet mixing process, combining water and HF before incorporation allows the fibers to uniformly disperse within the mix hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes however more research surrounding its characteristics needs to be conducted.