Implementing a Mobility Platform to Connect Hubs in Rural Areas

Mobility, for many people, is an important factor in the satisfaction of daily needs and many people are dependent on public transport. In rural areas with a low population density, it is difficult to provide public transportation with sufficient coverage and frequency. Therefore, the available public transport is unattractive. As a result, people use their own car, which is not desirable from a sustainable point of view and not possible for children and elderly people. Sometimes people organize themselves and volunteer transport services are created. These services are similar to demand-oriented taxis. However, these transport services are usually independent from each other and from the available line-based public transport, limiting both their usability and sustainability. We have developed a platform to improve usability and sustainability by connecting the different demand-oriented transport offerings with the line-based public transport. The system was implemented and tested in a rural area in Germany, but the SARS-CoV-2 pandemic limited real live operation.

Soil-Structure Interaction Models for the Reinforced Foundation System: A State-of-the-Art Review

Challenges of weak soil subgrade are often resolved either by stabilization or reinforcing it. However, it is also practiced to reinforce the granular fill to improve the load-settlement behavior of it over weak soil strata. The inclusion of reinforcement in the engineered granular fill provided a new impetus for the development of enhanced Soil-Structure Interaction (SSI) models, also known as mechanical foundation models or lumped parameter models. Several researchers have been working in this direction to understand the mechanism of granular fill-reinforcement interaction and the response of weak soil under the application of load. These models have been developed by extending available SSI models such as the Winkler Model, Pasternak Model, Hetenyi Model, Kerr Model etc., and are helpful to visualize the load-settlement behavior of a physical system through 1-D and 2-D analysis considering beam and plate resting on the foundation, respectively. Based on the literature survey, these models are categorized as ‘Reinforced Pasternak Model,’ ‘Double Beam Model,’ ‘Reinforced Timoshenko Beam Model,’ and ‘Reinforced Kerr Model’. The present work reviews the past 30+ years of research in the field of SSI models for reinforced foundation systems, presenting the conceptual development of these models systematically and discussing their limitations. A flow-chart showing procedure for compution of deformation and mobilized tension is also incorporated in the paper. Special efforts are taken to tabulate the parameters and their significance in the load-settlement analysis, which may be helpful in future studies for the comparison and enhancement of results and findings of physical models. 

Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain-Computer Interface Methods

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems and issues of this new era. The Brain-Computer Interface (BCI) has opened the door to several new research areas and have been able to provide solutions to critical and vital issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair. This review presents the state-of-the-art methods and improvements of canonical correlation analyses (CCA), an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said differently, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers understand the most state-of-the-art methods available in this field, their pros and cons, and their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the main methods used in this field in a hierarchical way, (2) explaining the pros and cons of each method and their performance, (3) presenting the gaps that exist at the end of each method that can improve the understanding and open doors to new researches or improvements. 

Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops

Background: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, for which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. Objective: This article tried to optimize the layout of a troops’ cafeteria and to improve the overall efficiency of the dining process. Methods: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. Results: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p

Twitter Sentiment Analysis during the Lockdown on New Zealand

One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2021, until April 4, 2021. Natural language processing (NLP), which is a form of Artificial intelligent was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applied machine learning sentimental method such as Crystal Feel and extended the size of the sample tweet by using multiple tweets over a longer period of time.

Performance Evaluation of a Millimeter-Wave Phased Array Antenna Using Circularly Polarized Elements

This paper is focused on the design of an mm-wave phased array. To date, linear polarization is adapted in the reported designs of phased arrays. However, linear polarization faces several well-known challenges. As such, an advanced design for phased array antennas is required that offers circularly polarized (CP) radiation. A feasible solution for achieving CP phased array antennas is proposed using open-circular loop antennas. To this end, a 3-element circular loop phased array antenna is designed to operate at 28 GHz. In addition, the array ability to control the direction of the main lobe is investigated. The results show that the highest achievable field of view (FOV) is 100°, i.e. 50° to the left and 50° to the right-hand side directions. The results are achieved with a CP bandwidth of 15%. Furthermore, the results demonstrate that a high broadside gain of circa 11 dBi can be achieved for the steered beam. Besides, radiation efficiency of 97% can also be achieved based on the proposed design.

Comparison of Composite Programming and Compromise Programming for Aircraft Selection Problem Using Multiple Criteria Decision Making Analysis Method

In this paper, the comparison of composite programming and compromise programming for the aircraft selection problem is discussed using the multiple criteria decision analysis method. The decision making process requires the prior definition and fulfillment of certain factors, especially when it comes to complex areas such as aircraft selection problems. The proposed technique gives more efficient results by extending the composite programming and compromise programming, which are widely used in modeling multiple criteria decisions. The proposed model is applied to a practical decision problem for evaluating and selecting aircraft problems.A selection of aircraft was made based on the proposed approach developed in the field of multiple criteria decision making. The model presented is solved by using the following methods: composite programming, and compromise programming. The importance values of the weight coefficients of the criteria are calculated using the mean weight method. The evaluation and ranking of aircraft are carried out using the composite programming and compromise programming methods. In order to determine the stability of the model and the ability to apply the developed composite programming and compromise programming approach, the paper analyzes its sensitivity, which involves changing the value of the coefficient λ and q in the first part. The second part of the sensitivity analysis relates to the application of different multiple criteria decision making methods, composite programming and compromise programming. In addition, in the third part of the sensitivity analysis, the Spearman correlation coefficient of the ranks obtained was calculated which confirms the applicability of all the proposed approaches.

Maximum Distance Separable b-Symbol Repeated-Root γ-Constacylic Codes over a Finite Chain Ring of Length 2

Let p be a prime and let b be an integer. MDS b-symbol codes are a direct generalization of MDS codes. The γ-constacyclic codes of length pˢ over the finite commutative chain ring Fₚm [u]/ < u² > had been classified into four distinct types, where is a nonzero element of the field Fₚm. Let C₃ be a code of Type 3. In this paper, we obtain the b-symbol distance db(C₃) of the code C₃. Using this result, necessary and sufficient conditions under which C₃ is an MDS b-symbol code are given.

Simulation with Uncertainties of Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform

In a task to assist NASA in analyzing the dynamic forces caused by operational countermeasures of an astronaut’s exercise platform impacting the spacecraft, an active proportional-integral-derivative controller commanding a linear actuator is proposed in a vibration isolation system to regulate the movement of the exercise platform. Computer simulation shows promising results that most exciter forces can be reduced or even eliminated. This paper emphasizes on parameter uncertainties, variations and exciter force variations. Drift and variations of system parameters in the vibration isolation system for astronaut’s exercise platform are analyzed. An active controlled scheme is applied with the goals to reduce the platform displacement and to minimize the force being transmitted to the spacecraft structure. The controller must be robust enough to accommodate the wide variations of system parameters and exciter forces. Computer simulation for the vibration isolation system was performed via MATLAB/Simulink and Trick. The simulation results demonstrate the achievement of force reduction with small platform displacement under wide ranges of variations in system parameters. 

Strongly Coupled Finite Element Formulation of Electromechanical Systems with Integrated Mesh Morphing using Radial Basis Functions

The paper introduces a method to efficiently simulate nonlinear changing electrostatic fields occurring in micro-electromechanical systems (MEMS). Large deflections of the capacitor electrodes usually introduce nonlinear electromechanical forces on the mechanical system. Traditional finite element methods require a time-consuming remeshing process to capture exact results for this physical domain interaction. In order to accelerate the simulation process and eliminate the remeshing process, a formulation of a strongly coupled electromechanical transducer element will be introduced which uses a combination of finite-element with an advanced mesh morphing technique using radial basis functions (RBF). The RBF allows large geometrical changes of the electric field domain while retain high element quality of the deformed mesh. Coupling effects between mechanical and electrical domains are directly included within the element formulation. Fringing field effects are described accurate by using traditional arbitrary shape functions.

Neural Network Supervisory Proportional-Integral-Derivative Control of the Pressurized Water Reactor Core Power Load Following Operation

This work presents the particle swarm optimization trained neural network (PSO-NN) supervisory proportional integral derivative (PID) control method to monitor the pressurized water reactor (PWR) core power for safe operation. The proposed control approach is implemented on the transfer function of the PWR core, which is computed from the state-space model. The PWR core state-space model is designed from the neutronics, thermal-hydraulics, and reactivity models using perturbation around the equilibrium value. The proposed control approach computes the control rod speed to maneuver the core power to track the reference in a closed-loop scheme. The particle swarm optimization (PSO) algorithm is used to train the neural network (NN) and to tune the PID simultaneously. The controller performance is examined using integral absolute error, integral time absolute error, integral square error, and integral time square error functions, and the stability of the system is analyzed by using the Bode diagram. The simulation results indicated that the controller shows satisfactory performance to control and track the load power effectively and smoothly as compared to the PSO-PID control technique. This study will give benefit to design a supervisory controller for nuclear engineering research fields for control application.

Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning

Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals out of which 11 were chosen based on their Intraclass Correlation Coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, five features were introduced to the Linear Discriminant Analysis classifier and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90% respectively.

Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining

In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Titanium Dioxide Modified with Glutathione as Potential Drug Carrier with Reduced Toxic Properties

The paper presents a process to obtain glutathione-modified titanium oxide nanoparticles. The processes were carried out in a microwave radiation field. The influence of the molar ratio of glutathione to titanium oxide and the effect of the fold of NaOH vs. stoichiometric amount on the size of the formed TiO2 nanoparticles was determined. The physicochemical properties of the obtained products were evaluated using dynamic light scattering (DLS), transmission electron microscope- energy-dispersive X-ray spectroscopy (TEM-EDS), low-temperature nitrogen adsorption method (BET), X-Ray Diffraction (XRD) and Fourier-transform infrared spectroscopy (FTIR) microscopy methods. The size of TiO2 nanoparticles was characterized from 30 nm to 336 nm. The release of titanium ions from the prepared products was evaluated. These studies were carried out using different media in which the powders were incubated for a specific time. These were: water, SBF and Ringer's solution. The release of titanium ions from modified products is weaker compared to unmodified titanium oxide nanoparticles. The reduced release of titanium ions may allow the use of such modified materials as substances in drug delivery systems.

Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis

The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluates the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.

An Approach to Capture, Evaluate and Handle Complexity of Engineering Change Occurrences in New Product Development

This paper represents the conception that complex problems do not necessary need similar complex solutions in order to cope with the complexity. Furthermore, a simple solution based on established methods can provide a sufficient way dealing with the complexity. To verify this conception, the presented paper focuses on the field of change management as a part of new product development process in automotive sector. In the field of complexity management, dealing with increasing complexity is essential, while, only non-flexible rigid processes that are not designed to handle complexity are available. The basic methodology of this paper can be divided in four main sections: 1) analyzing the complexity of the change management, 2) literature review in order to identify potential solutions and methods, 3) capturing and implementing expertise of experts from change management filed of an automobile manufacturing company and 4) systematical comparison of the identified methods from literature and connecting these with defined requirements of the complexity of the change management in order to develop a solution. As a practical outcome, this paper provides a method to capture the complexity of engineering changes (EC) and includes it within the EC evaluation process, following case-related process guidance to cope with the complexity. Furthermore, this approach supports the conception that dealing with complexity is possible while utilizing rather simple and established methods by combining them in to a powerful tool.

Farming Production in Brazil: Innovation and Land-Sparing Effect

Innovation and technology can be determinant factors to ensure agricultural and sustainable growth, as well as productivity gains. Technical change has contributed considerably to supply agricultural expansion in Brazil. This agricultural growth could be achieved by incorporating more land or capital. If capital is the main source of agricultural growth, it is possible to increase production per unit of land. The objective of this paper is to estimate: 1) total factor productivity (TFP), which is measured in terms of the rate of output per unit of input; and 2) the land-saving effect (LSE) that is the amount of land required in the case that yield rate is constant over time. According to this study, from 1990 to 2019, it appears that 87% of Brazilian agriculture product growth comes from the gains of productivity; the remaining 13% comes from input growth. In the same period, the total LSE was roughly 400 Mha, which corresponds to 47% of the national territory. These effects reflect the greater efficiency of using productive factors, whose technical change has allowed an increase in the agricultural production based on productivity gains.

Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS

Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).

A Simulation Tool for Projection Mapping Based on Mapbox and Unity

A simulation tool is proposed for big-scale projection mapping events. The tool has four main functions based on Mapbox and Unity utilities. The first function is building three-dimensional models of real cities using Mapbox. The second function is movie projections to some buildings in real cities using Unity. The third is a movie sending function from a PC to a virtual projector. The fourth function is mapping movies with fitting buildings. The simulation tool was adapted to a real projection mapping event held in 2019. The event completed, but it faced a severe problem in the movie projection to the target building. Extra tents were set in front of the target building, and the tents became obstacles to the movie projection. The simulation tool developed herein could reconstruct the problems of the event. Therefore, if the simulation tool was developed before the 2019 projection mapping event, the problem of the tents being obstacles could have been avoided using the tool. Moreover, we confirmed that the simulation tool is useful for planning future projection mapping events to avoid various extra equipment obstacles, such as utility poles, planting trees, and monument towers.

OILU Tag: A Projective Invariant Fiducial System

This paper presents the development of a 2D visual marker, derived from a recent patented work in the field of numbering systems. The proposed fiducial uses a group of projective invariant straight-line patterns, easily detectable and remotely recognizable. Based on an efficient data coding scheme, the developed marker enables producing a large panel of unique real time identifiers with highly distinguishable patterns. The proposed marker Incorporates simultaneously decimal and binary information, making it readable by both humans and machines. This important feature opens up new opportunities for the development of efficient visual human-machine communication and monitoring protocols. Extensive experiment tests validate the robustness of the marker against acquisition and geometric distortions.