Topic Modeling Using Latent Dirichlet Allocation and Latent Semantic Indexing on South African Telco Twitter Data

Twitter is one of the most popular social media platforms where users share their opinions on different subjects. Twitter can be considered a great source for mining text due to the high volumes of data generated through the platform daily. Many industries such as telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model in this experiment. A higher topic coherence score indicates better performance of the model.

A Multi-Feature Deep Learning Algorithm for Urban Traffic Classification with Limited Labeled Data

Acoustic sensors, if embedded in smart street lights, can help in capturing the activities (car honking, sirens, events, traffic, etc.) in cities. Needless to say, the acoustic data from such scenarios are complex due to multiple audio streams originating from different events, and when decomposed to independent signals, the amount of retrieved data volume is small in quantity which is inadequate to train deep neural networks. So, in this paper, we address the two challenges: a) separating the mixed signals, and b) developing an efficient acoustic classifier under data paucity. So, to address these challenges, we propose an architecture with supervised deep learning, where the initial captured mixed acoustics data are analyzed with Fast Fourier Transformation (FFT), followed by filtering the noise from the signal, and then decomposed to independent signals by fast independent component analysis (Fast ICA). To address the challenge of data paucity, we propose a multi feature-based deep neural network with high performance that is reflected in our experiments when compared to the conventional convolutional neural network (CNN) and multi-layer perceptron (MLP).

Public-Private Partnership Transportation Projects: An Exploratory Study

When public transportation projects were delivered through design-bid-build and later design-build, governments found a serious issue: inadequate funding. With population growth, governments began to develop new arrangements in which the private sectors were involved to cut the financial burden. This arrangement, Public-Private Partnership (PPP), has its own risks; however, performance outputs can motivate or discourage its use. On top of such output are time and budget, which can be affected by the type of project delivery methods. Project completion within or ahead of schedule as well as within or under budget is among any owner’s objectives. With a higher application of PPP in the highway industry in the US and insufficient research, the current study addresses the schedule and cost performance of PPP highway projects and determines which one outperforms the other. To meet this objective, after collecting performance data of all PPP projects, schedule growth and cost growth are calculated, and finally, statistical analysis is conducted to evaluate the PPP performance. The results show that PPP highway projects on average have saved time and cost; however, the main benefit is a faster delivery rather than an under-budget completion. This study can provide better insights to understand PPP highways’ performance and assist practitioners in applying PPP for transportation projects with the opportunity to save time and cost.

Possibilities for Testing User Experience and User Interface Design on Mobile Devices

In an era when everything is increasingly digital, consumers are always looking for new options in solutions to their everyday needs. In this context, mobile apps are developing at an exponential pace. One of the fastest growing segments of mobile technologies is, obviously, e-commerce. It can be predicted that mobile commerce will record nearly three times the global growth of e-commerce across all platforms, which indicates its importance in the given segment. The current coronavirus pandemic is also changing many of the existing paradigms both socially, economically, and technologically, which has a major impact on changing consumer behavior and the emphasis on simplification and clarity of mobile solutions. This is the area that User Experience (UX) and User Interface (UI) designers deal with. Their task is to design a sufficiently attractive and interesting solution that will be available on all mobile devices and at the same time will be easy enough for the customer/visitor to get to the destination or to get the necessary information in a few clicks. The basis for changes in UX design can now be obtained not only through online analytical tools, but also through neuromarketing, especially in the case of mobile devices. The paper highlights possibilities for testing UX design applications on mobile devices using a special platform that combines a stationary eye camera (eye tracking) and facial analysis (facial coding).

Dielectric Recovery Characteristics of High Voltage Gas Circuit Breakers Operating with CO2 Mixture

CO₂-based gas mixtures exhibit huge potential as the interruption medium for replacing SF₆ in high voltage switchgears. In this paper, the recovery characteristics of dielectric strength of CO₂-O₂ mixture in the post arc phase after the current zero are presented. As representative examples, the dielectric recovery curves under conditions of different gas filling pressures and short-circuit current amplitudes are presented. A series of dielectric recovery measurements suggests that the dielectric recovery rate is proportional to the mass flux of the blowing gas, and the dielectric strength recovers faster in the case of lower short circuit currents.

Development of a Basic Robot System for Medical and Nursing Care for Patients with Glaucoma

Medical methods to completely treat glaucoma are yet to be developed. Therefore, ophthalmologists manage patients mainly to delay disease progression. Patients with glaucoma are mainly elderly individuals. In elderly people's houses, having an equipment that can provide medical treatment and care can release their family from their care. For elderly people with the glaucoma to live by themselves as much as possible, we developed a support robot having five functions: elderly people care, ophthalmological examination, trip assistance to the neighborhood, medical treatment, and data referral to a hospital. The medical and nursing care robot should approach the visual field that the patients can see at a speed suitable for their eyesight. This is because the robot will be dangerous if it approaches the patients from the visual field that they cannot see. We experimentally developed a robot that brings a white cane to elderly people with glaucoma. The base part of the robot is a carriage, which is a Megarover 1.1, and it has two infrared sensors. The robot moves along a white line on the floor using the infrared sensors and has a special arm, which does not use electricity. The arm can scoop the block attached to the white cane. Next, we also developed a direction detector comprised of a charge-coupled device camera (SVR41ResucueHD; Sun Mechatronics), goggles (MG-277MLF; Midori Anzen Co. Ltd.), and biconvex lenses with a focal length of 25 mm (Edmund Co.). Some young people were photographed using the direction detector, which was put on their faces. Image processing was performed using Scilab 6.1.0 and Image Processing and Computer Vision Toolbox 4.1.2. To measure the people's line of vision, we calculated the iris's center of gravity using five processes: reduction, trimming, binarization or gray scale, edge extraction, and Hough transform. We compared the binarization and gray scale processes in image processing. The binarization process was better than the gray scale process. For edge extraction, we compared five methods: Sobel, Prewitt, Laplacian of Gaussian, fast Fourier transform, and Canny. The Canny method was the optimal extraction method. We performed the Hough transform to search for the main coordinates from the iris's edge, and we found that the Hough transform could calculate the center point of the iris.

Engineering Topology of Photonic Systems for Sustainable Molecular Structure: Autopoiesis Systems

This paper introduces topological order in descried social systems starting with the original concept of autopoiesis by biologists and scientists, including the modification of general systems based on socialized medicine. Topological order is important in describing the physical systems for exploiting optical systems and improving photonic devices. The stats of topologically order have some interesting properties of topological degeneracy and fractional statistics that reveal the entanglement origin of topological order, etc. Topological ideas in photonics form exciting developments in solid-state materials, that being; insulating in the bulk, conducting electricity on their surface without dissipation or back-scattering, even in the presence of large impurities. A specific type of autopoiesis system is interrelated to the main categories amongst existing groups of the ecological phenomena interaction social and medical sciences. The hypothesis, nevertheless, has a nonlinear interaction with its natural environment ‘interactional cycle’ for exchange photon energy with molecules without changes in topology (i.e., chemical transformation into products do not propagate any changes or variation in the network topology of physical configuration). The engineering topology of a biosensor is based on the excitation boundary of surface electromagnetic waves in photonic band gap multilayer films. The device operation is similar to surface Plasmonic biosensors in which a photonic band gap film replaces metal film as the medium when surface electromagnetic waves are excited. The use of photonic band gap film offers sharper surface wave resonance leading to the potential of greatly enhanced sensitivity. So, the properties of the photonic band gap material are engineered to operate a sensor at any wavelength and conduct a surface wave resonance that ranges up to 470 nm. The wavelength is not generally accessible with surface Plasmon sensing. Lastly, the photonic band gap films have robust mechanical functions that offer new substrates for surface chemistry to understand the molecular design structure, and create sensing chips surface with different concentrations of DNA sequences in the solution to observe and track the surface mode resonance under the influences of processes that take place in the spectroscopic environment. These processes led to the development of several advanced analytical technologies, which are automated, real-time, reliable, reproducible and cost-effective. This results in faster and more accurate monitoring and detection of biomolecules on refractive index sensing, antibody–antigen reactions with a DNA or protein binding. Ultimately, the controversial aspect of molecular frictional properties is adjusted to each other in order to form unique spatial structure and dynamics of biological molecules for providing the environment mutual contribution in investigation of changes due the pathogenic archival architecture of cell clusters.

Developing a Coronavirus Academic Paper Sorting Application

The COVID-19 Literature Summary App, now live on the university website, was created for the primary purpose of enabling academicians and clinicians to quickly sort through the vast array of recent coronavirus publications by topics of interest. Multiple methods of summarizing and sorting the manuscripts were created. A summary page introduces the application function and capabilities, while an interactive map provides daily updates on infection, death, and recovery rates. A page with a pivot table allows publication sorting by topic, with an interactive data table that allows sorting topics by columns, as wells as the capability to view abstracts. Additionally, publications may be sorted by the medical topics they cover. We used the CORD-19 database to compile lists of publications. The data table can sort binary variables, allowing the user to pick desired publication topics, such as papers that describe COVID-19 symptoms. The application is primarily designed for use by researchers but can be used by anybody who wants a faster and more efficient means of locating papers of interest. 

Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain-Computer Interface Methods

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems and issues of this new era. The Brain-Computer Interface (BCI) has opened the door to several new research areas and have been able to provide solutions to critical and vital issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair. This review presents the state-of-the-art methods and improvements of canonical correlation analyses (CCA), an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said differently, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers understand the most state-of-the-art methods available in this field, their pros and cons, and their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the main methods used in this field in a hierarchical way, (2) explaining the pros and cons of each method and their performance, (3) presenting the gaps that exist at the end of each method that can improve the understanding and open doors to new researches or improvements. 

Graph Codes-2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Multimedia Indexing and Retrieval is generally de-signed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, espe-cially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelisation. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

On the Algorithmic Iterative Solutions of Conjugate Gradient, Gauss-Seidel and Jacobi Methods for Solving Systems of Linear Equations

In this paper, efforts were made to examine and compare the algorithmic iterative solutions of conjugate gradient method as against other methods such as Gauss-Seidel and Jacobi approaches for solving systems of linear equations of the form Ax = b, where A is a real n x n symmetric and positive definite matrix. We performed algorithmic iterative steps and obtained analytical solutions of a typical 3 x 3 symmetric and positive definite matrix using the three methods described in this paper (Gauss-Seidel, Jacobi and Conjugate Gradient methods) respectively. From the results obtained, we discovered that the Conjugate Gradient method converges faster to exact solutions in fewer iterative steps than the two other methods which took much iteration, much time and kept tending to the exact solutions.

Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data

The primary tool currently used to pre-process 10X chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.

Relationship between Hepatokines and Insulin Resistance in Childhood Obesity

Childhood obesity is an important clinical problem, because it may lead to chronic diseases during the adulthood period of the individual. Obesity is a metabolic disease associated with low-grade inflammation. The liver occurs at the center of metabolic pathways. Adropin, fibroblast growth factor-21 (FGF-21) and fetuin A are hepatokines. Due to the immense participation of the liver in glucose metabolism, these liver derived factors may be associated with insulin resistance (IR), which is a phenomenon discussed within the scope of obesity problems. The aim of this study is to determine the concentrations of adropin, FGF-21 and fetuin A in childhood obesity, to point out possible differences between the obesity groups and to investigate possible associations among these three hepatokines in obese and morbid obese children. A total of 132 children were included in the study. Two obese groups were constituted. The groups were matched in terms of mean±SD values of ages. Body mass index values of the obese and morbid obese groups were 25.0±3.5 kg/m2 and 29.8±5.7 kg/m2, respectively. Anthropometric measurements including waist circumference, hip circumference, head circumference, and neck circumference were recorded. Informed consent forms were taken from the parents of the participants and the Ethics Committee of the institution approved the study protocol. Blood samples were obtained after an overnight fasting. Routine biochemical tests including glucose- and lipid-related parameters were performed. Concentrations of the hepatokines (adropin, FGF-21, fetuin A) were determined by enzyme-linked immunosorbent assay. Insulin resistance indices such as homeostasis model assessment for IR (HOMA-IR), alanine transaminase-to aspartate transaminase ratio (ALT/AST), diagnostic obesity notation model assessment laboratory index, diagnostic obesity notation model assessment metabolic syndrome index as well as obesity indices such as diagnostic obesity notation model assessment-II index, and fat mass index were calculated using the previously derived formulas. Statistical evaluation of the study data as well as findings of the study were performed by SPSS for Windows. Statistical difference was accepted significant when p < 0.05. Statistically significant differences were found for insulin, triglyceride, high density lipoprotein cholesterol levels of the groups. A significant increase was observed for FGF-21 concentrations in the morbid obese group. Higher adropin and fetuin A concentrations were observed in the same group in comparison with the values detected in the obese group (p > 0.05). There was no statistically significant difference between the ALT/AST values of the groups. In all of the remaining IR and obesity indices, significantly increased values were calculated for morbid obese children. Significant correlations were detected between HOMA-IR and each of the hepatokines. The highest one was the association with fetuin A (r = 0.373, p = 0.001). In conclusion, increased levels observed in adropin, FGF-21 and fetuin A have shown that these hepatokines possess increasing potential going from the obese to morbid obese state. Out of the correlations found with IR index, the most affected hepatokine was fetuin A, the parameter possibly used as the indicator of the advanced obesity stage.

Spexin and Fetuin A in Morbid Obese Children

Spexin, expressed in the central nervous system, has attracted much interest in feeding behavior, obesity, diabetes, energy metabolism and cardiovascular functions. Fetuin A is known as the negative acute phase reactant synthesized in the liver. Eosinophils are early indicators of cardiometabolic complications. Patients with elevated platelet count, associated with hypercoagulable state in the body, are also more liable to cardiovascular diseases (CVDs). In this study, the aim is to examine the profiles of spexin and fetuin A concomitant with the course of variations detected in eosinophil as well as platelet counts in morbid obese children. 34 children with normal-body mass index (N-BMI) and 51 morbid obese (MO) children participated in the study. Written-informed consent forms were obtained prior to the study. Institutional ethics committee approved the study protocol. Age- and sex-adjusted BMI percentile tables prepared by World Health Organization were used to classify healthy and obese children. Mean age ± SEM of the children were 9.3 ± 0.6 years and 10.7 ± 0.5 years in N-BMI and MO groups, respectively. Anthropometric measurements of the children were taken. BMI values were calculated from weight and height values. Blood samples were obtained after an overnight fasting. Routine hematologic and biochemical tests were performed. Within this context, fasting blood glucose (FBG), insulin (INS), triglycerides (TRG), high density lipoprotein-cholesterol (HDL-C) concentrations were measured. Homeostatic model assessment for insulin resistance (HOMA-IR) values were calculated. Spexin and fetuin A levels were determined by enzyme-linked immunosorbent assay. Data were evaluated from the statistical point of view. Statistically significant differences were found between groups in terms of BMI, fat mass index, INS, HOMA-IR and HDL-C. In MO group, all parameters increased as HDL-C decreased. Elevated concentrations in MO group were detected in eosinophils (p < 0.05) and platelets (p > 0.05). Fetuin A levels decreased in MO group (p > 0.05). However, decrease was statistically significant in spexin levels for this group (p < 0.05). In conclusion, these results have suggested that increases in eosinophils and platelets exhibit behavior as cardiovascular risk factors. Decreased fetuin A behaved as a risk factor suitable to increased risk for cardiovascular problems associated with the severity of obesity. Along with increased eosinophils, increased platelets and decreased fetuin A, decreased spexin was the parameter, which reflects best its possible participation in the early development of CVD risk in MO children.

6D Posture Estimation of Road Vehicles from Color Images

Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.

Fast and Robust Long-term Tracking with Effective Searching Model

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Towards End-To-End Disease Prediction from Raw Metagenomic Data

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Challenges and Opportunities of E-Procurement in the Construction Industry

Construction Industry is evolving amid the fourth industrial revolution. Transportation, commerce, manufacturing and many other industries ripened the current technological advancement and are striving to utilise every development in the IT sector. The procurement of construction works is known to be very conventional and backward in the adoption of digitalisation. The construction industry's procurement and supply chain are blamed for the most inflated cost of construction projects, mainly attributed to a lack of transparency and trust between the industry stakeholders. This research explores the challenges of e-procurement adoption in the industry and identifies the potential opportunities for its usage. This investigation's data are acquired through interviews, and the data are analysed using qualitative content analysis. This study reveals compounding challenges (i.e., corruption and lack of commitment) that lead to the failure of such efforts in Nigeria and the potential prospects (i.e., transparency and efficiency). This study is essential in developing a more effective and transparent process of procurement so that the Nigerian construction industry is not be left behind in the fast-digitalising markets.

Traditional Dyeing of Silk with Natural Dyes by Eco-Friendly Method

In traditional dyeing of natural fibers with natural dyes, metal salts are commonly used to increase color stability. This method always carries the risk of environmental pollution (contamination of arable soils and fresh groundwater) due to the release of dyeing effluents containing large amounts of metal. Therefore, researchers are always looking for new methods to obtain a green dyeing system. In this research, the use of the enzymatic dyeing method to prevent environmental pollution with metals and reduce production costs has been proposed. After degumming and bleaching, raw silk fabrics were dyed with natural dyes (Madder and Sumac) by three methods (pre-mordanting with a metal salt, one-step enzymatic dyeing, and two-step enzymatic dyeing). Results show that silk dyed with natural dyes by the enzymatic method has higher color strength and colorfastness than the pretreated with a metal salt. Also, the amount of remained dyes in the dyeing wastewater is significantly reduced by the enzymatic method. It is found that the enzymatic dyeing method leads to improvement of dye absorption, color strength, soft hand, no change in color shade, low production costs (due to low dyeing temperature), and a significant reduction in environmental pollution.