The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Lexicon-Based Sentiment Analysis for Stock Movement Prediction

Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.

Music Aptitude and School Readiness in Indonesian Children

This study investigated the relationship between music aptitude and school readiness in Indonesian children. Music aptitude is described as children’s music potential, whereas school readiness is defined as a condition in which a child is deemed ready to enter the formal education system. This study presents a hypothesis that music aptitude is correlated with school readiness. This is a correlational research study of 17 children aged 5-6 years old (M = 6.10, SD = 0.33) who were enrolled in a kindergarten school in Jakarta, Indonesia. Music aptitude scores were obtained from Primary Measures of Music Audiation, whereas School readiness scores were obtained from Bracken School Readiness Assessment Third Edition. The analysis of the data was performed using Pearson Correlation. The result found no correlation between music aptitude and school readiness (r = 0.196, p = 0.452). Discussions regarding the results, perspective from the measures and cultures are presented. Further study is recommended to establish links between music aptitude and school readiness.

In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Significance of Bike-Frame Geometric Factors for Cycling Efficiency and Muscle Activation

With the advocacy of green transportation and green traveling, cycling has become increasingly popular nowadays. Physiology and bike design are key factors for the influence of cycling efficiency. Therefore, this study aimed to investigate the significance of bike-frame geometric factors on cycling efficiency and muscle activation for different body sizes of non-professional Asian male cyclists. Participants who represented various body sizes, as measured by leg and back lengths, carried out cycling tests using a tailor-assembled road bike with different ergonomic design configurations including seat-height adjustments (i.e., 96%, 100%, and 104% of trochanteric height) and bike frame sizes (i.e., small and medium frames) for an assessable distance of 1 km. A specific power meter and self-developed adaptable surface electromyography (sEMG) were used to measure average pedaling power and cadence generated and muscle activation, respectively. The results showed that changing the seat height was far more significant than the body and bike frame sizes. The sEMG data evidently provided a better understanding of muscle activation as a function of different seat heights. Therefore, the interpretation of this study is that the major bike ergonomic design factor dominating the cycling efficiency of Asian participants with different body sizes was the seat height.

Rank-Based Chain-Mode Ensemble for Binary Classification

In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.

Critical Psychosocial Risk Treatment for Engineers and Technicians

This study explores how management addresses psychosocial risks in seven teams of engineers and technicians in the midst of the fourth industrial revolution. The sample is from an ongoing quasi-experiment about psychosocial risk management in a manufacturing company in Sweden. Each of the seven teams belongs to one of two clusters: a positive cluster or a negative cluster. The positive cluster reports a significantly positive change in psychosocial risk levels between two time-points and the negative cluster reports a significantly negative change. The data are collected using semi-structured interviews. The results of the computer aided thematic analysis show that there are more differences than similarities when comparing the risk treatment actions taken between the two clusters. Findings show that the managers in the positive cluster use more enabling actions that foster and support formal and informal relationship building. In contrast, managers that use less enabling actions hinder the development of positive group processes and contribute negative changes in psychosocial risk levels. This exploratory study sheds some light on how management can influence significant positive and negative changes in psychosocial risk levels during a risk management process.

Levels and Trends of Under-Five Mortality in South Africa from 1998 to 2012

Childhood mortality is a key sign of the coverage of child survival interventions, social and economic progressions. Although the level of under-five mortality has been declining, it is still unacceptably high. The primary aim of this paper is to establish and analyse the levels and trends of under-five mortality for the periods 1998, 2003 and 2012 in South Africa. Methods: The data used for analysis came from the 1998 SADHS, the 2003 SADHS and the 2012 SABSSM which collected information on the survival status of children. The Kaplan-Meier estimate of the survival function method was used to determine the probabilities of failure (death) from birth up to 59 months. Results and Conclusion: The overall U5MR declined by 28.2% from 53.1 in 1998 to 38.1 in 2012. The U5MR of male children declined from 59.2 in 1998 to 46.2 in 2003 and dropped further to 41.4 in 2012. The U5MR of children of mothers aged 40 years and older increased from 64.0 in 1998 to 89.0 in 2003 and rose further to 129.9 in 2012. The U5MR of children of mothers with education level of 12 years or more increased from 32.2 in 1998 to 35.2 in 2003 and declined substantially to 17.5 in 2012.

The Alliance for Grassland Renewal: A Model for Teaching Endophyte Technology

To the author’s best knowledge, there are no published reports of effective methods for teaching fescue toxicosis and grass endophyte technology in the USA. To address this need, a group of university scientists, industry representatives, government agents, and livestock producers formed an organization called the Alliance for Grassland Renewal. One goal of the Alliance was to develop a teaching method that could be employed across all regions in the USA and all sectors of the agricultural community. The first step in developing this method was identification of experts who were familiar with the science and management of fescue toxicosis. The second step was curriculum development. Experts wrote a curriculum that addressed all aspects of toxicosis and management, including toxicology, animal nutrition, pasture management, economics, and mycology. The curriculum was created for presentation in lectures, laboratories, and in the field. The curriculum was in that it could be delivered across state lines, regardless of peculiar, in-state recommendations. The curriculum was also unique as it was unanimously supported by private companies otherwise in competition with each other. The final step in developing this teaching method was formulating a delivery plan. All experts, including university, industry, government, and production, volunteered to travel from any state in the USA, converge in one location, teach a 1-day workshop, then travel to the next location. The results of this teaching method indicate widespread success. Since 2012, experts across the entire USA have converged to teach Alliance workshops in Kansas, Oklahoma, Missouri, Kentucky, Georgia, South Carolina, North Carolina, and Virginia, with ongoing workshops in Arkansas and Tennessee. Data from post-workshop surveys indicate that instruction has been effective, as at least 50% of the participants stated their intention to adopt the endophyte technology presented in these workshops. The teaching method developed by the Alliance for Grassland Renewal has proved to be effective, and the Alliance continues to expand across the USA.

Building and Tree Detection Using Multiscale Matched Filtering

In this study, an automated building and tree detection method is proposed using DSM data and true orthophoto image. A multiscale matched filtering is used on DSM data. Therefore, first watershed transform is applied. Then, Otsu’s thresholding method is used as an adaptive threshold to segment each watershed region. Detected objects are masked with NDVI to separate buildings and trees. The proposed method is able to detect buildings and trees without entering any elevation threshold. We tested our method on ISPRS semantic labeling dataset and obtained promising results.

The Impact of Culture on Tourists’ Evaluation of Hotel Service Experiences

The purpose of this study is to investigate the impact of tourists’ culture on perception and evaluation of hotel service experience and behavioral intentions. Drawing on Hofested’s cultural dimensions, this study seeks to further contribute towards understanding the effect of culture on perception and evaluation of hotels’ services, and whether there are differences between Saudi and European tourists’ perceptions of hotel services evaluation. A descriptive cross-sectional design was used in this study. Data were collected from tourists staying in five-star hotels in Saudi Arabia using the self-completion technique. The findings show that evaluations of hotel services differ from one culture to another. T-test results reveal that Saudis were more tolerant and reported significantly higher levels of satisfaction, were more likely to return and recommend the hotel, and perceived the price for the hotel stay as being good value for money as compared to their European counterparts. The sample was relatively small and specific to only five-star hotel evaluations. As a result, findings cannot be generalized to the wider tourist population. The results of this research have important implications for management within the Saudi hospitality industry. The study contributes to the tourist cultural theory by emphasizing the relative importance of cultural dimensions in-service evaluation. The author argues that no studies could be identified that compare Saudis and Europeans in their evaluations of their experiences staying at hotels. Therefore, the current study would enhance understanding of the effects of cultural factors on service evaluations and provide valuable input for international market segmentation and resource allocation in the Saudi hotel industry.

Liquid Chromatography Microfluidics for Detection and Quantification of Urine Albumin Using Linear Regression Method

Nearly a hundred per million of the Filipino population is diagnosed with Chronic Kidney Disease (CKD). The early stage of CKD has no symptoms and can only be discovered once the patient undergoes urinalysis. Over the years, different methods were discovered and used for the quantification of the urinary albumin such as the immunochemical assays where most of these methods require large machinery that has a high cost in maintenance and resources, and a dipstick test which is yet to be proven and is still debated as a reliable method in detecting early stages of microalbuminuria. This research study involves the use of the liquid chromatography concept in microfluidic instruments with biosensor as a means of separation and detection respectively, and linear regression to quantify human urinary albumin. The researchers’ main objective was to create a miniature system that quantifies and detect patients’ urinary albumin while reducing the amount of volume used per five test samples. For this study, 30 urine samples of unknown albumin concentrations were tested using VITROS Analyzer and the microfluidic system for comparison. Based on the data shared by both methods, the actual vs. predicted regression were able to create a positive linear relationship with an R2 of 0.9995 and a linear equation of y = 1.09x + 0.07, indicating that the predicted values and actual values are approximately equal. Furthermore, the microfluidic instrument uses 75% less in total volume – sample and reagents combined, compared to the VITROS Analyzer per five test samples.

Remote Monitoring and Control System of Potentiostat Based on the Internet of Things

Constant potometer is an important component of pipeline anti-corrosion systems in the chemical industry. Based on Internet of Things (IoT) technology, Programmable Logic Controller (PLC) technology and database technology, this paper developed a set of a constant potometer remote monitoring management system. The remote monitoring and remote adjustment of the working status of the constant potometer are realized. The system has real-time data display, historical data query, alarm push management, user permission management, and supporting Web access and mobile client application (APP) access. The actual engineering project test results show the stability of the system, which can be widely used in cathodic protection systems.

Design of an Eddy Current Brake System for the Use of Roller Coasters Based on a Human Factors Engineering Approach

The goal of this paper is to converge upon a design of a brake system that could be used for a roller coaster found at an amusement park. It was necessary to find what could be deemed as a “comfortable” deceleration so that passengers do not feel as if they are suddenly jerked and pressed against the restraining harnesses. A human factors engineering approach was taken in order to determine this deceleration. Using a previous study that tested the deceleration of transit vehicles, it was found that a -0.45 G deceleration would be used as a design requirement to build this system around. An adjustable linear eddy current brake using permanent magnets would be the ideal system to use in order to meet this design requirement. Anthropometric data were then used to determine a realistic weight and length of the roller coaster that the brake was being designed for. The weight and length data were then factored into magnetic brake force equations. These equations were used to determine how the brake system and the brake run layout would be designed. A final design for the brake was determined and it was found that a total of 12 brakes would be needed with a maximum braking distance of 53.6 m in order to stop a roller coaster travelling at its top speed and loaded to maximum capacity. This design is derived from theoretical calculations, but is within the realm of feasibility.

Implementation of the Quality Management System and Development of Organizational Learning: Case of Three Small and Medium-Sized Enterprises in Morocco

The profusion of studies relating to the concept of organizational learning shows the importance that has been given to this concept in the management sciences. A few years ago, companies leaned towards ISO 9001 certification; this requires the implementation of the quality management system (QMS). In order for this objective to be achieved, companies must have a set of skills, which pushes them to develop learning through continuous training. The results of empirical research have shown that implementation of the QMS in the company promotes the development of learning. It should also be noted that several types of learning are developed in this sense. Given the nature of skills development is normative in the context of the quality demarche, companies are obliged to qualify and improve the skills of their human resources. Continuous training is the keystone to develop the necessary learning. To carry out continuous training, companies need to be able to identify their real needs by developing training plans based on well-defined engineering. The training process goes obviously through several stages. Initially, training has a general aspect, that is to say, it focuses on topics and actions of a general nature. Subsequently, this is done in a more targeted and more precise way to accompany the evolution of the QMS and also to make the changes decided each time (change of working method, change of practices, change of objectives, change of mentality, etc.). To answer our problematic we opted for the method of qualitative research. It should be noted that the case study method crosses several data collection techniques to explain and understand a phenomenon. Three cases of companies were studied as part of this research work using different data collection techniques related to this method.

Sedimentary Response to Coastal Defense Works in São Vicente Bay, São Paulo

The article presents the evaluation of the effectiveness of two groins located at Gonzaguinha and Milionários Beaches, situated on the southeast coast of Brazil. The effectiveness of these coastal defense structures is evaluated in terms of sedimentary dynamics, which is one of the most important environmental processes to be assessed in coastal engineering studies. The applied method is based on the implementation of the Delft3D numerical model system tools. Delft3D-WAVE module was used for waves modelling, Delft3D-FLOW for hydrodynamic modelling and Delft3D-SED for sediment transport modelling. The calibration of the models was carried out in a way that the simulations adequately represent the region studied, evaluating improvements in the model elements with the use of statistical comparisons of similarity between the results and waves, currents and tides data recorded in the study area. Analysis of the maximum wave heights was carried to select the months with higher accumulated energy to implement these conditions in the engineering scenarios. The engineering studies were performed for two scenarios: 1) numerical simulation of the area considering only the two existing groins; 2) conception of breakwaters coupled at the ends of the existing groins, resulting in two “T” shaped structures. The sediment model showed that, for the simulated period, the area is affected by erosive processes and that the existing groins have little effectiveness in defending the coast in question. The implemented T structures showed some effectiveness in protecting the beaches against erosion and provided the recovery of the portion directly covered by it on the Milionários Beach. In order to complement this study, it is suggested the conception of further engineering scenarios that might recover other areas of the studied region.

Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Study of Proton-9,11Li Elastic Scattering at 60~75 MeV/Nucleon

The radial form of nuclear matter distribution, charge and the shape of nuclei are essential properties of nuclei, and hence, are of great attention for several areas of research in nuclear physics. More than last three decades have witnessed a range of experimental means employing leptonic probes (such as muons, electrons etc.) for exploring nuclear charge distributions, whereas the hadronic probes (for example alpha particles, protons, etc.) have been used to investigate the nuclear matter distributions. In this paper, p-9,11Li elastic scattering differential cross sections in the energy range  to  MeV have been studied by means of Coulomb modified Glauber scattering formalism. By applying the semi-phenomenological Bhagwat-Gambhir-Patil [BGP] nuclear density for loosely bound neutron rich 11Li nucleus, the estimated matter radius is found to be 3.446 fm which is quite large as compared to so known experimental value 3.12 fm. The results of microscopic optical model based calculation by applying Bethe-Brueckner–Hartree–Fock formalism (BHF) have also been compared. It should be noted that in most of phenomenological density model used to reproduce the p-11Li differential elastic scattering cross sections data, the calculated matter radius lies between 2.964 and 3.55 fm. The calculated results with phenomenological BGP model density and with nucleon density calculated in the relativistic mean-field (RMF) reproduces p-9Li and p-11Li experimental data quite nicely as compared to Gaussian- Gaussian or Gaussian-Oscillator densities at all energies under consideration. In the approach described here, no free/adjustable parameter has been employed to reproduce the elastic scattering data as against the well-known optical model based studies that involve at least four to six adjustable parameters to match the experimental data. Calculated reaction cross sections σR for p-11Li at these energies are quite large as compared to estimated values reported by earlier works though so far no experimental studies have been performed to measure it.

Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Relationship between the Use of Hormonal Contraceptives and the Promotion of Changes in Hemodynamic Factors that Predispose to Cardiovascular Risk

The use of hormonal contraceptive drugs is widely used in different age groups, mainly due to easy acquisition and due to a large number of prescriptions. Besides, several studies point to the high cardiovascular risk concerning the use of hormonal contraceptives. In this study, we evaluated 620 Brazilian women to demonstrate the correlation between the hormonal contraceptives uses with the increase of the predisposition of risk for cardiovascular diseases. Our data demonstrated that concomitant use of contraceptives showed a significant reduction activated partial thromboplastin time and prothrombin time similar to hypercoagulability clinical conditions. Besides, as a compensation mechanism, there was an increase in the Fibrinogen levels. We also verified a significant increase at the total cholesterol and platelet aggregation up to 10%. Therefore, it was evidenced through this study that the use of hormonal contraceptives may increase cardiovascular risk. Besides, our data represents an alert since, in Brazil, the primary contraceptive hormone prescribed in public health units consists of second-generation drugs, which are the ones that most present associations with cardiovascular risk.