Spatial Distribution of Ambient BTEX Concentrations at an International Airport in South Africa

Air travel, and the use of airports, has experienced proliferative growth in the past few decades, resulting in the concomitant release of air pollutants. Air pollution needs to be monitored because of the known relationship between exposure to air pollutants and increased adverse effects on human health. This study monitored a group of volatile organic compounds (VOCs); specifically BTEX (viz. benzene, toluene, ethyl-benzene and xylenes), as many are detrimental to human health. Through the use of passive sampling methods, the spatial variability of BTEX within an international airport was investigated, in order to determine ‘hotspots’ where occupational exposure to BTEX may be intensified. The passive sampling campaign revealed BTEXtotal concentrations ranged between 12.95–124.04 µg m-3. Furthermore, BTEX concentrations were dispersed heterogeneously within the airport. Due to the slow wind speeds recorded (1.13 m.s-1); the hotspots were located close to their main BTEX sources. The main hotspot was located over the main apron of the airport. Employees working in this area may be chronically exposed to these emissions, which could be potentially detrimental to their health.

Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses

Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.

An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Promoting Mental and Spiritual Health among Postpartum Mothers to Extend Breastfeeding Period

The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.

Improved Power Spectrum Estimation for RR-Interval Time Series

The RR interval series is non-stationary and unevenly spaced in time. For estimating its power spectral density (PSD) using traditional techniques like FFT, require resampling at uniform intervals. The researchers have used different interpolation techniques as resampling methods. All these resampling methods introduce the low pass filtering effect in the power spectrum. The lomb transform is a means of obtaining PSD estimates directly from irregularly sampled RR interval series, thus avoiding resampling. In this work, the superiority of Lomb transform method has been established over FFT based approach, after applying linear and cubicspline interpolation as resampling methods, in terms of reproduction of exact frequency locations as well as the relative magnitudes of each spectral component.

High Dynamic Range Resampling for Software Radio

The classic problem of recovering arbitrary values of a band-limited signal from its samples has an added complication in software radio applications; namely, the resampling calculations inevitably fold aliases of the analog signal back into the original bandwidth. The phenomenon is quantified by the spur-free dynamic range. We demonstrate how a novel application of the Remez (Parks- McClellan) algorithm permits optimal signal recovery and SFDR, far surpassing state-of-the-art resamplers.

Development of Workplace Environmental Monitoring Systems Using Ubiquitous Sensor Network

In this study, workplace environmental monitoring systems were established using USN(Ubiquitous Sensor Networks) and LabVIEW. Although existing direct sampling methods enable finding accurate values as of the time points of measurement, those methods are disadvantageous in that continuous management and supervision are difficult and costs for are high when those methods are used. Therefore, the efficiency and reliability of workplace management by supervisors are relatively low when those methods are used. In this study, systems were established so that information on workplace environmental factors such as temperatures, humidity and noises is measured and transmitted to the PC in real time to enable supervisors to monitor workplaces through LabVIEW on the PC. When any accidents have occurred in workplaces, supervisors can immediately respond through the monitoring system and this system enables integrated workplace management and the prevention of safety accidents. By introducing these monitoring systems, safety accidents due to harmful environmental factors in workplaces can be prevented and these monitoring systems will be also helpful in finding out the correlation between safety accidents and occupational diseases by comparing and linking databases established by this monitoring system with existing statistical data.