Contact Stress Analysis of Spur Gear Teeth Pair

Contact stress analysis between two spur gear teeth was considered in different contact positions, representing a pair of mating gears during rotation. A programme has been developed to plot a pair of teeth in contact. This programme was run for each 3° of pinion rotation from the first location of contact to the last location of contact to produce 10 cases. Each case was represented a sequence position of contact between these two teeth. The programme gives graphic results for the profiles of these teeth in each position and location of contact during rotation. Finite element models were made for these cases and stress analysis was done. The results were presented and finite element analysis results were compared with theoretical calculations, wherever available.

Making Ends Meet: The Challenges of Investing in and Accounting for Sustainability

The transition to sustainable development requires considerable investments from stakeholders, both financial and immaterial. However, accounting for such investments often poses a challenge, as ventures with intangible or non-financial returns remain oblivious to conventional accounting techniques and risk assessment. That such investments may significantly contribute to the welfare of those affected may act as a driving force behind attempting to bridge this gap. This gains crucial importance as investments must be also backed by governments and administrations; entities whose budget depends on taxpayers- contributions and whose tasks are based on securing the welfare of their citizens. Besides economic welfare, citizens also require social and environmental wellbeing too. However, administrations must also safeguard that welfare is guaranteed not only to present, but to future generations too. With already strained budgets and the requirement of sustainable development, governments on all levels face the double challenge of making both of these ends meet.

Biosorption of Heavy Metals Contaminating the Wonderfonteinspruit Catchment Area using Desmodesmus sp.

A vast array of biological materials, especially algae have received increasing attention for heavy metal removal. Algae have been proven to be cheaper, more effective for the removal of metallic elements in aqueous solutions. A fresh water algal strain was isolated from Zoo Lake, Johannesburg, South Africa and identified as Desmodesmus sp. This paper investigates the efficacy of Desmodesmus sp.in removing heavy metals contaminating the Wonderfonteinspruit Catchment Area (WCA) water bodies. The biosorption data fitted the pseudo-second order and Langmuir isotherm models. The Langmuir maximum uptakes gave the sequence: Mn2+>Ni2+>Fe2+. The best results for kinetic study was obtained in concentration 120 ppm for Fe3+ and Mn2+, whilst for Ni2+ was at 20 ppm, which is about the same concentrations found in contaminated water in the WCA (Fe3+115 ppm, Mn2+ 121 ppm and Ni2+ 26.5 ppm).

A Comparison of Artificial Neural Networks for Prediction of Suspended Sediment Discharge in River- A Case Study in Malaysia

Prediction of highly non linear behavior of suspended sediment flow in rivers has prime importance in the field of water resources engineering. In this study the predictive performance of two Artificial Neural Networks (ANNs) namely, the Radial Basis Function (RBF) Network and the Multi Layer Feed Forward (MLFF) Network have been compared. Time series data of daily suspended sediment discharge and water discharge at Pari River was used for training and testing the networks. A number of statistical parameters i.e. root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and coefficient of determination (R2) were used for performance evaluation of the models. Both the models produced satisfactory results and showed a good agreement between the predicted and observed data. The RBF network model provided slightly better results than the MLFF network model in predicting suspended sediment discharge.

Security Risk Analysis Based on the Policy Formalization and the Modeling of Big Systems

Security risk models have been successful in estimating the likelihood of attack for simple security threats. However, modeling complex system and their security risk is even a challenge. Many methods have been proposed to face this problem. Often difficult to manipulate, and not enough all-embracing they are not as famous as they should with administrators and deciders. We propose in this paper a new tool to model big systems on purpose. The software, takes into account attack threats and security strength.

Salt-Tolerance of Tissue-Cultured Date Palm Cultivars under Controlled Environment

A study was conducted in greenhouse environment to determine the response of five tissue-cultured date palm cultivars, Al- Ahamad, Nabusaif, Barhee, Khalas, and Kasab to irrigation water salinity of 1.6, 5, 10, or 20 dS/ m. The salinity level of 1.6dS/m, was used as a control. The effects of high salinity on plant survival were manifested at 360 days after planting (DAP) onwards. Three cultivars, Khalas, Kasab and Barhee were able to tolerate 10 dS/m salinity level at 24 months after the start of study. Khalas tolerated the highest salinity level of 20 dS/ m and 'Nabusaif' was found to be the least tolerant cv. The average heights of palms and the number of fronds were decreased with increasing salinity levels as time progressed.

Variational Iteration Method for the Solution of Boundary Value Problems

In this work, we present a reliable framework to solve boundary value problems with particular significance in solid mechanics. These problems are used as mathematical models in deformation of beams. The algorithm rests mainly on a relatively new technique, the Variational Iteration Method. Some examples are given to confirm the efficiency and the accuracy of the method.

Tuning of Thermal FEA Using Krylov Parametric MOR for Subsea Application

A dead leg is a typical subsea production system component. CFD is required to model heat transfer within the dead leg. Unfortunately its solution is time demanding and thus not suitable for fast prediction or repeated simulations. Therefore there is a need to create a thermal FEA model, mimicking the heat flows and temperatures seen in CFD cool down simulations. This paper describes the conventional way of tuning and a new automated way using parametric model order reduction (PMOR) together with an optimization algorithm. The tuned FE analyses replicate the steady state CFD parameters within a maximum error in heat flow of 6 % and 3 % using manual and PMOR method respectively. During cool down, the relative error of the tuned FEA models with respect to temperature is below 5% comparing to the CFD. In addition, the PMOR method obtained the correct FEA setup five times faster than the manually tuned FEA.

Modeling Peer-to-Peer Networks with Interest-Based Clusters

In the world of Peer-to-Peer (P2P) networking different protocols have been developed to make the resource sharing or information retrieval more efficient. The SemPeer protocol is a new layer on Gnutella that transforms the connections of the nodes based on semantic information to make information retrieval more efficient. However, this transformation causes high clustering in the network that decreases the number of nodes reached, therefore the probability of finding a document is also decreased. In this paper we describe a mathematical model for the Gnutella and SemPeer protocols that captures clustering-related issues, followed by a proposition to modify the SemPeer protocol to achieve moderate clustering. This modification is a sort of link management for the individual nodes that allows the SemPeer protocol to be more efficient, because the probability of a successful query in the P2P network is reasonably increased. For the validation of the models, we evaluated a series of simulations that supported our results.

A Metric-Set and Model Suggestion for Better Software Project Cost Estimation

Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with success

Simulating the Dynamics of Distribution of Hazardous Substances Emitted by Motor Engines in a Residential Quarter

This article is dedicated to development of mathematical models for determining the dynamics of concentration of hazardous substances in urban turbulent atmosphere. Development of the mathematical models implied taking into account the time-space variability of the fields of meteorological items and such turbulent atmosphere data as vortex nature, nonlinear nature, dissipativity and diffusivity. Knowing the turbulent airflow velocity is not assumed when developing the model. However, a simplified model implies that the turbulent and molecular diffusion ratio is a piecewise constant function that changes depending on vertical distance from the earth surface. Thereby an important assumption of vertical stratification of urban air due to atmospheric accumulation of hazardous substances emitted by motor vehicles is introduced into the mathematical model. The suggested simplified non-linear mathematical model of determining the sought exhaust concentration at a priori unknown turbulent flow velocity through non-degenerate transformation is reduced to the model which is subsequently solved analytically.

Laser Excited Nuclear γ -Source of High Spectral Brightness

This paper considers various channels of gammaquantum generation via an ultra-short high-power laser pulse interaction with different targets.We analyse the possibilities to create a pulsed gamma-radiation source using laser triggering of some nuclear reactions and isomer targets. It is shown that sub-MeV monochromatic short pulse of gamma-radiation can be obtained with pulse energy of sub-mJ level from isomer target irradiated by intense laser pulse. For nuclear reaction channel in light- atom materials, it is shown that sub-PW laser pulse gives rise to formation about million gamma-photons of multi-MeV energy.

Hardware Prototyping of an Efficient Encryption Engine

An approach to develop the FPGA of a flexible key RSA encryption engine that can be used as a standard device in the secured communication system is presented. The VHDL modeling of this RSA encryption engine has the unique characteristics of supporting multiple key sizes, thus can easily be fit into the systems that require different levels of security. A simple nested loop addition and subtraction have been used in order to implement the RSA operation. This has made the processing time faster and used comparatively smaller amount of space in the FPGA. The hardware design is targeted on Altera STRATIX II device and determined that the flexible key RSA encryption engine can be best suited in the device named EP2S30F484C3. The RSA encryption implementation has made use of 13,779 units of logic elements and achieved a clock frequency of 17.77MHz. It has been verified that this RSA encryption engine can perform 32-bit, 256-bit and 1024-bit encryption operation in less than 41.585us, 531.515us and 790.61us respectively.

Using Submerge Fermentation Method to Production of Extracellular Lipase by Aspergillus niger

In this study, lipase production has been investigated using submerge fermentation by Aspergillus niger in Kilka fish oil as main substrate. The Taguchi method with an L9 orthogonal array design was used to investigate the effect of parameters and their levels on lipase productivity. The optimum conditions for Kilka fish oil concentration, incubation temperature and pH were obtained 3 gr./ml 35°C and 7, respectively. The amount of lipase activity in optimum condition was obtained 4.59IU/ml. By comparing this amount with the amount of productivity in the olive oil medium based on the cost of each medium, it was that using Kilka fish oil is 84% economical. Therefore Kilka fish oil can be used as an economical and suitable substrate in the lipase production and industrial usages.

Suppression of Narrowband Interference in Impulse Radio Based High Data Rate UWB WPAN Communication System Using NLOS Channel Model

Study on suppression of interference in time domain equalizers is attempted for high data rate impulse radio (IR) ultra wideband communication system. The narrow band systems may cause interference with UWB devices as it is having very low transmission power and the large bandwidth. SRAKE receiver improves system performance by equalizing signals from different paths. This enables the use of SRAKE receiver techniques in IRUWB systems. But Rake receiver alone fails to suppress narrowband interference (NBI). A hybrid SRake-MMSE time domain equalizer is proposed to overcome this by taking into account both the effect of the number of rake fingers and equalizer taps. It also combats intersymbol interference. A semi analytical approach and Monte-Carlo simulation are used to investigate the BER performance of SRAKEMMSE receiver on IEEE 802.15.3a UWB channel models. Study on non-line of sight indoor channel models (both CM3 and CM4) illustrates that bit error rate performance of SRake-MMSE receiver with NBI performs better than that of Rake receiver without NBI. We show that for a MMSE equalizer operating at high SNR-s the number of equalizer taps plays a more significant role in suppressing interference.

Histogram Slicing to Better Reveal Special Thermal Objects

In this paper, an experimentation to enhance the visibility of hot objects in a thermal image acquired with ordinary digital camera is reported, after the applications of lowpass and median filters to suppress the distracting granular noises. The common thresholding and slicing techniques were used on the histogram at different gray levels, followed by a subjective comparative evaluation. The best result came out with the threshold level 115 and the number of slices 3.

Effectiveness of ICT Training Workshop for Tutors of Allama Iqbal Open University, Pakistan

The purpose of the study was to investigate the effectiveness of ICT training workshop of tutors of Allama Iqbal Open University Pakistan. The study was delimited to tutors of Multan region. The total sample comprised of 100 tutors. All the tutors who participated in ICT training workshop in Multan region were taken as sample in the study. A questionnaire having two parts, based on five point rating scale was developed by the researcher. Part one was about the competency level of computer skills while Part two was based on items related to training delivery, structure and content. Part One of questionnaire had five levels of competency about computer skills. The questionnaire was personally administered and collected back by the researcher himself on the last day of workshop. The collected data were analyzed by using descriptive statistics. Through this study it was found that majority of the tutors strongly agreed that training enhanced their computer skills. Majority of the respondents consider themselves to be generally competent in the use of computer. They also agreed that there was appropriate infrastructure and technical support in lab during training workshop. Moreover, it was found that the training imparted the knowledge of pedagogy of using computers for distance education.

Recursive Algorithms for Image Segmentation Based on a Discriminant Criterion

In this study, a new criterion for determining the number of classes an image should be segmented is proposed. This criterion is based on discriminant analysis for measuring the separability among the segmented classes of pixels. Based on the new discriminant criterion, two algorithms for recursively segmenting the image into determined number of classes are proposed. The proposed methods can automatically and correctly segment objects with various illuminations into separated images for further processing. Experiments on the extraction of text strings from complex document images demonstrate the effectiveness of the proposed methods.1

A Multi-Phase Methodology for Investigating Localisation Policies within the GCC: The Hotel Industry in the KSA and the UAE

Due to a high unemployment rate among local people and a high reliance on expatriate workers, the governments in the Gulf Co-operation Council (GCC) countries have been implementing programmes of localisation (replacing foreign workers with GCC nationals). These programmes have been successful in the public sector but much less so in the private sector. However, there are now insufficient jobs for locals in the public sector and the onus to provide employment has fallen on the private sector. This paper is concerned with a study, which is a work in progress (certain elements are complete but not the whole study), investigating the effective implementation of localisation policies in four- and five-star hotels in the Kingdom of Saudi Arabia (KSA) and the United Arab Emirates (UAE). The purpose of the paper is to identify the research gap, and to present the need for the research. Further, it will explain how this research was conducted. Studies of localisation in the GCC countries are under-represented in scholarly literature. Currently, the hotel sectors in KSA and UAE play an important part in the countries’ economies. However, the total proportion of Saudis working in the hotel sector in KSA is slightly under 8%, and in the UAE, the hotel sector remains highly reliant on expatriates. There is therefore a need for research on strategies to enhance the implementation of the localisation policies in general and in the hotel sector in particular. Further, despite the importance of the hotel sector to their economies, there remains a dearth of research into the implementation of localisation policies in this sector. Indeed, as far as the researchers are aware, there is no study examining localisation in the hotel sector in KSA, and few in the UAE. This represents a considerable research gap. Regarding how the research was carried out, a multiple case study strategy was used. The four- and five-star hotel sector in KSA is one of the cases, while the four- and five-star hotel sector in the UAE is the other case. Four- and five-star hotels in KSA and the UAE were chosen as these countries have the longest established localisation policies of all the GCC states and there are more hotels of these classifications in these countries than in any of the other Gulf countries. A literature review was carried out to underpin the research. The empirical data were gathered in three phases. In order to gain a pre-understanding of the issues pertaining to the research context, Phase I involved eight unstructured interviews with officials from the Saudi Commission for Tourism and Antiquities (three interviewees); the Saudi Human Resources Development Fund (one); the Abu Dhabi Tourism and Culture Authority (three); and the Abu Dhabi Development Fund (one). In Phase II, a questionnaire was administered to 24 managers and 24 employees in four- and five-star hotels in each country to obtain their beliefs, attitudes, opinions, preferences and practices concerning localisation. Unstructured interviews were carried out in Phase III with six managers in each country in order to allow them to express opinions that may not have been explored in sufficient depth in the questionnaire. The interviews in Phases I and III were analysed using thematic analysis and SPSS will be used to analyse the questionnaire data. It is recommended that future research be undertaken on a larger scale, with a larger sample taken from all over KSA and the UAE rather than from only four cities (i.e., Riyadh and Jeddah in KSA and Abu Dhabi and Sharjah in the UAE), as was the case in this research.

Effect of Gamma Irradiation on the Microhardness of Polymer Blends of Poly (Ethyl Methacrylate)(Pema) and Poly (Ethylene Oxide) (Peo)

The effect of gamma irradiation on micro-hardness of polymer blends of poly (ethyl methacrylate)(PEMA) and poly (ethylene oxide) (PEO) has been investigated to detect the radiation induced crosslinking. The blend system comprises a noncrystallizable polymer, PEMA and a crystallizable polymer, PEO. On irradiation, the overall hardness of the blend specimens for different dose levels infers occurrence of a crosslinking process. The radiation-induced crosslinking was greater for blends having lower concentration of PEO. However, increase in radiation dose causes softening of blend system due to radiation induced scissioning of the chains