Contributory Factors to Diabetes Dietary Regimen Non Adherence in Adults with Diabetes

A cross sectional survey design was used to collect data from 370 diabetic patients. Two instruments were used in obtaining data; in-depth interview guide and researchers- developed questionnaire. Fisher's exact test was used to investigate association between the identified factors and nonadherence. Factors identified were: socio-demographic factors such as: gender, age, marital status, educational level and occupation; psychosocial obstacles such as: non-affordability of prescribed diet, frustration due to the restriction, limited spousal support, feelings of deprivation, feeling that temptation is inevitable, difficulty in adhering in social gatherings and difficulty in revealing to host that one is diabetic; health care providers obstacles were: poor attitude of health workers, irregular diabetes education in clinics , limited number of nutrition education sessions/ inability of the patients to estimate the desired quantity of food, no reminder post cards or phone calls about upcoming patient appointments and delayed start of appointment / time wasting in clinics.

Health Monitoring of Power Transformers by Dissolved Gas Analysis using Regression Method and Study the Effect of Filtration on Oil

Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.

On the Analysis and a Few Optimization Issues of a New iCIM 3000 System at an Academic-Research Oriented Institution

In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.

Short Time Identification of Feed Drive Systems using Nonlinear Least Squares Method

Design and modeling of nonlinear systems require the knowledge of all inside acting parameters and effects. An empirical alternative is to identify the system-s transfer function from input and output data as a black box model. This paper presents a procedure using least squares algorithm for the identification of a feed drive system coefficients in time domain using a reduced model based on windowed input and output data. The command and response of the axis are first measured in the first 4 ms, and then least squares are applied to predict the transfer function coefficients for this displacement segment. From the identified coefficients, the next command response segments are estimated. The obtained results reveal a considerable potential of least squares method to identify the system-s time-based coefficients and predict accurately the command response as compared to measurements.

Self-Esteem and Stress Level among Traumatic Brain Injured Adults with Mild, Moderate and Severe Injuries attending a Day Program Rehabilitation Facility

The purpose of the study was to determine if, among 32 brain injured adults in community rehabilitation programs, there is a statistically significant relationship between the degree of severity of brain injury and these adults- level of self-esteem and stress. The researcher hypothesized there would be a statistically significant difference and a statistically significant relationship in self-esteem and stress levels among and TBI adults. A Pearson product moment correlational analysis was implemented and results found a statistically significant relationship between self-esteem and stress levels. Future recommendations were suggested upon completion of research.

A Few Descriptive and Optimization Issues on the Material Flow at a Research-Academic Institution: The Role of Simulation

Lately, significant work in the area of Intelligent Manufacturing has become public and mainly applied within the frame of industrial purposes. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Aware of all this and due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: Intelligent Manufacturing, the present paper emerges with the main aim of contributing to the design and analysis of the material flow in either systems, cells or work stations under this new “intelligent" denomination. For this, besides offering a conceptual basis in some of the key points to be taken into account and some general principles to consider in the design and analysis of the material flow, also some tips on how to define other possible alternative material flow scenarios and a classification of the states a system, cell or workstation are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a detailed layout, other figures and a few expressions which could help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.

Effects of Beak Trimming on Behavior and Agonistic Activity of Thai Native Pullets Raised in Floor Pens

The effect of beak trimming on behavior of two strains of Thai native pullets kept in floor pens was studied. Six general activities (standing, crouching, moving, comforting, roosting, and nesting), 6 beak related activities (preening, feeding, drinking, pecking at inedible object, feather pecking, and litter pecking), and 4 agonistic activities (head pecking, threatening, avoiding, and fighting) were measured twice a for 15 consecutive days, started when the pullets were 19 wk old. It was found that beak trimmed pullets drank more frequent (P

Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model

The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learning. The algorithm is tested on XOR and parity problem and compared with the standard back propagation training algorithm.

Low-Cost Pre-Treatment of Pharmaceutical Wastewater

Pharmaceutical industries and effluents of sewage treatment plants are the main sources of residual pharmaceuticals in water resources. These emergent pollutants may adversely impact the biophysical environment. Pharmaceutical industries often generate wastewater that changes in characteristics and quantity depending on the used manufacturing processes. Carbamazepine (CBZ), {5Hdibenzo [b,f]azepine-5-carboxamide, (C15H12N2O)}, is a significant non-biodegradable pharmaceutical contaminant in the Jordanian pharmaceutical wastewater, which is not removed by the activated sludge processes in treatment plants. Activated carbon may potentially remove that pollutant from effluents, but the high cost involved suggests that more attention should be given to the potential use of low-cost materials in order to reduce cost and environmental contamination. Powders of Jordanian non-metallic raw materials namely, Azraq Bentonite (AB), Kaolinite (K), and Zeolite (Zeo) were activated (acid and thermal treatment) and evaluated by removing CBZ. The results of batch and column techniques experiments showed around 46% and 67% removal of CBZ respectively.

Correlation of Viscosity in Nanofluids using Genetic Algorithm-neural Network (GA-NN)

An accurate and proficient artificial neural network (ANN) based genetic algorithm (GA) is developed for predicting of nanofluids viscosity. A genetic algorithm (GA) is used to optimize the neural network parameters for minimizing the error between the predictive viscosity and the experimental one. The experimental viscosity in two nanofluids Al2O3-H2O and CuO-H2O from 278.15 to 343.15 K and volume fraction up to 15% were used from literature. The result of this study reveals that GA-NN model is outperform to the conventional neural nets in predicting the viscosity of nanofluids with mean absolute relative error of 1.22% and 1.77% for Al2O3-H2O and CuO-H2O, respectively. Furthermore, the results of this work have also been compared with others models. The findings of this work demonstrate that the GA-NN model is an effective method for prediction viscosity of nanofluids and have better accuracy and simplicity compared with the others models.

Selection Initial modes for Belief K-modes Method

The belief K-modes method (BKM) approach is a new clustering technique handling uncertainty in the attribute values of objects in both the cluster construction task and the classification one. Like the standard version of this method, the BKM results depend on the chosen initial modes. So, one selection method of initial modes is developed, in this paper, aiming at improving the performances of the BKM approach. Experiments with several sets of real data show that by considered the developed selection initial modes method, the clustering algorithm produces more accurate results.

Visualising Energy Efficiency Landscape

This paper discusses the landscape design that could increase energy efficiency in a house. By planting trees in a house compound, the tree shades prevent direct sunlight from heating up the building, and it enables cooling off the surrounding air. The requirement for air-conditioning could be minimized and the air quality could be improved. During the life time of a tree, the saving cost from the mentioned benefits could be up to US $ 200 for each tree. The project intends to visually describe the landscape design in a house compound that could enhance energy efficiency and consequently lead to energy saving. The house compound model was developed in three dimensions by using AutoCAD 2005, the animation was programmed by using LightWave 3D softwares i.e. Modeler and Layout to display the tree shadings in the wall. The visualization was executed on a VRML Pad platform and implemented on a web environment.

Ensembling Adaptively Constructed Polynomial Regression Models

The approach of subset selection in polynomial regression model building assumes that the chosen fixed full set of predefined basis functions contains a subset that is sufficient to describe the target relation sufficiently well. However, in most cases the necessary set of basis functions is not known and needs to be guessed – a potentially non-trivial (and long) trial and error process. In our research we consider a potentially more efficient approach – Adaptive Basis Function Construction (ABFC). It lets the model building method itself construct the basis functions necessary for creating a model of arbitrary complexity with adequate predictive performance. However, there are two issues that to some extent plague the methods of both the subset selection and the ABFC, especially when working with relatively small data samples: the selection bias and the selection instability. We try to correct these issues by model post-evaluation using Cross-Validation and model ensembling. To evaluate the proposed method, we empirically compare it to ABFC methods without ensembling, to a widely used method of subset selection, as well as to some other well-known regression modeling methods, using publicly available data sets.

Three Dimensional Modeling of Mixture Formation and Combustion in a Direct Injection Heavy-Duty Diesel Engine

Due to the stringent legislation for emission of diesel engines and also increasing demand on fuel consumption, the importance of detailed 3D simulation of fuel injection, mixing and combustion have been increased in the recent years. In the present work, FIRE code has been used to study the detailed modeling of spray and mixture formation in a Caterpillar heavy-duty diesel engine. The paper provides an overview of the submodels implemented, which account for liquid spray atomization, droplet secondary break-up, droplet collision, impingement, turbulent dispersion and evaporation. The simulation was performed from intake valve closing (IVC) to exhaust valve opening (EVO). The predicted in-cylinder pressure is validated by comparing with existing experimental data. A good agreement between the predicted and experimental values ensures the accuracy of the numerical predictions collected with the present work. Predictions of engine emissions were also performed and a good quantitative agreement between measured and predicted NOx and soot emission data were obtained with the use of the present Zeldowich mechanism and Hiroyasu model. In addition, the results reported in this paper illustrate that the numerical simulation can be one of the most powerful and beneficial tools for the internal combustion engine design, optimization and performance analysis.

Integrated Cultivation Technique for Microbial Lipid Production by Photosynthetic Microalgae and Locally Oleaginous Yeast

The objective of this research is to study of microbial lipid production by locally photosynthetic microalgae and oleaginous yeast via integrated cultivation technique using CO2 emissions from yeast fermentation. A maximum specific growth rate of Chlorella sp. KKU-S2 of 0.284 (1/d) was obtained under an integrated cultivation and a maximum lipid yield of 1.339g/L was found after cultivation for 5 days, while 0.969g/L of lipid yield was obtained after day 6 of cultivation time by using CO2 from air. A high value of volumetric lipid production rate (QP, 0.223 g/L/d), specific product yield (YP/X, 0.194), volumetric cell mass production rate (QX, 1.153 g/L/d) were found by using ambient air CO2 coupled with CO2 emissions from yeast fermentation. Overall lipid yield of 8.33 g/L was obtained (1.339 g/L of Chlorella sp. KKU-S2 and 7.06g/L of T. maleeae Y30) while low lipid yield of 0.969g/L was found using non-integrated cultivation technique. To our knowledge this is the unique report about the lipid production from locally microalgae Chlorella sp. KKU-S2 and yeast T. maleeae Y30 in an integrated technique to improve the biomass and lipid yield by using CO2 emissions from yeast fermentation.

A Practical Approach for Electricity Load Forecasting

This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.

An Efficient and Generic Hybrid Framework for High Dimensional Data Clustering

Clustering in high dimensional space is a difficult problem which is recurrent in many fields of science and engineering, e.g., bioinformatics, image processing, pattern reorganization and data mining. In high dimensional space some of the dimensions are likely to be irrelevant, thus hiding the possible clustering. In very high dimensions it is common for all the objects in a dataset to be nearly equidistant from each other, completely masking the clusters. Hence, performance of the clustering algorithm decreases. In this paper, we propose an algorithmic framework which combines the (reduct) concept of rough set theory with the k-means algorithm to remove the irrelevant dimensions in a high dimensional space and obtain appropriate clusters. Our experiment on test data shows that this framework increases efficiency of the clustering process and accuracy of the results.

Mobile Communications Client Server System for Stock Exchange e-Services Access

Using mobile Internet access technologies and eservices, various economic agents can efficiently offer their products or services to a large number of clients. With the support of mobile communications networks, the clients can have access to e-services, anywhere and anytime. This is a base to establish a convergence of technological and financial interests of mobile operators, software developers, mobile terminals producers and e-content providers. In this paper, a client server system is presented, using 3G, EDGE, mobile terminals, for Stock Exchange e-services access.

Indoor Mapping by using Smartphone Device

This paper presented the potential of smart phone to provide support on mapping the indoor asset. The advantage of using the smart phone to generate the indoor map is that it has the ability to capture, store and reproduces still or video images; indeed most of us do have this powerful gadget. The captured images usually used by maintenance team to save a record for future reference. Here, these images are used to generate 3D models of an object precisely and accurately for efficient and effective solution in data gathering. Thus, it could be a resource for an informative database in asset management.

Accurate Visualization of Graphs of Functions of Two Real Variables

The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.