GIS-based Non-point Sources of Pollution Simulation in Cameron Highlands, Malaysia

Cameron Highlands is a mountainous area subjected to torrential tropical showers. It extracts 5.8 million liters of water per day for drinking supply from its rivers at several intake points. The water quality of rivers in Cameron Highlands, however, has deteriorated significantly due to land clearing for agriculture, excessive usage of pesticides and fertilizers as well as construction activities in rapidly developing urban areas. On the other hand, these pollution sources known as non-point pollution sources are diverse and hard to identify and therefore they are difficult to estimate. Hence, Geographical Information Systems (GIS) was used to provide an extensive approach to evaluate landuse and other mapping characteristics to explain the spatial distribution of non-point sources of contamination in Cameron Highlands. The method to assess pollution sources has been developed by using Cameron Highlands Master Plan (2006-2010) for integrating GIS, databases, as well as pollution loads in the area of study. The results show highest annual runoff is created by forest, 3.56 × 108 m3/yr followed by urban development, 1.46 × 108 m3/yr. Furthermore, urban development causes highest BOD load (1.31 × 106 kgBOD/yr) while agricultural activities and forest contribute the highest annual loads for phosphorus (6.91 × 104 kgP/yr) and nitrogen (2.50 × 105 kgN/yr), respectively. Therefore, best management practices (BMPs) are suggested to be applied to reduce pollution level in the area.

Automatic Generation of OWL Ontologies from UML Class Diagrams Based on Meta- Modelling and Graph Grammars

Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.

Determination of in Vitro Susceptibility of the Typhoid Pathogens to Synergistic Action of Euphorbia Hirta, Euphorbia Heterophylla and Phyllanthus Niruri for Possible Development of Effective Anti-Typhoid Drugs

Studies were carried out to determine the in vitro susceptibility of the typhoid pathogens to combined action of Euphorbia hirta, Euphorbia heterophylla and Phyllanthus niruri. Clinical isolates of the typhoid bacilli were subjected to susceptibility testing using agar diffusion technique and the minimum inhibitory concentration (MIC) determined with tube dilution technique. These isolates, when challenged with doses of the extracts from the three medicinal plants showed zones of inhibition as wide as 26±0.2mm, 22±0.1mm and 18±0.0mm respectively. The minimum inhibitory concentration (MIC) revealed organisms inhibited at varying concentrations of extracts: E. hirta (S. typhi 0.250mg/ml, S. paratyphi A 0.125mg/ml, S. paratyphi B 0.185mg/ml and S. paratyphi C 0.225mg/ml), E. heterophylla (S. typhi 0.280mg/ml, S. paratyphi A 0.150mg/ml, S. paratyphi B 0.200mg/ml and S. paratyphi C 0.250mg/ml) and P. niruri (S. typhi 0.150mg/ml, S. paratyphi A 0.100mg/ml, S. paratyphi B 0.115mg/ml and S. paratyphi C 0.125mg/ml). The results of the synergy between the three plants in the ration of 1:1:1 showed very low MICs for the test pathogens as follows S. typhi 0.025mg/ml, S. paratyphi A 0.080mg/ml, S. paratyphi B 0.015mg/ml and S. paratyphi C 0.10mg/ml with the diameter zone of inhibition (DZI) ranging from 35±0.2mm, 28±0.4mm, 20±0.1mm and 32±0.3mm respectively. The secondary metabolites were identified using simple methods and HPLC. Organic components such as anthroquinones, different alkaloids, tannins, 6-ethoxy-1,2,3,4-tetrahydro-2,2,4-trimethyl and steroids were identified. The prevalence of Salmonellae, a deadly infectious disease, is still very high in parts of Nigeria. The synergistic action of these three plants is very high. It is concluded that pharmaceutical companies should take advantage of these findings to develop new anti-typhoid drugs from these plants.

Three Dimensional Modeling of Mixture Formation and Combustion in a Direct Injection Heavy-Duty Diesel Engine

Due to the stringent legislation for emission of diesel engines and also increasing demand on fuel consumption, the importance of detailed 3D simulation of fuel injection, mixing and combustion have been increased in the recent years. In the present work, FIRE code has been used to study the detailed modeling of spray and mixture formation in a Caterpillar heavy-duty diesel engine. The paper provides an overview of the submodels implemented, which account for liquid spray atomization, droplet secondary break-up, droplet collision, impingement, turbulent dispersion and evaporation. The simulation was performed from intake valve closing (IVC) to exhaust valve opening (EVO). The predicted in-cylinder pressure is validated by comparing with existing experimental data. A good agreement between the predicted and experimental values ensures the accuracy of the numerical predictions collected with the present work. Predictions of engine emissions were also performed and a good quantitative agreement between measured and predicted NOx and soot emission data were obtained with the use of the present Zeldowich mechanism and Hiroyasu model. In addition, the results reported in this paper illustrate that the numerical simulation can be one of the most powerful and beneficial tools for the internal combustion engine design, optimization and performance analysis.

Hardware Stream Cipher Based On LFSR and Modular Division Circuit

Proposal for a secure stream cipher based on Linear Feedback Shift Registers (LFSR) is presented here. In this method, shift register structure used for polynomial modular division is combined with LFSR keystream generator to yield a new keystream generator with much higher periodicity. Security is brought into this structure by using the Boolean function to combine state bits of the LFSR keystream generator and taking the output through the Boolean function. This introduces non-linearity and security into the structure in a way similar to the Non-linear filter generator. The security and throughput of the suggested stream cipher is found to be much greater than the known LFSR based structures for the same key length.

Comparison of Neural Network and Logistic Regression Methods to Predict Xerostomia after Radiotherapy

To evaluate the ability to predict xerostomia after radiotherapy, we constructed and compared neural network and logistic regression models. In this study, 61 patients who completed a questionnaire about their quality of life (QoL) before and after a full course of radiation therapy were included. Based on this questionnaire, some statistical data about the condition of the patients’ salivary glands were obtained, and these subjects were included as the inputs of the neural network and logistic regression models in order to predict the probability of xerostomia. Seven variables were then selected from the statistical data according to Cramer’s V and point-biserial correlation values and were trained by each model to obtain the respective outputs which were 0.88 and 0.89 for AUC, 9.20 and 7.65 for SSE, and 13.7% and 19.0% for MAPE, respectively. These parameters demonstrate that both neural network and logistic regression methods are effective for predicting conditions of parotid glands.

The Adoption of Halal Transportations Technologies for Halal Logistics Service Providers in Malaysia

The purpose of this study is i) to investigate the driving factors and barriers of the adoption of Information and Communication Technology (ICT) in Halal logistic and ii) to develop an ICT adoption framework for Halal logistic service provider. The Halal LSPs selected for the study currently used ICT service platforms, such as accounting and management system for Halal logistic business. The study categorizes the factors influencing the adoption decision and process by LSPs into four groups: technology related factors, organizational and environmental factors, Halal assurance related factors, and government related factors. The major contribution in this study is the discovery that technology related factors (ICT compatibility with Halal requirement) and Halal assurance related factors are the most crucial factors among the Halal LSPs applying ICT for Halal control in transportation-s operation. Among the government related factors, ICT requirement for monitoring Halal included in Halal Logistic Standard on Transportation (MS2400:2010) are the most influencing factors in the adoption of ICT with the support of the government. In addition, the government related factors are very important in the reducing the main barriers and the creation of the atmosphere of ICT adoption in Halal LSP sector.

Quantitative Quality Assessment of Microscopic Image Mosaicing

The mosaicing technique has been employed in more and more application fields, from entertainment to scientific ones. In the latter case, often the final evaluation is still left to human beings, that assess visually the quality of the mosaic. Many times, a lack of objective measurements in microscopic mosaicing may prevent the mosaic from being used as a starting image for further analysis. In this work we analyze three different metrics and indexes, in the domain of signal analysis, image analysis and visual quality, to measure the quality of different aspects of the mosaicing procedure, such as registration errors and visual quality. As the case study we consider the mosaicing algorithm we developed. The experiments have been carried out by considering mosaics with very different features: histological samples, that are made of detailed and contrasted images, and live stem cells, that show a very low contrast and low detail levels.

Robot Vision Application based on Complex 3D Pose Computation

The paper presents a technique suitable in robot vision applications where it is not possible to establish the object position from one view. Usually, one view pose calculation methods are based on the correspondence of image features established at a training step and exactly the same image features extracted at the execution step, for a different object pose. When such a correspondence is not feasible because of the lack of specific features a new method is proposed. In the first step the method computes from two views the 3D pose of feature points. Subsequently, using a registration algorithm, the set of 3D feature points extracted at the execution phase is aligned with the set of 3D feature points extracted at the training phase. The result is a Euclidean transform which have to be used by robot head for reorientation at execution step.

Segmentation of Ascending and Descending Aorta in CTA Images

In this study, a new and fast algorithm for Ascending Aorta (AscA) and Descending Aorta (DesA) segmentation is presented using Computed Tomography Angiography images. This process is quite important especially at the detection of aortic plaques, aneurysms, calcification or stenosis. The applied method has been carried out at four steps. At first step, lung segmentation is achieved. At the second one, Mediastinum Region (MR) is detected to use in the segmentation. At the third one, images have been applied optimal threshold and components which are outside of the MR were removed. Lastly, identifying and segmentation of AscA and DesA have been carried out. The performance of the applied method is found quite well for radiologists and it gives enough results to the surgeries medically.

Determination of Cd, Zn, K, pH, TNV, Organic Material and Electrical Conductivity (EC) Distribution in Agricultural Soils using Geostatistics and GIS (Case Study: South- Western of Natanz- Iran)

Soil chemical and physical properties have important roles in compartment of the environment and agricultural sustainability and human health. The objectives of this research is determination of spatial distribution patterns of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) in agricultural soils of Natanz region in Esfehan province. In this study geostatistic and non-geostatistic methods were used for prediction of spatial distribution of these parameters. 64 composite soils samples were taken at 0-20 cm depth. The study area is located in south of NATANZ agricultural lands with area of 21660 hectares. Spatial distribution of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) was determined using geostatistic and geographic information system. Results showed that Cd, pH, TNV and K data has normal distribution and Zn, OC and EC data had not normal distribution. Kriging, Inverse Distance Weighting (IDW), Local Polynomial Interpolation (LPI) and Redial Basis functions (RBF) methods were used to interpolation. Trend analysis showed that organic carbon in north-south and east to west did not have trend while K and TNV had second degree trend. We used some error measurements include, mean absolute error(MAE), mean squared error (MSE) and mean biased error(MBE). Ordinary kriging(exponential model), LPI(Local polynomial interpolation), RBF(radial basis functions) and IDW methods have been chosen as the best methods to interpolating of the soil parameters. Prediction maps by disjunctive kriging was shown that in whole study area was intensive shortage of organic matter and more than 63.4 percent of study area had shortage of K amount.

Decision Trees for Predicting Risk of Mortality using Routinely Collected Data

It is well known that Logistic Regression is the gold standard method for predicting clinical outcome, especially predicting risk of mortality. In this paper, the Decision Tree method has been proposed to solve specific problems that commonly use Logistic Regression as a solution. The Biochemistry and Haematology Outcome Model (BHOM) dataset obtained from Portsmouth NHS Hospital from 1 January to 31 December 2001 was divided into four subsets. One subset of training data was used to generate a model, and the model obtained was then applied to three testing datasets. The performance of each model from both methods was then compared using calibration (the χ2 test or chi-test) and discrimination (area under ROC curve or c-index). The experiment presented that both methods have reasonable results in the case of the c-index. However, in some cases the calibration value (χ2) obtained quite a high result. After conducting experiments and investigating the advantages and disadvantages of each method, we can conclude that Decision Trees can be seen as a worthy alternative to Logistic Regression in the area of Data Mining.

Developing OMS in IHL

Managing knowledge of research is one way to ensure just in time information and knowledge to support research strategist and activities. Unfortunately researcher found the vital research knowledge in IHL (Institutions of Higher Learning) are scattered, unstructured and unorganized. Aiming on lay aside conceptual foundations for understanding and developing OMS (Organizational Memory System) to facilitate research in IHL, this research revealed ten factors contributed to the needs of research in the IHL and seven internal challenges of IHL in promoting research to their academic members. This study then suggested a comprehensive support of managing research knowledge using Organizational Memory System (OMS). Eight OMS characteristics to support research were identified. Finally the initial work in designing OMS was projected using knowledge taxonomy. All analysis is derived from pertinent research paper related to research in IHL and OMS. Further study can be conducted to validate and verify results presented.

Evaluation of the Impact of Dataset Characteristics for Classification Problems in Biological Applications

Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.

Spatial Mapping of Dengue Incidence: A Case Study in Hulu Langat District, Selangor, Malaysia

Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.

White Blood Cells Identification and Counting from Microscopic Blood Image

The counting and analysis of blood cells allows the evaluation and diagnosis of a vast number of diseases. In particular, the analysis of white blood cells (WBCs) is a topic of great interest to hematologists. Nowadays the morphological analysis of blood cells is performed manually by skilled operators. This involves numerous drawbacks, such as slowness of the analysis and a nonstandard accuracy, dependent on the operator skills. In literature there are only few examples of automated systems in order to analyze the white blood cells, most of which only partial. This paper presents a complete and fully automatic method for white blood cells identification from microscopic images. The proposed method firstly individuates white blood cells from which, subsequently, nucleus and cytoplasm are extracted. The whole work has been developed using MATLAB environment, in particular the Image Processing Toolbox.

Lean Changeability – Evaluation and Design of Lean and Transformable Factories

In today-s turbulent environment, companies are faced with two principal challenges. On the one hand, it is necessary to produce ever more cost-effectively to remain competitive. On the other hand, factories need to be transformable in order to manage unpredictable changes in the corporate environment. To deal with these different challenges, companies use the philosophy of lean production in the first case, in the second case the philosophy of transformability. To a certain extent these two approaches follow different directions. This can cause conflicts when designing factories. Therefore, the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover has developed a procedure to allow companies to evaluate and design their factories with respect to the requirements of both philosophies.

An Evaluation of Carbon Dioxide Emissions Trading among Enterprises -The Tokyo Cap and Trade Program-

This study aims to propose three evaluation methods to evaluate the Tokyo Cap and Trade Program when emissions trading is performed virtually among enterprises, focusing on carbon dioxide (CO2), which is the only emitted greenhouse gas that tends to increase. The first method clarifies the optimum reduction rate for the highest cost benefit, the second discusses emissions trading among enterprises through market trading, and the third verifies long-term emissions trading during the term of the plan (2010-2019), checking the validity of emissions trading partly using Geographic Information Systems (GIS). The findings of this study can be summarized in the following three points. 1. Since the total cost benefit is the greatest at a 44% reduction rate, it is possible to set it more highly than that of the Tokyo Cap and Trade Program to get more total cost benefit. 2. At a 44% reduction rate, among 320 enterprises, 8 purchasing enterprises and 245 sales enterprises gain profits from emissions trading, and 67 enterprises perform voluntary reduction without conducting emissions trading. Therefore, to further promote emissions trading, it is necessary to increase the sales volumes of emissions trading in addition to sales enterprises by increasing the number of purchasing enterprises. 3. Compared to short-term emissions trading, there are few enterprises which benefit in each year through the long-term emissions trading of the Tokyo Cap and Trade Program. Only 81 enterprises at the most can gain profits from emissions trading in FY 2019. Therefore, by setting the reduction rate more highly, it is necessary to increase the number of enterprises that participate in emissions trading and benefit from the restraint of CO2 emissions.

Computer Modeling of Drug Distribution after Intravitreal Administration

Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.

Real-time Haptic Modeling and Simulation for Prosthetic Insertion

In this work a surgical simulator is produced which enables a training otologist to conduct a virtual, real-time prosthetic insertion. The simulator provides the Ear, Nose and Throat surgeon with real-time visual and haptic responses during virtual cochlear implantation into a 3D model of the human Scala Tympani (ST). The parametric model is derived from measured data as published in the literature and accounts for human morphological variance, such as differences in cochlear shape, enabling patient-specific pre- operative assessment. Haptic modeling techniques use real physical data and insertion force measurements, to develop a force model which mimics the physical behavior of an implant as it collides with the ST walls during an insertion. Output force profiles are acquired from the insertion studies conducted in the work, to validate the haptic model. The simulator provides the user with real-time, quantitative insertion force information and associated electrode position as user inserts the virtual implant into the ST model. The information provided by this study may also be of use to implant manufacturers for design enhancements as well as for training specialists in optimal force administration, using the simulator. The paper reports on the methods for anatomical modeling and haptic algorithm development, with focus on simulator design, development, optimization and validation. The techniques may be transferrable to other medical applications that involve prosthetic device insertions where user vision is obstructed.