The Impact of Rehabilitation Approaches in the Sustainability of the Management of Small Tanks in Sri Lanka

Small tanks, the ancient man-made rain water storage systems, support the pheasant life and agriculture of the dry zone of Sri Lanka. Many small tanks were abandoned with time due to various reasons. Such tanks, rehabilitated in the recent past, were found to be less sustainable and most of these rehabilitation approaches have failed. The objective of this research is to assess the impact of the rehabilitation approaches in the management of small tanks in the Kurunegala District of Sri Lanka with respect to eight small tanks. A Sustainability index was developed using seven indicators representing the ability and commitment of the villagers to maintain these tanks. The sustainability index of the eight tanks varied between 79.2 and 47.2 out of a total score of 100. The conclusion is that, the approaches used for tank rehabilitation have a significant effect on the sustainability of the management of these small tanks.

Super Resolution Blind Reconstruction of Low Resolution Images using Wavelets based Fusion

Crucial information barely visible to the human eye is often embedded in a series of low resolution images taken of the same scene. Super resolution reconstruction is the process of combining several low resolution images into a single higher resolution image. The ideal algorithm should be fast, and should add sharpness and details, both at edges and in regions without adding artifacts. In this paper we propose a super resolution blind reconstruction technique for linearly degraded images. In our proposed technique the algorithm is divided into three parts an image registration, wavelets based fusion and an image restoration. In this paper three low resolution images are considered which may sub pixels shifted, rotated, blurred or noisy, the sub pixel shifted images are registered using affine transformation model; A wavelet based fusion is performed and the noise is removed using soft thresolding. Our proposed technique reduces blocking artifacts and also smoothens the edges and it is also able to restore high frequency details in an image. Our technique is efficient and computationally fast having clear perspective of real time implementation.

Human Body Configuration using Bayesian Model

In this paper we present a novel approach for human Body configuration based on the Silhouette. We propose to address this problem under the Bayesian framework. We use an effective Model based MCMC (Markov Chain Monte Carlo) method to solve the configuration problem, in which the best configuration could be defined as MAP (maximize a posteriori probability) in Bayesian model. This model based MCMC utilizes the human body model to drive the MCMC sampling from the solution space. It converses the original high dimension space into a restricted sub-space constructed by the human model and uses a hybrid sampling algorithm. We choose an explicit human model and carefully select the likelihood functions to represent the best configuration solution. The experiments show that this method could get an accurate configuration and timesaving for different human from multi-views.

Correlations between Cleaning Frequency of Reservoir and Water Tower and Parameters of Water Quality

This study was investigated on sampling and analyzing water quality in water reservoir & water tower installed in two kind of residential buildings and school facilities. Data of water quality was collected for correlation analysis with frequency of sanitization of water reservoir through questioning managers of building about the inspection charts recorded on equipment for water reservoir. Statistical software packages (SPSS) were applied to the data of two groups (cleaning frequency and water quality) for regression analysis to determine the optimal cleaning frequency of sanitization. The correlation coefficient (R) in this paper represented the degree of correlation, with values of R ranging from +1 to -1.After investigating three categories of drinking water users; this study found that the frequency of sanitization of water reservoir significantly influenced the water quality of drinking water. A higher frequency of sanitization (more than four times per 1 year) implied a higher quality of drinking water. Results indicated that sanitizing water reservoir & water tower should at least twice annually for achieving the aim of safety of drinking water.

Apoptosis Induced by Low-concentration Ethanol in Hepatocellular Carcinoma Cell Strains and Down-regulated AFP and Survivin Analysis by Proteomic Technology

Ethanol is generally used as a therapeutic reagent against Hepatocellular carcinoma (HCC or hepatoma) worldwide, as it can induce Hepatocellular carcinoma cell apoptosis at low concentration through a multifactorial process regulated by several unknown proteins. This paper provides a simple and available proteomic strategy for exploring differentially expressed proteins in the apoptotic pathway. The appropriate concentrations of ethanol required to induce HepG2 cell apoptosis were first assessed by MTT assay, Gisma and fluorescence staining. Next, the central proteins involved in the apoptosis pathway processs were determined using 2D-PAGE, SDS-PAGE, and bio-software analysis. Finally the downregulation of two proteins, AFP and survivin, were determined by immunocytochemistry and reverse transcriptase PCR (RT-PCR) technology. The simple, useful method demonstrated here provides a new approach to proteomic analysis in key bio-regulating process including proliferation, differentiation, apoptosis, immunity and metastasis.

Advanced Neural Network Learning Applied to Pulping Modeling

This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.

Describing Learning Features of Reusable Resources: A Proposal

One of the main advantages of the LO paradigm is to allow the availability of good quality, shareable learning material through the Web. The effectiveness of the retrieval process requires a formal description of the resources (metadata) that closely fits the user-s search criteria; in spite of the huge international efforts in this field, educational metadata schemata often fail to fulfil this requirement. This work aims to improve the situation, by the definition of a metadata model capturing specific didactic features of shareable learning resources. It classifies LOs into “teacher-oriented" and “student-oriented" categories, in order to describe the role a LO is to play when it is integrated into the educational process. This article describes the model and a first experimental validation process that has been carried out in a controlled environment.

Discovery of Human HMG-Coa Reductase Inhibitors Using Structure-Based Pharmacophore Modeling Combined with Molecular Dynamics Simulation Methodologies

3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGR) catalyzes the conversion of HMG-CoA to mevalonate using NADPH and the enzyme is involved in rate-controlling step of mevalonate. Inhibition of HMGR is considered as effective way to lower cholesterol levels so it is drug target to treat hypercholesterolemia, major risk factor of cardiovascular disease. To discover novel HMGR inhibitor, we performed structure-based pharmacophore modeling combined with molecular dynamics (MD) simulation. Four HMGR inhibitors were used for MD simulation and representative structure of each simulation were selected by clustering analysis. Four structure-based pharmacophore models were generated using the representative structure. The generated models were validated used in virtual screening to find novel scaffolds for inhibiting HMGR. The screened compounds were filtered by applying drug-like properties and used in molecular docking. Finally, four hit compounds were obtained and these complexes were refined using energy minimization. These compounds might be potential leads to design novel HMGR inhibitor.

Improved Robust Stability and Stabilization Conditions of Discrete-time Delayed System

The problem of robust stability and robust stabilization for a class of discrete-time uncertain systems with time delay is investigated. Based on Tchebychev inequality, by constructing a new augmented Lyapunov function, some improved sufficient conditions ensuring exponential stability and stabilization are established. These conditions are expressed in the forms of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. Compared with some previous results derived in the literature, the new obtained criteria have less conservatism. Two numerical examples are provided to demonstrate the improvement and effectiveness of the proposed method.

Video Data Mining based on Information Fusion for Tamper Detection

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Forecasting Fraudulent Financial Statements using Data Mining

This paper explores the effectiveness of machine learning techniques in detecting firms that issue fraudulent financial statements (FFS) and deals with the identification of factors associated to FFS. To this end, a number of experiments have been conducted using representative learning algorithms, which were trained using a data set of 164 fraud and non-fraud Greek firms in the recent period 2001-2002. The decision of which particular method to choose is a complicated problem. A good alternative to choosing only one method is to create a hybrid forecasting system incorporating a number of possible solution methods as components (an ensemble of classifiers). For this purpose, we have implemented a hybrid decision support system that combines the representative algorithms using a stacking variant methodology and achieves better performance than any examined simple and ensemble method. To sum up, this study indicates that the investigation of financial information can be used in the identification of FFS and underline the importance of financial ratios.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Health Care Ethics in Vulnerable Populations: Clinical Research through the Patient's Eyes

Chronic conditions carry with them strong emotions and often lead to charged relationships between patients and their health providers and, by extension, patients and health researchers. Persons are both autonomous and relational and a purely cognitive model of autonomy neglects the social and relational basis of chronic illness. Ensuring genuine informed consent in research requires a thorough understanding of how participants perceive a study and their reasons for participation. Surveys may not capture the complexities of reasoning that underlies study participation. Contradictory reasons for participation, for instance an initial claim of altruism as rationale and a subsequent claim of personal benefit (therapeutic misconception), affect the quality of informed consent. Individuals apply principles through the filter of personal values and lived experience. Authentic autonomy, and hence authentic consent to research, occurs within the context of patients- unique life narratives and illness experiences.

Mounting Time Reduction using Content-Based Block Management for NAND Flash File System

The flash memory has many advantages such as low power consumption, strong shock resistance, fast I/O and non-volatility. And it is increasingly used in the mobile storage device. The YAFFS, one of the NAND flash file system, is widely used in the embedded device. However, the existing YAFFS takes long time to mount the file system because it scans whole spare areas in all pages of NAND flash memory. In order to solve this problem, we propose a new content-based flash file system using a mounting time reduction technique. The proposed method only scans partial spare areas of some special pages by using content-based block management. The experimental results show that the proposed method reduces the average mounting time by 87.2% comparing with JFFS2 and 69.9% comparing with YAFFS.

Separation of Dissolved Gas for Breathing of a Human against Sudden Waves Using Hollow Fiber Membranes

The separation of dissolved gas including dissolved oxygen can be used in breathing for a human under water. When one is suddenly wrecked or meets a tsunami, one is instantly drowned and cannot breathe under water. To avoid this crisis, when we meet waves, the dissolved gas separated from water by wave is used, while air can be used to breathe when we are about to escape from water. In this thesis, we investigated the separation characteristics of dissolved gas using the pipe type of hollow fiber membrane with polypropylene and the nude type of one with polysulfone. The hollow fiber membranes with good characteristics under water are used to separate the dissolved gas. The hollow fiber membranes with good characteristics in an air are used to transfer air. The combination of membranes with good separation characteristics under water and good transferring one in an air is used to breathe instantly under water to be alive at crisis. These results showed that polypropylene represented better performance than polysulfone under both of air and water conditions.

Reduction of Chloride Dioxide in Paper Bleaching using Peroxide Activation

All around the world pulp and paper industries are the biggest plant production with the environmental pollution as the biggest challenge facing the pulp manufacturing operations. The concern among these industries is to produce a high volume of papers with the high quality standard and of low cost without affecting the environment. This result obtained from this bleaching study show that the activation of peroxide was an effective method of reducing the total applied charge of chlorine dioxide which is harmful to our environment and also show that softwood and hardwood Kraft pulps responded linearly to the peroxide treatments. During the bleaching process the production plant produce chlorines. Under the trial stages chloride dioxide has been reduced by 3 kg/ton to reduce the brightness from 65% ISO to 60% ISO of pulp and the dosing point returned to the E stage charges by pre-treating Kraft pulps with hydrogen peroxide. The pulp and paper industry has developed elemental chlorine free (ECF) and totally chlorine free (TCF) bleaching, in their quest for being environmental friendly, they have been looking at ways to turn their ECF process into a TCF process while still being competitive. This prompted the research to investigate the capability of the hydrogen peroxide as catalyst to reduce chloride dioxide.

Exploring the Application of Knowledge Management Factors in Esfahan University's Medical College

In this competitive age, one of the key tools of most successful organizations is knowledge management. Today some organizations measure their current knowledge and use it as an indicator for rating the organization on their reports. Noting that the universities and colleges of medical science have a great role in public health of societies, their access to newest scientific research and the establishment of organizational knowledge management systems is very important. In order to explore the Application of Knowledge Management Factors, a national study was undertaken. The main purpose of this study was to find the rate of the application of knowledge management factors and some ways to establish more application of knowledge management system in Esfahan University-s Medical College (EUMC). Esfahan is the second largest city after Tehran, the capital city of Iran, and the EUMC is the biggest medical college in Esfahan. To rate the application of knowledge management, this study uses a quantitative research methodology based on Probst, Raub and Romhardt model of knowledge management. A group of 267 faculty members and staff of the EUMC were asked via questionnaire. Finding showed that the rate of the application of knowledge management factors in EUMC have been lower than average. As a result, an interview with ten faculty members conducted to find the guidelines to establish more applications of knowledge management system in EUMC.

Estimating the Absorption of Volatile Organic Compounds in Four Biodiesels Using the UNIFAC Procedure

This work considered the thermodynamic feasibility of scrubbing volatile organic compounds into biodiesel in view of designing a gas treatment process with this absorbent. A detailed vapour – liquid equilibrium investigation was performed using the original UNIFAC group contribution method. The four biodiesels studied in this work are methyl oleate, methyl palmitate, methyl linolenate and ethyl stearate. The original UNIFAC procedure was used to estimate the infinite dilution activity coefficients of 13 selected volatile organic compounds in the biodiesels. The calculations were done at the VOC mole fraction of 9.213x10-8. Ethyl stearate gave the most favourable phase equilibrium. A close agreement was found between the infinite dilution activity coefficient of toluene found in this work and those reported in literature. Thermodynamic models can efficiently be used to calculate vast amount of phase equilibrium behaviour using limited number of experimental data.

Vulnerabilities of IEEE 802.11i Wireless LAN CCMP Protocol

IEEE has recently incorporated CCMP protocol to provide robust security to IEEE 802.11 wireless LANs. It is found that CCMP has been designed with a weak nonce construction and transmission mechanism, which leads to the exposure of initial counter value. This weak construction of nonce renders the protocol vulnerable to attacks by intruders. This paper presents how the initial counter can be pre-computed by the intruder. This vulnerability of counter block value leads to pre-computation attack on the counter mode encryption of CCMP. The failure of the counter mode will result in the collapse of the whole security mechanism of 802.11 WLAN.

An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure

Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.