Environmental Analysis of Springs in Urban Areas–A Methodological Proposal

The springs located in urban areas are the outpouring of surface water, which can serve as water supply, effluent receptors and important local macro-drainage elements. With unplanned occupation, non-compliance with environmental legislation and the importance of these water bodies, it is vital to analyze the springs within urban areas, considering the Brazilian forest code. This paper submits an analysis and discussion methodology proposal of environmental compliance functions of urban springs, by means of G.I.S. - Geographic Information System analysis - and in situ analysis. The case study included two springs which exhibit a history of occupation along its length, with different degrees of impact. The proposed method is effective and easy to apply, representing a powerful tool for analyzing the environmental conditions of springs in urban areas.

Application of Neural Network and Finite Element for Prediction the Limiting Drawing Ratio in Deep Drawing Process

In this paper back-propagation artificial neural network (BPANN) is employed to predict the limiting drawing ratio (LDR) of the deep drawing process. To prepare a training set for BPANN, some finite element simulations were carried out. die and punch radius, die arc radius, friction coefficient, thickness, yield strength of sheet and strain hardening exponent were used as the input data and the LDR as the specified output used in the training of neural network. As a result of the specified parameters, the program will be able to estimate the LDR for any new given condition. Comparing FEM and BPANN results, an acceptable correlation was found.

A Processor with Dynamically Reconfigurable Circuit for Floating-Point Arithmetic

This paper describes about dynamic reconfiguration to miniaturize arithmetic circuits in general-purpose processor. Dynamic reconfiguration is a technique to realize required functions by changing hardware construction during operation. The proposed arithmetic circuit performs floating-point arithmetic which is frequently used in science and technology. The data format is floating-point based on IEEE754. The proposed circuit is designed using VHDL, and verified the correct operation by simulations and experiments.

Fabrication of Microfluidic Device for Quantitative Monitoring of Algal Cell Behavior Using X-ray LIGA Technology

In this paper, a simple microfluidic device for monitoring algal cell behavior is proposed. An array of algal microwells is fabricated by PDMS soft-lithography using X-ray LIGA mold, placed on a glass substrate. Two layers of replicated PDMS and substrate are attached by oxygen plasma bonding, creating a microchannel for the microfluidic system. Algal cell are loaded into the microfluidic device, which provides positive charge on the bottom surface of wells. Algal cells, which are negative charged, can be attracted to the bottom of the wells via electrostatic interaction. By varying the concentration of algal cells in the loading suspension, it is possible to obtain wells with a single cell. Liquid medium for cells monitoring are flown continuously over the wells, providing nutrient and waste exchange between the well and the main flow. This device could lead to the uncovering of the quantitative biology of the algae, which is a key to effective and extensive algal utilizations in the field of biotechnology, food industry and bioenergy research and developments.

Cross Layer Optimization for Fairness Balancing Based on Adaptively Weighted Utility Functions in OFDMA Systems

Cross layer optimization based on utility functions has been recently studied extensively, meanwhile, numerous types of utility functions have been examined in the corresponding literature. However, a major drawback is that most utility functions take a fixed mathematical form or are based on simple combining, which can not fully exploit available information. In this paper, we formulate a framework of cross layer optimization based on Adaptively Weighted Utility Functions (AWUF) for fairness balancing in OFDMA networks. Under this framework, a two-step allocation algorithm is provided as a sub-optimal solution, whose control parameters can be updated in real-time to accommodate instantaneous QoS constrains. The simulation results show that the proposed algorithm achieves high throughput while balancing the fairness among multiple users.

Dimensionality Reduction of PSSM Matrix and its Influence on Secondary Structure and Relative Solvent Accessibility Predictions

State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.

On Optimum Stratification

In this manuscript, we discuss the problem of determining the optimum stratification of a study (or main) variable based on the auxiliary variable that follows a uniform distribution. If the stratification of survey variable is made using the auxiliary variable it may lead to substantial gains in precision of the estimates. This problem is formulated as a Nonlinear Programming Problem (NLPP), which turn out to multistage decision problem and is solved using dynamic programming technique.

Totally Integrated Smart Energy System through Data Acquisition via Remote Location

This paper discusses the approach of real-time controlling of the energy management system using the data acquisition tool of LabVIEW. The main idea of this inspiration was to interface the Station (PC) with the system and publish the data on internet using LabVIEW. In this venture, controlling and switching of 3 phase AC loads are effectively and efficiently done. The phases are also sensed through devices. In case of any failure the attached generator starts functioning automatically. The computer sends command to the system and system respond to the request. The modern feature is to access and control the system world-wide using world wide web (internet). This controlling can be done at any time from anywhere to effectively use the energy especially in developing countries where energy management is a big problem. In this system totally integrated devices are used to operate via remote location.

Simulation of Loss-of-Flow Transient in a Radiant Steam Boiler with Relap5/Mod3.2

loss of feedwater accident is one of the frequently sever accidents in steam boiler facilities. It threatens the system structural integrity and generates serious hazards and economic loses. The safety analysis of the thermal installations, based extensively on the numeric simulation. The simulation analysis using realistic computer codes like Relap5/Mod3.2 will help understand steam boiler thermal-hydraulic behavior during normal and abnormal conditions. In this study, we are interested on the evaluation of the radiant steam boiler assessment and response to loss-of-feedwater accident. Pressure, temperature and flow rate profiles are presented in various steam boiler system components. The obtained results demonstrate the importance and capability of the Relap5/Mod3.2 code in the thermal-hydraulic analysis of the steam boiler facilities.

A General Regression Test Selection Technique

This paper presents a new methodology to select test cases from regression test suites. The selection strategy is based on analyzing the dynamic behavior of the applications that written in any programming language. Methods based on dynamic analysis are more safe and efficient. We design a technique that combine the code based technique and model based technique, to allow comparing the object oriented of an application that written in any programming language. We have developed a prototype tool that detect changes and select test cases from test suite.

Implementation of RSA Blind Signature on CryptO-0N2 Protocol

Blind Signature were introduced by Chaum. In this scheme, a signer can “sign” a document without knowing the document contain. This is particularly important in electronic voting. CryptO-0N2 is an electronic voting protocol which is development of CryptO-0N. During its development this protocol has not been furnished with the requirement of blind signature, so the choice of voters can be determined by counting center. In this paper will be presented of implementation of blind signature using RSA algorithm.

How to Integrate Sustainability in Technological Degrees: Robotics at UPC

Embedding Sustainability in technological curricula has become a crucial factor for educating engineers with competences in sustainability. The Technical University of Catalonia UPC, in 2008, designed the Sustainable Technology Excellence Program STEP 2015 in order to assure a successful Sustainability Embedding. This Program takes advantage of the opportunity that the redesign of all Bachelor and Master Degrees in Spain by 2010 under the European Higher Education Area framework offered. The STEP program goals are: to design compulsory courses in each degree; to develop the conceptual base and identify reference models in sustainability for all specialties at UPC; to create an internal interdisciplinary network of faculty from all the schools; to initiate new transdisciplinary research activities in technology-sustainability-education; to spread the know/how attained; to achieve international scientific excellence in technology-sustainability-education and to graduate the first engineers/architects of the new EHEA bachelors with sustainability as a generic competence. Specifically, in this paper authors explain their experience in leading the STEP program, and two examples are presented: Industrial Robotics subject and the curriculum for the School of Architecture.

Model Discovery and Validation for the Qsar Problem using Association Rule Mining

There are several approaches in trying to solve the Quantitative 1Structure-Activity Relationship (QSAR) problem. These approaches are based either on statistical methods or on predictive data mining. Among the statistical methods, one should consider regression analysis, pattern recognition (such as cluster analysis, factor analysis and principal components analysis) or partial least squares. Predictive data mining techniques use either neural networks, or genetic programming, or neuro-fuzzy knowledge. These approaches have a low explanatory capability or non at all. This paper attempts to establish a new approach in solving QSAR problems using descriptive data mining. This way, the relationship between the chemical properties and the activity of a substance would be comprehensibly modeled.

Experimental Analysis of Diesel Hydrotreating Reactor to Development a Simplified Tool for Process Real- time Optimization

In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .

Benchmarking Cleaner Production Performance of Coal-fired Power Plants Using Two-stage Super-efficiency Data Envelopment Analysis

Benchmarking cleaner production performance is an effective way of pollution control and emission reduction in coal-fired power industry. A benchmarking method using two-stage super-efficiency data envelopment analysis for coal-fired power plants is proposed – firstly, to improve the cleaner production performance of DEA-inefficient or weakly DEA-efficient plants, then to select the benchmark from performance-improved power plants. An empirical study is carried out with the survey data of 24 coal-fired power plants. The result shows that in the first stage the performance of 16 plants is DEA-efficient and that of 8 plants is relatively inefficient. The target values for improving DEA-inefficient plants are acquired by projection analysis. The efficient performance of 24 power plants and the benchmarking plant is achieved in the second stage. The two-stage benchmarking method is practical to select the optimal benchmark in the cleaner production of coal-fired power industry and will continuously improve plants- cleaner production performance.

Flow Properties of Commercial Infant Formula Powders

The objective of this work was to investigate flow properties of powdered infant formula samples. Samples were purchased at a local pharmacy and differed in composition. Lactose free infant formula, gluten free infant formula and infant formulas containing dietary fibers and probiotics were tested and compared with a regular infant formula sample which did not contain any of these supplements. Particle size and bulk density were determined and their influence on flow properties was discussed. There were no significant differences in bulk densities of the samples, therefore the connection between flow properties and bulk density could not be determined. Lactose free infant formula showed flow properties different to standard supplement-free sample. Gluten free infant formula with addition of probiotic microorganisms and dietary fiber had the narrowest particle size distribution range and exhibited the best flow properties. All the other samples exhibited the same tendency of decreasing compaction coefficient with increasing flow speed, which means they all become freer flowing with higher flow speeds.

ISC–Intelligent Subspace Clustering, A Density Based Clustering Approach for High Dimensional Dataset

Many real-world data sets consist of a very high dimensional feature space. Most clustering techniques use the distance or similarity between objects as a measure to build clusters. But in high dimensional spaces, distances between points become relatively uniform. In such cases, density based approaches may give better results. Subspace Clustering algorithms automatically identify lower dimensional subspaces of the higher dimensional feature space in which clusters exist. In this paper, we propose a new clustering algorithm, ISC – Intelligent Subspace Clustering, which tries to overcome three major limitations of the existing state-of-art techniques. ISC determines the input parameter such as є – distance at various levels of Subspace Clustering which helps in finding meaningful clusters. The uniform parameters approach is not suitable for different kind of databases. ISC implements dynamic and adaptive determination of Meaningful clustering parameters based on hierarchical filtering approach. Third and most important feature of ISC is the ability of incremental learning and dynamic inclusion and exclusions of subspaces which lead to better cluster formation.

A Revisited View to the Paced Auditory Serial Addition Test (PASAT) in Female and Male Normal Subjects

Paced Auditory Serial Addition Test (PASAT) has been used as a common research tool for different neurological disorders like Multiple Sclerosis. Recently, technology let researchers to introduce a new versions of the visual test, the paced visual serial addition test (PVSAT). In this paper, the computerized version of these two tests is introduced. Beside the number of true responses are interpreted, the reaction time of subjects are calculated by the software. We hypothesize that paying attention to the reaction time may be valuable. For this purpose, sixty eight female normal subjects and fifty eight male normal subjects are enrolled in the study. We investigate the similarity between the PASAT3 and PVSAT3 in number of true responses and the new criterion (the average reaction time of each subject). The similarity between two tests were rejected (p-value = 0.000) which means that these two test differ. The effect of sex in the tests were not approved since the pvalues of different between PASAT3 and PVSAT3 in both sex is the same (p-value = 0.000) which means that male and female subjects performed the tests at no different level of performance. The new criterion shows a negative correlation with the age which offers aged normal subjects may have the same number of true responses as the young subjects but they have latent responses. This will give prove for the importance of reaction time.

Motor Imaginary Signal Classification Using Adaptive Recursive Bandpass Filter and Adaptive Autoregressive Models for Brain Machine Interface Designs

The noteworthy point in the advancement of Brain Machine Interface (BMI) research is the ability to accurately extract features of the brain signals and to classify them into targeted control action with the easiest procedures since the expected beneficiaries are of disabled. In this paper, a new feature extraction method using the combination of adaptive band pass filters and adaptive autoregressive (AAR) modelling is proposed and applied to the classification of right and left motor imagery signals extracted from the brain. The introduction of the adaptive bandpass filter improves the characterization process of the autocorrelation functions of the AAR models, as it enhances and strengthens the EEG signal, which is noisy and stochastic in nature. The experimental results on the Graz BCI data set have shown that by implementing the proposed feature extraction method, a LDA and SVM classifier outperforms other AAR approaches of the BCI 2003 competition in terms of the mutual information, the competition criterion, or misclassification rate.

Colour Stability of Wild Cactus Pear Juice

Prickly pear (Opuntia spp) fruit has received renewed interest since it contains a betalain pigment that has an attractive purple colour for the production of juice. Prickly pear juice was prepared by homogenizing the fruit and treating the pulp with 48 g of pectinase from Aspergillus niger. Titratable acidity was determined by diluting 10 ml prickly pear juice with 90 ml deionized water and titrating to pH 8.2 with 0.1 N NaOH. Brix was measured using a refractometer and ascorbic acid content assayed spectrophotometrically. Colour variation was determined colorimetrically (Hunter L.a.b.). Hunter L.a.b. analysis showed that the red purple colour of prickly pear juice had been affected by juice treatments. This was indicated by low light values of colour difference meter (CDML*), hue, CDMa* and CDMb* values. It was observed that non-treated prickly pear juice had a high (colour difference meter of light) CDML* of 3.9 compared to juice treatments (range 3.29 to 2.14). The CDML* significantly (p