Water Security in Rural Areas through Solar Energy in Baja California Sur, Mexico

This study aims to assess the potential of solar energy technology for improving access to water and hence the livelihood strategies of rural communities in Baja California Sur, Mexico. It focuses on livestock ranches and photovoltaic water-pumptechnology as well as other water extraction methods. The methodology used are the Sustainable Livelihoods and the Appropriate Technology approaches. A household survey was applied in June of 2006 to 32 ranches in the municipality, of which 22 used PV pumps; and semi-structured interviews were conducted. Findings indicate that solar pumps have in fact helped people improve their quality of life by allowing them to pursue a different livelihood strategy and that improved access to water -not necessarily as more water but as less effort to extract and collect it- does not automatically imply overexploitation of the resource; consumption is based on basic needs as well as on storage and pumping capacity. Justification for such systems lies in the avoidance of logistical problems associated to fossil fuels, PV pumps proved to be the most beneficial when substituting gasoline or diesel equipment but of dubious advantage if intended to replace wind or gravity systems. Solar water pumping technology-s main obstacle to dissemination are high investment and repairs costs and it is therefore not suitable for all cases even when insolation rates and water availability are adequate. In cases where affordability is not an obstacle it has become an important asset that contributes –by means of reduced expenses, less effort and saved time- to the improvement of livestock, the main livelihood provider for these ranches.

Fuzzy Control of Macroeconomic Models

The optimal control is one of the possible controllers for a dynamic system, having a linear quadratic regulator and using the Pontryagin-s principle or the dynamic programming method . Stochastic disturbances may affect the coefficients (multiplicative disturbances) or the equations (additive disturbances), provided that the shocks are not too great . Nevertheless, this approach encounters difficulties when uncertainties are very important or when the probability calculus is of no help with very imprecise data. The fuzzy logic contributes to a pragmatic solution of such a problem since it operates on fuzzy numbers. A fuzzy controller acts as an artificial decision maker that operates in a closed-loop system in real time. This contribution seeks to explore the tracking problem and control of dynamic macroeconomic models using a fuzzy learning algorithm. A two inputs - single output (TISO) fuzzy model is applied to the linear fluctuation model of Phillips and to the nonlinear growth model of Goodwin.

Analyzing CPFR Supporting Factors with Fuzzy Cognitive Map Approach

Collaborative planning, forecasting and replenishment (CPFR) coordinates the various supply chain management activities including production and purchase planning, demand forecasting and inventory replenishment between supply chain trading partners. This study proposes a systematic way of analyzing CPFR supporting factors using fuzzy cognitive map (FCM) approach. FCMs have proven particularly useful for solving problems in which a number of decision variables and uncontrollable variables are causally interrelated. Hence the FCMs of CPFR are created to show the relationships between the factors that influence on effective implementation of CPFR in the supply chain.

Feature Subset Selection approach based on Maximizing Margin of Support Vector Classifier

Identification of cancer genes that might anticipate the clinical behaviors from different types of cancer disease is challenging due to the huge number of genes and small number of patients samples. The new method is being proposed based on supervised learning of classification like support vector machines (SVMs).A new solution is described by the introduction of the Maximized Margin (MM) in the subset criterion, which permits to get near the least generalization error rate. In class prediction problem, gene selection is essential to improve the accuracy and to identify genes for cancer disease. The performance of the new method was evaluated with real-world data experiment. It can give the better accuracy for classification.

Unrelated Parallel Machines Scheduling Problem Using an Ant Colony Optimization Approach

Total weighted tardiness is a measure of customer satisfaction. Minimizing it represents satisfying the general requirement of on-time delivery. In this research, we consider an ant colony optimization (ACO) algorithm to solve the problem of scheduling unrelated parallel machines to minimize total weighted tardiness. The problem is NP-hard in the strong sense. Computational results show that the proposed ACO algorithm is giving promising results compared to other existing algorithms.

An Augmented Automatic Choosing Control Designed by Extremizing a Combination of Hamiltonian and Lyapunov Functions for Nonlinear Systems with Constrained Input

In this paper we consider a nonlinear feedback control called augmented automatic choosing control (AACC) for nonlinear systems with constrained input. Constant terms which arise from section wise linearization of a given nonlinear system are treated as coefficients of a stable zero dynamics.Parameters included in the control are suboptimally selectedby extremizing a combination of Hamiltonian and Lyapunov functions with the aid of the genetic algorithm. This approach is applied to a field excitation control problem of power system to demonstrate the splendidness of the AACC. Simulation results show that the new controller can improve performance remarkably well.

New Simultaneous High Performance Liquid Chromatographic Method for Determination of NSAIDs and Opioid Analgesics in Advanced Drug Delivery Systems and Human Plasma

A new and cost effective RP-HPLC method was developed and validated for simultaneous analysis of non steroidal anti inflammatory dugs Diclofenac sodium (DFS), Flurbiprofen (FLP) and an opioid analgesic Tramadol (TMD) in advanced drug delivery systems (Liposome and Microcapsules), marketed brands and human plasma. Isocratic system was employed for the flow of mobile phase consisting of 10 mM sodium dihydrogen phosphate buffer and acetonitrile in molar ratio of 67: 33 with adjusted pH of 3.2. The stationary phase was hypersil ODS column (C18, 250×4.6 mm i.d., 5 μm) with controlled temperature of 30 C°. DFS in liposomes, microcapsules and marketed drug products was determined in range of 99.76-99.84%. FLP and TMD in microcapsules and brands formulation were 99.78 - 99.94 % and 99.80 - 99.82 %, respectively. Single step liquid-liquid extraction procedure using combination of acetonitrile and trichloroacetic acid (TCA) as protein precipitating agent was employed. The detection limits (at S/N ratio 3) of quality control solutions and plasma samples were 10, 20, and 20 ng/ml for DFS, FLP and TMD, respectively. The Assay was acceptable in linear dynamic range. All other validation parameters were found in limits of FDA and ICH method validation guidelines. The proposed method is sensitive, accurate and precise and could be applicable for routine analysis in pharmaceutical industry as well as in human plasma samples for bioequivalence and pharmacokinetics studies.

Towards a New Methodology for Developing Web-Based Systems

Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.

A Novel Approach to EMABS and Comparison with ABS

In this paper two different Antilock braking system (ABS) are simulated and compared. One is the ordinary hydraulic ABS system which we call it ABS and the other is Electromagnetic Antilock braking system which is called (EMABS) the basis of performance of an EMABS is based upon Electromagnetic force. In this system there is no need to use servo hydraulic booster which are used in ABS system. In EMABS to generate the desired force we have use a magnetic relay which works with an input voltage through an air gap (g). The generated force will be amplified by the relay arm, and is applied to the brake shoes and thus the braking torque is generated. The braking torque is proportional to the applied electrical voltage E. to adjust the braking torque it is only necessary to regulate the electrical voltage E which is very faster and has a much smaller time constant T than the ABS system. The simulations of these two different ABS systems are done with MATLAB/SIMULINK software and the superiority of the EMABS has been shown.

A Novel Approach to Fault Classification and Fault Location for Medium Voltage Cables Based on Artificial Neural Network

A novel application of neural network approach to fault classification and fault location of Medium voltage cables is demonstrated in this paper. Different faults on a protected cable should be classified and located correctly. This paper presents the use of neural networks as a pattern classifier algorithm to perform these tasks. The proposed scheme is insensitive to variation of different parameters such as fault type, fault resistance, and fault inception angle. Studies show that the proposed technique is able to offer high accuracy in both of the fault classification and fault location tasks.

Computing a Time Based Effective Radius-of-Curvature for Roadways

The radius-of-curvature (ROC) defines the degree of curvature along the centerline of a roadway whereby a travelling vehicle must follow. Roadway designs must encompass ROC in mitigating the cost of earthwork associated with construction while also allowing vehicles to travel at maximum allowable design speeds. Thus, a road will tend to follow natural topography where possible, but curvature must also be optimized to permit fast, but safe vehicle speeds. The more severe the curvature of the road, the slower the permissible vehicle speed. For route planning, whether for urban settings, emergency operations, or even parcel delivery, ROC is a necessary attribute of road arcs for computing travel time. It is extremely rare for a geo-spatial database to contain ROC. This paper will present a procedure and mathematical algorithm to calculate and assign ROC to a segment pair and/or polyline.

A Fuzzy Predictive Filter for Sinusoidal Signals with Time-Varying Frequencies

Prediction of sinusoidal signals with time-varying frequencies has been an important research topic in power electronics systems. To solve this problem, we propose a new fuzzy predictive filtering scheme, which is based on a Finite Impulse Response (FIR) filter bank. Fuzzy logic is introduced here to provide appropriate interpolation of individual filter outputs. Therefore, instead of regular 'hard' switching, our method has the advantageous 'soft' switching among different filters. Simulation comparisons between the fuzzy predictive filtering and conventional filter bank-based approach are made to demonstrate that the new scheme can achieve an enhanced prediction performance for slowly changing sinusoidal input signals.

Method of Moments for Analysis of Multiple Crack Interaction in an Isotropic Elastic Solid

The problem of N cracks interaction in an isotropic elastic solid is decomposed into a subproblem of a homogeneous solid without crack and N subproblems with each having a single crack subjected to unknown tractions on the two crack faces. The unknown tractions, namely pseudo tractions on each crack are expanded into polynomials with unknown coefficients, which have to be determined by the consistency condition, i.e. by the equivalence of the original multiple cracks interaction problem and the superposition of the N+1 subproblems. In this paper, Kachanov-s approach of average tractions is extended into the method of moments to approximately impose the consistence condition. Hence Kachanov-s method can be viewed as the zero-order method of moments. Numerical results of the stress intensity factors are presented for interactions of two collinear cracks, three collinear cracks, two parallel cracks, and three parallel cracks. As the order of moment increases, the accuracy of the method of moments improves.

Flagging Critical Components to Prevent Transient Faults in Real-Time Systems

This paper proposes the use of metrics in design space exploration that highlight where in the structure of the model and at what point in the behaviour, prevention is needed against transient faults. Previous approaches to tackle transient faults focused on recovery after detection. Almost no research has been directed towards preventive measures. But in real-time systems, hard deadlines are performance requirements that absolutely must be met and a missed deadline constitutes an erroneous action and a possible system failure. This paper proposes the use of metrics to assess the system design to flag where transient faults may have significant impact. These tools then allow the design to be changed to minimize that impact, and they also flag where particular design techniques – such as coding of communications or memories – need to be applied in later stages of design.

An Optimal Unsupervised Satellite image Segmentation Approach Based on Pearson System and k-Means Clustering Algorithm Initialization

This paper presents an optimal and unsupervised satellite image segmentation approach based on Pearson system and k-Means Clustering Algorithm Initialization. Such method could be considered as original by the fact that it utilised K-Means clustering algorithm for an optimal initialisation of image class number on one hand and it exploited Pearson system for an optimal statistical distributions- affectation of each considered class on the other hand. Satellite image exploitation requires the use of different approaches, especially those founded on the unsupervised statistical segmentation principle. Such approaches necessitate definition of several parameters like image class number, class variables- estimation and generalised mixture distributions. Use of statistical images- attributes assured convincing and promoting results under the condition of having an optimal initialisation step with appropriated statistical distributions- affectation. Pearson system associated with a k-means clustering algorithm and Stochastic Expectation-Maximization 'SEM' algorithm could be adapted to such problem. For each image-s class, Pearson system attributes one distribution type according to different parameters and especially the Skewness 'β1' and the kurtosis 'β2'. The different adapted algorithms, K-Means clustering algorithm, SEM algorithm and Pearson system algorithm, are then applied to satellite image segmentation problem. Efficiency of those combined algorithms was firstly validated with the Mean Quadratic Error 'MQE' evaluation, and secondly with visual inspection along several comparisons of these unsupervised images- segmentation.

A Comparison of Exact and Heuristic Approaches to Capital Budgeting

This paper summarizes and compares approaches to solving the knapsack problem and its known application in capital budgeting. The first approach uses deterministic methods and can be applied to small-size tasks with a single constraint. We can also apply commercial software systems such as the GAMS modelling system. However, because of NP-completeness of the problem, more complex problem instances must be solved by means of heuristic techniques to achieve an approximation of the exact solution in a reasonable amount of time. We show the problem representation and parameter settings for a genetic algorithm framework.

Restoration of Noisy Document Images with an Efficient Bi-Level Adaptive Thresholding

An effective approach for extracting document images from a noisy background is introduced. The entire scheme is divided into three sub- stechniques – the initial preprocessing operations for noise cluster tightening, introduction of a new thresholding method by maximizing the ratio of stan- dard deviations of the combined effect on the image to the sum of weighted classes and finally the image restoration phase by image binarization utiliz- ing the proposed optimum threshold level. The proposed method is found to be efficient compared to the existing schemes in terms of computational complexity as well as speed with better noise rejection.

CFD Simulation and Validation of Flow Pattern Transition Boundaries during Moderately Viscous Oil-Water Two-Phase Flow through Horizontal Pipeline

In the present study, computational fluid dynamics (CFD) simulation has been executed to investigate the transition boundaries of different flow patterns for moderately viscous oil-water (viscosity ratio 107, density ratio 0.89 and interfacial tension of 0.032 N/m.) two-phase flow through a horizontal pipeline with internal diameter and length of 0.025 m and 7.16 m respectively. Volume of Fluid (VOF) approach including effect of surface tension has been employed to predict the flow pattern. Geometry and meshing of the present problem has been drawn using GAMBIT and ANSYS FLUENT has been used for simulation. A total of 47037 quadrilateral elements are chosen for the geometry of horizontal pipeline. The computation has been performed by assuming unsteady flow, immiscible liquid pair, constant liquid properties, co-axial flow and a T-junction as entry section. The simulation correctly predicts the transition boundaries of wavy stratified to stratified mixed flow. Other transition boundaries are yet to be simulated. Simulated data has been validated with our own experimental results.

Multivariate High Order Fuzzy Time Series Forecasting for Car Road Accidents

In this paper, we have presented a new multivariate fuzzy time series forecasting method. This method assumes mfactors with one main factor of interest. History of past three years is used for making new forecasts. This new method is applied in forecasting total number of car accidents in Belgium using four secondary factors. We also make comparison of our proposed method with existing methods of fuzzy time series forecasting. Experimentally, it is shown that our proposed method perform better than existing fuzzy time series forecasting methods. Practically, actuaries are interested in analysis of the patterns of causalities in road accidents. Thus using fuzzy time series, actuaries can define fuzzy premium and fuzzy underwriting of car insurance and life insurance for car insurance. National Institute of Statistics, Belgium provides region of risk classification for each road. Thus using this risk classification, we can predict premium rate and underwriting of insurance policy holders.

An Atomic-Domains-Based Approach for Attack Graph Generation

Attack graph is an integral part of modeling the overview of network security. System administrators use attack graphs to determine how vulnerable their systems are and to determine what security measures to deploy to defend their systems. Previous methods on AGG(attack graphs generation) are aiming at the whole network, which makes the process of AGG complex and non-scalable. In this paper, we propose a new approach which is simple and scalable to AGG by decomposing the whole network into atomic domains. Each atomic domain represents a host with a specific privilege. Then the process for AGG is achieved by communications among all the atomic domains. Our approach simplifies the process of design for the whole network, and can gives the attack graphs including each attack path for each host, and when the network changes we just carry on the operations of corresponding atomic domains which makes the process of AGG scalable.