Computational Tool for Techno-Economical Evaluation of Steam/Oxygen Fluidized Bed Biomass Gasification Technologies

The paper presents a computational tool developed for the evaluation of technical and economic advantages of an innovative cleaning and conditioning technology of fluidized bed steam/oxygen gasifiers outlet product gas. This technology integrates into a single unit the steam gasification of biomass and the hot gas cleaning and conditioning system. Both components of the computational tool, process flowsheet and economic evaluator, have been developed under IPSEpro software. The economic model provides information that can help potential users, especially small and medium size enterprises acting in the regenerable energy field, to decide the optimal scale of a plant and to better understand both potentiality and limits of the system when applied to a wide range of conditions.

Dynamic Features Selection for Heart Disease Classification

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

A Novel Model for Simultaneously Minimising Costs and Risks in Just-in-Time Systems Using Multi-Backup Suppliers: Part 2- Results

This paper implements the inventory model developed in the first part of this paper in a simplified problem to simultaneously reduce costs and risks in JIT systems. This model is developed to ascertain an optimal ordering strategy for procuring raw materials by using regular multi-external and local backup suppliers to reduce the total cost of the products, and at the same time to reduce the risks arising from this cost reduction within production systems. A comparison between the cost of using the JIT system and using the proposed inventory model shows the superiority of the use of the inventory model.

Variation of Spot Price and Profits of Andhra Pradesh State Grid in Deregulated Environment

In this paper variation of spot price and total profits of the generating companies- through wholesale electricity trading are discussed with and without Central Generating Stations (CGS) share and seasonal variations are also considered. It demonstrates how proper analysis of generators- efficiencies and capabilities, types of generators owned, fuel costs, transmission losses and settling price variation using the solutions of Optimal Power Flow (OPF), can allow companies to maximize overall revenue. It illustrates how solutions of OPF can be used to maximize companies- revenue under different scenarios. And is also extended to computation of Available Transfer Capability (ATC) is very important to the transmission system security and market forecasting. From these results it is observed that how crucial it is for companies to plan their daily operations and is certainly useful in an online environment of deregulated power system. In this paper above tasks are demonstrated on 124 bus real-life Indian utility power system of Andhra Pradesh State Grid and results have been presented and analyzed.

A Contractor for the Symmetric Solution Set

The symmetric solution set Σ sym is the set of all solutions to the linear systems Ax = b, where A is symmetric and lies between some given bounds A and A, and b lies between b and b. We present a contractor for Σ sym, which is an iterative method that starts with some initial enclosure of Σ sym (by means of a cartesian product of intervals) and sequentially makes the enclosure tighter. Our contractor is based on polyhedral approximation and solving a series of linear programs. Even though it does not converge to the optimal bounds in general, it may significantly reduce the overestimation. The efficiency is discussed by a number of numerical experiments.

Correlations between Cleaning Frequency of Reservoir and Water Tower and Parameters of Water Quality

This study was investigated on sampling and analyzing water quality in water reservoir & water tower installed in two kind of residential buildings and school facilities. Data of water quality was collected for correlation analysis with frequency of sanitization of water reservoir through questioning managers of building about the inspection charts recorded on equipment for water reservoir. Statistical software packages (SPSS) were applied to the data of two groups (cleaning frequency and water quality) for regression analysis to determine the optimal cleaning frequency of sanitization. The correlation coefficient (R) in this paper represented the degree of correlation, with values of R ranging from +1 to -1.After investigating three categories of drinking water users; this study found that the frequency of sanitization of water reservoir significantly influenced the water quality of drinking water. A higher frequency of sanitization (more than four times per 1 year) implied a higher quality of drinking water. Results indicated that sanitizing water reservoir & water tower should at least twice annually for achieving the aim of safety of drinking water.

Optimization Based Obstacle Avoidance

Based on a non-linear single track model which describes the dynamics of vehicle, an optimal path planning strategy is developed. Real time optimization is used to generate reference control values to allow leading the vehicle alongside a calculated lane which is optimal for different objectives such as energy consumption, run time, safety or comfort characteristics. Strict mathematic formulation of the autonomous driving allows taking decision on undefined situation such as lane change or obstacle avoidance. Based on position of the vehicle, lane situation and obstacle position, the optimization problem is reformulated in real-time to avoid the obstacle and any car crash.

The Development of Decision Support System for Waste Management; a Review

Most Decision Support Systems (DSS) for waste management (WM) constructed are not widely marketed and lack practical applications. This is due to the number of variables and complexity of the mathematical models which include the assumptions and constraints required in decision making. The approach made by many researchers in DSS modelling is to isolate a few key factors that have a significant influence to the DSS. This segmented approach does not provide a thorough understanding of the complex relationships of the many elements involved. The various elements in constructing the DSS must be integrated and optimized in order to produce a viable model that is marketable and has practical application. The DSS model used in assisting decision makers should be integrated with GIS, able to give robust prediction despite the inherent uncertainties of waste generation and the plethora of waste characteristics, and gives optimal allocation of waste stream for recycling, incineration, landfill and composting.

Numerical Analysis and Experimental Validation of Detector Pressure Housing Subject to HPHT

Reservoirs with high pressures and temperatures (HPHT) that were considered to be atypical in the past are now frequent targets for exploration. For downhole oilfield drilling tools and components, the temperature and pressure affect the mechanical strength. To address this issue, a finite element analysis (FEA) for 206.84 MPa (30 ksi) pressure and 165°C has been performed on the pressure housing of the measurement-while-drilling/logging-whiledrilling (MWD/LWD) density tool. The density tool is a MWD/LWD sensor that measures the density of the formation. One of the components of the density tool is the pressure housing that is positioned in the tool. The FEA results are compared with the experimental test performed on the pressure housing of the density tool. Past results show a close match between the numerical results and the experimental test. This FEA model can be used for extreme HPHT and ultra HPHT analyses, and/or optimal design changes.

A Novel Approach of Route Choice in Stochastic Time-varying Networks

Many exist studies always use Markov decision processes (MDPs) in modeling optimal route choice in stochastic, time-varying networks. However, taking many variable traffic data and transforming them into optimal route decision is a computational challenge by employing MDPs in real transportation networks. In this paper we model finite horizon MDPs using directed hypergraphs. It is shown that the problem of route choice in stochastic, time-varying networks can be formulated as a minimum cost hyperpath problem, and it also can be solved in linear time. We finally demonstrate the significant computational advantages of the introduced methods.

Autonomously Determining the Parameters for SVDD with RBF Kernel from a One-Class Training Set

The one-class support vector machine “support vector data description” (SVDD) is an ideal approach for anomaly or outlier detection. However, for the applicability of SVDD in real-world applications, the ease of use is crucial. The results of SVDD are massively determined by the choice of the regularisation parameter C and the kernel parameter  of the widely used RBF kernel. While for two-class SVMs the parameters can be tuned using cross-validation based on the confusion matrix, for a one-class SVM this is not possible, because only true positives and false negatives can occur during training. This paper proposes an approach to find the optimal set of parameters for SVDD solely based on a training set from one class and without any user parameterisation. Results on artificial and real data sets are presented, underpinning the usefulness of the approach.

Near-Field Robust Adaptive Beamforming Based on Worst-Case Performance Optimization

The performance of adaptive beamforming degrades substantially in the presence of steering vector mismatches. This degradation is especially severe in the near-field, for the 3-dimensional source location is more difficult to estimate than the 2-dimensional direction of arrival in far-field cases. As a solution, a novel approach of near-field robust adaptive beamforming (RABF) is proposed in this paper. It is a natural extension of the traditional far-field RABF and belongs to the class of diagonal loading approaches, with the loading level determined based on worst-case performance optimization. However, different from the methods solving the optimal loading by iteration, it suggests here a simple closed-form solution after some approximations, and consequently, the optimal weight vector can be expressed in a closed form. Besides simplicity and low computational cost, the proposed approach reveals how different factors affect the optimal loading as well as the weight vector. Its excellent performance in the near-field is confirmed via a number of numerical examples.

Distributed Detection and Optimal Traffic-blocking of Network Worms

Despite the recent surge of research in control of worm propagation, currently, there is no effective defense system against such cyber attacks. We first design a distributed detection architecture called Detection via Distributed Blackholes (DDBH). Our novel detection mechanism could be implemented via virtual honeypots or honeynets. Simulation results show that a worm can be detected with virtual honeypots on only 3% of the nodes. Moreover, the worm is detected when less than 1.5% of the nodes are infected. We then develop two control strategies: (1) optimal dynamic trafficblocking, for which we determine the condition that guarantees minimum number of removed nodes when the worm is contained and (2) predictive dynamic traffic-blocking–a realistic deployment of the optimal strategy on scale-free graphs. The predictive dynamic traffic-blocking, coupled with the DDBH, ensures that more than 40% of the network is unaffected by the propagation at the time when the worm is contained.

An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

Identify Features and Parameters to Devise an Accurate Intrusion Detection System Using Artificial Neural Network

The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.

The Framework of Termination Mechanism in Modern Emergency Management

Termination Mechanism is an indispensible part of the emergency management mechanism. Despite of its importance in both theory and practice, it is almost a brand new field for researching. The concept of termination mechanism is proposed firstly in this paper, and the design and implementation which are helpful to guarantee the effect and integrity of emergency management are discussed secondly. Starting with introduction of the problems caused by absent termination and incorrect termination, the essence of termination mechanism is analyzed, a model based on Optimal Stopping Theory is constructed and the termination index is given. The model could be applied to find the best termination time point.. Termination decision should not only be concerned in termination stage, but also in the whole emergency management process, which makes it a dynamic decision making process. Besides, the main subjects and the procedure of termination are illustrated after the termination time point is given. Some future works are discussed lastly.

Design a single-phase BLDC Motor and Finite- Element Analysis of Stator Slots Structure Effects on the Efficiency

In this paper effect of stator slots structure and switching angle on a cylindrical single-phase brushless direct current motor (BLDC) is analyzed. BLDC motor with three different structures for stator slots is designed by using RMxprt software and efficiency of BLDC motor for different structures in full-load condition has been presented. Then the BLDC motor in different conditions by using Maxwell 3D software is designed and with finite element method is analyzed electromagnetically. At the end with the use of MATLAB software influence of switching angle on motor performance investigated and optimal angle has been determined. The results indicate that with correct choosing of stator slots structure and switching angle, maximum efficiency can be found.

Biodiesel Production from High Iodine Number Candlenut Oil

Transesterification of candlenut (aleurites moluccana) oil with methanol using potassium hydroxide as catalyst was studied. The objective of the present investigation was to produce the methyl ester for use as biodiesel. The operation variables employed were methanol to oil molar ratio (3:1 – 9:1), catalyst concentration (0.50 – 1.5 %) and temperature (303 – 343K). Oil volume of 150 mL, reaction time of 75 min were fixed as common parameters in all the experiments. The concentration of methyl ester was evaluated by mass balance of free glycerol formed which was analyzed by using periodic acid. The optimal triglyceride conversion was attained by using methanol to oil ratio of 6:1, potassium hydroxide as catalyst was of 1%, at room temperature. Methyl ester formed was characterized by its density, viscosity, cloud and pour points. The biodiesel properties had properties similar to those of diesel oil, except for the viscosity that was higher.

Improved Root-Mean-Square-Gain-Combining for SIMO Channels

The major problem that wireless communication systems undergo is multipath fading caused by scattering of the transmitted signal. However, we can treat multipath propagation as multiple channels between the transmitter and receiver to improve the signal-to-scattering-noise ratio. While using Single Input Multiple Output (SIMO) systems, the diversity receivers extract multiple signal branches or copies of the same signal received from different channels and apply gain combining schemes such as Root Mean Square Gain Combining (RMSGC). RMSGC asymptotically yields an identical performance to that of the theoretically optimal Maximum Ratio Combining (MRC) for values of mean Signal-to- Noise-Ratio (SNR) above a certain threshold value without the need for SNR estimation. This paper introduces an improvement of RMSGC using two different issues. We found that post-detection and de-noising the received signals improve the performance of RMSGC and lower the threshold SNR.

Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.