Hydrogen Integration in Petrochemical Complexes, Using Modified Automated Targeting Method

Owing to extensive use of hydrogen in refining or petrochemical units, it is essential to manage hydrogen network in order to make the most efficient utilization of hydrogen. On the other hand, hydrogen is an important byproduct not properly used through petrochemical complexes and mostly sent to the fuel system. A few works have been reported in literature to improve hydrogen network for petrochemical complexes. In this study a comprehensive analysis is carried out on petrochemical units using a modified automated targeting technique which is applied to determine the minimum hydrogen consumption. Having applied the modified targeting method in two petrochemical cases, the results showed a significant reduction in required fresh hydrogen.

An Adaptive Model for Blind Image Restoration using Bayesian Approach

Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.

Integrating Big Island Layout with Pull System for Production Optimization

Lean manufacturing is a production philosophy made popular by Toyota Motor Corporation (TMC). It is globally known as the Toyota Production System (TPS) and has the ultimate aim of reducing cost by thoroughly eliminating wastes or muda. TPS embraces the Just-in-time (JIT) manufacturing; achieving cost reduction through lead time reduction. JIT manufacturing can be achieved by implementing Pull system in the production. Furthermore, TPS aims to improve productivity and creating continuous flow in the production by arranging the machines and processes in cellular configurations. This is called as Cellular Manufacturing Systems (CMS). This paper studies on integrating the CMS with the Pull system to establish a Big Island-Pull system production for High Mix Low Volume (HMLV) products in an automotive component industry. The paper will use the build-in JIT system steps adapted from TMC to create the Pull system production and also create a shojinka line which, according to takt time, has the flexibility to adapt to demand changes simply by adding and taking out manpower. This will lead to optimization in production.

Variance Based Component Analysis for Texture Segmentation

This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA.

An Improved Quality Adaptive Rate Filtering Technique Based on the Level Crossing Sampling

Mostly the systems are dealing with time varying signals. The Power efficiency can be achieved by adapting the system activity according to the input signal variations. In this context an adaptive rate filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by following the input signal local variations. Thus, it correlates the processing activity with the signal variations. Interpolation is required in the proposed technique. A drastic reduction in the interpolation error is achieved by employing the symmetry during the interpolation process. Processing error of the proposed technique is calculated. The computational complexity of the proposed filtering technique is deduced and compared to the classical one. Results promise a significant gain of the computational efficiency and hence of the power consumption.

The Robust Clustering with Reduction Dimension

A clustering is process to identify a homogeneous groups of object called as cluster. Clustering is one interesting topic on data mining. A group or class behaves similarly characteristics. This paper discusses a robust clustering process for data images with two reduction dimension approaches; i.e. the two dimensional principal component analysis (2DPCA) and principal component analysis (PCA). A standard approach to overcome this problem is dimension reduction, which transforms a high-dimensional data into a lower-dimensional space with limited loss of information. One of the most common forms of dimensionality reduction is the principal components analysis (PCA). The 2DPCA is often called a variant of principal component (PCA), the image matrices were directly treated as 2D matrices; they do not need to be transformed into a vector so that the covariance matrix of image can be constructed directly using the original image matrices. The decomposed classical covariance matrix is very sensitive to outlying observations. The objective of paper is to compare the performance of robust minimizing vector variance (MVV) in the two dimensional projection PCA (2DPCA) and the PCA for clustering on an arbitrary data image when outliers are hiden in the data set. The simulation aspects of robustness and the illustration of clustering images are discussed in the end of paper

Classification of Non Stationary Signals Using Ben Wavelet and Artificial Neural Networks

The automatic classification of non stationary signals is an important practical goal in several domains. An essential classification task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present a modular system composed by three blocs: 1) Representation, 2) Dimensionality reduction and 3) Classification. The originality of our work consists in the use of a new wavelet called "Ben wavelet" in the representation stage. For the dimensionality reduction, we propose a new algorithm based on the random projection and the principal component analysis.

Evaluation on the Viability of Combined Heat and Power with Different Distributed Generation Technologies for Various Bindings in Japan

This paper has examined the energy consumption characteristics in six different buildings including apartments, offices, commercial buildings, hospitals, hotels and educational facilities. Then 5-hectare (50000m2) development site for respective building-s type has been assumed as case study to evaluate the introduction effect of Combined Heat and Power (CHP). All kinds of CHP systems with different distributed generation technologies including Gas Turbine (GT), Gas Engine (GE), Diesel Engine (DE), Solid Oxide Fuel Cell (SOFC) and Polymer Electrolyte Fuel Cell (PEFC), have been simulated by using HEATMAP, CHP system analysis software. And their primary energy utilization efficiency, energy saving ratio and CO2 reduction ratio have evaluated and compared respectively. The results can be summarized as follows: Various buildings have their special heat to power ratio characteristics. Matching the heat to power ratio demanded from an individual building with that supplied from a CHP system is very important. It is necessary to select a reasonable distributed generation technologies according to the load characteristics of various buildings. Distributed generation technologies with high energy generating efficiency and low heat to power ratio, like SOFC and PEFC is more reasonable selection for Building Combined Heat and Power (BCHP). CHP system is an attractive option for hotels, hospitals and apartments in Japan. The users can achieve high energy saving and environmental benefit by introducing a CHP systems. In others buildings, especially like commercial buildings and offices, the introduction of CHP system is unreasonable.

Joint Use of Factor Analysis (FA) and Data Envelopment Analysis (DEA) for Ranking of Data Envelopment Analysis

This article combines two techniques: data envelopment analysis (DEA) and Factor analysis (FA) to data reduction in decision making units (DMU). Data envelopment analysis (DEA), a popular linear programming technique is useful to rate comparatively operational efficiency of decision making units (DMU) based on their deterministic (not necessarily stochastic) input–output data and factor analysis techniques, have been proposed as data reduction and classification technique, which can be applied in data envelopment analysis (DEA) technique for reduction input – output data. Numerical results reveal that the new approach shows a good consistency in ranking with DEA.

Evolving Neural Networks using Moment Method for Handwritten Digit Recognition

This paper proposes a neural network weights and topology optimization using genetic evolution and the backpropagation training algorithm. The proposed crossover and mutation operators aims to adapt the networks architectures and weights during the evolution process. Through a specific inheritance procedure, the weights are transmitted from the parents to their offsprings, which allows re-exploitation of the already trained networks and hence the acceleration of the global convergence of the algorithm. In the preprocessing phase, a new feature extraction method is proposed based on Legendre moments with the Maximum entropy principle MEP as a selection criterion. This allows a global search space reduction in the design of the networks. The proposed method has been applied and tested on the well known MNIST database of handwritten digits.

Reduction of MMP Using Oleophilic Chemicals

CO2 miscible displacement is not feasible in many oil fields due to high reservoir temperature as higher pressure is required to achieve miscibility. The miscibility pressure is far higher than the formation fracture pressure making it impossible to have CO2 miscible displacement. However, by using oleophilic chemicals, minimum miscibility pressure (MMP) could be lowered. The main objective of this research is to find the best oleophilic chemical in MMP reduction using slim-tube test and Vanishing Interfacial Tension (VIT) The chemicals are selected based on the characteristics that it must be oil soluble, low water solubility, have 4 – 8 carbons, semi polar, economical, and safe for human operation. The families of chemicals chosen are carboxylic acid, alcohol, and ketone. The whole experiment would be conducted at 100°C and the best chemical is said to be effective when it is able to lower CO2-crude oil MMP the most. Findings of this research would have great impact to the oil and gas industry in reduction of operation cost for CO2EOR which is applicable to both onshore and offshore operation.

A New Face Recognition Method using PCA, LDA and Neural Network

In this paper, a new face recognition method based on PCA (principal Component Analysis), LDA (Linear Discriminant Analysis) and neural networks is proposed. This method consists of four steps: i) Preprocessing, ii) Dimension reduction using PCA, iii) feature extraction using LDA and iv) classification using neural network. Combination of PCA and LDA is used for improving the capability of LDA when a few samples of images are available and neural classifier is used to reduce number misclassification caused by not-linearly separable classes. The proposed method was tested on Yale face database. Experimental results on this database demonstrated the effectiveness of the proposed method for face recognition with less misclassification in comparison with previous methods.

Dimension Reduction of Microarray Data Based on Local Principal Component

Analysis and visualization of microarraydata is veryassistantfor biologists and clinicians in the field of diagnosis and treatment of patients. It allows Clinicians to better understand the structure of microarray and facilitates understanding gene expression in cells. However, microarray dataset is a complex data set and has thousands of features and a very small number of observations. This very high dimensional data set often contains some noise, non-useful information and a small number of relevant features for disease or genotype. This paper proposes a non-linear dimensionality reduction algorithm Local Principal Component (LPC) which aims to maps high dimensional data to a lower dimensional space. The reduced data represents the most important variables underlying the original data. Experimental results and comparisons are presented to show the quality of the proposed algorithm. Moreover, experiments also show how this algorithm reduces high dimensional data whilst preserving the neighbourhoods of the points in the low dimensional space as in the high dimensional space.

Issues Problems of Sedimentation in Reservoir Siazakh Dam Case Study

Sedimentation in reservoirs lowers the quality of consumed water, reduce the volume of reservoir, lowers the controllable amount of flood, increases the risk of water overflow during possible floods and the risk of reversal and reduction of dam's useful life. So in all stages of dam establishment such as cognitive studies, phase-1 studies of design, control, construction and maintenance, the problem of sedimentation in reservoir should be considered. What engineers need to do is examine and develop the methods to keep effective capacity of a reservoir, however engineers should also consider the influences of the methods on the flood disaster, functions of water use facilities and environmental issues.This article first examines the sedimentation in reservoirs and shows how to control it and then discusses the studies about the sedimens in Siazakh Dam.

Active Control for Reduction of Noise Passing through Enclosure and Optimization of Microphone Position

In this study, noise characteristics of structure were analyzed in an effort to reduce noise passing through an opening of an enclosure surrounding the structure that generates noise. Enclosures are essential measure to protect noise propagation from operating machinery. Access openings of the enclosures are important path of noise leakage. First, noise characteristics of structure were analyzed and feed-forward noise control was performed using simulation in order to reduce noise passing through the opening of enclosure, which surrounds a structure generating noise. We then implemented a feed-forward controller to actively control the acoustic power through the opening. Finally, we conducted optimization of placement of the reference sensors for several cases of the number of sensors. Good control performances were achieved using the minimum number of microphones arranged an optimal placement.

Investigation of Anti-diabetic and Hypocholesterolemic Potential of Psyllium Husk Fiber (Plantago psyllium) in Diabetic and Hypercholesterolemic Albino Rats

The present study was conducted to observe the effect of Plantago psyllium on blood glucose and cholesterol levels in normal and alloxan induced diabetic rats. To investigate the effect of Plantago psyllium 40 rats were included in this study divided into four groups of ten rats in each group. One group A was normal, second group B was diabetic, third group C was non diabetic and hypercholesterolemic and fourth group D was diabetic and hypercholesterolemic. Two groups B and D were made diabetic by intraperitonial injection of alloxan dissolved in 1mL distilled water at a dose of 125mg/Kg of body weight. Two groups C and D were made hypercholesterolemic by oral administration of powder cholesterol (1g/Kg of body weight). The blood samples from all the rats were collected from coccygial vein on 1st day, then on 21st and 42nd day respectively. All the samples were analyzed for blood glucose and cholesterol level by using enzymatic kits. The blood glucose and cholesterol levels of treated groups of rats showed significant reduction after 7 weeks of treatment with Plantago psyllium. By statistical analysis of results it was found that Plantago psyllium has anti-diabetic and hypocholesterolemic activity in diabetic and hypercholesterolemic albino rats.

Application New Approach with Two Networks Slow and Fast on the Asynchronous Machine

In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models. This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.

Fast Wavelet Image Denoising Based on Local Variance and Edge Analysis

The approach based on the wavelet transform has been widely used for image denoising due to its multi-resolution nature, its ability to produce high levels of noise reduction and the low level of distortion introduced. However, by removing noise, high frequency components belonging to edges are also removed, which leads to blurring the signal features. This paper proposes a new method of image noise reduction based on local variance and edge analysis. The analysis is performed by dividing an image into 32 x 32 pixel blocks, and transforming the data into wavelet domain. Fast lifting wavelet spatial-frequency decomposition and reconstruction is developed with the advantages of being computationally efficient and boundary effects minimized. The adaptive thresholding by local variance estimation and edge strength measurement can effectively reduce image noise while preserve the features of the original image corresponding to the boundaries of the objects. Experimental results demonstrate that the method performs well for images contaminated by natural and artificial noise, and is suitable to be adapted for different class of images and type of noises. The proposed algorithm provides a potential solution with parallel computation for real time or embedded system application.

The Coverage of the Object-Oriented Framework Application Class-Based Test Cases

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., family of products that share the common features). Software testing is a time consuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications at the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. Framework Interface Classes (FICs) are classes introduced by the framework hooks to be implemented at the application development stage. They can have reusable test cases generated at the framework development stage and provided with the framework to test the implementations of the FICs at the application development stage. In this paper, we conduct a case study using thirteen applications developed using three frameworks; one domain oriented and two application oriented. The results show that, in general, the percentage of the number of FICs in the applications developed using domain frameworks is, on average, greater than the percentage of the number of FICs in the applications developed using application frameworks. Consequently, the reduction of the application unit testing time using the reusable test cases generated for domain frameworks is, in general, greater than the reduction of the application unit testing time using the reusable test cases generated for application frameworks.

Investigation of Monochromatization Light Effect at Molecular/Atomic Level in Electronegative-Electropositive Gas Mixtures Plasma

In electronegative-electropositive gas mixtures plasma, at a total pressure varying in the range of ten to hundred Torr, the appearance of a quasi-mochromatization effect of the emitted radiation was reported. This radiation could be the result of the generating mechanisms at molecular level, which is the case of the excimer radiation but also at atomic level. Thus, in the last case, in (Ne+1%Ar/Xe+H2) gas mixtures plasma in a dielectric barrier discharge, this effect, called M-effect, consists in the reduction of the discharge emission spectrum practice at one single, strong spectral line with λ = 585.3 nm. The present paper is concerned with the characteristics comparative investigation of the principal reaction mechanisms involved in the quasi-monochromatization effect existence in the case of the excimer radiation, respectively of the Meffect. Also, the paper points out the role of the metastable electronegative atoms in the appearance of the monochromatization – effect at atomic level.