Amine Solution Recovery Package and Controlling Corrosion in Regeneration Tower

Sarkhoon gas plant, located in south of Iran, has been installed to removal H2S contained in a high pressure natural gas stream. The solvent used for the H2S removal from gaseous stream is 34% by weight (wt%) Di-ethanol amine (DEA) solutions. Due to increasing concentration of heat stable salt (HSS) in solvent, corrosivity of amine solution had been increased. Reports indicated that there was corrosion on the shell of regeneration column. Because source formation of HSS was unknown, we decided to control the amount of HSS at the limit less than 3% wt amine solvent. Therefore, two small columns were filled by strong anionic base and carbon active, and then polluted amine was passed through beds. Finally a temporary amine recovery package on industrial scale was made based on laboratory’s results. From economical point of view we could save $700000 beside corrosion occurrence of the stripping column has been vigorously decreased.

Evaluation of Eulerian and Lagrangian Method in Analysis of Concrete Gravity Dam Including Dam Water Foundation Interaction

Because of the reservoir effect, dynamic analysis of concrete dams is more involved than other common structures. This problem is mostly sourced by the differences between reservoir water, dam body and foundation material behaviors. To account for the reservoir effect in dynamic analysis of concrete gravity dams, two methods are generally employed. Eulerian method in reservoir modeling gives rise to a set of coupled equations, whereas in Lagrangian method, the same equations for dam and foundation structure are used. The Purpose of this paper is to evaluate and study possible advantages and disadvantages of both methods. Specifically, application of the above methods in the analysis of dam-foundationreservoir systems is leveraged to calculate the hydrodynamic pressure on dam faces. Within the frame work of dam- foundationreservoir systems, dam displacement under earthquake for various dimensions and characteristics are also studied. The results of both Lagrangian and Eulerian methods in effects of loading frequency, boundary condition and foundation elasticity modulus are quantitatively evaluated and compared. Our analyses show that each method has individual advantages and disadvantages. As such, in any particular case, one of the two methods may prove more suitable as presented in the results section of this study.

The Analysis of the Impact of Urbanization on Urban Meteorology from Urban Growth Management Perspective

The amount of urban artificial heat which affects the urban temperature rise in urban meteorology was investigated in order to clarify the relationships between urbanization and urban meteorology in this study. The results of calculation to identify how urban temperate was increased through the establishment of a model for measuring the amount of urban artificial heat and theoretical testing revealed that the amount of urban artificial heat increased urban temperature by plus or minus 0.23 ˚ C in 2007 compared with 1996, statistical methods (correlation and regression analysis) to clarify the relationships between urbanization and urban weather were as follows. New design techniques and urban growth management are necessary from urban growth management point of view suggested from this research at city design phase to decrease urban temperature rise and urban torrential rain which can produce urban disaster in terms of urban meteorology by urbanization.

The Performance Analysis of Error Saturation Nonlinearity LMS in Impulsive Noise based on Weighted-Energy Conservation

This paper introduces a new approach for the performance analysis of adaptive filter with error saturation nonlinearity in the presence of impulsive noise. The performance analysis of adaptive filters includes both transient analysis which shows that how fast a filter learns and the steady-state analysis gives how well a filter learns. The recursive expressions for mean-square deviation(MSD) and excess mean-square error(EMSE) are derived based on weighted energy conservation arguments which provide the transient behavior of the adaptive algorithm. The steady-state analysis for co-related input regressor data is analyzed, so this approach leads to a new performance results without restricting the input regression data to be white.

Information Extraction from Unstructured and Ungrammatical Data Sources for Semantic Annotation

The internet has become an attractive avenue for global e-business, e-learning, knowledge sharing, etc. Due to continuous increase in the volume of web content, it is not practically possible for a user to extract information by browsing and integrating data from a huge amount of web sources retrieved by the existing search engines. The semantic web technology enables advancement in information extraction by providing a suite of tools to integrate data from different sources. To take full advantage of semantic web, it is necessary to annotate existing web pages into semantic web pages. This research develops a tool, named OWIE (Ontology-based Web Information Extraction), for semantic web annotation using domain specific ontologies. The tool automatically extracts information from html pages with the help of pre-defined ontologies and gives them semantic representation. Two case studies have been conducted to analyze the accuracy of OWIE.

Simulation of Large Deformations of Rubbers by the RKPM Method

In this paper processes including large deformations of a rubber with hyperelastic material behavior are simulated by the RKPM method. Due to the loss of kronecker delta properties in the mesh less shape functions, the imposition of essential boundary conditions consumes significant CPU time in mesh free computations. In this work transformation method is used for imposition of essential boundary conditions. A RKPM material shape function is used in this analysis. The support of the material shape functions covers the same set of particles during material deformation and hence the transformation matrix is formed only once at the initial stages. A computer program in MATLAB is developed for simulations.

Integration of Multi-Source Data to Monitor Coral Biodiversity

This study aims at using multi-source data to monitor coral biodiversity and coral bleaching. We used coral reef at Racha Islands, Phuket as a study area. There were three sources of data: coral diversity, sensor based data and satellite data.

Applying Lean Principles, Tools and Techniques in Set Parts Supply Implementation

Lean, which was initially developed by Toyota, is widely implemented in other companies to improve competitiveness. This research is an attempt to identify the adoption of lean in the production system of Malaysian car manufacturer, Proton using case study approach. To gain the in-depth information regarding lean implementation, an activity on the assembly line called Set Parts Supply (SPS) was studied. The result indicates that by using lean principles, tools and techniques in the implementation of SPS enabled to achieve the goals on safety, quality, cost, delivery and morale. The implementation increased the size of the workspace, improved the quality of assembly and the delivery of parts supply, reduced the manpower, achieved cost savings on electricity and also increased the motivation of manpower in respect of attendance at work. A framework of SPS implementation is suggested as a contribution for lean practices in production system.

3D Face Recognition Using Modified PCA Methods

In this paper we present an approach for 3D face recognition based on extracting principal components of range images by utilizing modified PCA methods namely 2DPCA and bidirectional 2DPCA also known as (2D) 2 PCA.A preprocessing stage was implemented on the images to smooth them using median and Gaussian filtering. In the normalization stage we locate the nose tip to lay it at the center of images then crop each image to a standard size of 100*100. In the face recognition stage we extract the principal component of each image using both 2DPCA and (2D) 2 PCA. Finally, we use Euclidean distance to measure the minimum distance between a given test image to the training images in the database. We also compare the result of using both methods. The best result achieved by experiments on a public face database shows that 83.3 percent is the rate of face recognition for a random facial expression.

A New Particle Filter Inspired by Biological Evolution: Genetic Filter

In this paper, we consider a new particle filter inspired by biological evolution. In the standard particle filter, a resampling scheme is used to decrease the degeneracy phenomenon and improve estimation performance. Unfortunately, however, it could cause the undesired the particle deprivation problem, as well. In order to overcome this problem of the particle filter, we propose a novel filtering method called the genetic filter. In the proposed filter, we embed the genetic algorithm into the particle filter and overcome the problems of the standard particle filter. The validity of the proposed method is demonstrated by computer simulation.

An Automated Approach for Assembling Modular Fixtures Using SolidWorks

Modular fixtures (MFs) are very important tools in manufacturing processes in terms of reduction the cost and the production time. This paper introduces an automated approach for assembling MFs elements by employing SolidWorks as a powerful 3D CAD software. Visual Basic (VB) programming language was applied integrating with SolidWorks API (Application programming interface) functions. This integration allowed creating plug-in file and generating new menus in the SolidWorks environment. The menus allow the user to select, insert, and assemble MFs elements.

Emission of Volatile Organic Compounds from the Residential Combustion of Pyrenean Oak and Black Poplar

Smoke from domestic wood burning has been identified as a major contributor to air pollution, motivating detailed emission measurements under controlled conditions. A series of experiments was performed to characterise the emissions from wood combustion in a fireplace and in a woodstove of two common species of trees grown in Spain: Pyrenean oak (Quercus pyrenaica) and black poplar (Populus nigra). Volatile organic compounds (VOCs) in the exhaust emissions were collected in Tedlar bags, re-sampled in sorbent tubes and analysed by thermal desorption-gas chromatography-flame ionisation detection. Pyrenean oak presented substantially higher emissions in the woodstove than in the fireplace, for the majority of compounds. The opposite was observed for poplar. Among the 45 identified species, benzene and benzenerelated compounds represent the most abundant group, followed by oxygenated VOCs and aliphatics. Emission factors obtained in this study are generally of the same order than those reported for residential experiments in the USA.

Analysis of Road Repairs in Undermined Areas

The article presents analysis results of maps of expected subsidence in undermined areas for road repair management. The analysis was done in the area of Karvina district in the Czech Republic, including undermined areas with ongoing deep mining activities or finished deep mining in years 2003 - 2009. The article discusses the possibilities of local road maintenance authorities to determine areas that will need most repairs in the future with limited data available. Using the expected subsidence maps new map of surface curvature was calculated. Combined with road maps and historical data about repairs the result came for five main categories of undermined areas, proving very simple tool for management.

A Survey on Life Science Database Citation Frequency in Scientific Literatures

There are so many databases of various fields of life sciences available online. To find well-used databases, a survey to measure life science database citation frequency in scientific literatures is done. The survey is done by measuring how many scientific literatures which are available on PubMed Central archive cited a specific life science database. This paper presents and discusses the results of the survey.

Analyses of Socio-Cognitive Identity Styles by Slovak Adolescents

The contribution deals with analysis of identity style at adolescents (N=463) at the age from 16 to 19 (the average age is 17,7 years). We used the Identity Style Inventory by Berzonsky, distinguishing three basic, measured identity styles: informational, normative, diffuse-avoidant identity style and also commitment. The informational identity style influencing on personal adaptability, coping strategies, quality of life and the normative identity style, it means the style in which an individual takes on models of authorities at self-defining were found to have the highest representation in the studied group of adolescents by higher scores at girls in comparison with boys. The normative identity style positively correlates with the informational identity style. The diffuse-avoidant identity style was found to be positively associated with maladaptive decisional strategies, neuroticism and depressive reactions. There is the style, in which the individual shifts aside defining his personality. In our research sample the lowest score represents it and negatively correlates with commitment, it means with coping strategies, thrust in oneself and the surrounding world. The age of adolescents did not significantly differentiate representation of identity style. We were finding the model, in which informational and normative identity style had positive relationship and the informational and diffuseavoidant style had negative relationship, which were determinated with commitment. In the same time the commitment is influenced with other outside factors.

FCA-based Conceptual Knowledge Discovery in Folksonomy

The tagging data of (users, tags and resources) constitutes a folksonomy that is the user-driven and bottom-up approach to organizing and classifying information on the Web. Tagging data stored in the folksonomy include a lot of very useful information and knowledge. However, appropriate approach for analyzing tagging data and discovering hidden knowledge from them still remains one of the main problems on the folksonomy mining researches. In this paper, we have proposed a folksonomy data mining approach based on FCA for discovering hidden knowledge easily from folksonomy. Also we have demonstrated how our proposed approach can be applied in the collaborative tagging system through our experiment. Our proposed approach can be applied to some interesting areas such as social network analysis, semantic web mining and so on.

A New Stability Analysis and Stabilization of Discrete-Time Switched Linear Systems Using Vector Norms Approach

In this paper, we aim to investigate a new stability analysis for discrete-time switched linear systems based on the comparison, the overvaluing principle, the application of Borne-Gentina criterion and the Kotelyanski conditions. This stability conditions issued from vector norms correspond to a vector Lyapunov function. In fact, the switched system to be controlled will be represented in the Companion form. A comparison system relative to a regular vector norm is used in order to get the simple arrow form of the state matrix that yields to a suitable use of Borne-Gentina criterion for the establishment of sufficient conditions for global asymptotic stability. This proposed approach could be a constructive solution to the state and static output feedback stabilization problems.

Universal Metadata Definition

The need to have standards has always been a priority of all the disciplines in the world. Today, standards such as XML and USB are trying to create a universal interface for their respective areas. The information regarding every family in the discipline addressed, must have a lot in common, known as Metadata. A lot of work has been done in specific domains such as IEEE LOM and MPEG-7 but they do not appeal to the universality of creating Metadata for all entities, where we take an entity (object) as, not restricted to Software Terms. This paper tries to address this problem of universal Metadata Definition which may lead to increase in precision of search.

Functionalization of Carbon Nanotubes Using Nitric Acid Oxidation and DBD Plasma

In this study, multiwall carbon nanotubes (MWNTs) were modified with nitric acid chemically and by dielectric barrier discharge (DBD) plasma in an oxygen-based atmosphere. Used carbon nanotubes (CNTs) were prepared by chemical vapour deposition (CVD) floating catalyst method. For removing amorphous carbon and metal catalyst, MWNTs were exposed to dry air and washed with hydrochloric acid. Heating purified CNTs under helium atmosphere caused elimination of acidic functional groups. Fourier transformed infrared spectroscopy (FTIR) shows formation of oxygen containing groups such as C=O and COOH. Brunauer, Emmett, Teller (BET) analysis revealed that functionalization causes generation of defects on the sidewalls and opening of the ends of CNTs. Results of temperature-programmed desorption (TPD) and gas chromatography(GC) indicate that nitric acid treatment create more acidic groups than plasma treatment.

Semantic Modeling of Management Information: Enabling Automatic Reasoning on DMTF-CIM

CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping can be used for automatic reasoning about the management information models, as a design aid, by means of new-generation CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.