Optimization of Petroleum Refinery Configuration Design with Logic Propositions

This work concerns the topological optimization problem for determining the optimal petroleum refinery configuration. We are interested in further investigating and hopefully advancing the existing optimization approaches and strategies employing logic propositions to conceptual process synthesis problems. In particular, we seek to contribute to this increasingly exciting area of chemical process modeling by addressing the following potentially important issues: (a) how the formulation of design specifications in a mixed-logical-and-integer optimization model can be employed in a synthesis problem to enrich the problem representation by incorporating past design experience, engineering knowledge, and heuristics; and (b) how structural specifications on the interconnectivity relationships by space (states) and by function (tasks) in a superstructure should be properly formulated within a mixed-integer linear programming (MILP) model. The proposed modeling technique is illustrated on a case study involving the alternative processing routes of naphtha, in which significant improvement in the solution quality is obtained.

Analysis of Sequence Moves in Successful Chess Openings Using Data Mining with Association Rules

Chess is one of the indoor games, which improves the level of human confidence, concentration, planning skills and knowledge. The main objective of this paper is to help the chess players to improve their chess openings using data mining techniques. Budding Chess Players usually do practices by analyzing various existing openings. When they analyze and correlate thousands of openings it becomes tedious and complex for them. The work done in this paper is to analyze the best lines of Blackmar- Diemer Gambit(BDG) which opens with White D4... using data mining analysis. It is carried out on the collection of winning games by applying association rules. The first step of this analysis is assigning variables to each different sequence moves. In the second step, the sequence association rules were generated to calculate support and confidence factor which help us to find the best subsequence chess moves that may lead to winning position.

A study on a Generic Development Process for the BPM+SOA Design and Implementation

In order to optimize annual IT spending and to reduce the complexity of an entire system architecture, SOA trials have been started. It is common knowledge that to design an SOA system we have to adopt the top-down approach, but in reality silo systems are being made, so these companies cannot reuse newly designed services, and cannot enjoy SOA-s economic benefits. To prevent this situation, we designed a generic SOA development process referred to as the architecture of “mass customization." To define the generic detail development processes, we did a case study on an imaginary company. Through the case study, we could define the practical development processes and found this could vastly reduce updating development costs.

Attitude Change after Taking a Virtual Global Understanding Course

A virtual collaborative classroom was created at East Carolina University, using videoconference technology via regular internet to bring students from 18 different countries, 2 at a time, to the ECU classroom in real time to learn about each other-s culture. Students from two countries are partnered one on one, they meet for 4-5 weeks, and submit a joint paper. Then the same process is repeated for two other countries. Lectures and student discussions are managed with pre-determined topics and questions. Classes are conducted in English and reading assignments are placed on the website. Administratively all partners are independent, students pay fees and get credits at their home institution. Familiarity with technology, knowledge in cultural understanding and attitude change were assessed, only attitude changes are reported in this paper. After taking this course, all students stated their comfort level in working with, and their desire to interact with, culturally different others grew stronger and their xenophobia and isolationist attitudes decreased.

Effective Online Staff Training: Is This Possible?

The purpose of this paper is to consider the introduction of online courses to replace the current classroom-based staff training. The current training is practical, and must be completed before access to the financial computer system is authorized. The long term objective is to measure the efficacy, effectiveness and efficiency of the training, and to establish whether a transfer of knowledge back to the workplace has occurred. This paper begins with an overview explaining the importance of staff training in an evolving, competitive business environment and defines the problem facing this particular organization. A summary of the literature review is followed by a brief discussion of the research methodology and objective. The implementation of the alpha version of the online course is then described. This paper may be of interest to those seeking insights into, or new theory regarding, practical interventions of online learning in the real world.

Study on Diversified Developments Improving Environmental Values-In Case of University Campus -

This study aims to clarify constructions which enable to improve socio-cultural values of environments and also to obtain new knowledge on selecting development plans. CVM is adopted as a method of evaluation. As a case of the research, university campus (CP; the following) is selected on account of its various environments, institutions and many users. Investigations were conducted from 4 points of view, total value and utility value of whole CP environments, values of each environment existing in CP or development plan assumed in CP. Furthermore, respondents- attributes were also investigated. In consequence, the following is obtained. 1) Almost all of total value of CP is composed of utility value of direct use. 2) Each of environment and development plans whose value is the highest is clarified. 3) Moreover, development plan to improve environmental value the most is specified.

Granularity Analysis for Spatio-Temporal Web Sensors

In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services with the Web-mined knowledge have begun to be developed for the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore, this paper introduces the simplest Web Sensor and spatiotemporallynormalized Web Sensor to extract spatiotemporal data about a target phenomenon from weblogs searched by keyword(s) representing the target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity analyses of coefficient correlation with temperature, rainfall, snowfall, and earthquake statistics per day by region of Japan Meteorological Agency as physical-world data: spatial granularity (region-s population density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and media granularity (weblogs vs. microblogs such as Tweets).

Specification of Agent Explicit Knowledge in Cryptographic Protocols

Cryptographic protocols are widely used in various applications to provide secure communications. They are usually represented as communicating agents that send and receive messages. These agents use their knowledge to exchange information and communicate with other agents involved in the protocol. An agent knowledge can be partitioned into explicit knowledge and procedural knowledge. The explicit knowledge refers to the set of information which is either proper to the agent or directly obtained from other agents through communication. The procedural knowledge relates to the set of mechanisms used to get new information from what is already available to the agent. In this paper, we propose a mathematical framework which specifies the explicit knowledge of an agent involved in a cryptographic protocol. Modelling this knowledge is crucial for the specification, analysis, and implementation of cryptographic protocols. We also, report on a prototype tool that allows the representation and the manipulation of the explicit knowledge.

Text Retrieval Relevance Feedback Techniques for Bag of Words Model in CBIR

The state-of-the-art Bag of Words model in Content- Based Image Retrieval has been used for years but the relevance feedback strategies for this model are not fully investigated. Inspired from text retrieval, the Bag of Words model has the ability to use the wealth of knowledge and practices available in text retrieval. We study and experiment the relevance feedback model in text retrieval for adapting it to image retrieval. The experiments show that the techniques from text retrieval give good results for image retrieval and that further improvements is possible.

A Scenario-Based Approach for the Air Traffic Flow Management Problem with Stochastic Capacities

In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.

Compression of Semistructured Documents

EGOTHOR is a search engine that indexes the Web and allows us to search the Web documents. Its hit list contains URL and title of the hits, and also some snippet which tries to shortly show a match. The snippet can be almost always assembled by an algorithm that has a full knowledge of the original document (mostly HTML page). It implies that the search engine is required to store the full text of the documents as a part of the index. Such a requirement leads us to pick up an appropriate compression algorithm which would reduce the space demand. One of the solutions could be to use common compression methods, for instance gzip or bzip2, but it might be preferable if we develop a new method which would take advantage of the document structure, or rather, the textual character of the documents. There already exist a special compression text algorithms and methods for a compression of XML documents. The aim of this paper is an integration of the two approaches to achieve an optimal level of the compression ratio

Methane and Other Hydrocarbon Gas Emissions Resulting from Flaring in Kuwait Oilfields

Air pollution is a major environmental health problem, affecting developed and developing countries around the world. Increasing amounts of potentially harmful gases and particulate matter are being emitted into the atmosphere on a global scale, resulting in damage to human health and the environment. Petroleum-related air pollutants can have a wide variety of adverse environmental impacts. In the crude oil production sectors, there is a strong need for a thorough knowledge of gaseous emissions resulting from the flaring of associated gas of known composition on daily basis through combustion activities under several operating conditions. This can help in the control of gaseous emission from flares and thus in the protection of their immediate and distant surrounding against environmental degradation. The impacts of methane and non-methane hydrocarbons emissions from flaring activities at oil production facilities at Kuwait Oilfields have been assessed through a screening study using records of flaring operations taken at the gas and oil production sites, and by analyzing available meteorological and air quality data measured at stations located near anthropogenic sources. In the present study the Industrial Source Complex (ISCST3) Dispersion Model is used to calculate the ground level concentrations of methane and nonmethane hydrocarbons emitted due to flaring in all over Kuwait Oilfields. The simulation of real hourly air quality in and around oil production facilities in the State of Kuwait for the year 2006, inserting the respective source emission data into the ISCST3 software indicates that the levels of non-methane hydrocarbons from the flaring activities exceed the allowable ambient air standard set by Kuwait EPA. So, there is a strong need to address this acute problem to minimize the impact of methane and non-methane hydrocarbons released from flaring activities over the urban area of Kuwait.

A Materialized Approach to the Integration of XML Documents: the OSIX System

The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.

Reasoning with Dynamic Domains and Computer Security

Representing objects in a dynamic domain is essential in commonsense reasoning under some circumstances. Classical logics and their nonmonotonic consequences, however, are usually not able to deal with reasoning with dynamic domains due to the fact that every constant in the logical language denotes some existing object in the static domain. In this paper, we explore a logical formalization which allows us to represent nonexisting objects in commonsense reasoning. A formal system named N-theory is proposed for this purpose and its possible application in computer security is briefly discussed.

Automatic Generation of OWL Ontologies from UML Class Diagrams Based on Meta- Modelling and Graph Grammars

Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.

Computational Investigations of Concrete Footing Rotational Rigidity

In many buildings we rely on large footings to offer structural stability. Designers often compensate for the lack of knowledge available with regard to foundation-soil interaction by furnishing structures with overly large footings. This may lead to a significant increase in building expenditures if many large foundations are present. This paper describes the interface material law that governs the behavior along the contact surface of adjacent materials, and the behavior of a large foundation under ultimate limit loading. A case study is chosen that represents a common foundation-soil system frequently used in general practice and therefore relevant to other structures. Investigations include compressing versus uplifting wind forces, alterations to the foundation size and subgrade compositions, the role of the slab stiffness and presence and the effect of commonly used structural joints and connections. These investigations aim to provide the reader with an objective design approach, efficiently preventing structural instability.

Short Time Identification of Feed Drive Systems using Nonlinear Least Squares Method

Design and modeling of nonlinear systems require the knowledge of all inside acting parameters and effects. An empirical alternative is to identify the system-s transfer function from input and output data as a black box model. This paper presents a procedure using least squares algorithm for the identification of a feed drive system coefficients in time domain using a reduced model based on windowed input and output data. The command and response of the axis are first measured in the first 4 ms, and then least squares are applied to predict the transfer function coefficients for this displacement segment. From the identified coefficients, the next command response segments are estimated. The obtained results reveal a considerable potential of least squares method to identify the system-s time-based coefficients and predict accurately the command response as compared to measurements.

Role-play Gaming Simulation for Flood Management on Cultural Heritage: A Case Study of Ayutthaya Historic City

The main aim of this research is to develop a methodology to encourage people's awareness, knowledge and understanding on the participation of flood management for cultural heritage, as the cooperation and interaction among government section, private section, and public section through role-play gaming simulation theory. The format of this research is to develop Role-play gaming simulation from existing documents, game or role-playing from several sources and existing data of the research site. We found that role-play gaming simulation can be implemented to help improving the understanding of the existing problem and the impact of the flood on cultural heritage, and the role-play game can be developed into the tool to improve people's knowledge, understanding and awareness about people's participation for flood management on cultural heritage, moreover the cooperation among the government, private section and public section will be improved through the theory of role-play gaming simulation.

Integrated Cultivation Technique for Microbial Lipid Production by Photosynthetic Microalgae and Locally Oleaginous Yeast

The objective of this research is to study of microbial lipid production by locally photosynthetic microalgae and oleaginous yeast via integrated cultivation technique using CO2 emissions from yeast fermentation. A maximum specific growth rate of Chlorella sp. KKU-S2 of 0.284 (1/d) was obtained under an integrated cultivation and a maximum lipid yield of 1.339g/L was found after cultivation for 5 days, while 0.969g/L of lipid yield was obtained after day 6 of cultivation time by using CO2 from air. A high value of volumetric lipid production rate (QP, 0.223 g/L/d), specific product yield (YP/X, 0.194), volumetric cell mass production rate (QX, 1.153 g/L/d) were found by using ambient air CO2 coupled with CO2 emissions from yeast fermentation. Overall lipid yield of 8.33 g/L was obtained (1.339 g/L of Chlorella sp. KKU-S2 and 7.06g/L of T. maleeae Y30) while low lipid yield of 0.969g/L was found using non-integrated cultivation technique. To our knowledge this is the unique report about the lipid production from locally microalgae Chlorella sp. KKU-S2 and yeast T. maleeae Y30 in an integrated technique to improve the biomass and lipid yield by using CO2 emissions from yeast fermentation.

Optimizing Spatial Trend Detection By Artificial Immune Systems

Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.