Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Genetic Algorithm Based Optimal Control for a 6-DOF Non Redundant Stewart Manipulator

Applicability of tuning the controller gains for Stewart manipulator using genetic algorithm as an efficient search technique is investigated. Kinematics and dynamics models were introduced in detail for simulation purpose. A PD task space control scheme was used. For demonstrating technique feasibility, a Stewart manipulator numerical-model was built. A genetic algorithm was then employed to search for optimal controller gains. The controller was tested onsite a generic circular mission. The simulation results show that the technique is highly convergent with superior performance operating for different payloads.

ISTER (Immune System - Tumor Efficiency Rate): An Important Key for Planning in Radiotherapic Facilities

The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.

Design of the Production Line Based On RFID through 3D Modeling

Radio-frequency identification has entered as a beneficial means with conforming GS1 standards to provide the best solutions in the manufacturing area. It competes with other automated identification technologies e.g. barcodes and smart cards with regard to high speed scanning, reliability and accuracy as well. The purpose of this study is to improve production line-s performance by implementing RFID system in the manufacturing area on the basis of radio-frequency identification (RFID) system by 3D modeling in the program Cinema 4D R13 which provides obvious graphical scenes for users to portray their applications. Finally, with regard to improving system performance, it shows how RFID appears as a well-suited technology in a comparison of the barcode scanner to handle different kinds of raw materials in the production line base on logical process.

Agent-based Simulation for Blood Glucose Control in Diabetic Patients

This paper employs a new approach to regulate the blood glucose level of type I diabetic patient under an intensive insulin treatment. The closed-loop control scheme incorporates expert knowledge about treatment by using reinforcement learning theory to maintain the normoglycemic average of 80 mg/dl and the normal condition for free plasma insulin concentration in severe initial state. The insulin delivery rate is obtained off-line by using Qlearning algorithm, without requiring an explicit model of the environment dynamics. The implementation of the insulin delivery rate, therefore, requires simple function evaluation and minimal online computations. Controller performance is assessed in terms of its ability to reject the effect of meal disturbance and to overcome the variability in the glucose-insulin dynamics from patient to patient. Computer simulations are used to evaluate the effectiveness of the proposed technique and to show its superiority in controlling hyperglycemia over other existing algorithms

Traffic Flow Prediction using Adaboost Algorithm with Random Forests as a Weak Learner

Traffic Management and Information Systems, which rely on a system of sensors, aim to describe in real-time traffic in urban areas using a set of parameters and estimating them. Though the state of the art focuses on data analysis, little is done in the sense of prediction. In this paper, we describe a machine learning system for traffic flow management and control for a prediction of traffic flow problem. This new algorithm is obtained by combining Random Forests algorithm into Adaboost algorithm as a weak learner. We show that our algorithm performs relatively well on real data, and enables, according to the Traffic Flow Evaluation model, to estimate and predict whether there is congestion or not at a given time on road intersections.

Elasto-Visco-Plastic-Damage Model for Pre-Strained 304L Stainless Steel Subjected to Low Temperature

Primary barrier of membrane type LNG containment system consist of corrugated 304L stainless steel. This 304L stainless steel is austenitic stainless steel which shows different material behaviors owing to phase transformation during the plastic work. Even though corrugated primary barriers are subjected to significant amounts of pre-strain due to press working, quantitative mechanical behavior on the effect of pre-straining at cryogenic temperatures are not available. In this study, pre-strain level and pre-strain temperature dependent tensile tests are carried to investigate mechanical behaviors. Also, constitutive equations with material parameters are suggested for a verification study.

Reliability-based Selection of Wind Turbines for Large-Scale Wind Farms

This paper presents a reliability-based approach to select appropriate wind turbine types for a wind farm considering site-specific wind speed patterns. An actual wind farm in the northern region of Iran with the wind speed registration of one year is studied in this paper. An analytic approach based on total probability theorem is utilized in this paper to model the probabilistic behavior of both turbines- availability and wind speed. Well-known probabilistic reliability indices such as loss of load expectation (LOLE), expected energy not supplied (EENS) and incremental peak load carrying capability (IPLCC) for wind power integration in the Roy Billinton Test System (RBTS) are examined. The most appropriate turbine type achieving the highest reliability level is chosen for the studied wind farm.

A Novel, Cost-effective Design to Harness Ocean Energy in the Developing Countries

The world's population continues to grow at a quarter of a million people per day, increasing the consumption of energy. This has made the world to face the problem of energy crisis now days. In response to the energy crisis, the principles of renewable energy gained popularity. There are much advancement made in developing the wind and solar energy farms across the world. These energy farms are not enough to meet the energy requirement of world. This has attracted investors to procure new sources of energy to be substituted. Among these sources, extraction of energy from the waves is considered as best option. The world oceans contain enough energy to meet the requirement of world. Significant advancements in design and technology are being made to make waves as a continuous source of energy. One major hurdle in launching wave energy devices in a developing country like Pakistan is the initial cost. A simple, reliable and cost effective wave energy converter (WEC) is required to meet the nation-s energy need. This paper will present a novel design proposed by team SAS for harnessing wave energy. This paper has three major sections. The first section will give a brief and concise view of ocean wave creation, propagation and the energy carried by them. The second section will explain the designing of SAS-2. A gear chain mechanism is used for transferring the energy from the buoy to a rotary generator. The third section will explain the manufacturing of scaled down model for SAS-2 .Many modifications are made in the trouble shooting stage. The design of SAS-2 is simple and very less maintenance is required. SAS-2 is producing electricity at Clifton. The initial cost of SAS-2 is very low. This has proved SAS- 2 as one of the cost effective and reliable source of harnessing wave energy for developing countries.

Multi-Agent Systems Applied in the Modeling and Simulation of Biological Problems: A Case Study in Protein Folding

Multi-agent system approach has proven to be an effective and appropriate abstraction level to construct whole models of a diversity of biological problems, integrating aspects which can be found both in "micro" and "macro" approaches when modeling this type of phenomena. Taking into account these considerations, this paper presents the important computational characteristics to be gathered into a novel bioinformatics framework built upon a multiagent architecture. The version of the tool presented herein allows studying and exploring complex problems belonging principally to structural biology, such as protein folding. The bioinformatics framework is used as a virtual laboratory to explore a minimalist model of protein folding as a test case. In order to show the laboratory concept of the platform as well as its flexibility and adaptability, we studied the folding of two particular sequences, one of 45-mer and another of 64-mer, both described by an HP model (only hydrophobic and polar residues) and coarse grained 2D-square lattice. According to the discussion section of this piece of work, these two sequences were chosen as breaking points towards the platform, in order to determine the tools to be created or improved in such a way to overcome the needs of a particular computation and analysis of a given tough sequence. The backwards philosophy herein is that the continuous studying of sequences provides itself important points to be added into the platform, to any time improve its efficiency, as is demonstrated herein.

Financing - Scheduling Optimization for Construction Projects by using Genetic Algorithms

Investment in a constructed facility represents a cost in the short term that returns benefits only over the long term use of the facility. Thus, the costs occur earlier than the benefits, and the owners of facilities must obtain the capital resources to finance the costs of construction. A project cannot proceed without an adequate financing, and the cost of providing an adequate financing can be quite large. For these reasons, the attention to the project finance is an important aspect of project management. Finance is also a concern to the other organizations involved in a project such as the general contractor and material suppliers. Unless an owner immediately and completely covers the costs incurred by each participant, these organizations face financing problems of their own. At a more general level, the project finance is the only one aspect of the general problem of corporate finance. If numerous projects are considered and financed together, then the net cash flow requirements constitute the corporate financing problem for capital investment. Whether project finance is performed at the project or at the corporate level does not alter the basic financing problem .In this paper, we will first consider facility financing from the owner's perspective, with due consideration for its interaction with other organizations involved in a project. Later, we discuss the problems of construction financing which are crucial to the profitability and solvency of construction contractors. The objective of this paper is to present the steps utilized to determine the best combination of minimum project financing. The proposed model considers financing; schedule and maximum net area .The proposed model is called Project Financing and Schedule Integration using Genetic Algorithms "PFSIGA". This model intended to determine more steps (maximum net area) for any project with a subproject. An illustrative example will demonstrate the feature of this technique. The model verification and testing are put into consideration.

Survey of Impact of Production and Adoption of Nanocrops on Food Security

Perspective of food security in 21 century showed shortage of food that production is faced to vital problem. Food security strategy is applied longtime method to assess required food. Meanwhile, nanotechnology revolution changes the world face. Nanotechnology is adequate method utilize of its characteristics to decrease environmental problems and possible further access to food for small farmers. This article will show impact of production and adoption of nanocrops on food security. Population is researchers of agricultural research center of Esfahan province. The results of study show that there was a relationship between uses, conversion, distribution, and production of nanocrops, operative human resources, operative circumstance, and constrains of usage of nanocrops and food security. Multivariate regression analysis by enter model shows that operative circumstance, use, production and constrains of usage of nanocrops had positive impact on food security and they determine in four steps 20 percent of it.

Augmenting Use Case View for Modeling

Mathematical, graphical and intuitive models are often constructed in the development process of computational systems. The Unified Modeling Language (UML) is one of the most popular modeling languages used by practicing software engineers. This paper critically examines UML models and suggests an augmented use case view with the addition of new constructs for modeling software. It also shows how a use case diagram can be enhanced. The improved modeling constructs are presented with examples for clarifying important design and implementation issues.

Quantification of Peptides based on Isotope Dilution Surface Enhanced Raman Scattering

This study aims to demonstrate the quantification of peptides based on isotope dilution surface enhanced Raman scattering (IDSERS). SERS spectra of phenylalanine (Phe), leucine (Leu) and two peptide sequences TGQIFK (T13) and YSFLQNPQTSLCFSESIPTPSNR (T6) as part of the 22-kDa human growth hormone (hGH) were obtained on Ag-nanoparticle covered substrates. On the basis of the dominant Phe and Leu vibrational modes, precise partial least squares (PLS) prediction models were built enabling the determination of unknown T13 and T6 concentrations. Detection of hGH in its physiological concentration in order to investigate the possibility of protein quantification has been achieved.

Influence of the Entropic Parameter on the Flow Geometry and Morphology

The necessity of updating the numerical models inputs, because of geometrical and resistive variations in rivers subject to solid transport phenomena, requires detailed control and monitoring activities. The human employment and financial resources of these activities moves the research towards the development of expeditive methodologies, able to evaluate the outflows through the measurement of more easily acquirable sizes. Recent studies highlighted the dependence of the entropic parameter on the kinematical and geometrical flow conditions. They showed a meaningful variability according to the section shape, dimension and slope. Such dependences, even if not yet well defined, could reduce the difficulties during the field activities, and also the data elaboration time. On the basis of such evidences, the relationships between the entropic parameter and the geometrical and resistive sizes, obtained through a large and detailed laboratory experience on steady free surface flows in conditions of macro and intermediate homogeneous roughness, are analyzed and discussed.

Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

Performance Analysis of Parallel Client-Server Model Versus Parallel Mobile Agent Model

Mobile agent has motivated the creation of a new methodology for parallel computing. We introduce a methodology for the creation of parallel applications on the network. The proposed Mobile-Agent parallel processing framework uses multiple Javamobile Agents. Each mobile agent can travel to the specified machine in the network to perform its tasks. We also introduce the concept of master agent, which is Java object capable of implementing a particular task of the target application. Master agent is dynamically assigns the task to mobile agents. We have developed and tested a prototype application: Mobile Agent Based Parallel Computing. Boosted by the inherited benefits of using Java and Mobile Agents, our proposed methodology breaks the barriers between the environments, and could potentially exploit in a parallel manner all the available computational resources on the network. This paper elaborates performance issues of a mobile agent for parallel computing.

The Use of Artificial Neural Network in Option Pricing: The Case of S and P 100 Index Options

Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.

Water Demand Prediction for Touristic Mecca City in Saudi Arabia using Neural Networks

Saudi Arabia is an arid country which depends on costly desalination plants to satisfy the growing residential water demand. Prediction of water demand is usually a challenging task because the forecast model should consider variations in economic progress, climate conditions and population growth. The task is further complicated knowing that Mecca city is visited regularly by large numbers during specific months in the year due to religious occasions. In this paper, a neural networks model is proposed to handle the prediction of the monthly and yearly water demand for Mecca city, Saudi Arabia. The proposed model will be developed based on historic records of water production and estimated visitors- distribution. The driving variables for the model include annuallyvarying variables such as household income, household density, and city population, and monthly-varying variables such as expected number of visitors each month and maximum monthly temperature.

Dynamic Modeling and Simulation of Heavy Paraffin Dehydrogenation Reactor for Selective Olefin Production in Linear Alkyl Benzene Production Plant

Modeling of a heterogeneous industrial fixed bed reactor for selective dehydrogenation of heavy paraffin with Pt-Sn- Al2O3 catalyst has been the subject of current study. By applying mass balance, momentum balance for appropriate element of reactor and using pressure drop, rate and deactivation equations, a detailed model of the reactor has been obtained. Mass balance equations have been written for five different components. In order to estimate reactor production by the passage of time, the reactor model which is a set of partial differential equations, ordinary differential equations and algebraic equations has been solved numerically. Paraffins, olefins, dienes, aromatics and hydrogen mole percent as a function of time and reactor radius have been found by numerical solution of the model. Results of model have been compared with industrial reactor data at different operation times. The comparison successfully confirms validity of proposed model.