Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

A β-mannanase from Fusarium oxysporum SS-25 via Solid State Fermentation on Brewer’s Spent Grain: Medium Optimization by Statistical Tools, Kinetic Characterization and Its Applications

This study is concerned with the optimization of fermentation parameters for the hyper production of mannanase from Fusarium oxysporum SS-25 employing two step statistical strategy and kinetic characterization of crude enzyme preparation. The Plackett-Burman design used to screen out the important factors in the culture medium revealed 20% (w/w) wheat bran, 2% (w/w) each of potato peels, soyabean meal and malt extract, 1% tryptone, 0.14% NH4SO4, 0.2% KH2PO4, 0.0002% ZnSO4, 0.0005% FeSO4, 0.01% MnSO4, 0.012% SDS, 0.03% NH4Cl, 0.1% NaNO3 in brewer’s spent grain based medium with 50% moisture content, inoculated with 2.8×107 spores and incubated at 30oC for 6 days to be the main parameters influencing the enzyme production. Of these factors, four variables including soyabean meal, FeSO4, MnSO4 and NaNO3 were chosen to study the interactive effects and their optimum levels in central composite design of response surface methodology with the final mannanase yield of 193 IU/gds. The kinetic characterization revealed the crude enzyme to be active over broader temperature and pH range. This could result in 26.6% reduction in kappa number with 4.93% higher tear index and 1% increase in brightness when used to treat the wheat straw based kraft pulp. The hydrolytic potential of enzyme was also demonstrated on both locust bean gum and guar gum.

Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Retrieval of User Specific Images Using Semantic Signatures

Image search engines rely on the surrounding textual keywords for the retrieval of images. It is a tedious work for the search engines like Google and Bing to interpret the user’s search intention and to provide the desired results. The recent researches also state that the Google image search engines do not work well on all the images. Consequently, this leads to the emergence of efficient image retrieval technique, which interprets the user’s search intention and shows the desired results. In order to accomplish this task, an efficient image re-ranking framework is required. Sequentially, to provide best image retrieval, the new image re-ranking framework is experimented in this paper. The implemented new image re-ranking framework provides best image retrieval from the image dataset by making use of re-ranking of retrieved images that is based on the user’s desired images. This is experimented in two sections. One is offline section and other is online section. In offline section, the reranking framework studies differently (reference classes or Semantic Spaces) for diverse user query keywords. The semantic signatures get generated by combining the textual and visual features of the images. In the online section, images are re-ranked by comparing the semantic signatures that are obtained from the reference classes with the user specified image query keywords. This re-ranking methodology will increases the retrieval image efficiency and the result will be effective to the user.

Facebook Spam and Spam Filter Using Artificial Neural Networks

Spam is any unwanted electronic message or material in any form posted too many people. As the world is growing as global world, social networking sites play an important role in making world global providing people from different parts of the world a platform to meet and express their views. Among different social networking sites Facebook become the leading one. With increase in usage different users start abusive use of Facebook by posting or creating ways to post spam. This paper highlights the potential spam types nowadays Facebook users’ faces. This paper also provide the reason how user become victim to spam attack. A methodology is proposed in the end discusses how to handle different types of spam.

Does Material Choice Drive Sustainability of 3D Printing?

Environmental impacts of six 3D printers using various materials were compared to determine if material choice drove sustainability, or if other factors such as machine type, machine size, or machine utilization dominate. Cradle-to-grave life-cycle assessments were performed, comparing a commercial-scale FDM machine printing in ABS plastic, a desktop FDM machine printing in ABS, a desktop FDM machine printing in PET and PLA plastics, a polyjet machine printing in its proprietary polymer, an SLA machine printing in its polymer, and an inkjet machine hacked to print in salt and dextrose. All scenarios were scored using ReCiPe Endpoint H methodology to combine multiple impact categories, comparing environmental impacts per part made for several scenarios per machine. Results showed that most printers’ ecological impacts were dominated by electricity use, not materials, and the changes in electricity use due to different plastics was not significant compared to variation from one machine to another. Variation in machine idle time determined impacts per part most strongly. However, material impacts were quite important for the inkjet printer hacked to print in salt: In its optimal scenario, it had up to 1/38th the impacts coreper part as the worst-performing machine in the same scenario. If salt parts were infused with epoxy to make them more physically robust, then much of this advantage disappeared, and material impacts actually dominated or equaled electricity use. Future studies should also measure DMLS and SLS processes / materials.

Pilot Scale Production and Compatibility Criteria of New Self-Cleaning Materials

The paper involves a chain of activities from synthesis, establishment of the methodology for characterization and testing of novel protective materials through the pilot production and application on model supports. It summarizes the results regarding the development of the pilot production protocol for newly developed self-cleaning materials. The optimization of the production parameters was completed in order to improve the most important functional properties (mineralogy characteristics, particle size, self-cleaning properties and photocatalytic activity) of the newly designed nanocomposite material.

Microwave-Assisted Alginate Extraction from Portuguese Saccorhiza polyschides – Influence of Acid Pretreatment

Brown seaweeds are abundant in Portuguese coastline and represent an almost unexploited marine economic resource. One of the most common species, easily available for harvesting in the northwest coast, is Saccorhiza polyschides grows in the lowest shore and costal rocky reefs. It is almost exclusively used by local farmers as natural fertilizer, but contains a substantial amount of valuable compounds, particularly alginates, natural biopolymers of high interest for many industrial applications. Alginates are natural polysaccharides present in cell walls of brown seaweed, highly biocompatible, with particular properties that make them of high interest for the food, biotechnology, cosmetics and pharmaceutical industries. Conventional extraction processes are based on thermal treatment. They are lengthy and consume high amounts of energy and solvents. In recent years, microwave-assisted extraction (MAE) has shown enormous potential to overcome major drawbacks that outcome from conventional plant material extraction (thermal and/or solvent based) techniques, being also successfully applied to the extraction of agar, fucoidans and alginates. In the present study, acid pretreatment of brown seaweed Saccorhiza polyschides for subsequent microwave-assisted extraction (MAE) of alginate was optimized. Seaweeds were collected in Northwest Portuguese coastal waters of the Atlantic Ocean between May and August, 2014. Experimental design was used to assess the effect of temperature and acid pretreatment time in alginate extraction. Response surface methodology allowed the determination of the optimum MAE conditions: 40 mL of HCl 0.1 M per g of dried seaweed with constant stirring at 20ºC during 14h. Optimal acid pretreatment conditions have enhanced significantly MAE of alginates from Saccorhiza polyschides, thus contributing for the development of a viable, more environmental friendly alternative to conventional processes.

Determining a Suitable Maintenance Measure for Gentelligent Components Using Case-Based Reasoning

Components with sensory properties such as gentelligent components developed at the Collaborative Research Centre 653 offer a new angle in terms of the full utilization of the remaining service life as well as preventive maintenance. The developed methodology of component status driven maintenance analyzes the stress data obtained during the component's useful life and on the basis of this knowledge assesses the type of maintenance required in this case. The procedure is derived from the case-based reasoning method and will be explained in detail. The method's functionality is demonstrated with real-life data obtained during test runs of a racing car prototype.

Impact of Masonry Joints on Detection of Humidity Distribution in Aerated Concrete Masonry Constructions by Electric Impedance Spectrometry Measurements

Aerated concrete is a load bearing construction material, which has high heat insulation parameters. Walls can be erected from aerated concrete masonry constructions and in perfect circumstances additional heat insulation is not required. The most common problem in aerated concrete heat insulation properties is the humidity distribution throughout the cross section of the masonry elements as well as proper and conducted drying process of the aerated concrete construction because only dry aerated concrete masonry constructions can reach high heat insulation parameters. In order to monitor drying process of the masonry and detect humidity distribution throughout the cross section of aerated concrete masonry construction application of electrical impedance spectrometry is applied. Further test results and methodology of this non-destructive testing method is described in this paper.

A Collaborative Platform for Multilingual Ontology Development

Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper, we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.

Factors That Affect the Effectiveness of Enterprise Architecture Implementation Methodology

Enterprise Architecture (EA) is a strategy that is employed by enterprises in order to align their business and Information Technology (IT). EA is managed, developed, and maintained through Enterprise Architecture Implementation Methodology (EAIM). Effectiveness of EA implementation is the degree in which EA helps to achieve the collective goals of the organization. This paper analyzes the results of a survey that aims to explore the factors that affect the effectiveness of EAIM and specifically the relationship between factors and effectiveness of the output and functionality of EA project. The exploratory factor analysis highlights a specific set of five factors: alignment, adaptiveness, support, binding, and innovation. The regression analysis shows that there is a statistically significant and positive relationship between each of the five factors and the effectiveness of EAIM. Consistent with theory and practice, the most prominent factor for developing an effective EAIM is innovation. The findings contribute to the measuring the effectiveness of EA implementation project by providing an indication of the measurement implementation approaches which is used by the Enterprise Architects, and developing an effective EAIM.

A Framework for Evaluation of Enterprise Architecture Implementation Methodologies

Enterprise Architecture (EA) Implementation Methodologies have become an important part of EA projects. Several implementation methodologies have been proposed, as a theoretical and practical approach, to facilitate and support the development of EA within an enterprise. A significant question when facing the starting of EA implementation is deciding which methodology to utilize. In order to answer this question, a framework with several criteria is applied in this paper for the comparative analysis of existing EA implementation methodologies. Five EA implementation methodologies including: EAP, TOGAF, DODAF, Gartner, and FEA are selected in order to compare with proposed framework. The results of the comparison indicate that those methodologies have not reached a sufficient maturity as whole due to lack of consideration on requirement management, maintenance, continuum, and complexities in their process. The framework has also ability for the evaluation of any kind of EA implementation methodologies.

From Traditional to Applied: A Case Study in Industrial Engineering Curriculum

Applied industrial engineering is concerned with imparting employable skills to improve the productivity for current situation of products and services. The purpose of this case study is to present the results of an initial research study conducted to identify the desired professional characteristics of an industrial engineer with an undergraduate degree and the emerging topic areas that should be incorporated into the curriculum to prepare industrial engineering (IE) graduates for the future workforce. Conclusions and recommendations for applied industrial engineering syllabus have been gathered and reported below. A two-pronged approach was taken which included a method of benchmarking by comparing the applied industrial engineering curricula of various universities and an industry survey to identify job market requirements. This methodology produced an analysis of the changing nature of industrial engineering from learning to practical education. A curriculum study for engineering is a relatively unexplored area of research in the Middle East, much less for applied industrial engineering. This work is an effort to bridge the gap between theoretical study in the classroom and the real world work applications in the industrial and service sectors.

Apoptosis Inspired Intrusion Detection System

Artificial Immune Systems (AIS), inspired by the human immune system, are algorithms and mechanisms which are self-adaptive and self-learning classifiers capable of recognizing and classifying by learning, long-term memory and association. Unlike other human system inspired techniques like genetic algorithms and neural networks, AIS includes a range of algorithms modeling on different immune mechanism of the body. In this paper, a mechanism of a human immune system based on apoptosis is adopted to build an Intrusion Detection System (IDS) to protect computer networks. Features are selected from network traffic using Fisher Score. Based on the selected features, the record/connection is classified as either an attack or normal traffic by the proposed methodology. Simulation results demonstrates that the proposed AIS based on apoptosis performs better than existing AIS for intrusion detection.

On Supporting a Meta-design Approach in Socio-Technical Ontology Engineering

Many studies have revealed the fact of the complexity of ontology building process. Therefore there is a need for a new approach which one of that addresses the socio-technical aspects in the collaboration to reach a consensus. Meta-design approach is considered applicable as a method in the methodological model of socio-technical ontology engineering. Principles in the meta-design framework are applied in the construction phases of the ontology. A web portal is developed to support the meta-design principles requirements. To validate the methodological model semantic web applications were developed and integrated in the portal and also used as a way to show the usefulness of the ontology. The knowledge based system will be filled with data of Indonesian medicinal plants. By showing the usefulness of the developed ontology in a semantic web application, we motivate all stakeholders to participate in the development of knowledge based system of medicinal plants in Indonesia.

Statistical Optimization of Medium Components for Biomass Production of Chlorella pyrenoidosa under Autotrophic Conditions and Evaluation of Its Biochemical Composition under Stress Conditions

The aim of the present work was to statistically design an autotrophic medium for maximum biomass production by Chlorella pyrenoidosa using response surface methodology. After evaluating one factor at a time approach, K2HPO4, KNO3, MgSO4.7H2O and NaHCO3 were preferred over the other components of the fog’s medium as most critical autotrophic medium components. The study showed that the maximum biomass yield was achieved while the concentrations of MgSO4.7H2O, K2HPO4, KNO3 and NaHCO3 were 0.409 g/L, 0.24 g/L, 1.033 g/L, and 3.265 g/L, respectively. The study reported that the biomass productivity of C. pyrenoidosa improved from 0.14 g/L in defined fog’s medium to 1.40 g/L in modified fog’s medium resulting 10 fold increase. The biochemical composition biosynthesis of C. pyrenoidosa was altered using nitrogen limiting stress bringing about 5.23 fold increase in lipid content than control (cell without stress), as analyzed by FTIR integration method.

Application of GAMS and GA in the Location and Penetration of Distributed Generation

Distributed Generation (DG) can help in reducing the cost of electricity to the costumer, relieve network congestion and provide environmentally friendly energy close to load centers. Its capacity is also scalable and it provides voltage support at distribution level. Hence, DG placement and penetration level is an important problem for both the utility and DG owner. DG allocation and capacity determination is a nonlinear optimization problem. The objective function of this problem is the minimization of the total loss of the distribution system. Also high levels of penetration of DG are a new challenge for traditional electric power systems. This paper presents a new methodology for the optimal placement of DG and penetration level of DG in distribution system based on General Algebraic Modeling System (GAMS) and Genetic Algorithm (GA).

Development of a Weed Suppression Robot for Rice Cultivation: Weed Suppression and Posture Control

Weed suppression and weeding are necessary measures for rice cultivation. Weed suppression precedes the process of weeding. It means suppressing the growth of young weeds and creating a weed-less environment. If we suppress the growth of weeds, we can reduce the number of weeds in a paddy field. This would result in a reduction of the weeding work load. In this paper, we will show how we developed a weed suppression robot for the purpose of reducing the weeding work load. The robot has a laser range finder for autonomous mobility and a robot arm for weed suppression. It travels along the rice rows without stepping on and injuring the rice plants in a paddy field. The robot arm applies force to the weed seedlings and thereby suppresses the growth of weeds. This paper will explain the methodology of the autonomous mobile, the experiment in weed suppression, and the method of controlling the robot’s posture on uneven ground.

Energy Interaction among HVAC and Supermarket Environment

Supermarkets are the most electricity-intensive type of commercial buildings. The unsuitable indoor environment of a supermarket provided by abnormal HVAC operations incurs waste energy consumption in refrigeration systems. This current study briefly describes significantly solid backgrounds and proposes easyto- use analysis terminology for investigating the impact of HVAC operations on refrigeration power consumption using the field-test data obtained from building automation system (BAS). With solid backgrounds and prior knowledge, expected energy interactions between HVAC and refrigeration systems are proposed through Pearson’s correlation analysis (R value) by considering correlations between equipment power consumption and dominantly independent variables (driving force conditions).The R value can be conveniently utilized to evaluate how strong relations between equipment operations and driving force parameters are. The calculated R values obtained from field data are compared to expected ranges of R values computed by energy interaction methodology. The comparisons can separate the operational conditions of equipment into faulty and normal conditions. This analysis can simply investigate the condition of equipment operations or building sensors because equipment could be abnormal conditions due to routine operations or faulty commissioning processes in field tests. With systematically solid and easy-to-use backgrounds of interactions provided in the present article, the procedures can be utilized as a tool to evaluate the proper commissioning and routine operations of HVAC and refrigeration systems to detect simple faults (e.g. sensors and driving force environment of refrigeration systems and equipment set-point) and optimize power consumption in supermarket buildings. Moreover, the analysis will be used to further study the FDD research for supermarkets in future.