Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops

Background: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, for which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. Objective: This article tried to optimize the layout of a troops’ cafeteria and to improve the overall efficiency of the dining process. Methods: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. Results: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p

Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data

The primary tool currently used to pre-process 10X chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.

Towards End-To-End Disease Prediction from Raw Metagenomic Data

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Exploring the Perspective of Service Quality in mHealth Services during the COVID-19 Pandemic

The impact of COVID-19 has a significant effect on all sectors of society globally. Health information technology (HIT) has become an effective health strategy in this age of distancing. In this regard, Mobile Health (mHealth) plays a critical role in managing patient and provider workflows during the COVID-19 pandemic. Therefore, the users' perception of service quality about mHealth services plays a significant role in shaping confidence and subsequent behaviors regarding the mHealth users' intention of use. This study's objective was to explore levels of user attributes analyzed by a qualitative method of how health practitioners and patients are satisfied or dissatisfied with using mHealth services; and analyzed the users' intention in the context of Taiwan during the COVID-19 pandemic. This research explores the experienced usability of a mHealth services during the Covid-19 pandemic. This study uses qualitative methods that include in-depth and semi-structured interviews that investigate participants' perceptions and experiences and the meanings they attribute to them. The five cases consisted of health practitioners, clinic staff, and patients' experiences using mHealth services. This study encourages participants to discuss issues related to the research question by asking open-ended questions, usually in one-to-one interviews. The findings show the positive and negative attributes of mHealth service quality. Hence, the significant importance of patients' and health practitioners' issues on several dimensions of perceived service quality is system quality, information quality, and interaction quality. A concept map for perceptions regards to emergency uses' intention of mHealth services process is depicted. The findings revealed that users pay more attention to "Medical care", "ease of use" and "utilitarian benefits" and have less importance for "Admissions and Convenience" and "Social influence". To improve mHealth services, the mHealth providers and health practitioners should better manage users' experiences to enhance mHealth services. This research contributes to the understanding of service quality issues in mHealth services during the COVID-19 pandemic.

Risk Assessment of Building Information Modelling Adoption in Construction Projects

Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.

Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Scheduling Multiple Workflow Using De-De Dodging Algorithm and PBD Algorithm in Cloud: Detailed Study

Workflow scheduling is an important part of cloud computing and based on different criteria it decides cost, execution time, and performances. A cloud workflow system is a platform service facilitating automation of distributed applications based on new cloud infrastructure. An aspect which differentiates cloud workflow system from others is market-oriented business model, an innovation which challenges conventional workflow scheduling strategies. Time and Cost optimization algorithm for scheduling Hybrid Clouds (TCHC) algorithm decides which resource should be chartered from public providers is combined with a new De-De algorithm considering that every instance of single and multiple workflows work without deadlocks. To offset this, two new concepts - De-De Dodging Algorithm and Priority Based Decisive Algorithm - combine with conventional deadlock avoidance issues by proposing one algorithm that maximizes active (not just allocated) resource use and reduces Makespan.

Producing Graphical User Interface from Activity Diagrams

Graphical User Interface (GUI) is essential to programming, as is any other characteristic or feature, due to the fact that GUI components provide the fundamental interaction between the user and the program. Thus, we must give more interest to GUI during building and development of systems. Also, we must give a greater attention to the user who is the basic corner in the dealing with the GUI. This paper introduces an approach for designing GUI from one of the models of business workflows which describe the workflow behavior of a system, specifically through Activity Diagrams (AD).

A Collaborative Platform for Multilingual Ontology Development

Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper, we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.

Simulation Model of an Ultra-Light Overhead Conveyor System; Analysis of the Process in the Warehouse

Ultra-light overhead conveyor systems are rope-based conveying systems with individually driven vehicles. The vehicles can move automatically on the rope and this can be realized by energy and signals. The ultra-light overhead conveyor systems always must be integrated with a logistical process by finding a best way for a cheaper material flow in order to guarantee precise and fast workflows. This paper analyzes the process of an ultra-light overhead conveyor system using necessary assumptions. The analysis consists of three scenarios. These scenarios are based on raising the vehicle speeds with equal increments at each case. The correlation between the vehicle speed and system throughput is investigated. A discrete-event simulation model of an ultra-light overhead conveyor system is constructed using DOSIMIS-3 software to implement three scenarios. According to simulation results; the optimal scenario, hence the optimal vehicle speed, is found out among three scenarios. This simulation model demonstrates the effect of increased speed on the system throughput.

Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies

Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.

Adaptive Algorithm to Predict the QoS of Web Processes and Workflows

Workflow Management Systems (WfMS) alloworganizations to streamline and automate business processes and reengineer their structure. One important requirement for this type of system is the management and computation of the Quality of Service(QoS) of processes and workflows. Currently, a range of Web processes and workflow languages exist. Each language can be characterized by the set of patterns they support. Developing andimplementing a suitable and generic algorithm to compute the QoSof processes that have been designed using different languages is a difficult task. This is because some patterns are specific to particular process languages and new patterns may be introduced in future versions of a language. In this paper, we describe an adaptive algorithm implemented to cope with these two problems. The algorithm is called adaptive since it can be dynamically changed as the patterns of a process language also change.

Two Scenarios for Ultra-Light Overhead Conveyor System in Logistics Applications

Overhead conveyor systems are in use in many installations around the world, meeting the widest range of applications possible. Overhead conveyor systems are particularly preferred in automotive industry but also at post offices. Overhead conveyor systems must always be integrated with a logistical process by finding the best way for a cheaper material flow in order to guarantee precise and fast workflows. With their help, any transport can take place without wasting ground and space, without excessive company capacity, lost or damaged products, erroneous delivery, endless travels and without wasting time. Ultra-light overhead conveyor systems are rope-based conveying systems with individually driven vehicles. The vehicles can move automatically on the rope and this can be realized by energy and signals. Crossings are realized by switches. Ultra-light overhead conveyor systems provide optimal material flow, which produces profit and saves time. This article introduces two new ultra-light overhead conveyor designs in logistics and explains their components. According to the explanation of the components, scenarios are created by means of their technical characteristics. The scenarios are visualized with the help of CAD software. After that, assumptions are made for application area. According to these assumptions scenarios are visualized. These scenarios help logistics companies achieve lower development costs as well as quicker market maturity.

A Model of Technological Platform for the Knowledge Management Organization

This paper describes an experience of research, development and innovation applied in Industrial Naval at (Science and Technology Corporation for the Development of Shipbuilding Industry, Naval in Colombia (COTECMAR) particularly through processes of research, innovation and technological development, based on theoretical models related to organizational knowledge management, technology management and management of human talent and integration of technology platforms. It seeks ways to facilitate the initial establishment of environments rich in information, knowledge and content-supported collaborative strategies on dynamic processes missionary, seeking further development in the context of research, development and innovation of the Naval Engineering in Colombia, making it a distinct basis for the generation of knowledge assets from COTECMAR. The integration of information and communication technologies, supported on emerging technologies (mobile technologies, wireless, digital content via PDA, and content delivery services on the Web 2.0 and Web 3.0) as a view of the strategic thrusts in any organization facilitates the redefinition of processes for managing information and knowledge, enabling the redesign of workflows, the adaptation of new forms of organization - preferably in networking and support the creation of symbolic-inside-knowledge promotes the development of new skills, knowledge and attitudes of the knowledge worker

The Design of the HL7 RIM-based Sharing Components for Clinical Information Systems

The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.

The Open Knowledge Kernel

Web services are pieces of software that can be invoked via a standardized protocol. They can be combined via formalized taskflow languages. The Open Knowledge system is a fully distributed system using P2P technology, that allows users to publish the setaskflows, and programmers to register their web services or publish implementations of them, for the roles described in these workflows.Besides this, the system offers the functionality to select a peer that could coordinate such an interaction model and inform web services when it is their 'turn'. In this paper we describe the architecture and implementation of the Open Knowledge Kernel which provides the core functionality of the Open Knowledge system.

Software Engineering Interoperable Environment for University Process Workflow and Document Management

The objective of the research was focused on the design, development and evaluation of a sustainable web based network system to be used as an interoperable environment for University process workflow and document management. In this manner the most of the process workflows in Universities can be entirely realized electronically and promote integrated University. Definition of the most used University process workflows enabled creating electronic workflows and their execution on standard workflow execution engines. Definition or reengineering of workflows provided increased work efficiency and helped in having standardized process through different faculties. The concept and the process definition as well as the solution applied as Case study are evaluated and findings are reported.

TSM: A Design Pattern to Make Ad-hoc BPMs Easy and Inexpensive in Workflow-aware MISs

Despite so many years- development, the mainstream of workflow solutions from IT industries has not made ad-hoc workflow-support easy or inexpensive in MIS. Moreover, most of academic approaches tend to make their resulted BPM (Business Process Management) more complex and clumsy since they used to necessitate modeling workflow. To cope well with various ad-hoc or casual requirements on workflows while still keeping things simple and inexpensive, the author puts forth first the TSM design pattern that can provide a flexible workflow control while minimizing demand of predefinitions and modeling workflow, which introduces a generic approach for building BPM in workflow-aware MISs (Management Information Systems) with low development and running expenses.

Understanding and Designing Situation-Aware Mobile and Ubiquitous Computing Systems

Using spatial models as a shared common basis of information about the environment for different kinds of contextaware systems has been a heavily researched topic in the last years. Thereby the research focused on how to create, to update, and to merge spatial models so as to enable highly dynamic, consistent and coherent spatial models at large scale. In this paper however, we want to concentrate on how context-aware applications could use this information so as to adapt their behavior according to the situation they are in. The main idea is to provide the spatial model infrastructure with a situation recognition component based on generic situation templates. A situation template is – as part of a much larger situation template library – an abstract, machinereadable description of a certain basic situation type, which could be used by different applications to evaluate their situation. In this paper, different theoretical and practical issues – technical, ethical and philosophical ones – are discussed important for understanding and developing situation dependent systems based on situation templates. A basic system design is presented which allows for the reasoning with uncertain data using an improved version of a learning algorithm for the automatic adaption of situation templates. Finally, for supporting the development of adaptive applications, we present a new situation-aware adaptation concept based on workflows.

Virtual Learning Process Environment: Cohort Analytics for Learning and Learning Processes

Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.