Reference Architecture for Intelligent Enterprise Solutions

Data in IT systems in enterprises have been growing at phenomenal pace. This has provided opportunities to run analytics to gather intelligence on key business parameters that enable them to provide better products and services to customers. While there are several Artificial Intelligence/Machine Learning (AI/ML) and Business Intelligence (BI) tools and technologies available in marketplace to run analytics, there is a need for an integrated view when developing intelligent solutions in enterprises. This paper progressively elaborates a reference model for enterprise solutions, builds an integrated view of data, information and intelligence components and presents a reference architecture for intelligent enterprise solutions. Finally, it applies the reference architecture to an insurance organization. The reference architecture is the outcome of experience and insights gathered from developing intelligent solutions for several organizations.

Scientific Methods in Educational Management: The Metasystems Perspective

Although scientific methods have been the subject of a large number of papers, the term ‘scientific methods in educational management’ is still not well defined. In this paper, it is adopted the metasystems perspective to define the mentioned term and distinguish them from methods used in time of the scientific management and knowledge management paradigms. In our opinion, scientific methods in educational management rely on global phenomena, events, and processes and their influence on the educational organization. Currently, scientific methods in educational management are integrated with the phenomenon of globalization, cognitivisation, and openness, etc. of educational systems and with global events like the COVID-19 pandemic. Concrete scientific methods are nested in a hierarchy of more and more abstract models of educational management, which form the context of the global impact on education, in general, and learning outcomes, in particular. However, scientific methods can be assigned to a specific mission, strategy, or tactics of educational management of the concrete organization, either by the global management, local development of school organization, or/and development of the life-long successful learner. By accepting this assignment, the scientific method becomes a personal goal of each individual with the educational organization or the option to develop the educational organization at the global standards. In our opinion, in educational management, the scientific methods need to confine the scope to the deep analysis of concrete tasks of the educational system (i.e., teaching, learning, assessment, development), which result in concrete strategies of organizational development. More important are seeking the ways for dynamic equilibrium between the strategy and tactic of the planetary tasks in the field of global education, which result in a need for ecological methods of learning and communication. In sum, distinction between local and global scientific methods is dependent on the subjective conception of the task assignment, measurement, and appraisal. Finally, we conclude that scientific methods are not holistic scientific methods, but the strategy and tactics implemented in the global context by an effective educational/academic manager.

1/Sigma Term Weighting Scheme for Sentiment Analysis

Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.

Lamb Wave Wireless Communication in Healthy Plates Using Coherent Demodulation

Guided ultrasonic waves are used in Non-Destructive Testing and Structural Health Monitoring for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average bit error percentage. Results has shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.

Speedup Breadth-First Search by Graph Ordering

Breadth-First Search (BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improving the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes’ overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads.We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.

The Role of the Injured Party's Fault in the Apportionment of Damages in Tort Law: A Comparative-Historical Study between Common Law and Islamic Law

In order to understand the role of the injured party's fault in dividing liability, we studied its historical background. In common law, the traditional contributory negligence rule was a complete defense. Then the legislature and judicial procedure modified that rule to one of apportionment. In Islamic law, too, the Action rule was at first used when the injured party was the sole cause, but jurists expanded the scope of this rule, so this rule was used in cases where both the injured party's fault and that of the other party are involved. There are some popular approaches for apportionment of damages. Some common law countries like Britain had chosen ‘the causal potency approach’ and ‘fixed apportionment’. Islamic countries like Iran have chosen both ‘the relative blameworthiness’ and ‘equal apportionment’ approaches. The article concludes that both common law and Islamic law believe in the division of responsibility between a wrongdoer claimant and the defendant. In contrast, in the apportionment of responsibility, Islamic law mostly believes in equal apportionment that is way easier and saves time and money, but common law legal systems have chosen the causal potency approach which is more complicated than the rival approach but is fairer.

Adaptive Few-Shot Deep Metric Learning

Currently the most prevalent deep learning methods require a large amount of data for training, whereas few-shot learning tries to learn a model from limited data without extensive retraining. In this paper, we present a loss function based on triplet loss for solving few-shot problem using metric based learning. Instead of setting the margin distance in triplet loss as a constant number empirically, we propose an adaptive margin distance strategy to obtain the appropriate margin distance automatically. We implement the strategy in the deep siamese network for deep metric embedding, by utilizing an optimization approach by penalizing the worst case and rewarding the best. Our experiments on image recognition and co-segmentation model demonstrate that using our proposed triplet loss with adaptive margin distance can significantly improve the performance.

Introduction to Electron Spectroscopy for Surfaces Characterization

Spectroscopy is the study of the spectrum produced by the radiation-matter interaction which requires the study of electromagnetic radiation (or electrons) emitted, absorbed, or scattered by matter. Thus, the spectral analysis is using spectrometers which enables us to obtain curves that express the distribution of the energy emitted (spectrum). Analysis of emission spectra can therefore constitute several methods depending on the range of radiation energy. The most common methods used are Auger electron spectroscopy (AES) and Electron Energy Losses Spectroscopy (EELS), which allow the determination of the atomic structure on the surface. This paper focalized essentially on the Electron Energy Loss Spectroscopy.

The Applicability of Distillation as an Alternative Nuclear Reprocessing Method

A customized two-stage model has been developed to simulate, analyse, and visualize distillation of actinides as a useful alternative low-pressure separation method in the nuclear recycling cases. Under the most optimal conditions of idealized thermodynamic equilibrium stages and under total reflux of distillate the investigated cases of chloride systems for the separation of such actinides are (A) UCl4-CsCl-PuCl3 and (B) ThCl4-NaCl-PuCl3. Simulatively, uranium tetrachloride in case A is successfully separated by distillation into a six-stage distillation column, and thorium tetrachloride from case B into an eight-stage distillation column. For this, a permissible mole fraction value of 1E-06 has been assumed for the residual impurification degree. With further separation effort of eleven to seventeen required separation stages, the monochlorides of plutonium trichloride from both systems A and B are simulatively shown to be separated as high pure distillation products.

Lagrangian Flow Skeletons Captured in the Wake of a Swimming Nematode C. elegans Using an Immersed Boundary Fluid-Structure Interaction Approach

In this paper, Lagrangian coherent structure (LCS) concept is applied to wake flows generated in the up/down-stream of a swimming nematode C. elegans in an intermediate Re number range, i.e., 250-1200. It materializes Lagrangian hidden structures depicting flow transport barriers. To pursue the goals, nematode swimming in a quiescent fluid flow environment is numerically simulated by a two-way fluid-structure interaction (FSI) approach with the aid of immersed boundary method (IBM). In this regard, incompressible Navier-Stokes equations, fully-coupled with Lagrangian deformation equations for the immersed body, are solved using IB2d code. For all simulations, nematode’s body is modeled with a parametrized spring-fiber built-in case available in the computational code. Reverse von-Kármán vortex street formation and vortex shedding characteristics are studied and discussed in details via LCS approach, including grid resolution, integration time and Reynolds number effects. Results unveil presence of different flow regions with distinct fluid particle fates in the swimming animal’s wake and formation of so-called ‘mushroom-shaped’ structures in attracting LCS identities.

De Broglie Wavelength Defined by the Rest Energy E0 and Its Velocity

In this paper, we take a different approach to de Broglie wavelength, as we relate it to relativistic physics. The quantum energy of the photon radiated by a body with de Broglie wavelength, as it moves with velocity v, can be defined within relativistic physics by rest energy E₀. In this way, we can show the connection between the quantum of radiation energy of the body and the rest of energy E₀ and thus combine what has been incompatible so far, namely relativistic and quantum physics. So, here we discuss the unification of relativistic and quantum physics by introducing the factor k that is analog to the Lorentz factor in Einstein's theory of relativity.

Fast and Robust Long-term Tracking with Effective Searching Model

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Image Processing Approach for Detection of Three-Dimensional Tree-Rings from X-Ray Computed Tomography

Tree-ring analysis is an important part of the quality assessment and the dating of (archaeological) wood samples. It provides quantitative data about the whole anatomical ring structure, which can be used, for example, to measure the impact of the fluctuating environment on the tree growth, for the dendrochronological analysis of archaeological wooden artefacts and to estimate the wood mechanical properties. Despite advances in computer vision and edge recognition algorithms, detection and counting of annual rings are still limited to 2D datasets and performed in most cases manually, which is a time consuming, tedious task and depends strongly on the operator’s experience. This work presents an image processing approach to detect the whole 3D tree-ring structure directly from X-ray computed tomography imaging data. The approach relies on a modified Canny edge detection algorithm, which captures fully connected tree-ring edges throughout the measured image stack and is validated on X-ray computed tomography data taken from six wood species.

Study of Compatibility and Oxidation Stability of Vegetable Insulating Oils

The use of vegetable oil (or natural ester) as an insulating fluid in electrical transformers is a trend that aims to contribute to environmental preservation since it is biodegradable and non-toxic. Besides, vegetable oil has high flash and combustion points, being considered a fire safety fluid. However, vegetable oil is usually less stable towards oxidation than mineral oil. Both insulating fluids, mineral and vegetable oils, need to be tested periodically according to specific standards. Oxidation stability can be determined by the induction period measured by conductivity method (Rancimat) by monitoring the effectivity of oil’s antioxidant additives, a methodology already developed for food application and biodiesel but still not standardized for insulating fluids. Besides adequate oxidation stability, fluids must be compatible with transformer's construction materials under normal operating conditions to ensure that damage to the oil and parts of the transformer does not occur. ASTM standard and Brazilian normative differ in parameters evaluated, which reveals the need to regulate tests for each oil type. The aim of this study was to assess oxidation stability and compatibility of vegetable oils to suggest the best way to assure a viable performance of vegetable oil as transformer insulating fluid. The determination of the induction period for several vegetable insulating oils from the local market by using Rancimat was carried out according to BS EN 14112 standard, at different temperatures (110, 120, and 130 °C). Also, the compatibility of vegetable oil was assessed according to ASTM and ABNT NBR standards. The main results showed that the best temperature for use in the Rancimat test is 130 °C, which allows a better observation of conductivity change. The compatibility test results presented differences between vegetable and mineral oil standards that should be taken into account in oil testing since materials compatibility and oxidation stability are essential for equipment reliability.

The Canaanite Trade Network between the Shores of the Mediterranean Sea

The Canaanite civilization was one of the early great civilizations of the Near East, they influenced and been influenced from the civilizations of the ancient world especially the Egyptian and Mesopotamia civilizations. The development of the Canaanite trade started from the Chalcolithic Age to the Iron Age through the oldest trade route in the Middle East. This paper will focus on defining the Canaanites and from where did they come from and the meaning of the term Canaan and how the Ancient Manuscripts define the borders of the land of Canaan and this essay will describe the Canaanite trade route and their exported goods such as cedar wood, and pottery.

Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud

In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.

Towards End-To-End Disease Prediction from Raw Metagenomic Data

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Environmental Study on Urban Disinfection Using an On-site Generation System

In this experimental study, the behaviors of Mixed Oxidant solution components (MOS) and sodium hypochlorite (HYPO) as the most commonly applied surface disinfectant were compared through the effectiveness of chlorine disinfection as a function of the contact time and residual chlorine. In this regard, the variation of pH, free available chlorine (FAC) concentration, and electric conductivity (EC) of disinfection solutions in different concentrations were monitored over 48 h contact time. In parallel, the plant stress activated by chlorine-based disinfectants was assessed by comparing MOS and HYPO. The elements of pH and EC in the plant-soil and their environmental impacts, spread by disinfection solutions were analyzed through several concentrations of FAC including 500 mg/L, 1000 mg/L, and 5000 mg/L in irrigated water. All the experiments were carried out at the service station of Sant Cugat, Spain. The outcomes indicated lower pH and higher durability of MOS than HYPO at the same concentration of FAC which resulted in promising stability of FAC within MOS. Furthermore, the pH and EC value of plant-soil irrigated by NaOCl solution were higher than that of MOS solution at the same FAC concentration. On-site generation of MOS as a safe chlorination option might be considered an imaginary future of smart cities.

Effective Leadership in the Engineering, Technology, and Construction Industry

This paper explores what effective leadership is being employed in the engineering, technology, and construction (ETC) industry. Organizations need to understand what character traits are being used and what leadership styles work to promote sustainability and improve the triple bottom line. This paper looks at multiple publications on leadership and character traits effective for managers and leaders in the ETC industry. The ETC industry is a trillion-dollar industry, and understanding ways to improve leadership is vital for organizations' successful outcomes. With improvements to the managerial and leadership, there could be ways for organizations to profit more and cut down on cost costs. Finding ways to improve motivation can help organizations improve safety, improve culture, and increase employee motivation. From the research, this paper has found that situational leadership, transformational, and transactional are the most effective leadership styles that individuals can use in the ETC industry for leadership. Character traits that are the most effective have been identified in this research paper. This research has contributed to the ways individuals who start in the engineering and technology industry can improve upon their leadership skills as they are promoted into managerial and leadership roles. The need for managerial positions in the ETC industry, such as project and construction managers, to improve is vital for successful outcomes and creating a high-level performance. The study helps provide a gap in the limited research available to improve ETC leadership for all organizations' present and future.

Mnemotopic Perspectives: Communication Design as Stabilizer for the Memory of Places

The ancestral relationship between humans and geographical environment has long been at the center of an interdisciplinary dialogue, which sees one of its main research nodes in the relationship between memory and places. Given its deep complexity, this symbiotic connection continues to look for a proper definition that appears increasingly negotiated by different disciplines. Numerous fields of knowledge are involved, from anthropology to semiotics of space, from photography to architecture, up to subjects traditionally far from these reasonings. This is the case of Design of Communication, a young discipline, now confident in itself and its objectives, aimed at finding and investigating original forms of visualization and representation, between sedimented knowledge and new technologies. In particular, Design of Communication for the Territory offers an alternative perspective to the debate, encouraging the reactivation and reconstruction of the memory of places. Recognizing mnemotopes as a cultural object of vertical interpretation of the memory-place relationship, design can become a real mediator of the territorial fixation of memories, making them increasingly accessible and perceptible, contributing to build a topography of memory. According to a mnemotopic vision, Communication Design can support the passage from a memory in which the observer participates only as an individual to a collective form of memory. A mnemotopic form of Communication Design can, through geolocation and content map-based systems, make chronology a topography rooted in the territory and practicable; it can be useful to understand how the perception of the memory of places changes over time, considering how to insert them in the contemporary world. Mnemotopes can be materialized in different format of translation, editing and narration and then involved in complex systems of communication. The memory of places, therefore, if stabilized by the tools offered by Communication Design, can make visible ruins and territorial stratifications, illuminating them with new communicative interests that can be shared and participated.