Random Subspace Neural Classifier for Meteor Recognition in the Night Sky

This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.

Identifying Chaotic Architecture: Origins of Nonlinear Design Theory

Through the emergence of modern architecture, an aggressive desire for new design theories appeared through the works of architects and critics. The discourse of complexity and volumetric composition happened to be an important and controversial issue in the discipline of architecture which was discussed through a general point of view in Robert Venturi and Denise Scott Brown's book “Complexity and contradiction in architecture” in 1966, this paper attempts to identify chaos theory as a scientific model of complexity and its relation to architecture design theory by conducting a qualitative analysis and multidisciplinary critical approach through architecture and basic sciences resources. Accordingly, we identify chaotic architecture as the correlation between chaos theory and the discipline of architecture, and as an independent nonlinear design theory with specific characteristics and properties.

Improving the Software Homologation Process through Peer Review: An Experience Report on Android Development Environment

In the current technological market environment, ensuring the quality of new products has become a complex challenge. In this scenario, companies have been investing in solutions that aim to reduce the execution time of software testing and lead to cost efficiency. However, companies that have a complex and specialized testing environment usually face barriers related to costly testing processes, especially in distributed settings. Sidia Institute of Technology works on research and development for the Android platform for mobile devices in Latin America. As we work in a global software development (GSD) scope, we have faced barriers caused by failures detected lately that have caused delays in the homologation release process on Android projects. Thus, we adopt an Internal Review process, using as an alternative to reduce these failures. In this paper it was presented the experience of a homologation team adopting an Internal Review process in order to increase the performance through of improving test efficiency. Using this approach, it was possible to realize a substantial improvement in quality, reliability and timeliness of our deliveries. Through the quantitative analyses, it was possible identify a positive growth in homologation efficiency of 6% after adoption of the process. In addition, we performed a qualitative analysis from the collected data through an online questionnaire. In particular, results show that association between failure reduction and review process adoption provides the most quality that has a positive effect on project milestones. We hope this report can be helpful to other companies and the scientific community to improve their process thereby increasing competitive advantages.

Readiness of Intellectual Capital Measurement: A Review of the Property Development and Investment Industry

In the knowledge economy, the financial indicator is not the unique instrument to gauge the performance of a company. The role of intellectual capital contributing to the company performance is increasing. To measure the company performance due to intellectual capital, the value-added intellectual capital (VAIC) model is adopted to measure the intellectual capital utilization efficiency of the subject companies. The purpose of this study is to review the readiness of measuring intellectual capital for the Hong Kong listed companies in the property development and property investment industry by using VAIC model. This study covers the financial reports from the representative Hong Kong listed property development companies and property investment companies in the period 2014-2019. The findings from this study indicated the industry is ready for IC measurement employing VAIC framework but not yet ready for using the extended VAIC model.

Improved BEENISH Protocol for Wireless Sensor Networks Based Upon Fuzzy Inference System

The main design parameter of WSN (wireless sensor network) is the energy consumption. To compensate this parameter, hierarchical clustering is a technique that assists in extending duration of the networks life by efficiently consuming the energy. This paper focuses on dealing with the WSNs and the FIS (fuzzy interface system) which are deployed to enhance the BEENISH protocol. The node energy, mobility, pause time and density are considered for the selection of CH (cluster head). The simulation outcomes exhibited that the projected system outperforms the traditional system with regard to the energy utilization and number of packets transmitted to sink.

Exploring the Challenges to Usage of Building and Construction Cost Indices in Ghana

Price fluctuation contract is imperative and of paramount essence in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to usage of building construction cost indices in Ghana. Data were gathered from contractors and quantity surveying firms. The study utilized survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered were analyzed scientifically, using the Relative Importance Index (RII) to rank the problems associated with the existing methods. The findings revealed the following among others: late release of data; inadequate recovery of costs; and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provided useful lessons for policy makers and practitioners in decision making towards the usage and improvement of available indices.

Wildfires Assessed by Remote Sense Images and Burned Land Monitoring

The tools described in this paper enable the location of burned areas where took place the annihilation of natural habitats and establishes a baseline for major changes in forest ecosystems during recovery. Moreover, the result allows the follow up of the surface fuel loading, allowing the evaluation and guidance of restoration measures to remote areas by phased time planning. This case study implements the evaluation of burned areas that suffered successive wildfires in Portugal mainland during the summer of 2017, killing more than 60 people. The goal is to show that this evaluation can be done with remote sense data free of charges in a simple laptop, with open-source software, describing the not-so-simple methodology step by step, to make it accessible for local workers in the areas attained, where the availability of information is essential for the immediate planning of mitigation measures, such as restoring road access, allocate funds for the recovery of human dwellings and assess further needs for restoration of the ecological system. Wildfires also devastate forest ecosystems having a direct impact on vegetation cover and killing or driving away the animal population, besides loss of all crops in rural areas that are essential as local resources. The economic interests are also attained, as the pinewood burned becomes useless for the noblest applications, so its value decreases, and resin extraction ends for several years.

Injury Prediction for Soccer Players Using Machine Learning

Injuries in professional sports occur on a regular basis. Some may be minor while others can cause huge impact on a player’s career and earning potential. In soccer, there is a high risk of players picking up injuries during game time. This research work seeks to help soccer players reduce the risk of getting injured by predicting the likelihood of injury while playing in the near future and then providing recommendations for intervention. The injury prediction tool will use a soccer player’s number of minutes played on the field, number of appearances, distance covered and performance data for the current and previous seasons as variables to conduct statistical analysis and provide injury predictive results using a machine learning linear regression model.

A Timed and Colored Petri Nets for Modeling and Verifying Cloud System Elasticity

Elasticity is the essential property of cloud computing. As the name suggests, it constitutes the ability of a cloud system to adjust resource provisioning in relation to fluctuating workloads. There are two types of elasticity operations, vertical and horizontal. In this work, we are interested in horizontal scaling, which is ensured by two mechanisms; scaling in and scaling out. Following the sizing of the system, we can adopt scaling in the event of over-supply and scaling out in the event of under-supply. In this paper, we propose a formal model, based on temporized and colored Petri nets (TdCPNs), for the modeling of the duplication and the removal of a virtual machine from a server. This model is based on formal Petri Nets (PNs) modeling language. The proposed models are edited, verified, and simulated with two examples implemented in colored Petri nets (CPNs)tools, which is a modeling tool for colored and timed PNs.

Auditory Brainstem Response in Wave VI for the Detection of Learning Disabilities

The use of brain stem auditory evoked potential (BAEP) is a common way to study the hearing function of people, a way to learn the functionality of a part of the brain neuronal groups that intervene in the learning process by studying the behaviour of wave VI. The latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of innocuous, low-cost and easy-access techniques such as, among others, the BAEP that can help us to detect early possible neurodevelopmental difficulties for their subsequent assessment and cure. To date and the authors best knowledge, only the latency data obtained, observing the first to V waves and mainly in the left ear, were taken into account. This work shows that it is essential to consider both ears; with these latest data, it has been possible to diagnose more precisely some cases than with the previous data had been diagnosed as “normal”despite showing signs of some alteration that motivated the new consultation to the specialist.

Utilization of Schnerr-Sauer Cavitation Model for Simulation of Cavitation Inception and Super Cavitation

In this study, the Reynolds-Stress-Navier-Stokes framework is utilized to investigate the flow inside the diesel injector nozzle. The flow is assumed to be multiphase as the formation of vapor by pressure drop is visualized. For pressure and velocity linkage, the coupled algorithm is used. Since the cavitation phenomenon inherently is unsteady, the quasi-steady approach is utilized for saving time and resources in the current study. Schnerr-Sauer cavitation model is used, which was capable of predicting flow behavior both at the initial and final steps of the cavitation process. Two different turbulent models were used in this study to clarify which one is more capable in predicting cavitation inception and super-cavitation. It was found that K-ε was more compatible with the Shnerr-Sauer cavitation model; therefore, the mentioned model is used for the rest of this study.

The Latency-Amplitude Binomial of Waves Resulting from the Application of Evoked Potentials for the Diagnosis of Dyscalculia

Recent advances in cognitive neuroscience have allowed a step forward in perceiving the processes involved in learning from the point of view of acquiring new information or the modification of existing mental content. The evoked potentials technique reveals how basic brain processes interact to achieve adequate and flexible behaviours. The objective of this work, using evoked potentials, is to study if it is possible to distinguish if a patient suffers a specific type of learning disorder to decide the possible therapies to follow. The methodology used in this work is to analyze the dynamics of different brain areas during a cognitive activity to find the relationships between the other areas analyzed to understand the functioning of neural networks better. Also, the latest advances in neuroscience have revealed the exis-tence of different brain activity in the learning process that can be highlighted through the use of non-invasive, innocuous, low-cost and easy-access techniques such as, among others, the evoked potentials that can help to detect early possible neurodevelopmental difficulties for their subsequent assessment and therapy. From the study of the amplitudes and latencies of the evoked potentials, it is possible to detect brain alterations in the learning process, specifically in dyscalculia, to achieve specific corrective measures for the application of personalized psycho-pedagogical plans that allow obtaining an optimal integral development of the affected people.

Analysis of Incidences of Collapsed Buildings in the City of Douala, Cameroon from 2011-2020

This study focuses on the problem of collapsed buildings within the city of Douala over the past ten years, and more precisely within the period from 2011 to 2020. It was carried out in a bid to ascertain the real causes of this phenomenon, which has become recurrent in the leading economic city of Cameroon. To achieve this, it was first necessary to review some works dealing with construction materials and technology as well as some case histories of structural collapse within the city. Thereafter, a statistical study was carried out on the results obtained. It was found that the causes of building collapses in the city of Douala are: Neglect of administrative procedures, use of poor quality materials, poor composition and confectioning of concrete, lack of Geotechnical study, lack of structural analysis and design, corrosion of the reinforcement bars, poor maintenance in buildings, and other causes. Out of the 46 cases of failure and collapse of buildings within the city of Douala, 7 of these were identified to have had no geotechnical study carried out, giving a percentage of 15.22%. It was also observed that out of the 46 cases of structural failure, 6 were as a result of lack of proper structural analysis and design giving a percentage of 13.04%. Subsequently, recommendations and suggestions are made in a bid to placing particular emphasis on the choice of materials, the manufacture and casting of concrete as well as the placement of the required reinforcements. All this guarantees the stability of a building.

Using Design Sprint for Software Engineering Undergraduate Student Projects: A Method Paper

Software engineering curriculums generally consist of industry-based practices such as project-based learning (PBL) which mainly focuses on efficient and innovative product development. These approaches can be tailored and used in project-based modules in software engineering curriculums. However, there are very limited attempts in the area especially related to Sri Lankan context. This paper describes a tailored pedagogical approach and its results of using design sprint which can be used for project-based modules in software engineering (SE) curriculums. A controlled group of second year software engineering students was selected for the study. The study results indicate that all of the students agreed that the design sprint approach is effective in group-based projects and 83% of students stated that it minimized the re-work compared to traditional project approaches. The tailored process was effective, easy to implement and produced desired results at the end of the session while providing students an enjoyable experience.

Depth Estimation in DNN Using Stereo Thermal Image Pairs

Depth estimation using stereo images is a challenging problem in computer vision. Many different studies have been carried out to solve this problem. With advancing machine learning, tackling this problem is often done with neural network-based solutions. The images used in these studies are mostly in the visible spectrum. However, the need to use the Infrared (IR) spectrum for depth estimation has emerged because it gives better results than visible spectra in some conditions. At this point, we recommend using thermal-thermal (IR) image pairs for depth estimation. In this study, we used two well-known networks (PSMNet, FADNet) with minor modifications to demonstrate the viability of this idea.

Effectiveness and Performance of Spatial Communication within Composite Interior Space: The Wayfinding System in the Saudi National Museum as a Case Study

The wayfinding system affects the course of a museum journey for visitors, both directly and indirectly. The design aspects of this system play an important role, making it an effective communication system within the museum space. However, translating the concepts that pertain to its design, and which are based on integration and connectivity in museum space design, such as intelligibility, lacks customization in the form of specific design considerations with reference to the most important approaches. These approaches link the organizational and practical aspects to the semiotic and semantic aspects related to the space syntax by targeting the visual and perceived consistency of visitors. In this context, the present study aims to identify how to apply the concept of intelligibility by employing integration and connectivity to design a wayfinding system in museums as a kind of composite interior space. Using the available plans and images to extrapolate the considerations used to design the wayfinding system in the Saudi National Museum as a case study, a descriptive analytical method was used to understand the basic organizational and Morphological principles of the museum space through the main aspects of space design (the Morphological and the pragmatic). The study’s methodology is based on the description and analysis of the basic organizational and Morphological principles of the museum space at the level of the major Morphological and Pragmatic design layers (based on available pictures and diagrams) and inductive method about applied level of intelligibility in spatial layout in the Hall of Islam and Arabia at the National Museum Saudi Arabia within the framework of a case study through the levels of verification of the properties of the concepts of connectivity and integration. The results indicated that the application of the characteristics of intelligibility is weak on both Pragmatic and Morphological levels. Based on the concept of connective and integration, we conclude the following: (1) High level of reflection of the properties of connectivity on the pragmatic level, (2) Weak level of reflection of the properties of Connectivity at the morphological level (3) Weakness in the level of reflection of the properties of integration in the space sample as a result of a weakness in the application at the morphological and pragmatic level. The study’s findings will assist designers, professionals, and researchers in the field of museum design in understanding the significance of the wayfinding system by delving into it through museum spaces by highlighting the most essential aspects using a clear analytical method.

Review of Carbon Materials: Application in Alternative Energy Sources and Catalysis

The application of carbon materials in the branches of the electrochemical industry shows an increasing tendency each year due to the many interesting properties they possess. These are, among others, a well-developed specific surface, porosity, high sorption capacity, good adsorption properties, low bulk density, electrical conductivity and chemical resistance. All these properties allow for their effective use, among others in supercapacitors, which can store electric charges of the order of 100 F due to carbon electrodes constituting the capacitor plates. Coals (including expanded graphite, carbon black, graphite carbon fibers, activated carbon) are commonly used in electrochemical methods of removing oil derivatives from water after tanker disasters, e.g., phenols and their derivatives by their electrochemical anodic oxidation. Phenol can occupy practically the entire surface of carbon material and leave the water clean of hydrophobic impurities. Regeneration of such electrodes is also not complicated, it is carried out by electrochemical methods consisting in unblocking the pores and reducing resistances, and thus their reactivation for subsequent adsorption processes. Graphite is commonly used as an anode material in lithium-ion cells, while due to the limited capacity it offers (372 mAh g-1), new solutions are sought that meet both capacitive, efficiency and economic criteria. Increasingly, biodegradable materials, green materials, biomass, waste (including agricultural waste) are used in order to reuse them and reduce greenhouse effects and, above all, to meet the biodegradability criterion necessary for the production of lithium-ion cells as chemical power sources. The most common of these materials are cellulose, starch, wheat, rice, and corn waste, e.g., from agricultural, paper and pharmaceutical production. Such products are subjected to appropriate treatments depending on the desired application (including chemical, thermal, electrochemical). Starch is a biodegradable polysaccharide that consists of polymeric units such as amylose and amylopectin that build an ordered (linear) and amorphous (branched) structure of the polymer. Carbon is also used as a catalyst. Elemental carbon has become available in many nano-structured forms representing the hybridization combinations found in the primary carbon allotropes, and the materials can be enriched with a large number of surface functional groups. There are many examples of catalytic applications of coal in the literature, but the development of this field has been hampered by the lack of a conceptual approach combining structure and function and a lack of understanding of material synthesis. In the context of catalytic applications, the integrity of carbon environmental management properties and parameters such as metal conductivity range and bond sequence management should be characterized. Such data, along with surface and textured information, can form the basis for the provision of network support services.

An Examination of the Factors Affecting the Adoption of Cloud Enterprise Resource Planning Systems in Egyptian Companies

Enterprise resource planning (ERP) is an integrated system that helps companies in managing their resources. There are two types of ERP systems, the traditional ERP systems, and the cloud ERP systems. Cloud ERP systems were introduced after the development of cloud computing technology. This research aims to identify the factors that affect the adoption of cloud ERP in Egyptian companies. Moreover, the aim of our study is to provide guidance to Egyptian companies in the cloud ERP adoption decision and to participate in increasing the number of the cloud ERP studies that are conducted in the Middle East and in developing countries. There are many factors influencing the adoption of cloud ERP in Egyptian organizations which are discussed and explained in the research. Those factors are examined through combining the Diffusion of Innovation theory (DOI) and technology-organization-environment framework (TOE). Data were collected through a survey that was developed using constructs from the existing studies of cloud computing and cloud ERP technologies and was then modified to fit our research. The analysis of the data was based on Structural Equation Modeling (SEM) using Smart PLS software that was used for the empirical analysis of the research model.

A Modern Review of the Non-Invasive Continuous Blood Glucose Measuring Devices and Techniques for Remote Patient Monitoring System

Diabetes disease that arises from the higher glucose level due to insulin shortage or insulin opposition in the human body has become a common disease in the world. No medicine can cure it completely. However, by taking medicine, maintaining diets, and having exercises regularly, a diabetes patient can keep his glucose level within the specified limits and in this way, he/she can lead a normal life like a healthy person. But to control glucose levels, a patient needs to monitor them regularly. Various techniques are being used over the last four decades. This modern review article aims to provide a comparative study report on various blood glucose monitoring techniques in a very concise and organized manner. The review mainly emphasizes working principles, cost, technology, sensors, measurement types, measurement accuracy, advantages, and disadvantages, etc. of various techniques and then compares among each other. Besides, the use of algorithms and simulators for the growth of this technology is also presented. Finally, current research trends of this measurement technology have also been discussed.

Development of Nondestructive Imaging Analysis Method Using Muonic X-Ray with a Double-Sided Silicon Strip Detector

In recent years, a nondestructive elemental analysis method based on muonic X-ray measurements has been developed and applied for various samples. Muonic X-rays are emitted after the formation of a muonic atom, which occurs when a negatively charged muon is captured in a muon atomic orbit around the nucleus. Because muonic X-rays have a higher energy than electronic X-rays due to the muon mass, they can be measured without being absorbed by a material. Thus, estimating the two-dimensional (2D) elemental distribution of a sample became possible using an X-ray imaging detector. In this work, we report a non-destructive imaging experiment using muonic X-rays at Japan Proton Accelerator Research Complex. The irradiated target consisted of a polypropylene material, and a double-sided silicon strip detector, which was developed as an imaging detector for astronomical obervation, was employed. A peak corresponding to muonic X-rays from the carbon atoms in the target was clearly observed in the energy spectrum at an energy of 14 keV, and 2D visualizations were successfully reconstructed to reveal the projection image from the target. This result demonstrates the potential of the nondestructive elemental imaging method that is based on muonic X-ray measurement. To obtain a higher position resolution for imaging a smaller target, a new detector system will be developed to improve the statistical analysis in further research.