Structural-Geotechnical Effects of the Foundation of a Medium-Height Structure

The interaction effects between the existing soil and the substructure of a 5-story building with an underground one, were evaluated in such a way that the structural-geotechnical concepts were validated through the method of impedance factors with a program based on the method of the finite elements. The continuous wall-type foundation had a constant thickness and followed inclined and orthogonal directions, while the ground had homogeneous and medium-type characteristics. The soil considered was type C according to the Ecuadorian Construction Standard (NEC) and the corresponding foundation comprised a depth of 4.00 meters and a basement wall thickness of 40 centimeters. This project is part of a mid-rise building in the city of Azogues (Ecuador). The hypotheses raised responded to the objectives in such a way that the model implemented with springs had a variation with respect to the embedded base, obtaining conservative results.

A Practical Construction Technique to Enhance the Performance of Rock Bolts in Tunnels

In Swedish tunnel construction, a critical issue that has been repeatedly acknowledged is corrosion and, consequently, failure of the rock bolts in rock support systems. The defective installation of rock bolts results in the formation of cavities in the cement mortar that is regularly used to fill the area under the dome plates. These voids allow for water-ingress to the rock bolt assembly, which results in corrosion of rock bolt components and eventually failure. In addition, the current installation technique consists of several manual steps with intense labor works that are usually done in uncomfortable and exhausting conditions, e.g., under the roof of the tunnels. Such intense tasks also lead to a considerable waste of materials and execution errors. Moreover, adequate quality control of the execution is hardly possible with the current technique. To overcome these issues, a non-shrinking/expansive cement-based mortar filled in the paper packaging has been developed in this study which properly fills the area under the dome plates without or with the least remaining cavities, ultimately that diminishes the potential of corrosion. This article summarizes the development process and the experimental evaluation of this technique for the installation of rock bolts. In the development process, the cementitious mortar was first developed using specific cement and shrinkage reducing/expansive additives. The mechanical and flow properties of the mortar were then evaluated using compressive strength, density, and slump flow measurement methods. In addition, isothermal calorimetry and shrinkage/expansion measurements were used to elucidate the hydration and durability attributes of the mortar. After obtaining the desired properties in both fresh and hardened conditions, the developed dry mortar was filled in specific permeable paper packaging and then submerged in water bath for specific intervals before the installation. The tests were enhanced progressively by optimizing different parameters such as shape and size of the packaging, characteristics of the paper used, immersion time in water and even some minor characteristics of the mortar. Finally, the developed prototype was tested in a lab-scale rock bolt assembly with various angles to analyze the efficiency of the method in real life scenario. The results showed that the new technique improves the performance of the rock bolts by reducing the material wastage, improving environmental performance, facilitating and accelerating the labor works, and finally enhancing the durability of the whole system. Accordingly, this approach provides an efficient alternative for the traditional way of tunnel bolt installation with considerable advantages for the Swedish tunneling industry.

Morphological Characteristics and Development of the Estuary Area of Lam River, Vietnam

On the basis of the structure of alluvial sediments explained by echo sounding data and remote sensing images, the following results can be given: The estuary of Lam river from Ben Thuy Bridge (original word: Bến Thủy) to Cua Hoi (original word: Cửa Hội) is divided into three channels (location is calculated according to the river bank on the Nghe An Province, original word: Nghệ An): i) channel I (from Ben Thuy Bridge to Hung Hoa, original word: Hưng Hòa) is the branching river; ii) channel II (from Hung Hoa to Nghi Thai, original word: Nghi Thái)is a channel develops in a meandering direction with a concave side toward Ha Tinh Province (Hà Tĩnh); iii) channel III (from Nghi Thai to Cua Hoi)is a channel develops in a meandering direction with a concave side toward Nghe An province.This estuary area is formed in the period from after the sea level dropped below 0m (current water level) to the present: i) Channel II developed moving towards Ha Tinh Province; ii) Channel III developed moving towards Nghe An Province; iii) In channel I, a second river branch is formed because the flow of river cuts through the Hong Lam- Hong Nhat mudflat (original word: Hồng Lam -Hồng Nhất),at the same time creating an island.Morphological characteristics of the estuary area of Lam River are the main result of erosion and deposition activities corresponding to two water levels: the water level is about 2 m lower than the current water level and the current water level.Characteristics of the sediment layers on the riverbed in the estuary can be used to determine the sea levels in Late Holocene to the present.

A Real-Time Monitoring System of the Supply Chain Conditions, Products and Means of Transport

Real-time monitoring of the supply chain conditions and procedures is a critical element for the optimal coordination and safety of the deliveries, as well as for the minimization of the delivery time and cost. Real time monitoring requires IoT data streams, which are related to the conditions of the products and the means of transport (e.g., location, temperature/humidity conditions, kinematic state, ambient light conditions, etc.). These streams are generated by battery-based IoT tracking devices, equipped with appropriate sensors, and are transmitted to a cloud-based back-end system. Proper handling and processing of the IoT data streams, using predictive and artificial intelligence algorithms, can provide significant and useful results, which can be exploited by the supply chain stakeholders in order to enhance their financial benefits, as well as the efficiency, security, transparency, coordination and sustainability of the supply chain procedures. The technology, the features and the characteristics of a complete, proprietary system, including hardware, firmware and software tools - developed in the context of a co-funded R&D program - are addressed and presented in this paper. 

The Client-Supplier Relationship in Managing Innovation: Delineating Defence Industry First Mover Challenges within the Government Contract Competition

All companies are confronted with the need to innovate in order to meet market demands. In so doing they are challenged with the dilemma of whether to aim to be first into the market with a new innovative product, or to deliberately wait and learn from a pioneers’ mistakes; potentially avoiding higher risks. It is therefore important to critically understand from a first mover advantage and disadvantage perspective the decision-making implications of defence industry transformation onset by an innovative paradigm shift. This paper will argue that the type of industry characteristics matter, especially when considering what role the clients play in the innovation process and what their level of influence is. Through investigation of qualitative case study research, this inquiry will focus on first mover advantages and first mover disadvantages with a view to establish practical and value-added academic findings by focusing on specific industries where the clients play an active role in cooperation with the supplier innovation. The resulting findings will help managers to mitigate risk in innovative technology introduction. A selection from several defence industry innovations is specifically chosen because of the client–supplier relationship that typically differs from traditional first mover research. In this instance, case studies will be used referencing vertical-take-off-and-landing defence equipment innovations. 

A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection

With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.

Fundamentals of Performance Management in the World of Public Service Organisations

The examination of the Public Service Organization’s performance evaluation includes several steps that help public organizations to develop a more efficient system. Public sector organizations have different characteristics than the competitive sector, so it can be stated that other/new elements become more important in their performance processes. The literature in this area is diverse, so highlighting an indicator system can be useful for introducing a system, but it is also worthwhile to measure the specific elements of the organization. In the case of a public service organization, due to the service obligation, it is usually possible to talk about a high number of users, so compliance is more difficult. For the organization, it is an important target to place great emphasis on the increase of service standards and the development of related processes. In this research, the health sector is given a prominent role, as it is a sensitive area where both organizational and individual performance is important for all participants. As a primary step, the content of the strategy is decisive, as this is important for the efficient structure of the process. When designing any system, it is important to review the expectations of the stakeholders, as this is primary when considering the design. The goal of this paper is to build the foundations of a performance management and indexing framework that can help a hospital to provide effective feedback and a direction that is important in assessing and developing a service and can become a management philosophy.

The Role of Synthetic Data in Aerial Object Detection

The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represent another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.

Software Product Quality Evaluation Model with Multiple Criteria Decision Making Analysis

This paper presents a software product quality evaluation model based on the ISO/IEC 25010 quality model. The evaluation characteristics and sub characteristics were identified from the ISO/IEC 25010 quality model. The multidimensional structure of the quality model is based on characteristics such as functional suitability, performance efficiency, compatibility, usability, reliability, security, maintainability, and portability, and associated sub characteristics. Random numbers are generated to establish the decision maker’s importance weights for each sub characteristics. Also, random numbers are generated to establish the decision matrix of the decision maker’s final scores for each software product against each sub characteristics. Thus, objective criteria importance weights and index scores for datasets were obtained from the random numbers. In the proposed model, five different software product quality evaluation datasets under three different weight vectors were applied to multiple criteria decision analysis method, preference analysis for reference ideal solution (PARIS) for comparison, and sensitivity analysis procedure. This study contributes to provide a better understanding of the application of MCDMA methods and ISO/IEC 25010 quality model guidelines in software product quality evaluation process.

Optimization of Hemp Fiber Reinforced Concrete for Mix Design Method

The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. HF were obtained from the manufacturer and hand processed to ensure uniformity in width and length. The fibers were added to concrete as both wet and dry mix to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed that the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375 indicating a variation in the mixing process. While completing the dry mix, the addition of plain HF caused them to intertwine creating lumps and inconsistency. However, during the wet mixing process, combining water and HF before incorporation allows the fibers to uniformly disperse within the mix hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes however more research surrounding its characteristics needs to be conducted.

Physicochemical and Thermal Characterization of Starch from Three Different Plantain Cultivars in Puerto Rico

Plantain contains starch as the main component and represents a relevant source of this carbohydrate. Starches from different cultivars of plantain and bananas have been studied for industrialization purposes due to their morphological and thermal characteristics and their influence in food products. This study aimed to characterize the physical, chemical, and thermal properties of starch from three different plantain cultivated in Puerto Rico: Maricongo, Maiden and FHIA 20. Amylose and amylopectin content, color, granular size, morphology, and thermal properties were determined. According to the amylose content in starches, FHIA 20 presented lowest content of the three cultivars studied. In terms of color, Maiden and FHIA 20 starches exhibited significantly higher whiteness indexes compared to Maricongo starch. Starches of the three cultivars had an elongated-ovoid morphology, with a smooth surface and a non-porous appearance. Regardless of similarities in their morphology, FHIA 20 exhibited a lower aspect ratio since its granules tended to be more elongated. Comparison of the thermal properties of starches showed that initial starch gelatinization temperature was similar among cultivars. However, FHIA 20 starch presented a noticeably higher final gelatinization temperature (87.95°C) and transition enthalpy than Maricongo (79.69°C) and Maiden (77.40°C). Despite similarities, starches from plantain cultivars showed differences in their composition and thermal behavior. This represents an opportunity to diversify plantain starch use in food-related applications.

User’s Susceptibility Factors to Malware Attacks: A Systemic Literature Review

Users’ susceptibility to malware attacks have been noticed in the past few years. Investigating the factors that make a user vulnerable to those attacks is critical because they can be utilized to set up proactive strategies such as awareness and education to mitigate the impacts of those attacks. Demographic, behavioral, and cultural vulnerabilities are the main factors that make users susceptible to malware attacks. It is challenging, however, to draw more general conclusions based on those factors due to the varieties in the type of users and different types of malware. Therefore, we conducted a systematic literature review (SLR) of the existing research for user susceptibility factors to malware attacks. The results showed that all demographic factors are consistently associated with malware infection regardless of the users' type except for age and gender. Besides, the association of culture and personality factors with malware infection is consistent in most of the selected studies and for all types of users. Moreover, malware infection varies based on age, geographic location, and host types. We propose that future studies should carefully take into consideration the type of users because different users may be exposed to different threats or targeted based on their user domains’ characteristics. Additionally, as different types of malware use different tactics to trick users, taking the malware types into consideration is important.

Function of Fractals: Application of Non-linear Geometry in Continental Architecture

Since the introduction of fractal geometry in 1970, numerous efforts have been made by architects and researchers to transfer this area of mathematical knowledge in the discipline of architecture and postmodernist discourse. The discourse of complexity and architecture is one of the most significant ongoing discourses in the discipline of architecture from the 70's until today and has generated significant styles such as deconstructivism and parametricism in architecture. During these years, several projects were designed and presented by designers and architects using fractal geometry, but due to the lack of sufficient knowledge and appropriate comprehension of the features and characteristics of this nonlinear geometry, none of the fractal-based designs have been successful and satisfying. Fractal geometry as a geometric technology has a long presence in the history of architecture. The current research attempts to identify and discover the characteristics, features, potentials and functionality of fractals despite their aesthetic aspect by examining case studies of pre-modern architecture in Asia and investigating the function of fractals. 

Recycling of Sintered NdFeB Magnet Waste via Oxidative Roasting and Selective Leaching

Neodymium-iron-boron (NdFeB) magnets classified as high-power magnets are widely used in various applications such as automotive, electrical and medical devices. Because significant amounts of rare earth metals will be subjected to shortages in the future, therefore domestic NdFeB magnet waste recycling should therefore be developed in order to reduce social and environmental impacts towards a circular economy. Each type of wastes has different characteristics and compositions. As a result, these directly affect recycling efficiency as well as types and purity of the recyclable products. This research, therefore, focused on the recycling of manufacturing NdFeB magnet waste obtained from the sintering stage of magnet production and the waste contained 23.6% Nd, 60.3% Fe and 0.261% B in order to recover high purity neodymium oxide (Nd2O3) using hybrid metallurgical process via oxidative roasting and selective leaching techniques. The sintered NdFeB waste was first ground to under 70 mesh prior to oxidative roasting at 550–800 oC to enable selective leaching of neodymium in the subsequent leaching step using H2SO4 at 2.5 M over 24 h. The leachate was then subjected to drying and roasting at 700–800 oC prior to precipitation by oxalic acid and calcination to obtain Nd2O3 as the recycling product. According to XRD analyses, it was found that increasing oxidative roasting temperature led to an increasing amount of hematite (Fe2O3) as the main composition with a smaller amount of magnetite (Fe3O4) found. Peaks of Nd2O3 were also observed in a lesser amount. Furthermore, neodymium iron oxide (NdFeO3) was present and its XRD peaks were pronounced at higher oxidative roasting temperatures. When proceeded to acid leaching and drying, iron sulfate and neodymium sulfate were mainly obtained. After the roasting step prior to water leaching, iron sulfate was converted to form Fe2O3 as the main compound, while neodymium sulfate remained in the ingredient. However, a small amount of Fe3O4 was still detected by XRD. The higher roasting temperature at 800 oC resulted in a greater Fe2O3 to Nd2(SO4)3 ratio, indicating a more effective roasting temperature. Iron oxides were subsequently water leached and filtered out while the solution contained mainly neodymium sulfate. Therefore, low oxidative roasting temperature not exceeding 600 oC followed by acid leaching and roasting at 800 oC gave the optimum condition for further steps of precipitation and calcination to finally achieve Nd2O3.

Comparison of Conventional Control and Robust Control on Double-Pipe Heat Exchanger

Heat exchanger is a device used to mix liquids having different temperatures. In this case, the temperature control becomes a critical objective. This research work presents the temperature control of the double-pipe heat exchanger (multi-input multi-output (MIMO) system), which is modeled as first-order coupled hyperbolic partial differential equations (PDEs), using conventional and advanced control techniques, and develops appropriate robust control strategy to meet stability requirements and performance objectives. We designed the proportional–integral–derivative (PID) controller and H-infinity controller for a heat exchanger (HE) system. Frequency characteristics of sensitivity functions and open-loop and closed-loop time responses are simulated using MATLAB software and the stability of the system is analyzed using Kalman's test. The simulation results have demonstrated that the H-infinity controller is more efficient than PID in terms of robustness and performance.

Automated 3D Segmentation System for Detecting Tumor and Its Heterogeneity in Patients with High Grade Ovarian Epithelial Cancer

High grade ovarian epithelial cancer (OEC) is the most fatal gynecological cancer and poor prognosis of this entity is closely related to considerable intratumoral genetic heterogeneity. By examining imaging data, it is possible to assess the heterogeneity of tumorous tissue. This study presents a methodology for aligning, segmenting and finally visualizing information from various magnetic resonance imaging series, in order to construct 3D models of heterogeneity maps from the same tumor in OEC patients. The proposed system may be used as an adjunct digital tool by health professionals for personalized medicine, as it allows for an easy visual assessment of the heterogeneity of the examined tumor.

Learning Objects Content Presentation Adaptation Model Considering Students' Learning Styles

Learning styles (LSs) correspond to the individual preferences of a person regarding the modes and forms in which he/she prefers to learn throughout the teaching/learning process. The content presentation of learning objects (LOs) using knowledge about the students’ LSs offers them digital educational resources tailored to their individual learning preferences. In this context, the most relevant characteristics of the LSs along with the most appropriate forms of LOs' content presentation were mapped and associated. Such was performed in order to define the composition of an adaptive model of LO's content presentation considering the LSs, which was called Adaptation of Content Presentation of Learning Objects Considering Learning Styles (ACPLOLS). LO prototypes were created with interfaces that were adapted to students' LSs. These prototypes were based on a model created for validation of the approaches that were used, which were established through experiments with the students. The results of subjective measures of students' emotional responses demonstrated that the ACPLOLS has reached the desired results in relation to the adequacy of the LOs interface, in accordance with the Felder-Silverman LSs Model.

A Review on Bearing Capacity Factor Nγ of Shallow Foundations with Different Shapes

There are several methods for calculating the bearing capacity factors of foundations and retaining walls. In this paper, the bearing capacity factor Nγ (shape factor) for different types of foundation have been investigated. The formula for bearing capacity on c–φ–γ soil can still be expressed by Terzaghi’s equation except that the bearing capacity factor Nγ depends on the surcharge ratio, and friction angle φ. It is apparent that the value of Nγ increases irregularly with the friction angle of the subsoil, which leads to an excessive increment in Nγ of foundations with larger width. Also, the bearing capacity factor Nγ will significantly decrease with an increase in foundation`s width. It also should be highlighted that the effect of shape and dimension will be less noticeable with a decrease in the relative density of the soil. Hence, the bearing capacity factor Nγ relatively depends on foundation`s width, surcharge and roughness ratio. This paper presents the results of various studies conducted on the bearing capacity factor Nγ of: different types of shallow foundation and foundations with irregular geometry (ring footing, triangular footing, shell foundations and etc.) Further studies on the effect of bearing capacity factor Nγ on mat foundations and the characteristics of this factor with or without consideration for the presence of friction between soil and foundation are recommended.

Research on the Protection and Reuse Model of Historical Buildings in Chinese Airports

China had constructed a large number of military and civilian airports before and after World War II, and then began large-scale repairs, reconstructions or relocation of airports after the baptism of wars after World War I and World War II. The airport's historical area and its historical buildings such as terminals, hangars, and towers have adopted different protection strategies and reuse application strategies. This paper is based on the judgment of the value of airport historical buildings to study different protection and reuse strategies. The protection and reuse models of historical buildings are classified in three dimensions: the airport historical area, the airport historical building complex and its individual buildings, and combined with specific examples to discuss and summarize the technical characteristics, protection strategies and successful experiences of different modes of protection and reuse of historical areas and historical buildings of airports.

Effect of Birks Constant and Defocusing Parameter on Triple-to-Double Coincidence Ratio Parameter in Monte Carlo Simulation-GEANT4

This project concerns with the detection efficiency of the portable Triple-to-Double Coincidence Ratio (TDCR) at the National Institute of Metrology of Ionizing Radiation (INMRI-ENEA) which allows direct activity measurement and radionuclide standardization for pure-beta emitter or pure electron capture radionuclides. The dependency of the simulated detection efficiency of the TDCR, by using Monte Carlo simulation Geant4 code, on the Birks factor (kB) and defocusing parameter has been examined especially for low energy beta-emitter radionuclides such as 3H and 14C, for which this dependency is relevant. The results achieved in this analysis can be used for selecting the best kB factor and the defocusing parameter for computing theoretical TDCR parameter value. The theoretical results were compared with the available ones, measured by the ENEA TDCR portable detector, for some pure-beta emitter radionuclides. This analysis allowed to improve the knowledge of the characteristics of the ENEA TDCR detector that can be used as a traveling instrument for in-situ measurements with particular benefits in many applications in the field of nuclear medicine and in the nuclear energy industry.