The Use of Symbolic Signs in Modern Ukrainian Monumental Church Painting: Classification and Hidden Semantics

Monumental church paintings are often perceived either as the interior decoration of the temple or as the "Gospel for the illiterate," as the temple painting often contains scenes from Holy Scripture. In science the painting of the Orthodox Church is mainly the subject of study of art critics, but from the point of view of culturology and semiotics, it is insufficiently studied. The symbolism of monumental church painting is insufficiently revealed. The aim of this paper is to give a description of symbolic signs, to classify them, to give examples for each type of sign from the paintings of modern temples of Eastern Ukraine, on the basis of semiotic analysis of iconographic plots used in monumental church painting. We offer own classification of symbols of monumental church painting, using examples from the murals of modern Orthodox churches in Eastern Ukraine, mainly from the Donetsk region. When analyzing the semantics of symbolic signs, the following methods of the culturological approach were used: semiotic, iconological, iconographic, hermeneutic, culturological, descriptive, comparative-historical, visual-analytical. When interpreting the meanings of symbolic signs, scientific, cultural and theological literature were used. Photos taken by the author have been added to the article.

Comparison between Different Classifications of Periodontal Diseases and Their Advantages

The classification of periodontal diseases has changed significantly in favor of simplifying the protocol of diagnosis and periodontal treatment. This review study aims to highlight the latest publications in the new periodontal disease classification, talking about the most significant differences versus the old classification with the tendency to express the advantages or disadvantages of clinical application. The aim of the study also includes the growing tendency to link the way of classification of periodontal diseases with predetermined protocols of periodontal treatment of the diagnoses included in the classification. The new classification of periodontal diseases is rather comprehensive in its subdivisions, as the disease is viewed in its entirety, with the biological dimensions of the disease, the degree of aggravation and progression of the disease, in relation to risk factors, predisposition to patient susceptibility and impact of periodontal disease to the general health status of the patient.

Controlled Vocabularies and Information Retrieval: 1918 Pandemic’s Scientific Literature as an Example

The role of controlled vocabularies in information retrieval is broadly recognized as a relevant feature. Besides, there is a standing demand that editors and databases should consider the effective introduction of controlled vocabularies in their procedures to index scientific literature. That is especially important because information retrieval is pointed out as a significant point to drive systematic literature review. Hence, a first question emerges: Are the controlled vocabularies at this moment considered? On the other hand, subject searching in the catalogs is complex mainly due to the dichotomy between keywords from authors versus keywords based on controlled vocabularies. Finally, there is some demand to unify the terminology related to health to make easier the medical history exploitation and research. Considering these features, this paper focuses on controlled vocabularies related to the health field and their role for storing, classifying, and retrieving relevant literature. The objective is knowing which role plays the controlled vocabularies related to the health field to index and retrieve research literature in data bases such as Web of Science (WoS) and Scopus. So, this exploratory research is grounded over two research questions: 1) Which are the terms considered in specific controlled vocabularies of the health field; and 2) How papers are indexed in relevant databases to be easily retrieved, considering keywords vs specific health’ controlled vocabularies? This research takes as fieldwork the controlled vocabularies related to health and the scientific interest for 1918 flu pandemic, also known equivocally as ‘Spanish flu’. This interest has been fostered by the emergence in the early 21st of epidemics of pneumonic diseases caused by virus. Searches about and with controlled vocabularies on WoS and Scopus databases are conducted. First results of this work in progress are surprising. There are different controlled vocabularies for the health field, into which the terms collected and preferred related to ‘1918 pandemic’ are identified. To summarize, ‘Spanish influenza epidemic’ or ‘Spanish flu’ are collected as not preferred terms. The preferred terms are: ‘influenza’ or ‘influenza pandemic, 1918-1919’. Although the controlled vocabularies are clear in their election, most of the literature about ‘1918 pandemic’ is retrievable either by ‘Spanish’ or by ‘1918’ disjunct, and the dominant word to retrieve literature is ‘Spanish’ rather than ‘1918’. This is surprising considering the existence of suitable controlled vocabularies related to health topics, and the modern guidelines of World Health Organization concerning naming of diseases that point out to other preferred terms. A first conclusion is the failure of using controlled vocabularies for a field such as health, and in consequence for WoS and Scopus. This research opens further research questions about which is the role that controlled vocabularies play in the instructions to authors that journals deliver to documents’ authors.

The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Alignment of a Combined Groin for Flow through a Straight Open Channel

The rivers in Bangladesh are highly unstable having loose boundaries, mild slope of water surface and bed, irregular siltation of huge sediment coming from upstream, among others. The groins are installed in the river bank to deflect the flowing water away from the vulnerable zones. The conventional groins are found to be unstable and ineffective. The combined groin having both impermeable and permeable components in the same structure improves the flow field to function better over others. The main goal of this study is to analyze the hydraulic characteristics induced by the combined groins of different alignments by using a 2D numerical model, iRIC Nays2DH. In this numerical simulation, the K-ε model for turbulence and Cubic Interpolation Pseudo-particle (CIP) method for advective terms are utilized. A particular flow condition is applied in the channel for all sets of groins with different alignments. The simulation results reveal that the combined groins alter the flow patterns considerably, with no significant recirculation of flow in the groin field. The effect of different alignments of groins is found somewhat different. Based on hydraulic features caused by the groins, the combined groin that aligns the permeable component towards slightly downstream performs better over others.

Time Organization for Urban Mobility Decongestion: A Methodology for People’s Profile Identification

Quality of life, environmental impact, congestion of mobility means, and infrastructures remain significant challenges for urban mobility. Solutions like car sharing, spatial redesign, eCommerce, and autonomous vehicles will likely increase the unit veh-km and the density of cars in urban traffic, thus reducing congestion. However, the impact of such solutions is not clear for researchers. Congestion arises from growing populations that must travel greater distances to arrive at similar locations (e.g., workplaces, schools) during the same time frame (e.g., rush hours). This paper first reviews the research and application cases of urban congestion methods through recent years. Rethinking the question of time, it then investigates people’s willingness and flexibility to adapt their arrival and departure times from workplaces. We use neural networks and methods of supervised learning to apply a methodology for predicting peoples’ intentions from their responses in a questionnaire. We created and distributed a questionnaire to more than 50 companies in the Paris suburb. Obtained results illustrate that our methodology can predict peoples’ intentions to reschedule their activities (work, study, commerce, etc.).

Possibilities for Testing User Experience and User Interface Design on Mobile Devices

In an era when everything is increasingly digital, consumers are always looking for new options in solutions to their everyday needs. In this context, mobile apps are developing at an exponential pace. One of the fastest growing segments of mobile technologies is, obviously, e-commerce. It can be predicted that mobile commerce will record nearly three times the global growth of e-commerce across all platforms, which indicates its importance in the given segment. The current coronavirus pandemic is also changing many of the existing paradigms both socially, economically, and technologically, which has a major impact on changing consumer behavior and the emphasis on simplification and clarity of mobile solutions. This is the area that User Experience (UX) and User Interface (UI) designers deal with. Their task is to design a sufficiently attractive and interesting solution that will be available on all mobile devices and at the same time will be easy enough for the customer/visitor to get to the destination or to get the necessary information in a few clicks. The basis for changes in UX design can now be obtained not only through online analytical tools, but also through neuromarketing, especially in the case of mobile devices. The paper highlights possibilities for testing UX design applications on mobile devices using a special platform that combines a stationary eye camera (eye tracking) and facial analysis (facial coding).

Cultivating Individuality and Equality in Education: Ideas on Respecting Dimensions of Diversity within the Classroom

This systematic literature review sought to explore the dimensions of diversity that can affect classroom learning. This review is significant as it can aid educators in reaching more of their diverse student population and creating supportive classrooms for teachers and students. For this study, peer-reviewed articles were found and compiled using Google Scholar. Key terms used in the search include student individuality, classroom equality, student development, teacher development, and teacher individuality. Relevant educational standards such as Common Core and Partnership for the 21st Century were also included as part of this review. Student and teacher individuality and equality is discussed as well as methods to grow both within educational settings. Embracing student and teacher individuality was found to be key as it may affect how each person interacts with given information. One method to grow individuality and equality in educational settings included drafting and employing revised teaching standards which include various Common Core and US State standards. Another was to use educational theories such as constructivism, cognitive learning, and Experiential Learning Theory. However, barriers to growing individuality, such as not acknowledging differences in a population’s dimensions of diversity, still exist. Studies found preserving the dimensions of diversity owned by both teachers and students yielded more positive and beneficial classroom experiences.

Visual Odometry and Trajectory Reconstruction for UAVs

The growing popularity of systems based on Unmanned Aerial Vehicles (UAVs) is highlighting their vulnerability particularly in relation to the positioning system used. Typically, UAV architectures use the civilian GPS which is exposed to a number of different attacks, such as jamming or spoofing. This is why it is important to develop alternative methodologies to accurately estimate the actual UAV position without relying on GPS measurements only. In this paper we propose a position estimate method for UAVs based on monocular visual odometry. We have developed a flight control system capable of keeping track of the entire trajectory travelled, with a reduced dependency on the availability of GPS signal. Moreover, the simplicity of the developed solution makes it applicable to a wide range of commercial drones. The final goal is to allow for safer flights in all conditions, even under cyber-attacks trying to deceive the drone.

Atherosclerosis Prevalence within Populations of the Southeastern United States

A prevalence cohort study of atherosclerotic lesions within cadavers was performed to better understand and characterize the prevalence of atherosclerosis among Georgia residents within body donors in the Philadelphia College of Osteopathic Medicine (PCOM) - Georgia body donor program. We procured specimens from cadavers used for medical student, physical therapy student, and biomedical science student cadaveric anatomical dissection at PCOM - South Georgia and PCOM - Georgia. Tissues were prepared using hematoxylin and eosin (H&E) stain as histological slides by Colquitt Regional Medical Center Laboratory Services. One section from each of the following arteries was taken after cadaveric dissection at the site of most calcification palpated grossly (if present): left anterior descending coronary artery, left internal carotid artery, abdominal aorta, splenic artery, and hepatic artery. All specimens were graded and categorized according to the American Heart Association’s Modified and Conventional Standards for Atherosclerotic Lesions using x4, x10, x40 microscopic magnification. Our study cohort included 22 cadavers, with 16 females and 6 males. The average age was 72.54 and median age was 72, with a range of 52 to 90 years old. The cause of death determination listing vascular and/or cardiovascular causes were present on 6 of the 22 death certificates. 19 of 22 (86%) cadavers had at least a single artery grading > 5. Of the cadavers with at least a single artery graded at greater than 5, only 5 of 19 (26%) cadavers had a vascular or cardiovascular cause of death reported. Malignancy was listed as a cause of death on 7 (32%) of death certificates. The average atherosclerosis grading of the common hepatic, splenic and left internal carotid arteries (2.15, 3.05, and 3.36 respectively) were lower than the left anterior descending artery and the abdominal aorta (5.16 and 5.86 respectively). This prevalence study characterizes atherosclerosis found in five medium and large systemic arteries within cadavers from the state of Georgia.

Design and Analysis of Low-Power, High Speed and Area Efficient 2-Bit Digital Magnitude Comparator in 90nm CMOS Technology Using Gate Diffusion Input

Digital magnitude comparators based on Gate Diffusion Input (GDI) implementation technique are high speed and area-efficient, and they consume less power as compared to other implementation techniques. However, they are less efficient for some logic gates and have no full voltage swing. In this paper, we made a performance comparison between the GDI implementation technique and other implementation methods, such as Static CMOS, Pass Transistor Logic (PTL), and Transmission Gate (TG) in 90 nm, 120 nm, and 180 nm CMOS technologies using BSIM4 MOS model. We proposed a methodology (hybrid implementation) of implementing digital magnitude comparators which significantly improved the power, speed, area, and voltage swing requirements. Simulation results revealed that the hybrid implementation of digital magnitude comparators show a 10.84% (power dissipation), 41.6% (propagation delay), 47.95% (power-delay product (PDP)) improvement compared to the usual GDI implementation method. We used Microwind & Dsch Version 3.5 as well as the Tanner EDA 16.0 tools for simulation purposes.

Efficient Alias-free Level Crossing Sampling

This paper proposes strategies in level crossing (LC) sampling and reconstruction that provide alias-free high-fidelity signal reconstruction for speech signals without exponentially increasing sample number with increasing bit-depth. We introduce methods in LC sampling that reduce the sampling rate close to the Nyquist frequency even for large bit-depth. The results indicate that larger variation in the sampling intervals leads to alias-free sampling scheme; this is achieved by either reducing the bit-depth or adding a jitter to the system for high bit-depths. In conjunction with windowing, the signal is reconstructed from the LC samples using an efficient Toeplitz reconstruction algorithm.

Dielectric Recovery Characteristics of High Voltage Gas Circuit Breakers Operating with CO2 Mixture

CO₂-based gas mixtures exhibit huge potential as the interruption medium for replacing SF₆ in high voltage switchgears. In this paper, the recovery characteristics of dielectric strength of CO₂-O₂ mixture in the post arc phase after the current zero are presented. As representative examples, the dielectric recovery curves under conditions of different gas filling pressures and short-circuit current amplitudes are presented. A series of dielectric recovery measurements suggests that the dielectric recovery rate is proportional to the mass flux of the blowing gas, and the dielectric strength recovers faster in the case of lower short circuit currents.

MLOps Scaling Machine Learning Lifecycle in an Industrial Setting

Machine learning has evolved from an area of academic research to a real-world applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiments. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on Cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.

JEWEL: A Cosmological Model Due to the Geometrical Displacement of Galactic Object Like Black, White and Worm Holes

Stellar objects such as black, white and worm holes can be the subject of speculative reasoning if represented in a simplified and geometric form in order to be able to move them; and the cosmological model is one of the most important contents in relation to speculations that can then open the way to other aspects that are not strictly speculative but practical, precisely in the Universe represented by us. In this work, thanks to the hypothesis of a very large number of black, white and worm holes present in our Universe, we imagine that they can be moved; it was therefore thought to align them on a plane and following a redistribution, and the boundaries of this plane were ideally joined, giving rise to a sphere that has the stellar objects examined radially distributed. Thanks to geometrical displacements of these stellar objects that do not make each one of them lose their functionality in the region in which they are located, at the end of the speculative process it is possible to highlight a spherical layer that allows a flow from the outside and inside this spherical shell allowing to relate to other external and internal spherical layers; this aspect that seems useful to describe the universe we live in, for example inside one of the spherical shells just described. The name "Jewel" was chosen because, imagining the speculative process present in this work at the end of steps, the cosmological model tends to be "luminous". This cosmological model includes, for each internal part of a generic layer, different and numerous moments of our universe thanks to an eternal flow inward. There are many aspects to explore, one of these is the connection between the outermost and the inside of the spherical layers.

Terrorism as a Threat to International Peace: A Study on 9/11 Terrorism

This paper is a theory-oriented study that seeks to generalize the process through which terrorism leads to the disruption of international peace. For this, it scrutinizes 9/11 terrorism based on five analytical domains of threat—security disorder, political tensions, economic adversity, socio-ideological intolerance, and the fear and cost of counterterrorism—each of which is explored in light of specific indicators. By applying qualitative correlation method, the paper finds that terrorism immediately entails five distinct kinds of negative impacts that lead to both internal disorders caused by state weakness and global disorder caused by international tensions, which in consequence, causes international peace to be disrupted. Thus, in following inductive process, the findings of this paper help to make a general inference that terrorism is a threat to international peace. 

Tailormade Geometric Properties of Chitosan by Gamma Irradiation

Chitosans, CSs, in solution are increasingly used in a range of geometric properties in various academic and industrial sectors, especially in the domain of pharmaceutical and biomedical engineering. In order to provide a tailoring guide of CSs to the applicants, gamma (γ)-irradiation technology and simple viscosity measurements have been used in this study. Accordingly, CS solid discs (0.5 cm thickness and 2.5 cm diameter) were exposed in air to Cobalt-60 (γ)-radiation, at room temperature and constant 50 kGy dose for different periods of exposer time (tγ). Diluted solutions of native and different irradiated CS were then prepared by dissolving 1.25 mg cm-3 of each polymer in 0.1 M NaCl/0.2 M CH3COOH. The single-concentration relative viscosity (ƞr) measurements were employed to obtain their intrinsic viscosity ([ƞ]) values and interrelated parameters, like: the molar mass (Mƞ), hydrodynamic radiuses (RH,ƞ), radius of gyration (RG,ƞ), and second virial coefficient (A2,ƞ) of CSs in the solution. The results show an exponential decrease of ƞr, [ƞ], Mƞ, RH,ƞ and RG,ƞ with increasing tγ. This suggests the influence of random chain-scission of CSs glycosidic bonds, with rate constant kr and kr-1 (lifetime τr ~ 0.017 min-1 and 57.14 min, respectively). The results also show an exponential decrease of A2ƞ with increasing tγ, which can be attributed to the growth of excluded volume effect in CS segments by tγ and, hence, better solution quality. The results are represented in following scaling laws as a tailoring guide to the applicants: RH,ƞ = 6.98 x 10-3 Mr0.65; RG,ƞ = 7.09 x 10-4 Mr0.83; A2,ƞ = 121.03 Mƞ,r-0.19.

Profitability and Budgeting of Kenaf Cultivation and Fiber Production in Kelantan Districts

The purpose of the analysis is estimation of viability and profitability of kenaf plant farming in Kelantan State. The monetary information was gathered through interviewing kenaf growers as well group discussion. In addition, the production statistics were collected from Kenaf factory administrative group. The monetary data were analyzed using the Precision financial Calculator. For kenaf production per hectare three scenarios of productivity were adopted, they were 15, 12 and ten; the research results exposed that, when kenaf productivity was 15 ton and the agronomist received financial supports from kenaf administration, the margin profit reached up to 37% which is almost dual profitability that is expected without government support. The financial analysis explains that, the adopted scenarios of the productivity are feasible when Benefit Cost Ratio (BCR) was used as financial indicator. Nonetheless, the kenaf productivity of 15 ton is the superlative viable among the others and payback period is 5 years which equals to middle period time to return the invested amount back. The study concluded that for the farmer to increase the productivity of kenaf per hectare the well farming practices as well as continuously farmers financial support are highly needed.

Paradigm of Digital Twin Application in Project Management in Architecture, Engineering and Construction

With the growing trend of adoption of advanced technologies like, building information modeling, artificial intelligence, wireless network, the collaboration and integration of these technologies into digital twin become more prominent in architecture, engineering and construction (AEC) industry in view of the nature and scale of AEC industry which efficiently adopted the digital twin. Digital twin is provided to be effective for AEC professions for design and project management. The digital concept is continuously developing and it is vital for AEC professionals and other stakeholders to understand the digital twin concept and the adoption of various advanced building technologies related to the AEC industry. This paper is to review the application of digital twins application in project management in AEC industry and highlight the challenge of AEC partitioners faced by the revolution of technologies including digital twins and building information modelling (BIM) for further research and future study.

Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients Cohorts: A Case Study in Scotland

Health and Social care (HSc) services planning and scheduling are facing unprecedented challenges, due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven approaches can help to improve policies, plan and design services provision schedules using algorithms that assist healthcare managers to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as Classification and Regression Trees (CART), Random Forests (RF), and Logistic Regression (LGR). The significance tests Chi-Squared and Student’s test are used on data over a 39 years span for which data exist for services delivered in Scotland. The demands are associated using probabilities and are parts of statistical hypotheses. These hypotheses, as their NULL part, assume that the target demand is statistically dependent on other services’ demands. This linking is checked using the data. In addition, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus, groups of services. Statistical tests confirmed ML coupling and made the prediction statistically meaningful and proved that a target service can be matched reliably to other services while ML showed that such marked relationships can also be linear ones. Zero padding was used for missing years records and illustrated better such relationships both for limited years and for the entire span offering long-term data visualizations while limited years periods explained how well patients numbers can be related in short periods of time or that they can change over time as opposed to behaviours across more years. The prediction performance of the associations were measured using metrics such as Receiver Operating Characteristic (ROC), Area Under Curve (AUC) and Accuracy (ACC) as well as the statistical tests Chi-Squared and Student. Co-plots and comparison tables for the RF, CART, and LGR methods as well as the p-value from tests and Information Exchange (IE/MIE) measures are provided showing the relative performance of ML methods and of the statistical tests as well as the behaviour using different learning ratios. The impact of k-neighbours classification (k-NN), Cross-Correlation (CC) and C-Means (CM) first groupings was also studied over limited years and for the entire span. It was found that CART was generally behind RF and LGR but in some interesting cases, LGR reached an AUC = 0 falling below CART, while the ACC was as high as 0.912 showing that ML methods can be confused by zero-padding or by data’s irregularities or by the outliers. On average, 3 linear predictors were sufficient, LGR was found competing well RF and CART followed with the same performance at higher learning ratios. Services were packed only when a significance level (p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, low birth weights, alcoholism, drug abuse, and emergency admissions. The work found  that different HSc services can be well packed as plans of limited duration, across various services sectors, learning configurations, as confirmed by using statistical hypotheses.