Understanding the Architecture of Hindu Temples: A Philosophical Interpretation

Vedic philosophy is one of the oldest existing philosophies of the world. Started around 6500 BC, in Western Indian subcontinent, the Indus valley Civilizations developed a theology which, gradually developed into a well-established philosophy of beliefs, popularly known as ‘Hindu religion’. In Vedic theology, the abstract concept of God was formulated mostly by close observation of the dynamicity and the recurrence of natural and universal phenomena. Through the ages, the philosophy of this theology went through various discursions, debates, and questionings and the abstract concept of God was, in time, formalized into more representational forms by the means of various signs and symbols. Often, these symbols were used in more subtle ways in the construction of “sacred” sculptures and structures. Apparently, two different philosophies were developed from the Vedic philosophy and these two philosophies are mostly seen in the northern part and southern part of the Indian subcontinent. This paper tries to summarize the complex philosophical treaties of Hinduism of northern and southern India and seeks to understand the meanings of the various signs and symbolisms that were incorporated in the architecture of Hindu temples, including the names given to various parts of the temples. The Hindu temples are not only places of worship or ‘houses of Gods’ like the Greek and Roman temples but are also structures that symbolize the dynamicity and also spiritual upliftment of human beings.

Developing Improvements to Multi-Hazard Risk Assessments

This paper outlines the approaches taken to assess multi-hazard assessments. There is currently confusion in assessing multi-hazard impacts, and so this study aims to determine which of the available options are the most useful. The paper uses an international literature search, and analysis of current multi-hazard assessments and a case study to illustrate the effectiveness of the chosen method. Findings from this study will help those wanting to assess multi-hazards to undertake a straightforward approach. The paper is significant as it helps to interpret the various approaches and concludes with the preferred method. Many people in the world live in hazardous environments and are susceptible to disasters. Unfortunately, when a disaster strikes it is often compounded by additional cascading hazards, thus people would confront more than one hazard simultaneously. Hazards include natural hazards (earthquakes, floods, etc.) or cascading human-made hazards (for example, Natural Hazard Triggering Technological disasters (Natech) such as fire, explosion, toxic release). Multi-hazards have a more destructive impact on urban areas than one hazard alone. In addition, climate change is creating links between different disasters such as causing landslide dams and debris flows leading to more destructive incidents. Much of the prevailing literature deals with only one hazard at a time. However, recently sophisticated multi-hazard assessments have started to appear. Given that multi-hazards occur, it is essential to take multi-hazard risk assessment under consideration. This paper aims to review the multi-hazard assessment methods through articles published to date and categorize the strengths and disadvantages of using these methods in risk assessment. Napier City is selected as a case study to demonstrate the necessity of using multi-hazard risk assessments. In order to assess multi-hazard risk assessments, first, the current multi-hazard risk assessment methods were described. Next, the drawbacks of these multi-hazard risk assessments were outlined. Finally, the improvements to current multi-hazard risk assessments to date were summarised. Generally, the main problem of multi-hazard risk assessment is to make a valid assumption of risk from the interactions of different hazards. Currently, risk assessment studies have started to assess multi-hazard situations, but drawbacks such as uncertainty and lack of data show the necessity for more precise risk assessment. It should be noted that ignoring or partial considering multi-hazards in risk assessment will lead to an overestimate or overlook in resilient and recovery action managements.

A Design of Anisotropic Wet Etching System to Reduce Hillocks on Etched Surface of Silicon Substrate

This research aims to design and build a wet etching system, which is suitable for anisotropic wet etching, in order to reduce etching time, to reduce hillocks on the etched surface (to reduce roughness), and to create a 45-degree wall angle (micro-mirror). This study would start by designing a wet etching system. There are four main components in this system: an ultrasonic cleaning, a condenser, a motor and a substrate holder. After that, an ultrasonic machine was modified by applying a condenser to maintain the consistency of the solution concentration during the etching process and installing a motor for improving the roughness. This effect on the etch rate and the roughness showed that the etch rate increased and the roughness was reduced.

Exporting Physiochemical Changes during the Fermentation of Aloe Vera

Aloe Vera is a short-stemmed succulent plant which is commonly used in Myanmar traditional medicine. A. vera gel was also used as food addictive. This study aims to improve the Myanmar folk medicine to a functional beverage. In this research, Aloe vera was fermented with Saccharomyces cerevisiae for 6 months. Three different processes were carried out. Process I contains A. vera 10%, sugar 30%, water 50%, and starter culture 10%, process II contains A. vera 10%, sugar 15%, honey 15%, and water 50%, starter culture 10%; process III contains A. vera 10%, honey 30%, water 50%, starter culture 10%. During wine fermentation, the wine parameters such as alcohol content, total soluble solid (ºBrix), pH, color and cell population were analyzed. After 30 days of fermentation, total cell population remained 2.8x106 in P-I, P-II and 3.2x106 in P-III. Total soluble solid content dropped to 15.8 in P-I, P-II and 15.7 in P-III. After 30 days, clear wine was transferred to other vassals for racking. After 6 months of racking, microbial population reached under detectable level and alcohol content was round about 11% but not significantly different among these processes. P-II was found to have the highest color intensity at 450 nm and it got the most taster satisfaction when sensory evaluation was carried out using five hedonic scales after 6 month of racking.

Development of Requirements Analysis Tool for Medical Autonomy in Long-Duration Space Exploration Missions

Improving resources for medical autonomy of astronauts in prolonged space missions, such as a Mars mission, requires not only technology development, but also decision-making support systems. The Advanced Crew Medical System - Medical Condition Requirements study, funded by the Canadian Space Agency, aimed to create knowledge content and a scenario-based query capability to support medical autonomy of astronauts. The key objective of this study was to create a prototype tool for identifying medical infrastructure requirements in terms of medical knowledge, skills and materials. A multicriteria decision-making method was used to prioritize the highest risk medical events anticipated in a long-term space mission. Starting with those medical conditions, event sequence diagrams (ESDs) were created in the form of decision trees where the entry point is the diagnosis and the end points are the predicted outcomes (full recovery, partial recovery, or death/severe incapacitation). The ESD formalism was adapted to characterize and compare possible outcomes of medical conditions as a function of available medical knowledge, skills, and supplies in a given mission scenario. An extensive literature review was performed and summarized in a medical condition database. A PostgreSQL relational database was created to allow query-based evaluation of health outcome metrics with different medical infrastructure scenarios. Critical decision points, skill and medical supply requirements, and probable health outcomes were compared across chosen scenarios. The three medical conditions with the highest risk rank were acute coronary syndrome, sepsis, and stroke. Our efforts demonstrate the utility of this approach and provide insight into the effort required to develop appropriate content for the range of medical conditions that may arise.

The Keys to Innovation: Defining and Evaluating Attributes that Measure Innovation Capabilities

Innovation is a key driver for companies, society, and economic growth. However, assessing and measuring innovation for individuals as well as organizations remains difficult. Our i5-Score presented in this study will help to overcome this difficulty and facilitate measuring the innovation potential. The score is based on a framework we call the 5Gs of innovation which defines specific innovation attributes. Those are 1) the drive for long-term goals 2) the audacity to generate new ideas, 3) the openness to share ideas with others, 4) the ability to grow, and 5) the ability to maintain high levels of optimism. To validate the i5-Score, we conducted a study at Florida Polytechnic University. The results show that the i5-Score is a good measure reflecting the innovative mindset of an individual or a group. Thus, the score can be utilized for evaluating, refining and enhancing innovation capabilities.

Analysis of Network Performance Using Aspect of Quantum Cryptography

Quantum cryptography is described as a point-to-point secure key generation technology that has emerged in recent times in providing absolute security. Researchers have started studying new innovative approaches to exploit the security of Quantum Key Distribution (QKD) for a large-scale communication system. A number of approaches and models for utilization of QKD for secure communication have been developed. The uncertainty principle in quantum mechanics created a new paradigm for QKD. One of the approaches for use of QKD involved network fashioned security. The main goal was point-to-point Quantum network that exploited QKD technology for end-to-end network security via high speed QKD. Other approaches and models equipped with QKD in network fashion are introduced in the literature as. A different approach that this paper deals with is using QKD in existing protocols, which are widely used on the Internet to enhance security with main objective of unconditional security. Our work is towards the analysis of the QKD in Mobile ad-hoc network (MANET).

Review of the Road Crash Data Availability in Iraq

Iraq is a middle income country where the road safety issue is considered one of the leading causes of deaths. To control the road risk issue, the Iraqi Ministry of Planning, General Statistical Organization started to organise a collection system of traffic accidents data with details related to their causes and severity. These data are published as an annual report. In this paper, a review of the available crash data in Iraq will be presented. The available data represent the rate of accidents in aggregated level and classified according to their types, road users’ details, and crash severity, type of vehicles, causes and number of causalities. The review is according to the types of models used in road safety studies and research, and according to the required road safety data in the road constructions tasks. The available data are also compared with the road safety dataset published in the United Kingdom as an example of developed country. It is concluded that the data in Iraq are suitable for descriptive and exploratory models, aggregated level comparison analysis, and evaluation and monitoring the progress of the overall traffic safety performance. However, important traffic safety studies require disaggregated level of data and details related to the factors of the likelihood of traffic crashes. Some studies require spatial geographic details such as the location of the accidents which is essential in ranking the roads according to their level of safety, and name the most dangerous roads in Iraq which requires tactic plan to control this issue. Global Road safety agencies interested in solve this problem in low and middle-income countries have designed road safety assessment methodologies which are basing on the road attributes data only. Therefore, in this research it is recommended to use one of these methodologies.

Religion versus Secularism on Women’s Liberation: The Question of Women Liberation and Modern Education

The nineteenth century was characterized by major educational reforms in the Arab World. One of the unintended outcomes of colonization in Arab countries was the initiation of women liberation as well as the introduction of modern education and its application in sensitizing people on the rights of women and their liberation. The reforms were often attributed to various undercurrents that took place at different levels within the Ottoman Empire, and particularly the arrival and influence of the Christian missionaries were supported by the American and European governments. These trends were also significantly attributed to the increase in the presence of Europeans in the region, as well as the introduction of secular ideas and approaches related to the meaning of modernity. Using literary analysis as a method, this paper examines the role of an important male figure like the political activist and writer Qāsim Amīn and the religious reformer Muḥammad ʻAbduh in starting this discourse and shows their impact on the emancipation of women movement (Taḥrīr), and how later women led the movement with their published work. This paper explores Arab Salons and the initiation of women’s literary circles. Women from wealthy families in Egypt and Syria who had studied in Europe or interacted with European counterparts began these circles. These salons acted as central locations where people could meet and hold discussions on political, social, and literary trends as they happened each day. The paper concludes with a discussion of current debates between the Islamist and the secularist branches of the movement today. While the Islamists believe that adhering to the core of Islam with some of its contested position on women is a modern ideology of liberation that fits the current culture of modern time Egypt; the secularists argue that the influence that Islam has on the women’s liberation movement in Egypt has been a threat to the natural success and progress of the movement, which was initiated in the early nineteenth century independent of the more recent trends towards religiosity in the country.

Spatial Disparity in Education and Medical Facilities: A Case Study of Barddhaman District, West Bengal, India

The economic scenario of any region does not show the real picture for the measurement of overall development. Therefore, economic development must be accompanied by social development to be able to make an assessment to measure the level of development. The spatial variation with respect to social development has been discussed taking into account the quality of functioning of a social system in a specific area. In this paper, an attempt has been made to study the spatial distribution of social infrastructural facilities and analyze the magnitude of regional disparities at inter- block level in Barddhman district. It starts with the detailed account of the selection process of social infrastructure indicators and describes the methodology employed in the empirical analysis. Analyzing the block level data, this paper tries to identify the disparity among the blocks in the levels of social development. The results have been subsequently explained using both statistical analysis and geo spatial technique. The paper reveals that the social development is not going on at the same rate in every part of the district. Health facilities and educational facilities are concentrated at some selected point. So overall development activities come to be concentrated in a few centres and the disparity is seen over the blocks.

Assessment and Uncertainty Analysis of ROSA/LSTF Test on Pressurized Water Reactor 1.9% Vessel Upper Head Small-Break Loss-of-Coolant Accident

An experiment utilizing the ROSA/LSTF (rig of safety assessment/large-scale test facility) simulated a 1.9% vessel upper head small-break loss-of-coolant accident with an accident management (AM) measure under the total failure of high-pressure injection system of emergency core cooling system in a pressurized water reactor. Steam generator (SG) secondary-side depressurization on the AM measure was started by fully opening relief valves in both SGs when the maximum core exit temperature rose to 623 K. A large increase took place in the cladding surface temperature of simulated fuel rods on account of a late and slow response of core exit thermocouples during core boil-off. The author analyzed the LSTF test by reference to the matrix of an integral effect test for the validation of a thermal-hydraulic system code. Problems remained in predicting the primary coolant distribution and the core exit temperature with the RELAP5/MOD3.3 code. The uncertainty analysis results of the RELAP5 code confirmed that the sample size with respect to the order statistics influences the value of peak cladding temperature with a 95% probability at a 95% confidence level, and the Spearman’s rank correlation coefficient.

Simultaneous Optimization of Design and Maintenance through a Hybrid Process Using Genetic Algorithms

In general, issues related to design and maintenance are considered in an independent manner. However, the decisions made in these two sets influence each other. The design for maintenance is considered an opportunity to optimize the life cycle cost of a product, particularly in the nuclear or aeronautical field, where maintenance expenses represent more than 60% of life cycle costs. The design of large-scale systems starts with product architecture, a choice of components in terms of cost, reliability, weight and other attributes, corresponding to the specifications. On the other hand, the design must take into account maintenance by improving, in particular, real-time monitoring of equipment through the integration of new technologies such as connected sensors and intelligent actuators. We noticed that different approaches used in the Design For Maintenance (DFM) methods are limited to the simultaneous characterization of the reliability and maintainability of a multi-component system. This article proposes a method of DFM that assists designers to propose dynamic maintenance for multi-component industrial systems. The term "dynamic" refers to the ability to integrate available monitoring data to adapt the maintenance decision in real time. The goal is to maximize the availability of the system at a given life cycle cost. This paper presents an approach for simultaneous optimization of the design and maintenance of multi-component systems. Here the design is characterized by four decision variables for each component (reliability level, maintainability level, redundancy level, and level of monitoring data). The maintenance is characterized by two decision variables (the dates of the maintenance stops and the maintenance operations to be performed on the system during these stops). The DFM model helps the designers choose technical solutions for the large-scale industrial products. Large-scale refers to the complex multi-component industrial systems and long life-cycle, such as trains, aircraft, etc. The method is based on a two-level hybrid algorithm for simultaneous optimization of design and maintenance, using genetic algorithms. The first level is to select a design solution for a given system that considers the life cycle cost and the reliability. The second level consists of determining a dynamic and optimal maintenance plan to be deployed for a design solution. This level is based on the Maintenance Free Operating Period (MFOP) concept, which takes into account the decision criteria such as, total reliability, maintenance cost and maintenance time. Depending on the life cycle duration, the desired availability, and the desired business model (sales or rental), this tool provides visibility of overall costs and optimal product architecture.

Determination of the Thermophysical Characteristics of the Composite Material Clay Cement Paper

In Morocco, the building sector is largely responsible for the evolution of energy consumption. The control of energy in this sector remains a major issue despite the rise of renewable energies. The design of an environmentally friendly building requires mastery and knowledge of energy and bioclimatic aspects. This implies taking into consideration of all the elements making up the building and the way in which energy exchanges take place between these elements. In this context, thermal insulation seems to be an ideal starting point for reducing energy consumption and greenhouse gas emissions. In this context, thermal insulation seems to be an ideal starting point for reducing energy consumption and greenhouse gas emissions. The aim of this work is to provide some solutions to reduce energy consumption while maintaining thermal comfort in the building. The objective of our work is to present an experimental study on the characterization of local materials used in the thermal insulation of buildings. These are paper recycling stabilized with cement and clay. The thermal conductivity of these materials, which were constituted based on sand, clay, cement; water, as well as treated paper, was determined by the guarded-hot-plate method. It involves the design of two materials that will subsequently be subjected to thermal and mechanical tests to determine their thermophysical properties. The results show that the thermal conductivity decreases as well in the case of the paper-cement mixture as that of the paper-clay and seems to stabilize around 40%. Measurements of mechanical properties such as flexural strength have shown that the enrichment of the studied material with paper makes it possible to reduce the flexural strength by 20% while optimizing the conductivity.

Extending the Flipped Classroom Approach: Using Technology in Module Delivery to Students of English Language and Literature at the British University in Egypt

Technology-enhanced teaching has been in the limelight since the 90s when educators started investigating and experimenting with using computers in the classroom as a means of building 21st. century skills and motivating students. The concept of technology-enhanced strategies in education is kaleidoscopic! It has meant different things to different educators. For the purpose of this paper, however, it will be used to refer to the diverse technology-based strategies used to support and enrich the flipped learning process, in the classroom and outside. The paper will investigate how technology is put in the service of teaching and learning to improve the students’ learning experience as manifested in students’ attendance and engagement, achievement rates and finally, students’ projects at the end of the semester. The results will be supported by a student survey about relevant specific aspects of their learning experience in the modules in the study.

Measuring Banks’ Antifragility via Fuzzy Logic

Analysing the world banking sector, we realize that traditional risk measurement methodologies no longer reflect the actual scenario with uncertainty and leave out events that can change the dynamics of markets. Considering this, regulators and financial institutions began to search more realistic models. The aim is to include external influences and interdependencies between agents, to describe and measure the operationalization of these complex systems and their risks in a more coherent and credible way. Within this context, X-Events are more frequent than assumed and, with uncertainties and constant changes, the concept of antifragility starts to gain great prominence in comparison to others methodologies of risk management. It is very useful to analyse whether a system succumbs (fragile), resists (robust) or gets benefits (antifragile) from disorder and stress. Thus, this work proposes the creation of the Banking Antifragility Index (BAI), which is based on the calculation of a triangular fuzzy number – to "quantify" qualitative criteria linked to antifragility.

Analysis and Evaluation of the Public Responses to Traffic Congestion Pricing Schemes in Urban Streets

Traffic congestion pricing in urban streets is one of the most suitable options for solving the traffic problems and environment pollutions in the cities of the country. Unlike its acceptable outcomes, there are problems concerning the necessity to pay by the mass. Regarding the fact that public response in order to succeed in this strategy is so influential, studying their response and behavior to get the feedback and improve the strategies is of great importance. In this study, a questionnaire was used to examine the public reactions to the traffic congestion pricing schemes at the center of Tehran metropolis and the factors involved in people’s decision making in accepting or rejecting the congestion pricing schemes were assessed based on the data obtained from the questionnaire as well as the international experiences. Then, by analyzing and comparing the schemes, guidelines to reduce public objections to them are discussed. The results of reviewing and evaluating the public reactions show that all the pros and cons must be considered to guarantee the success of these projects. Consequently, with targeted public education and consciousness-raising advertisements, prior to initiating a scheme and ensuring the mechanism of the implementation after the start of the project, the initial opposition is reduced and, with the gradual emergence of the real and tangible benefits of its implementation, users’ satisfaction will increase.

Choice Experiment Approach on Evaluation of Non-Market Farming System Outputs: First Results from Lithuanian Case Study

Market and non-market outputs are produced jointly in agriculture. Their supply depends on the intensity and type of production. The role of agriculture as an economic activity and its effects are important for the Lithuanian case study, as agricultural land covers more than a half of country. Positive and negative externalities, created in agriculture are not considered in the market. Therefore, specific techniques such as stated preferences methods, in particular choice experiments (CE) are used for evaluation of non-market outputs in agriculture. The main aim of this paper is to present construction of the research path for evaluation of non-market farming system outputs in Lithuania. The conventional and organic farming, covering crops (including both cereal and industrial crops) and livestock (including dairy and cattle) production has been selected. The CE method and nested logit (NL) model were selected as appropriate for evaluation of non-market outputs of different farming systems in Lithuania. A pilot survey was implemented between October–November 2018, in order to test and improve the CE questionnaire. The results of the survey showed that the questionnaire is accepted and well understood by the respondents. The econometric modelling showed that the selected NL model could be used for the main survey. The understanding of the differences between organic and conventional farming by residents was identified. It was revealed that they are more willing to choose organic farming in comparison to conventional farming.

Comparing the Efficiency of Simpson’s 1/3 and 3/8 Rules for the Numerical Solution of First Order Volterra Integro-Differential Equations

This paper compared the efficiency of Simpson’s 1/3 and 3/8 rules for the numerical solution of first order Volterra integro-differential equations. In developing the solution, collocation approximation method was adopted using the shifted Legendre polynomial as basis function. A block method approach is preferred to the predictor corrector method for being self-starting. Experimental results confirmed that the Simpson’s 3/8 rule is more efficient than the Simpson’s 1/3 rule.

Investigation of the Relationship between Exam Anxiety and Binge Disorders in High School Students in the 15-19 Age Range

Eating disorders are mental health disorders as a result of disruption of the diet. This study was conducted to examine the relationship between exam stress and binge eating disorder in high school students. The study was conducted in March 2018 with 60 high school students (31 females, 29 males) aged 15-19 years. Personal characteristics and eating habits of individuals were measured by using a questionnaire prepared by the researcher. The binge eating disorder status of the individuals participating in the study was determined by the Bulimic Investigatory Test Edinburgh (BITE); test anxiety status was determined by Revised Exam Anxiety Scale. Statistical analysis of the data obtained from the study was done by IBM SPSS Statistics 23 program. While there was no significant relationship between the points obtained from the Bulimia Research Test Edinburgh and the consumption of something after the dinner (p > 0.05), there was a significant relationship with need to eat when stressed (p < 0.05). It is seen in 43.3% of individuals that there is no binge eating disorder, but abnormal eating behavior is observed. In 8.3% of the students, binge eating disorder (BED) was seen. No significant difference was found between male and female students in terms of BED (p > 0.05). It was determined that 60% of the participants who have BED had a medium level of anxiety and 40% had a high level of anxiety. A significant relationship was found between BITE and revised test anxiety (RTA) scale scores (p < 0.05). However, the relationship between the need for eating and the BMI and RTA scores were not significant (p > 0.05). In the study, the desired parameter was positive; there was a positive relationship between the BED and the test scores. In this period which is the starting time of dietary issues and different mental issues, for example, youthfulness, there should be regular trainings on the methods of coping with anxiety and on the principles of healthy nutrition in order to prevent health problems.

Evaluation of the Weight-Based and Fat-Based Indices in Relation to Basal Metabolic Rate-to-Weight Ratio

Basal metabolic rate is questioned as a risk factor for weight gain. The relations between basal metabolic rate and body composition have not been cleared yet. The impact of fat mass on basal metabolic rate is also uncertain. Within this context, indices based upon total body mass as well as total body fat mass are available. In this study, the aim is to investigate the potential clinical utility of these indices in the adult population. 287 individuals, aged from 18 to 79 years, were included into the scope of the study. Based upon body mass index values, 10 underweight, 88 normal, 88 overweight, 81 obese, and 20 morbid obese individuals participated. Anthropometric measurements including height (m), and weight (kg) were performed. Body mass index, diagnostic obesity notation model assessment index I, diagnostic obesity notation model assessment index II, basal metabolic rate-to-weight ratio were calculated. Total body fat mass (kg), fat percent (%), basal metabolic rate, metabolic age, visceral adiposity, fat mass of upper as well as lower extremities and trunk, obesity degree were measured by TANITA body composition monitor using bioelectrical impedance analysis technology. Statistical evaluations were performed by statistical package (SPSS) for Windows Version 16.0. Scatterplots of individual measurements for the parameters concerning correlations were drawn. Linear regression lines were displayed. The statistical significance degree was accepted as p < 0.05. The strong correlations between body mass index and diagnostic obesity notation model assessment index I as well as diagnostic obesity notation model assessment index II were obtained (p < 0.001). A much stronger correlation was detected between basal metabolic rate and diagnostic obesity notation model assessment index I in comparison with that calculated for basal metabolic rate and body mass index (p < 0.001). Upon consideration of the associations between basal metabolic rate-to-weight ratio and these three indices, the best association was observed between basal metabolic rate-to-weight and diagnostic obesity notation model assessment index II. In a similar manner, this index was highly correlated with fat percent (p < 0.001). Being independent of the indices, a strong correlation was found between fat percent and basal metabolic rate-to-weight ratio (p < 0.001). Visceral adiposity was much strongly correlated with metabolic age when compared to that with chronological age (p < 0.001). In conclusion, all three indices were associated with metabolic age, but not with chronological age. Diagnostic obesity notation model assessment index II values were highly correlated with body mass index values throughout all ranges starting with underweight going towards morbid obesity. This index is the best in terms of its association with basal metabolic rate-to-weight ratio, which can be interpreted as basal metabolic rate unit.