Bold Headlines of Urban Eyescapes: A Computational Approach of Urban Mapping via Digital Surveying and Eye-Tracking Technologies

The sensory stimuli from the urban environment are often distinguished as subtle structures that derive from experiencing the city. The experience of the urban environment is also related to the social relationships and memories that complete the 'urban eyescapes' and the way individuals can recall them. Despite the fact that the consideration of urban sensory stimuli is part of urban design, currently the account of visual experience in urban studies is hard to be identified. This article explores ways of recording how the senses mediate one's engagement with the urban environment. This study involves an experiment in the urban environment of the Copenhagen city centre, with 20 subjects performing a walking task. The aim of the experiment is to categorize the visual 'Bold Headlined Stimuli’ (BHS) of the examined environment, using eye-tracking techniques. The analysis allows us to identify the Headlining Stimuli Process, (HSP) in the select urban environment. HSP is significantly mediated by body mobility and perceptual memories and has shown how urban stimuli influence the intelligibility and the recalling patterns of the urban characteristics. The results have yielded a 'Bold Headline list' of stimuli related to: the spatial characteristics of higher preference; the stimuli that are relevant to livability; and the spatial dimensions easier to recall. The data of BHS will be used in cross-disciplinary city analysis. In the future, these results could be useful in urban design, to provide information on how urban space affects the human activities.

Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Drawings as a Methodical Access to Reconstruct Children's Perspective on a Horse-Assisted Intervention

In this article, the collection and analysis of drawings are implemented and discussed as a methodological approach to reconstruct children's perspective on horse-assisted interventions. For this purpose, drawings of three children (8-10 years old) were included in the research process in order to clarify the question of what insights can be derived from the drawings about the child's perspective on the intervention. The children were asked to draw a picture of themselves at the horse stable. Practical implementation considerations are disclosed. The developed analysis steps consider the work of two art historians (Erwin Panofsky and Max Imdahl) to capture the visual sense and to interpret the children's drawings. Relevant topics about the children's perspective can be inferred from the drawings. In the drawings, the following topics are important for the children: Overcoming challenges and fears in handling the horse, support from an adult in handling the horse and feeling self-confident and competent to act after completing tasks with the horse. The drawings show the main topics which are relevant for the children and can be used as a basis for conversation. All in all, the child's drawing offers a useful addition to other survey methods in order to gain further insights into the experiences of children in a horse-assisted setting.

Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model

Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.

Adomian’s Decomposition Method to Functionally Graded Thermoelastic Materials with Power Law

This paper presents an iteration method for the numerical solutions of a one-dimensional problem of generalized thermoelasticity with one relaxation time under given initial and boundary conditions. The thermoelastic material with variable properties as a power functional graded has been considered. Adomian’s decomposition techniques have been applied to the governing equations. The numerical results have been calculated by using the iterations method with a certain algorithm. The numerical results have been represented in figures, and the figures affirm that Adomian’s decomposition method is a successful method for modeling thermoelastic problems. Moreover, the empirical parameter of the functional graded, and the lattice design parameter have significant effects on the temperature increment, the strain, the stress, the displacement.

Deep Learning Application for Object Image Recognition and Robot Automatic Grasping

Since the vision system application in industrial environment for autonomous purposes is required intensely, the image recognition technique becomes an important research topic. Here, deep learning algorithm is employed in image system to recognize the industrial object and integrate with a 7A6 Series Manipulator for object automatic gripping task. PC and Graphic Processing Unit (GPU) are chosen to construct the 3D Vision Recognition System. Depth Camera (Intel RealSense SR300) is employed to extract the image for object recognition and coordinate derivation. The YOLOv2 scheme is adopted in Convolution neural network (CNN) structure for object classification and center point prediction. Additionally, image processing strategy is used to find the object contour for calculating the object orientation angle. Then, the specified object location and orientation information are sent to robotic controller. Finally, a six-axis manipulator can grasp the specific object in a random environment based on the user command and the extracted image information. The experimental results show that YOLOv2 has been successfully employed to detect the object location and category with confidence near 0.9 and 3D position error less than 0.4 mm. It is useful for future intelligent robotic application in industrial 4.0 environment.

Classifying Turbomachinery Blade Mode Shapes Using Artificial Neural Networks

Currently, extensive signal analysis is performed in order to evaluate structural health of turbomachinery blades. This approach is affected by constraints of time and the availability of qualified personnel. Thus, new approaches to blade dynamics identification that provide faster and more accurate results are sought after. Generally, modal analysis is employed in acquiring dynamic properties of a vibrating turbomachinery blade and is widely adopted in condition monitoring of blades. The analysis provides useful information on the different modes of vibration and natural frequencies by exploring different shapes that can be taken up during vibration since all mode shapes have their corresponding natural frequencies. Experimental modal testing and finite element analysis are the traditional methods used to evaluate mode shapes with limited application to real live scenario to facilitate a robust condition monitoring scheme. For a real time mode shape evaluation, rapid evaluation and low computational cost is required and traditional techniques are unsuitable. In this study, artificial neural network is developed to evaluate the mode shape of a lab scale rotating blade assembly by using result from finite element modal analysis as training data. The network performance evaluation shows that artificial neural network (ANN) is capable of mapping the correlation between natural frequencies and mode shapes. This is achieved without the need of extensive signal analysis. The approach offers advantage from the perspective that the network is able to classify mode shapes and can be employed in real time including simplicity in implementation and accuracy of the prediction. The work paves the way for further development of robust condition monitoring system that incorporates real time mode shape evaluation.

Machine Learning Techniques in Bank Credit Analysis

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Determination of Optimal Stress Locations in 2D–9 Noded Element in Finite Element Technique

In Finite Element Technique nodal stresses are calculated through displacement as nodes. In this process, the displacement calculated at nodes is sufficiently good enough but stresses calculated at nodes are not sufficiently accurate. Therefore, the accuracy in the stress computation in FEM models based on the displacement technique is obviously matter of concern for computational time in shape optimization of engineering problems. In the present work same is focused to find out unique points within the element as well as the boundary of the element so, that good accuracy in stress computation can be achieved. Generally, major optimal stress points are located in domain of the element some points have been also located at boundary of the element where stresses are fairly accurate as compared to nodal values. Then, it is subsequently concluded that there is an existence of unique points within the element, where stresses have higher accuracy than other points in the elements. Therefore, it is main aim is to evolve a generalized procedure for the determination of the optimal stress location inside the element as well as at the boundaries of the element and verify the same with results from numerical experimentation. The results of quadratic 9 noded serendipity elements are presented and the location of distinct optimal stress points is determined inside the element, as well as at the boundaries. The theoretical results indicate various optimal stress locations are in local coordinates at origin and at a distance of 0.577 in both directions from origin. Also, at the boundaries optimal stress locations are at the midpoints of the element boundary and the locations are at a distance of 0.577 from the origin in both directions. The above findings were verified through experimentation and findings were authenticated. For numerical experimentation five engineering problems were identified and the numerical results of 9-noded element were compared to those obtained by using the same order of 25-noded quadratic Lagrangian elements, which are considered as standard. Then root mean square errors are plotted with respect to various locations within the elements as well as the boundaries and conclusions were drawn. After numerical verification it is noted that in a 9-noded element, origin and locations at a distance of 0.577 from origin in both directions are the best sampling points for the stresses. It was also noted that stresses calculated within line at boundary enclosed by 0.577 midpoints are also very good and the error found is very less. When sampling points move away from these points, then it causes line zone error to increase rapidly. Thus, it is established that there are unique points at boundary of element where stresses are accurate, which can be utilized in solving various engineering problems and are also useful in shape optimizations.

Digital Library Evaluation by SWARA-WASPAS Method

Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.

Designing of a Non-Zero Dispersion Shifted Fiber with Ultra-High Birefringence and High Non-Linearity

Photonic Crystal Fiber (PCF) uses are no longer limited to telecommunication only rather it is now used for many sensors-based fiber optics application, medical science, space application and so on. In this paper, the authors have proposed a microstructure PCF that is designed by using Finite Element Method (FEM) based software. Besides designing, authors have discussed the necessity of the characteristics that it poses for some specified applications because it is not possible to have all good characteristics from a single PCF. Proposed PCF shows the property of ultra-high birefringence (0.0262 at 1550 nm) which is more useful for sensor based on fiber optics. The non-linearity of this fiber is 50.86 w-1km-1 at 1550 nm wavelength which is very high to guide the light through the core tightly. For Perfectly Matched Boundary Layer (PML), 0.6 μm diameter is taken. This design will offer the characteristics of Nonzero-Dispersion-Shifted Fiber (NZ-DSF) for 450 nm waveband. Since it is a software-based design and no practical evaluation has made, 2% tolerance is checked and the authors have found very small variation of the characteristics.

Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes

The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.

Mechanical Properties of Organic Polymer and Exfoliated Graphite Reinforced Bacteria Cellulose Paper

Bacterial Cellulose (BC) is a structural organic compound produced in the anaerobic process. This material can be a useful eco-friendly substitute for commercial textiles that are used in industries today. BC is easily and sustainably produced and has the capabilities to be used as a replacement in textiles. However, BC is extremely fragile when it completely dries. This research was conducted to improve the mechanical properties of the BC by reinforcing with an organic polymer and exfoliated graphite (EG). The BC films were grown over a period of weeks in a green tea and kombucha solution at 30 °C, then cleaned and added to an enhancing solution. The enhancing solutions were a mixture of 2.5 wt% polymer and 2.5 wt% latex solution, a 5 wt% polymer solution, a 0.20 wt% graphite solution and were each allowed to sit in a furnace for 48 h at 50 °C. Tensile test samples were prepared and tested until fracture at a strain rate of 8 mm/min. From the research with the addition of a 5 wt% polymer solution, the flexibility of the BC has significantly improved with the maximum strain significantly larger than that of the base sample. The addition of EG has also increased the modulus of elasticity of the BC by about 25%.

Multi-Objective Optimal Design of a Cascade Control System for a Class of Underactuated Mechanical Systems

This paper presents a multi-objective optimal design of a cascade control system for an underactuated mechanical system. Cascade control structures usually include two control algorithms (inner and outer). To design such a control system properly, the following conflicting objectives should be considered at the same time: 1) the inner closed-loop control must be faster than the outer one, 2) the inner loop should fast reject any disturbance and prevent it from propagating to the outer loop, 3) the controlled system should be insensitive to measurement noise, and 4) the controlled system should be driven by optimal energy. Such a control problem can be formulated as a multi-objective optimization problem such that the optimal trade-offs among these design goals are found. To authors best knowledge, such a problem has not been studied in multi-objective settings so far. In this work, an underactuated mechanical system consisting of a rotary servo motor and a ball and beam is used for the computer simulations, the setup parameters of the inner and outer control systems are tuned by NSGA-II (Non-dominated Sorting Genetic Algorithm), and the dominancy concept is used to find the optimal design points. The solution of this problem is not a single optimal cascade control, but rather a set of optimal cascade controllers (called Pareto set) which represent the optimal trade-offs among the selected design criteria. The function evaluation of the Pareto set is called the Pareto front. The solution set is introduced to the decision-maker who can choose any point to implement. The simulation results in terms of Pareto front and time responses to external signals show the competing nature among the design objectives. The presented study may become the basis for multi-objective optimal design of multi-loop control systems.

Blended Learning through Google Classroom

This paper discusses that good learning involves all academic groups in the school. Blended learning is learning outside the classroom. Google Classroom is a free service learning app for schools, non-profit organizations and anyone with a personal Google account. Facilities accessed through computers and mobile phones are very useful for school teachers and students. Blended learning classrooms using both traditional and technology-based methods for teaching have become the norm for many educators. Using Google Classroom gives students access to online learning. Even if the teacher is not in the classroom, the teacher can provide learning. This is the supervision of the form of the teacher when the student is outside the school.

The Strategic Engine Model: Redefined Strategy Structure, as per Market-and Resource-Based Theory Application, Tested in the Automotive Industry

The purpose of the paper is to redefine the levels of structure of corporate, business and functional strategies that were established over the past several decades, to a conceptual model, consisting of corporate, business and operations strategies, that are reinforced by functional strategies. We will propose a conceptual framework of different perspectives in the role of strategic operations as a separate strategic place and reposition the remaining functional strategies as supporting tools, existing at all three levels. The proposed model is called ‘the strategic engine’, since the mutual relationships of its ingredients are identical with main elements and working principle of the internal combustion engine. Based on theoretical essence, related to every strategic level, we will prove that the strategic engine model is useful for managers seeking to safeguard the competitive advantage of their companies. Each strategy level is researched through its basic elements. At the corporate level we examine the scope of firm’s product, the vertical and geographical coverage. At the business level, the point of interest is limited to the SWOT analysis’ basic elements. While at operations level, the key research issue relates to the scope of the following performance indicators: cost, quality, speed, flexibility and dependability. In this relationship, the paper provides a different view for the role of operations strategy within the overall strategy concept. We will prove that the theoretical essence of operations goes far beyond the scope of traditionally accepted business functions. Exploring the applications of Resource-based theory and Market-based theory within the strategic levels framework, we will prove that there is a logical consequence of the theoretical impact in corporate, business and operations strategy – at every strategic level, the validity of one theory is substituted to the level of the other. Practical application of the conceptual model is tested in automotive industry. Actually, the proposed theoretical concept is inspired by a leading global automotive group – Inchcape PLC, listed on the London Stock Exchange, and constituent of the FTSE 250 Index.

The Potential Involvement of Platelet Indices in Insulin Resistance in Morbid Obese Children

Association between insulin resistance (IR) and hematological parameters has long been a matter of interest. Within this context, body mass index (BMI), red blood cells, white blood cells and platelets were involved in this discussion. Parameters related to platelets associated with IR may be useful indicators for the identification of IR. Platelet indices such as mean platelet volume (MPV), platelet distribution width (PDW) and plateletcrit (PCT) are being questioned for their possible association with IR. The aim of this study was to investigate the association between platelet (PLT) count as well as PLT indices and the surrogate indices used to determine IR in morbid obese (MO) children. A total of 167 children participated in the study. Three groups were constituted. The number of cases was 34, 97 and 36 children in the normal BMI, MO and metabolic syndrome (MetS) groups, respectively. Sex- and age-dependent BMI-based percentile tables prepared by World Health Organization were used for the definition of morbid obesity. MetS criteria were determined. BMI values, homeostatic model assessment for IR (HOMA-IR), alanine transaminase-to-aspartate transaminase ratio (ALT/AST) and diagnostic obesity notation model assessment laboratory (DONMA-lab) index values were computed. PLT count and indices were analyzed using automated hematology analyzer. Data were collected for statistical analysis using SPSS for Windows. Arithmetic mean and standard deviation were calculated. Mean values of PLT-related parameters in both control and study groups were compared by one-way ANOVA followed by Tukey post hoc tests to determine whether a significant difference exists among the groups. The correlation analyses between PLT as well as IR indices were performed. Statistically significant difference was accepted as p-value < 0.05. Increased values were detected for PLT (p < 0.01) and PCT (p > 0.05) in MO group compared to those observed in children with N-BMI. Significant increases for PLT (p < 0.01) and PCT (p < 0.05) were observed in MetS group in comparison with the values obtained in children with N-BMI (p < 0.01). Significantly lower MPV and PDW values were obtained in MO group compared to the control group (p < 0.01). HOMA-IR (p < 0.05), DONMA-lab index (p < 0.001) and ALT/AST (p < 0.001) values in MO and MetS groups were significantly increased compared to the N-BMI group. On the other hand, DONMA-lab index values also differed between MO and MetS groups (p < 0.001). In the MO group, PLT was negatively correlated with MPV and PDW values. These correlations were not observed in the N-BMI group. None of the IR indices exhibited a correlation with PLT and PLT indices in the N-BMI group. HOMA-IR showed significant correlations both with PLT and PCT in the MO group. All of the three IR indices were well-correlated with each other in all groups. These findings point out the missing link between IR and PLT activation. In conclusion, PLT and PCT may be related to IR in addition to their identities as hemostasis markers during morbid obesity. Our findings have suggested that DONMA-lab index appears as the best surrogate marker for IR due to its discriminative feature between morbid obesity and MetS.

Assessment of Obesity Parameters in Terms of Metabolic Age above and below Chronological Age in Adults

Chronologic age (CA) of individuals is closely related to obesity and generally affects the magnitude of obesity parameters. On the other hand, close association between basal metabolic rate (BMR) and metabolic age (MA) is also a matter of concern. It is suggested that MA higher than CA is the indicator of the need to improve the metabolic rate. In this study, the aim was to assess some commonly used obesity parameters, such as obesity degree, visceral adiposity, BMR, BMR-to-weight ratio, in several groups with varying differences between MA and CA values. The study comprises adults, whose ages vary between 18 and 79 years. Four groups were constituted. Group 1, 2, 3 and 4 were composed of 55, 33, 76 and 47 adults, respectively. The individuals exhibiting -1, 0 and +1 for their MA-CA values were involved in Group 1, which was considered as the control group. Those, whose MA-CA values varying between -5 and -10 participated in Group 2. Those, whose MAs above their real ages were divided into two groups [Group 3 (MA-CA; from +5 to + 10) and Group 4 (MA-CA; from +11 to + 12)]. Body mass index (BMI) values were calculated. TANITA body composition monitor using bioelectrical impedance analysis technology was used to obtain values for obesity degree, visceral adiposity, BMR and BMR-to-weight ratio. The compiled data were evaluated statistically using a statistical package program; SPSS. Mean ± SD values were determined. Correlation analyses were performed. The statistical significance degree was accepted as p < 0.05. The increase in BMR was positively correlated with obesity degree. MAs and CAs of the groups were 39.9 ± 16.8 vs 39.9 ± 16.7 years for Group 1, 45.0 ± 15.3 vs 51.4 ± 15.7 years for Group 2, 47.2 ± 12.7 vs 40.0 ± 12.7 years for Group 3, and 53.6 ± 14.8 vs 42 ± 14.8 years for Group 4. BMI values of the groups were 24.3 ± 3.6 kg/m2, 23.2 ± 1.7 kg/m2, 30.3 ± 3.8 kg/m2, and 40.1 ± 5.1 kg/m2 for Group 1, 2, 3 and 4, respectively. Values obtained for BMR were 1599 ± 328 kcal in Group 1, 1463 ± 198 kcal in Group 2, 1652 ± 350 kcal in Group 3, and 1890 ± 360 kcal in Group 4. A correlation was observed between BMR and MA-CA values in Group 1. No correlation was detected in other groups. On the other hand, statistically significant correlations between MA-CA values and obesity degree, BMI as well as BMR/weight were found in Group 3 and in Group 4. It was concluded that upon consideration of these findings in terms of MA-CA values, BMR-to-weight ratio was found to be much more useful indicator of the severe increase in obesity development than BMR. Also, the lack of associations between MA and BMR as well as BMR-to-weight ratio emphasize the importance of consideration of MA-CA values rather than MA.

Automated Monitoring System to Support Investigation of Contributing Factors of Work-Related Disorders and Accidents

Work-related illnesses and disorders have been a constant aspect of work. Although their nature has changed over time, from musculoskeletal disorders to illnesses related to psychosocial aspects of work, its impact on the life of workers remains significant. Despite significant efforts worldwide to protect workers, the disparity between changes in work legislation and actual benefit for workers’ health has been creating a significant economic burden for social security and health systems around the world. In this context, this study aims to propose, test and validate a modular prototype that allows for work environmental aspects to be assessed, monitored and better controlled. The main focus is also to provide a historical record of working conditions and the means for workers to obtain comprehensible and useful information regarding their work environment and legal limits of occupational exposure to different types of environmental variables, as means to improve prevention of work-related accidents and disorders. We show the developed prototype provides useful and accurate information regarding the work environmental conditions, validating them with standard occupational hygiene equipment. We believe the proposed prototype is a cost-effective and adequate approach to work environment monitoring that could help elucidate the links between work and occupational illnesses, and that different industry sectors, as well as developing countries, could benefit from its capabilities.

Daily Site Risks Associated with Construction Projects and On-spot Corrective Measurements: Case Study of Revamping Projects in Kuwait Oil Company Fields Area

The growth and expansion of the industrial facilities comes proportional to the market increasing demand of products and services. Furthermore, raw material producers such as oil companies usually undergo massive revamping projects to maintain a synchronized supply. These revamping projects are usually delivered through challenging construction projects held and associated with daily site risks related to the construction process. Henceforth, a case study related to these risks and corresponding on-spot corrective measurements has been made on a certain number of construction project contractors at Kuwait Oil Company (KOC) to derive the benefits and overall effectiveness of the on-spot corrective measurements during the construction phase of a project, and how would the same help in avoiding major incidents, ensuring a smooth, cost effective and on time delivery of the project. Findings of this case study shall have an added value to the overall risk management process by minimizing the daily site risks that may affect the project lead time, resulting in an undisturbed on-site construction process.