A Review on Building Information Modelling in Nigeria and Its Potentials

Construction Industry has been evolving since the development of Building Information Modelling (BIM). This technological process is unstoppable; it is out to the market with remarkable case studies of solving the long industry’s history of fragmentation. This industry has been changing over time; United States has recorded the most significant development in construction digitalization, Australia, United Kingdom and some other developed nations are also amongst promoters of BIM process and its development. Recently, a developing country like China and Malaysia are keying into the industry’s digital shift, while very little move is seen in South Africa whose development is considered higher and perhaps leader in the digital transition amongst the African countries. To authors’ best knowledge, Nigerian construction industry has never engaged in BIM discussions hence has no attention at national level. Consequently, Nigeria has no “Noteworthy BIM publications.” Decision makers and key stakeholders need to be informed on the current trend of the industry’s development (BIM in specific) and the opportunities of adopting this digitalization trend in relation to the identified challenges. BIM concept can be traced mostly in Architectural practices than engineering practices in Nigeria. A superficial BIM practice is found to be at organisational level only and operating a model based - “BIM stage 1.” Research to adopting this innovation has received very little attention. This piece of work is literature review based, aimed at exploring BIM in Nigeria and its prospects. The exploration reveals limitations in the literature availability as to extensive research in the development of BIM in the country. Numerous challenges were noticed including building collapse, inefficiencies, cost overrun and late project delivery. BIM has potentials to overcome the above challenges and even beyond. Low level of BIM adoption with reasonable level of awareness is noticed. However, lack of policy and guideline as well as serious lack of experts in the field are amongst the major barriers to BIM adoption. The industry needs to embrace BIM to possibly compete with its global counterpart.

Development of Affordable and Reliable Diagnostic Tools to Record Vital Parameters for Improving Health Care in Low Resources Settings

In most developing countries, although the vast majority of the people are living in the rural areas, the qualified medical doctors are not available there. Health care workers and paramedics, called village doctors, informal healthcare providers, are largely responsible for the rural medical care. Mishaps due to wrong diagnosis and inappropriate medication have been causing serious suffering that is preventable. While innovators have created many devices, the vast majority of these technologies do not find applications to address the needs and conditions in low-resource settings. The primary motive is to address the acute lack of affordable medical technologies for the poor people in low-resource settings. A low cost smart medical device that is portable, battery operated and can be used at any point of care has been developed to detect breathing rate, electrocardiogram (ECG) and arterial pulse rate to improve diagnosis and monitoring of patients and thus improve care and safety. This simple and easy to use smart medical device can be used, managed and maintained effectively and safely by any health worker with some training. In order to empower the health workers and village doctors, our device is being further developed to integrate with ICT tools like smart phones and connect to the medical experts wherever available, to manage the serious health problems.

Quantification of Aerodynamic Variables Using Analytical Technique and Computational Fluid Dynamics

Aerodynamic stability coefficients are necessary to be known before any unmanned aircraft flight is performed. This requires expertise on aerodynamics and stability control of the aircraft. To enable efficacious performance of aircraft requires that a well-defined flight path and aerodynamics should be defined beforehand. This paper presents a study on the aerodynamics of an unmanned aero vehicle (UAV) during flight conditions. Current research holds comparative studies of different parameters for flight aerodynamic, measured using two different open source analytical software programs. These software packages are DATCOM and XLRF5, which help in depicting the flight aerodynamic variables. Computational fluid dynamics (CFD) was also used to perform aerodynamic analysis for which Star CCM+ was used. Output trends of the study demonstrate high accuracies between the two software programs with that of CFD. It can be seen that the Coefficient of Lift (CL) obtained from DATCOM and XFLR is similar to CL of CFD simulation. In the similar manner, other potential aerodynamic stability parameters obtained from analytical software are in good agreement with CFD.

Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

The Emerging Global Judicial Ethics: Issues and Problems

In many states around the world, actions to improve judicial ethics are developing significantly through the production of professional standards for judges. The quest to improve the ethics of judges is legitimate. However, as this development tends to be very important at the moment, some risks it presents must be highlighted. Indeed, if the objective of improving Judges’ Ethics is legitimate, it can also lead to banalization of justice, reinforcement of criticism against the judiciary and to broach incidentally the question of the limits of judgment, which is most perilous for the independence of the judiciary. This research, based on case studies, interviews with judges and an analysis of the literature on this topic (mainly from the United States of America and European Union Member States), tends to draw attention to the fact that the result of the development of these professional standards is that the ethical requirements of judges become ethical requirements of justice, which is an undesirable effect of which we must be aware, in order to prevent it.

Tools and Techniques in Risk Assessment in Public Risk Management Organisations

Risk assessment and the knowledge provided through this process is a crucial part of any decision-making process in the management of risks and uncertainties. Failure in assessment of risks can cause inadequacy in the entire process of risk management, which in turn can lead to failure in achieving organisational objectives as well as having significant damaging consequences on populations affected by the potential risks being assessed. The choice of tools and techniques in risk assessment can influence the degree and scope of decision-making and subsequently the risk response strategy. There are various available qualitative and quantitative tools and techniques that are deployed within the broad process of risk assessment. The sheer diversity of tools and techniques available to practitioners makes it difficult for organisations to consistently employ the most appropriate methods. This tools and techniques adaptation is rendered more difficult in public risk regulation organisations due to the sensitive and complex nature of their activities. This is particularly the case in areas relating to the environment, food, and human health and safety, when organisational goals are tied up with societal, political and individuals’ goals at national and international levels. Hence, recognising, analysing and evaluating different decision support tools and techniques employed in assessing risks in public risk management organisations was considered. This research is part of a mixed method study which aimed to examine the perception of risk assessment and the extent to which organisations practise risk assessment’ tools and techniques. The study adopted a semi-structured questionnaire with qualitative and quantitative data analysis to include a range of public risk regulation organisations from the UK, Germany, France, Belgium and the Netherlands. The results indicated the public risk management organisations mainly use diverse tools and techniques in the risk assessment process. The primary hazard analysis; brainstorming; hazard analysis and critical control points were described as the most practiced risk identification techniques. Within qualitative and quantitative risk analysis, the participants named the expert judgement, risk probability and impact assessment, sensitivity analysis and data gathering and representation as the most practised techniques.

Multimodal Database of Emotional Speech, Video and Gestures

People express emotions through different modalities. Integration of verbal and non-verbal communication channels creates a system in which the message is easier to understand. Expanding the focus to several expression forms can facilitate research on emotion recognition as well as human-machine interaction. In this article, the authors present a Polish emotional database composed of three modalities: facial expressions, body movement and gestures, and speech. The corpora contains recordings registered in studio conditions, acted out by 16 professional actors (8 male and 8 female). The data is labeled with six basic emotions categories, according to Ekman’s emotion categories. To check the quality of performance, all recordings are evaluated by experts and volunteers. The database is available to academic community and might be useful in the study on audio-visual emotion recognition.

Utilizing the Analytic Hierarchy Process in Improving Performances of Blind Judo

Identifying, structuring, and racking the most important factors related to improving athletes’ performances could pave the way for improve training system. The purpose of this study was to identify the relative importance factors to improve performance of the of judo athletes with visual impairments, including blindness by using the Analytic Hierarchy Process (AHP). After reviewing the literature, the relative importance of factors affecting performance of the blind judo was selected. A group of expert reviewed the first draft of the questionnaires, and then finally selected performance factors were classified into the major categories of techniques, physical fitness, and psychological categories. Later, a pre-selected experts group was asked to review the final version of questionnaire and confirm the priories of performance factors. The order of priority was determined by performing pairwise comparisons using Expert Choice 2000. Results indicated that “grappling” (.303) and “throwing” (.234) were the most important lower hierarchy factors for blind judo skills. In addition, the most important physical factors affecting performance were “muscular strength and endurance” (.238). Further, among other psychological factors “competitive anxiety” (.393) was important factor that affects performance. It is important to offer psychological skills training to reduce anxiety of judo athletes with visual impairments and blindness, so they can compete in their optimal states. These findings offer insights into what should be considered when determining factors to improve performance of judo athletes with visual impairments and blindness.

Optimizing Usability Testing with Collaborative Method in an E-Commerce Ecosystem

Usability testing (UT) is one of the vital steps in the User-centred design (UCD) process when designing a product. In an e-commerce ecosystem, UT becomes primary as new products, features, and services are launched very frequently. And, there are losses attached to the company if an unusable and inefficient product is put out to market and is rejected by customers. This paper tries to answer why UT is important in the product life-cycle of an E-commerce ecosystem. Secondary user research was conducted to find out work patterns, development methods, type of stakeholders, and technology constraints, etc. of a typical E-commerce company. Qualitative user interviews were conducted with product managers and designers to find out the structure, project planning, product management method and role of the design team in a mid-level company. The paper tries to address the usual apprehensions of the company to inculcate UT within the team. As well, it stresses upon factors like monetary resources, lack of usability expert, narrow timelines, and lack of understanding of higher management as some primary reasons. Outsourcing UT to vendors is also very prevalent with mid-level e-commerce companies, but it has its own severe repercussions like very little team involvement, huge cost, misinterpretation of the findings, elongated timelines, and lack of empathy towards the customer, etc. The shortfalls of the unavailability of a UT process in place within the team and conducting UT through vendors are bad user experiences for customers while interacting with the product, badly designed products which are neither useful and nor utilitarian. As a result, companies see dipping conversions rates in apps and websites, huge bounce rates and increased uninstall rates. Thus, there was a need for a more lean UT system in place which could solve all these issues for the company. This paper highlights on optimizing the UT process with a collaborative method. The degree of optimization and structure of collaborative method is the highlight of this paper. Collaborative method of UT is one in which the centralised design team of the company takes for conducting and analysing the UT. The UT is usually a formative kind where designers take findings into account and uses in the ideation process. The success of collaborative method of UT is due to its ability to sync with the product management method employed by the company or team. The collaborative methods focus on engaging various teams (design, marketing, product, administration, IT, etc.) each with its own defined roles and responsibility in conducting a smooth UT with users In-house. The paper finally highlights the positive results of collaborative UT method after conducting more than 100 In-lab interviews with users across the different lines of businesses. Some of which are the improvement of interaction between stakeholders and the design team, empathy towards users, improved design iteration, better sanity check of design solutions, optimization of time and money, effective and efficient design solution. The future scope of collaborative UT is to make this method leaner, by reducing the number of days to complete the entire project starting from planning between teams to publishing the UT report.

Cloud Enterprise Application Provider Selection Model for the Small and Medium Enterprise: A Pilot Study

Enterprise Applications (EAs) aid the organizations achieve operational excellence and competitive advantage. Over time, most Small and Medium Enterprises (SMEs), which are known to be the major drivers of most thriving global economies, use the costly on-premise versions of these applications thereby making business difficult to competitively thrive in the same market environment with their large enterprise counterparts. The advent of cloud computing presents the SMEs an affordable offer and great opportunities as such EAs can be cloud-hosted and rented on a pay-per-use basis which does not require huge initial capital. However, as there are numerous Cloud Service Providers (CSPs) offering EAs as Software-as-a-Service (SaaS), there is a challenge of choosing a suitable provider with Quality of Service (QoS) that meet the organizations’ customized requirements. The proposed model takes care of that and goes a step further to select the most affordable among a selected few of the CSPs. In the earlier stage, before developing the instrument and conducting the pilot test, the researchers conducted a structured interview with three experts to validate the proposed model. In conclusion, the validity and reliability of the instrument were tested through experts, typical respondents, and analyzed with SPSS 22. Results confirmed the validity of the proposed model and the validity and reliability of the instrument.

Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses

A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).

Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

The Effects of Physical Activity and Serotonin on Depression, Anxiety, Body Image and Mental Health

Sport has found a special place as an effective phenomenon in all societies of the contemporary world. The relationship between physical activity and exercise with different sciences has provided new fields for human study. The range of issues related to exercise and physical education is such that it requires specialized sciences and special studies. In this article, the psychological and social sections of exercise have been investigated for children and adults. It can be used for anyone in different age groups. Exercise and regular physical movements have a great impact on the mental and social health of the individual in addition to body health. It affects the individual's adaptability in society and his/her personality. Exercise affects the treatment of diseases such as depression, anxiety, stress, body image, and memory. Exercise is a safe haven for young people to achieve the optimum human development in its shelter. The effects of sensorimotor skills on mental actions and mental development are such a way that many psychologists and sports science experts believe these activities should be included in training programs in the first place. Familiarity of students and scholars with different programs and methods of sensorimotor activities not only causes their mental actions; but also increases mental health and vitality, enhances self-confidence and, therefore, mental health.

SeCloudBPMN: A Lightweight Extension for BPMN Considering Security Threats in the Cloud

Business processes are crucial for organizations and help businesses to evaluate and optimize their performance and processes against current and future-state business goals. Outsourcing business processes to the cloud becomes popular due to a wide varsity of benefits and cost-saving. However, cloud outsourcing raises enterprise data security concerns, which must be incorporated in Business Process Model and Notation (BPMN). This paper, presents SeCloudBPMN, a lightweight extension for BPMN which extends the BPMN to explicitly support the security threats in the cloud as an outsourcing environment. SeCloudBPMN helps business’s security experts to outsource business processes to the cloud considering different threats from inside and outside the cloud. In this way, appropriate security countermeasures could be considered to preserve data security in business processes outsourcing to the cloud.

Development and Validation of an Instrument Measuring the Coping Strategies in Situations of Stress

Stress causes deleterious effects to the physical, psychological and organizational levels, which highlight the need to use effective coping strategies to deal with it. Several coping models exist, but they don’t integrate the different strategies in a coherent way nor do they take into account the new research on the emotional coping and acceptance of the stressful situation. To fill these gaps, an integrative model incorporating the main coping strategies was developed. This model arises from the review of the scientific literature on coping and from a qualitative study carried out among workers with low or high levels of stress, as well as from an analysis of clinical cases. The model allows one to understand under what circumstances the strategies are effective or ineffective and to learn how one might use them more wisely. It includes Specific Strategies in controllable situations (the Modification of the Situation and the Resignation-Disempowerment), Specific Strategies in non-controllable situations (Acceptance and Stubborn Relentlessness) as well as so-called General Strategies (Wellbeing and Avoidance). This study is intended to undertake and present the process of development and validation of an instrument to measure coping strategies based on this model. An initial pool of items has been generated from the conceptual definitions and three expert judges have validated the content. Of these, 18 items have been selected for a short form questionnaire. A sample of 300 students and employees from a Quebec university was used for the validation of the questionnaire. Concerning the reliability of the instrument, the indices observed following the inter-rater agreement (Krippendorff’s alpha) and the calculation of the coefficients for internal consistency (Cronbach's alpha) are satisfactory. To evaluate the construct validity, a confirmatory factor analysis using MPlus supports the existence of a model with six factors. The results of this analysis suggest also that this configuration is superior to other alternative models. The correlations show that the factors are only loosely related to each other. Overall, the analyses carried out suggest that the instrument has good psychometric qualities and demonstrates the relevance of further work to establish predictive validity and reconfirm its structure. This instrument will help researchers and clinicians better understand and assess coping strategies to cope with stress and thus prevent mental health issues.

An Analysis of Digital Forensic Laboratory Development among Malaysia’s Law Enforcement Agencies

Cybercrime is on the rise, and yet many Law Enforcement Agencies (LEAs) in Malaysia have no Digital Forensics Laboratory (DFL) to assist them in the attrition and analysis of digital evidence. From the estimated number of 30 LEAs in Malaysia, sadly, only eight of them owned a DFL. All of the DFLs are concentrated in the capital of Malaysia and none at the state level. LEAs are still depending on the national DFL (CyberSecurity Malaysia) even for simple and straightforward cases. A survey was conducted among LEAs in Malaysia owning a DFL to understand their history of establishing the DFL, the challenges that they faced and the significance of the DFL to their case investigation. The results showed that the while some LEAs faced no challenge in establishing a DFL, some of them took seven to 10 years to do so. The reason was due to the difficulty in convincing their management because of the high costs involved. The results also revealed that with the establishment of a DFL, LEAs were better able to get faster forensic result and to meet agency’s timeline expectation. It is also found that LEAs were also able to get more meaningful forensic results on cases that require niche expertise, compared to sending off cases to the national DFL. Other than that, cases are getting more complex, and hence, a continuous stream of budget for equipment and training is inevitable. The result derived from the study is hoped to be used by other LEAs in justifying to their management the benefits of establishing an in-house DFL.

Improving Health Care and Patient Safety at the ICU by Using Innovative Medical Devices and ICT Tools: Examples from Bangladesh

Innovative medical technologies offer more effective medical care, with less risk to patient and healthcare personnel. Medical technology and devices when properly used provide better data, precise monitoring and less invasive treatments and can be more targeted and often less costly. The Intensive Care Unit (ICU) equipped with patient monitoring, respiratory and cardiac support, pain management, emergency resuscitation and life support devices is particularly prone to medical errors for various reasons. Many people in the developing countries now wonder whether their visit to hospital might harm rather than help them. This is because; clinicians in the developing countries are required to maintain an increasing workload with limited resources and absence of well-functioning safety system. A team of experts from the medical, biomedical and clinical engineering in Sweden and Bangladesh have worked together to study the incidents, adverse events at the ICU in Bangladesh. The study included both public and private hospitals to provide a better understanding for physical structure, organization and practice in operating processes of care, and the occurrence of adverse outcomes the errors, risks and accidents related to medical devices at the ICU, and to develop a ICT based support system in order to reduce hazards and errors and thus improve the quality of performance, care and cost effectiveness at the ICU. Concrete recommendations and guidelines have been made for preparing appropriate ICT related tools and methods for improving the routine for use of medical devices, reporting and analyzing of the incidents at the ICU in order to reduce the number of undetected and unsolved incidents and thus improve the patient safety.

A Challenge to Acquire Serious Victims’ Locations during Acute Period of Giant Disasters

In this paper, we report how to acquire serious victims’ locations in the Acute Stage of Large-scale Disasters, in an Emergency Information Network System designed by us. The background of our concept is based on the Great East Japan Earthquake occurred on March 11th, 2011. Through many experiences of national crises caused by earthquakes and tsunamis, we have established advanced communication systems and advanced disaster medical response systems. However, Japan was devastated by huge tsunamis swept a vast area of Tohoku causing a complete breakdown of all the infrastructures including telecommunications. Therefore, we noticed that we need interdisciplinary collaboration between science of disaster medicine, regional administrative sociology, satellite communication technology and systems engineering experts. Communication of emergency information was limited causing a serious delay in the initial rescue and medical operation. For the emergency rescue and medical operations, the most important thing is to identify the number of casualties, their locations and status and to dispatch doctors and rescue workers from multiple organizations. In the case of the Tohoku earthquake, the dispatching mechanism and/or decision support system did not exist to allocate the appropriate number of doctors and locate disaster victims. Even though the doctors and rescue workers from multiple government organizations have their own dedicated communication system, the systems are not interoperable.

Dosimetric Analysis of Intensity Modulated Radiotherapy versus 3D Conformal Radiotherapy in Adult Primary Brain Tumors: Regional Cancer Centre, India

Radiation therapy has undergone many advancements and evloved from 2D to 3D. Recently, with rapid pace of drug discoveries, cutting edge technology, and clinical trials has made innovative advancements in computer technology and treatment planning and upgraded to intensity modulated radiotherapy (IMRT) which delivers in homogenous dose to tumor and normal tissues. The present study was a hospital-based experience comparing two different conformal radiotherapy techniques for brain tumors. This analytical study design has been conducted at Regional Cancer Centre, India from January 2014 to January 2015. Ten patients have been selected after inclusion and exclusion criteria. All the patients were treated on Artiste Siemens Linac Accelerator. The tolerance level for maximum dose was 6.0 Gyfor lenses and 54.0 Gy for brain stem, optic chiasm and optical nerves as per RTOG criteria. Mean and standard deviation values of PTV98%, PTV 95% and PTV 2% in IMRT were 93.16±2.9, 95.01±3.4 and 103.1±1.1 respectively; for 3DCRT were 91.4±4.7, 94.17±2.6 and 102.7±0.39 respectively. PTV max dose (%) in IMRT and 3D-CRT were 104.7±0.96 and 103.9±1.0 respectively. Maximum dose to the tumor can be delivered with IMRT with acceptable toxicity limits. Variables such as expertise, location of tumor, patient condition, and TPS influence the outcome of the treatment.

A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.