Online Multilingual Dictionary Using Hamburg Notation for Avatar-Based Indian Sign Language Generation System

Sign Language (SL) is used by deaf and other people who cannot speak but can hear or have a problem with spoken languages due to some disability. It is a visual gesture language that makes use of either one hand or both hands, arms, face, body to convey meanings and thoughts. SL automation system is an effective way which provides an interface to communicate with normal people using a computer. In this paper, an avatar based dictionary has been proposed for text to Indian Sign Language (ISL) generation system. This research work will also depict a literature review on SL corpus available for various SL s over the years. For ISL generation system, a written form of SL is required and there are certain techniques available for writing the SL. The system uses Hamburg sign language Notation System (HamNoSys) and Signing Gesture Mark-up Language (SiGML) for ISL generation. It is developed in PHP using Web Graphics Library (WebGL) technology for 3D avatar animation. A multilingual ISL dictionary is developed using HamNoSys for both English and Hindi Language. This dictionary will be used as a database to associate signs with words or phrases of a spoken language. It provides an interface for admin panel to manage the dictionary, i.e., modification, addition, or deletion of a word. Through this interface, HamNoSys can be developed and stored in a database and these notations can be converted into its corresponding SiGML file manually. The system takes natural language input sentence in English and Hindi language and generate 3D sign animation using an avatar. SL generation systems have potential applications in many domains such as healthcare sector, media, educational institutes, commercial sectors, transportation services etc. This research work will help the researchers to understand various techniques used for writing SL and generation of Sign Language systems.

Surface and Bulk Magnetization Behavior of Isolated Ferromagnetic NiFe Nanowires

The surface and bulk magnetization behavior of template released isolated ferromagnetic Ni60Fe40 nanowires of relatively thick diameters (~200 nm), deposited from a dilute suspension onto pre-patterned insulating chips have been investigated experimentally, using a highly sensitive Magneto-Optical Ker Effect (MOKE) magnetometry and Magneto-Resistance (MR) measurements, respectively. The MR data were consistent with the theoretical predictions of the anisotropic magneto-resistance (AMR) effect. The MR measurements, in all the angles of investigations, showed large features and a series of nonmonotonic "continuous small features" in the resistance profiles. The extracted switching fields from these features and from MOKE loops were compared with each other and with the switching fields reported in the literature that adopted the same analytical techniques on the similar compositions and dimensions of nanowires. A large difference between MOKE and MR measurments was noticed. The disparate between MOKE and MR results is attributed to the variance in the micro-magnetic structure of the surface and the bulk of such ferromagnetic nanowires. This result was ascertained using micro-magnetic simulations on an individual: cylindrical and rectangular cross sections NiFe nanowires, with the same diameter/thickness of the experimental wires, using the Object Oriented Micro-magnetic Framework (OOMMF) package where the simulated loops showed different switching events, indicating that such wires have different magnetic states in the reversal process and the micro-magnetic spin structures during switching behavior was complicated. These results further supported the difference between surface and bulk magnetization behavior in these nanowires. This work suggests that a combination of MOKE and MR measurements is required to fully understand the magnetization behavior of such relatively thick isolated cylindrical ferromagnetic nanowires.

Highly Accurate Target Motion Compensation Using Entropy Function Minimization

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Adaptive E-Learning System Using Fuzzy Logic and Concept Map

This paper proposes an effective adaptive e-learning system that uses a coloured concept map to show the learner's knowledge level for each concept in the chosen subject area. A Fuzzy logic system is used to evaluate the learner's knowledge level for each concept in the domain, and produce a ranked concept list of learning materials to address weaknesses in the learner’s understanding. This system obtains information on the learner's understanding of concepts by an initial pre-test before the system is used for learning and a post-test after using the learning system. A Fuzzy logic system is used to produce a weighted concept map during the learning process. The aim of this research is to prove that such a proposed novel adapted e-learning system will enhance learner's performance and understanding. In addition, this research aims to increase participants' overall understanding of their learning level by providing a coloured concept map of understanding followed by a ranked concepts list of learning materials.

Understanding the Selectional Preferences of the Twitter Mentions Network

Users in social networks either unicast or broadcast their messages. At mention is the popular way of unicasting for Twitter whereas general tweeting could be considered as broadcasting method. Understanding the information flow and dynamics within a Social Network and modeling the same is a promising and an open research area called Information Diffusion. This paper seeks an answer to a fundamental question - understanding if the at-mention network or the unicasting pattern in social media is purely random in nature or is there any user specific selectional preference? To answer the question we present an empirical analysis to understand the sociological aspects of Twitter mentions network within a social network community. To understand the sociological behavior we analyze the values (Schwartz model: Achievement, Benevolence, Conformity, Hedonism, Power, Security, Self-Direction, Stimulation, Traditional and Universalism) of all the users. Empirical results suggest that values traits are indeed salient cue to understand how the mention-based communication network functions. For example, we notice that individuals possessing similar values unicast among themselves more often than with other value type people. We also observe that traditional and self-directed people do not maintain very close relationship in the network with the people of different values traits.

Numerical Modeling of Determination of in situ Rock Mass Deformation Modulus Using the Plate Load Test

Accurate determination of rock mass deformation modulus, as an important design parameter, is one of the most controversial issues in most engineering projects. A 3D numerical model of standard plate load test (PLT) using the FLAC3D code was carried to investigate the mechanism governing the test process. Five objectives were the focus of this study. The first goal was to employ 3D modeling in the interpretation of PLT conducted at the Bazoft dam site, Iran. The second objective was to investigate the effect of displacements measuring depth from the loading plates on the calculated moduli. The magnitude of rock mass deformation modulus calculated from PLT depends on anchor depth, and in practice, this may be a cause of error in the selection of realistic deformation modulus for the rock mass. The third goal of the study was to investigate the effect of testing plate diameter on the calculated modulus. Moreover, a comparison of the calculated modulus from ISRM formula, numerical modeling and calculated modulus from the actual PLT carried out at right abutment of the Bazoft dam site was another objective of the study. Finally, the effect of plastic strains on the calculated moduli in each of the loading-unloading cycles for three loading plates was investigated. The geometry, material properties, and boundary conditions on the constructed 3D model were selected based on the in-situ conditions of PLT at Bazoft dam site. A good agreement was achieved between numerical model results and the field tests results.

Lung Parasites in Stone Martens (Martes foina L.) from Bulgaria

The present work focused on the study of pulmonary helminth-fauna of the stone marten in Bulgaria in terms of which the data are little. For the purpose, four stone martens were helminthologically necropsied according to the common technique. In addition, some of the injured lung parts were investigated after their boiling in lactic acid and subsequent compression. Four nematode species from different families of order Strongylida and Trichocephalida were found in the lungs. These were Crenosoma petrowi Morosov, 1939; Eucoleus aerophilus Creplin, 1839; Filaroides martis Werner, 1782 and Sobolevingylus petrowi Romanov, 1952. Some of the parasite structures with taxonomic importance were measured and described. According to our best knowledge, the species F. martis and S. petrowi are recorded for the first time as a part of the helminth-fauna of Southeast Europe and Bulgaria in particular.

Assessing the Impact of High Fidelity Human Patient Simulation on Teamwork among Nursing, Medicine and Pharmacy Undergraduate Students

High fidelity human patient simulation has been used for many years by health sciences education programs to foster critical thinking, engage learners, improve confidence, improve communication, and enhance psychomotor skills. Unfortunately, there is a paucity of research on the use of high fidelity human patient simulation to foster teamwork among nursing, medicine and pharmacy undergraduate students. This study compared the impact of high fidelity and low fidelity simulation education on teamwork among nursing, medicine and pharmacy students. For the purpose of this study, two innovative teaching scenarios were developed based on the care of an adult patient experiencing acute anaphylaxis: one high fidelity using a human patient simulator and one low fidelity using case based discussions. A within subjects, pretest-posttest, repeated measures design was used with two-treatment levels and random assignment of individual subjects to teams of two or more professions. A convenience sample of twenty-four (n=24) undergraduate students participated, including: nursing (n=11), medicine (n=9), and pharmacy (n=4). The Interprofessional Teamwork Questionnaire was used to assess for changes in students’ perception of their functionality within the team, importance of interprofessional collaboration, comprehension of roles, and confidence in communication and collaboration. Student satisfaction was also assessed. Students reported significant improvements in their understanding of the importance of interprofessional teamwork and of the roles of nursing and medicine on the team after participation in both the high fidelity and the low fidelity simulation. However, only participants in the high fidelity simulation reported a significant improvement in their ability to function effectively as a member of the team. All students reported that both simulations were a meaningful learning experience and all students would recommend both experiences to other students. These findings suggest there is merit in both high fidelity and low fidelity simulation as a teaching and learning approach to foster teamwork among undergraduate nursing, medicine and pharmacy students. However, participation in high fidelity simulation may provide a more realistic opportunity to practice and function as an effective member of the interprofessional health care team.

Analysis of the Operational Performance of Three Unconventional Arterial Intersection Designs: Median U-Turn, Superstreet and Single Quadrant

This paper is aimed to evaluate and compare the operational performance of three Unconventional Arterial Intersection Designs (UAIDs) including Median U-Turn, Superstreet, and Single Quadrant Intersection using real traffic data. For this purpose, the heavily congested signalized intersection of Wadi Saqra in Amman was selected. The effect of implementing each of the proposed UAIDs was not only evaluated on the isolated Wadi Saqra signalized intersection, but also on the arterial road including both surrounding intersections. The operational performance of the isolated intersection was based on the level of service (LOS) expressed in terms of control delay and volume to capacity ratio. On the other hand, the measures used to evaluate the operational performance on the arterial road included traffic progression, stopped delay per vehicle, number of stops and the travel speed. The analysis was performed using SYNCHRO 8 microscopic software. The simulation results showed that all three selected UAIDs outperformed the conventional intersection design in terms of control delay but only the Single Quadrant Intersection design improved the main intersection LOS from F to B. Also, the results indicated that the Single Quadrant Intersection design resulted in an increase in average travel speed by 52%, and a decrease in the average stopped delay by 34% on the selected corridor when compared to the corridor with conventional intersection design. On basis of these results, it can be concluded that the Median U-Turn and the Superstreet do not perform the best under heavy traffic volumes.

Development and Validation of an Instrument Measuring the Coping Strategies in Situations of Stress

Stress causes deleterious effects to the physical, psychological and organizational levels, which highlight the need to use effective coping strategies to deal with it. Several coping models exist, but they don’t integrate the different strategies in a coherent way nor do they take into account the new research on the emotional coping and acceptance of the stressful situation. To fill these gaps, an integrative model incorporating the main coping strategies was developed. This model arises from the review of the scientific literature on coping and from a qualitative study carried out among workers with low or high levels of stress, as well as from an analysis of clinical cases. The model allows one to understand under what circumstances the strategies are effective or ineffective and to learn how one might use them more wisely. It includes Specific Strategies in controllable situations (the Modification of the Situation and the Resignation-Disempowerment), Specific Strategies in non-controllable situations (Acceptance and Stubborn Relentlessness) as well as so-called General Strategies (Wellbeing and Avoidance). This study is intended to undertake and present the process of development and validation of an instrument to measure coping strategies based on this model. An initial pool of items has been generated from the conceptual definitions and three expert judges have validated the content. Of these, 18 items have been selected for a short form questionnaire. A sample of 300 students and employees from a Quebec university was used for the validation of the questionnaire. Concerning the reliability of the instrument, the indices observed following the inter-rater agreement (Krippendorff’s alpha) and the calculation of the coefficients for internal consistency (Cronbach's alpha) are satisfactory. To evaluate the construct validity, a confirmatory factor analysis using MPlus supports the existence of a model with six factors. The results of this analysis suggest also that this configuration is superior to other alternative models. The correlations show that the factors are only loosely related to each other. Overall, the analyses carried out suggest that the instrument has good psychometric qualities and demonstrates the relevance of further work to establish predictive validity and reconfirm its structure. This instrument will help researchers and clinicians better understand and assess coping strategies to cope with stress and thus prevent mental health issues.

A Dynamic Mechanical Thermal T-Peel Test Approach to Characterize Interfacial Behavior of Polymeric Textile Composites

Basic understanding of interfacial mechanisms is of importance for the development of polymer composites. For this purpose, we need techniques to analyze the quality of interphases, their chemical and physical interactions and their strength and fracture resistance. In order to investigate the interfacial phenomena in detail, advanced characterization techniques are favorable. Dynamic mechanical thermal analysis (DMTA) using a rheological system is a sensitive tool. T-peel tests were performed with this system, to investigate the temperature-dependent peel behavior of woven textile composites. A model system was made of polyamide (PA) woven fabric laminated with films of polypropylene (PP) or PP modified by grafting with maleic anhydride (PP-g-MAH). Firstly, control measurements were performed with solely PP matrixes. Polymer melt investigations, as well as the extensional stress, extensional viscosity and extensional relaxation modulus at -10°C, 100 °C and 170 °C, demonstrate similar viscoelastic behavior for films made of PP-g-MAH and its non-modified PP-control. Frequency sweeps have shown that PP-g-MAH has a zero phase viscosity of around 1600 Pa·s and PP-control has a similar zero phase viscosity of 1345 Pa·s. Also, the gelation points are similar at 2.42*104 Pa (118 rad/s) and 2.81*104 Pa (161 rad/s) for PP-control and PP-g-MAH, respectively. Secondly, the textile composite was analyzed. The extensional stress of PA66 fabric laminated with either PP-control or PP-g-MAH at -10 °C, 25 °C and 170 °C for strain rates of 0.001 – 1 s-1 was investigated. The laminates containing the modified PP need more stress for T-peeling. However, the strengthening effect due to the modification decreases by increasing temperature and at 170 °C, just above the melting temperature of the matrix, the difference disappears. Independent of the matrix used in the textile composite, there is a decrease of extensional stress by increasing temperature. It appears that the more viscous is the matrix, the weaker the laminar adhesion. Possibly, the measurement is influenced by the fact that the laminate becomes stiffer at lower temperatures. Adhesive lap-shear testing at room temperature supports the findings obtained with the T-peel test. Additional analysis of the textile composite at the microscopic level ensures that the fibers are well embedded in the matrix. Atomic force microscopy (AFM) imaging of a cross section of the composite shows no gaps between the fibers and matrix. Measurements of the water contact angle show that the MAH grafted PP is more polar than the virgin-PP, and that suggests a more favorable chemical interaction of PP-g-MAH with PA, compared to the non-modified PP. In fact, this study indicates that T-peel testing by DMTA is a technique to achieve more insights into polymeric textile composites.

Investigating Elements of Identity of Traditional Neighborhoods in Isfahan and Using These Elements in the Design of Modern Neighborhoods

The process of planning, designing and building neighborhoods is a complex and multidimensional part of urban planning. Understanding the elements that give a neighborhood a sense of identity can lead to successful city planning and result in a cohesive and functional community where people feel a sense of belonging. These factors are important in ensuring that the needs of the urban population are met to live in a safe, pleasant and healthy society. This research paper aims to identify the elements of the identity of traditional neighborhoods in Isfahan and analyzes ways of using these elements in the design of modern neighborhoods to increase social interaction between communities and cultural reunification of people. The neighborhood of Jolfa in Isfahan has a unique socio-cultural identity as it dates back to the Safavid Dynasty of the 16th century, and most of its inhabitants are Christian Armenians of a religious minority. The elements of the identity of Jolfa were analyzed through the following research methods: field observations, distribution of questionnaires and qualitative analysis. The basic methodology that was used to further understand the Jolfa neighborhood and deconstruct the identity image that residents associate with their respective neighborhoods was a qualitative research method. This was done through utilizing questionnaires that respondents had to fill out in response to a series of research questions. From collecting these qualitative data, the major finding was that traditional neighborhoods that have elements of identity embedded in them are seen to have closer-knit communities whose residents have strong societal ties. This area of study in urban planning is vital to ensuring that new neighborhoods are built with concepts of social cohesion, community and inclusion in mind as they are what lead to strong, connected, and prosperous societies.

User-Based Cannibalization Mitigation in an Online Marketplace

Online marketplaces are not only digital places where consumers buy and sell merchandise, and they are also destinations for brands to connect with real consumers at the moment when customers are in the shopping mindset. For many marketplaces, brands have been important partners through advertising. There can be, however, a risk of advertising impacting a consumer’s shopping journey if it hurts the use experience or takes the user away from the site. Both could lead to the loss of transaction revenue for the marketplace. In this paper, we present user-based methods for cannibalization control by selectively turning off ads to users who are likely to be cannibalized by ads subject to business objectives. We present ways of measuring cannibalization of advertising in the context of an online marketplace and propose novel ways of measuring cannibalization through purchase propensity and uplift modeling. A/B testing has shown that our methods can significantly improve user purchase and engagement metrics while operating within business objectives. To our knowledge, this is the first paper that addresses cannibalization mitigation at the user-level in the context of advertising.

A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Inequalities in Higher Education and Students’ Perceptions of Factors Influencing Academic Performance

This qualitative study aims to answer the following research questions: i) What are the factors that students perceive as relevant to a) promoting and b) preventing good grades? ii) How does socio-economic status (SES) feature in those beliefs? We conducted in-depth interviews with 19 first- and second-year undergraduates of varying SES at a research-intensive university in the UK. The interviews yielded eight factors that students perceived as promoting and six perceived as preventing good grades. The findings suggested one significant difference between the beliefs of low and high SES students in that low SES students perceive themselves to be at a greater disadvantage to their peers while high SES students do not have such beliefs. This could have knock-on effects on their performance.

Effect of Needle Diameter on the Morphological Structure of Electrospun n-Bi2O3/Epoxy-PVA Nanofiber Mats

The effect of needle diameter on the morphological structure of electrospun n-Bi2O3/epoxy-PVA nanofibers has been investigated using three different types of needle diameters. The results were observed and investigated using two techniques of scanning electron microscope (SEM). The first technique is backscattered SEM while the second is secondary electron SEM. The results demonstrate that there is a correlation between the needle diameter and the morphology of electrospun nanofibers. As the internal needle diameter decreases, the average nanofiber diameter decreases and the fibers get thinner and smoother without agglomeration or beads formation. Moreover, with small needle diameter the nanofibrous porosity get larger compared with large needle diameter.

Understanding the Programming Techniques Using a Complex Case Study to Teach Advanced Object-Oriented Programming

Teaching Object-Oriented Programming (OOP) as part of a Computing-related university degree is a very difficult task; the road to ensuring that students are actually learning object oriented concepts is unclear, as students often find it difficult to understand the concept of objects and their behavior. This problem is especially obvious in advanced programming modules where Design Pattern and advanced programming features such as Multi-threading and animated GUI are introduced. Looking at the students’ performance at their final year on a university course, it was obvious that the level of students’ understanding of OOP varies to a high degree from one student to another. Students who aim at the production of Games do very well in the advanced programming module. However, the students’ assessment results of the last few years were relatively low; for example, in 2016-2017, the first quartile of marks were as low as 24.5 and the third quartile was 63.5. It is obvious that many students were not confident or competent enough in their programming skills. In this paper, the reasons behind poor performance in Advanced OOP modules are investigated, and a suggested practice for teaching OOP based on a complex case study is described and evaluated.

A Conversation about Inclusive Education: Revelations from Namibian Primary School Teachers

Inclusive education stems from a philosophy and vision, which argues that all children should learn together at school. It is not only about treating all pupils in the same way. It is also about allowing all children to attend school without any restrictions. Ten primary school teachers in a circuit in Namibia volunteered to participate in face-to-face interviews about inclusive education. The teachers responded to three questions about their (i) understanding of inclusive education; (ii) whether inclusive education was implemented in primary schools; and (iii) whether they were able to work with learners with special needs. Findings indicated that teachers understood what inclusive education entailed; felt that inclusive education was not implemented in their primary schools, and they were unable to work with learners with special needs in their classrooms. Further, the teachers identified training and resources as important components of inclusive education. It is recommended that education authorities should perhaps verify the findings reported here as well as ensure that the concerns raised by the teachers are addressed.

Gas Sweetening Process Simulation: Investigation on Recovering Waste Hydraulic Energy

In this research, firstly, a commercial gas sweetening unit with methyl-di-ethanol-amine (MDEA) solution is simulated and comprised in an integrated model in accordance with Aspen HYSYS software. For evaluation purposes, in the second step, the results of the simulation are compared with operating data gathered from South Pars Gas Complex (SPGC). According to the simulation results, the considerable energy potential contributed to the pressure difference between absorber and regenerator columns causes this energy driving force to be applied in power recovery turbine (PRT). In the last step, the amount of waste hydraulic energy is calculated, and its recovery methods are investigated.

A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.