Effect of Merger on Efficiencies: A Study of Taiwanese Higher Education

This study applies nonparametric data envelopment analysis (DEA) to investigate two cases of educational university mergers. The purpose of this study is by comparing the performance differences between pre-merger and post-merger universities to provide a reference for policy makers and management to solve the higher education crisis in Taiwan. This study finds that it seems, so far, no significantly merger synergies reflecting in efficiencies improvement are found from the two cases of post-merger in Taiwan. National Pingtung University (NPTU) is still technical efficiency university after merger. Their efficiency scores are always 1.0 from 2012 to 2017, except 2014. Though, National Tsing Hua University (NTHU) suffers from decay of efficiency scores after merger; their technical efficiency, pure technical efficiency and scale efficiency all dropped after merger.

State of Emergency in Turkey (July 2016 – July 2018): A Case of Utilization of Law as a Political Instrument

In this study, we will aim to analyze how the period of the state of emergency in Turkey lead to gaps in law and the formation of areas in which there was a complete lack of supervision. The state of emergency that was proclaimed following the coup attempt of July 15, 2016, continued until July 18, 2018, that is to say, 2 years, without taking into account whether the initial circumstances persisted. As part of this work, we claim that the state of emergency provided the executive power with important tools for governing, which it took constant use. We can highlight how the concern for security at the center of the basic considerations of the people in a city was exploited as a foundation by the military power in Turkey to interfere in the political, legal and social spheres. The constitutions of 1924, 1961 and 1982 entrusted the army with the role of protector of the integrity of the state. This became an instrument at the hands of the military to legitimize their interventions in the name of public security. Its interventions in the political field are indeed politically motivated. The constitution, the legislative and regulatory systems are modified and monopolized by the military power that dominates the legislative, regulatory and judicial power, leading to a state of exception. With the political convulsions over a decade, the government was able to usurp the instrument called the state of exception. In particular, the decree-laws of the state of emergency, which the executive makes frequent and generally abusive use, became instruments in the hands of the government to take measures that it wishes to escape from the rules and the pre-established control mechanisms. Thus the struggle against the political opposition becomes more unbalanced and destructive. To this must also be added the ineffectiveness of ex-post controls and domestic remedies. This research allows us to stress how a legal concept such as "the state of emergency" can be politically exploited to make it a legal weapon that continues to produce victims.

Image Haze Removal Using Scene Depth Based Spatially Varying Atmospheric Light in Haar Lifting Wavelet Domain

This paper presents a method for single image dehazing based on dark channel prior (DCP). The property that the intensity of the dark channel gives an approximate thickness of the haze is used to estimate the transmission and atmospheric light. Instead of constant atmospheric light, the proposed method employs scene depth to estimate spatially varying atmospheric light as it truly occurs in nature. Haze imaging model together with the soft matting method has been used in this work to produce high quality haze free image. Experimental results demonstrate that the proposed approach produces better results than the classic DCP approach as color fidelity and contrast of haze free image are improved and no over-saturation in the sky region is observed. Further, lifting Haar wavelet transform is employed to reduce overall execution time by a factor of two to three as compared to the conventional approach.

Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Eco-Design of Multifunctional System Based on a Shape Memory Polymer and ZnO Nanoparticles for Sportswear

Since the beginning of the 20th century, sportswear has a major contribution to the impact of fashion on our lives. Nowadays, the embracing of sportswear fashion/looks is undoubtedly noticeable, as the modern consumer searches for high comfort and linear aesthetics for its clothes. This compromise lead to the arise of the athleisure trend. Athleisure surges as a new style area that combines both wearability and fashion sense, differentiated from the archetypal sportswear, usually associated to “gym clothes”. Additionally, the possibility to functionalize and implement new technologies have shifted and progressively empowers the connection between the concepts of physical activities practice and well-being, allowing clothing to be more interactive and responsive with its surroundings. In this study, a design inspired in retro and urban lifestyle was envisioned, engineering textile structures that can respond to external stimuli. These structures are enhanced to be responsive to heat, water vapor and humidity, integrating shape memory polymers (SMP) to improve the breathability and heat-responsive behavior of the textiles and zinc oxide nanoparticles (ZnO NPs) to heighten the surface hydrophobic properties. The best results for hydrophobic exhibited superhydrophobic behavior with water contact angle (WAC) of more than 150 degrees. For the breathability and heat-response properties, SMP-coated samples showed an increase in water vapour permeability values of about 50% when compared with non SMP-coated samples. These innovative technological approaches were endorsed to design innovative clothing, in line with circular economy and eco-design principles, by assigning a substantial degree of mutability and versatility to the clothing. The development of a coat and shirt, in which different parts can be purchased separately to create multiple products, aims to combine the technicality of both the fabrics used and the making of the garments. This concept translates itself into a real constructive mechanism through the symbiosis of high-tech functionalities and the timeless design that follows the athleisure aesthetics.

Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine

The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.

Designing of a Non-Zero Dispersion Shifted Fiber with Ultra-High Birefringence and High Non-Linearity

Photonic Crystal Fiber (PCF) uses are no longer limited to telecommunication only rather it is now used for many sensors-based fiber optics application, medical science, space application and so on. In this paper, the authors have proposed a microstructure PCF that is designed by using Finite Element Method (FEM) based software. Besides designing, authors have discussed the necessity of the characteristics that it poses for some specified applications because it is not possible to have all good characteristics from a single PCF. Proposed PCF shows the property of ultra-high birefringence (0.0262 at 1550 nm) which is more useful for sensor based on fiber optics. The non-linearity of this fiber is 50.86 w-1km-1 at 1550 nm wavelength which is very high to guide the light through the core tightly. For Perfectly Matched Boundary Layer (PML), 0.6 μm diameter is taken. This design will offer the characteristics of Nonzero-Dispersion-Shifted Fiber (NZ-DSF) for 450 nm waveband. Since it is a software-based design and no practical evaluation has made, 2% tolerance is checked and the authors have found very small variation of the characteristics.

Uplink Throughput Prediction in Cellular Mobile Networks

The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.

Mistranslation in Cross Cultural Communication: A Discourse Analysis on Former President Bush’s Speech in 2001

The differences in languages play a big role in cross-cultural communication. If meanings are not translated accurately, the risk can be crucial not only on an interpersonal level, but also on the international and political levels. The use of metaphorical language by politicians can cause great confusion, often leading to statements being misconstrued. In these situations, it is the translators who struggle to put forward the intended meaning with clarity and this makes translation an important field to study and analyze when it comes to cross-cultural communication. Owing to the growing importance of language and the power of translation in politics, this research analyzes part of President Bush’s speech in 2001 in which he used the word “Crusade” which caused his statement to be misconstrued. The research uses a discourse analysis of cross-cultural communication literature which provides answers supported by historical, linguistic, and communicative perspectives. The first finding indicates that the word ‘crusade’ carries different meaning and significance in the narratives of the Western world when compared to the Middle East. The second one is that, linguistically, maintaining cultural meanings through translation is quite difficult and challenging. Third, when it comes to the cross-cultural communication perspective, the common and frequent usage of literal translation is a sign of poor strategies being followed in translation training. Based on the example of Bush’s speech, this paper hopes to highlight the weak practices in translation in cross-cultural communication which are still commonly used across the world. Translation studies have to take issues such as this seriously and attempt to find a solution. In every language, there are words and phrases that have cultural, historical and social meanings that are woven into the language. Literal translation is not the solution for this problem because that strategy is unable to convey these meanings in the target language.

Energy Recovery Potential from Food Waste and Yard Waste in New York and Montréal

Landfilling of organic waste is still the predominant waste management method in the USA and Canada. Strategic plans for waste diversion from landfills are needed to increase material recovery and energy generation from waste. In this paper, we carried out a statistical survey on waste flow in the two cities New York and Montréal and estimated the energy recovery potential for each case. Data collection and analysis of the organic waste (food waste, yard waste, etc.), paper and cardboard, metal, glass, plastic, carton, textile, electronic products and other materials were done based on the reports published by the Department of Sanitation in New York and Service de l'Environnement in Montréal. In order to calculate the gas generation potential of organic waste, Buswell equation was used in which the molar mass of the elements was calculated based on their atomic weight and the amount of organic waste in New York and Montréal. Also, the higher and lower calorific value of the organic waste (solid base) and biogas (gas base) were calculated. According to the results, only 19% (598 kt) and 45% (415 kt) of New York and Montréal waste were diverted from landfills in 2017, respectively. The biogas generation potential of the generated food waste and yard waste amounted to 631 million m3 in New York and 173 million m3 in Montréal. The higher and lower calorific value of food waste were 3482 and 2792 GWh in New York and 441 and 354 GWh in Montréal, respectively. In case of yard waste, they were 816 and 681 GWh in New York and 636 and 531 GWh in Montréal, respectively. Considering the higher calorific value, this amount would mean a contribution of around 2.5% energy in these cities.

Evaluation of Numerical Modeling of Jet Grouting Design Using in situ Loading Test

Jet grouting (JG) is one of the methods of improving and increasing the strength and bearing of soil in which the high pressure water or grout is injected through the nozzles into the soil. During this process, a part of the soil and grout particles comes out of the drill borehole, and the other part is mixed up with the grout in place, as a result of this process, a mass of modified soil is created. The purpose of this method is to change the soil into a mixture of soil and cement, commonly known as "soil-cement". In this paper, first, the principles of high pressure injection and then the effective parameters in the JG method are described. Then, the tests on the samples taken from the columns formed from the excavation around the soil-cement columns, as well as the static loading test on the created column, are discussed. In the other part of this paper, the soil behavior models for numerical modeling in PLAXIS software are mentioned. The purpose of this paper is to evaluate the results of numerical modeling based on in-situ static loading tests. The results indicate an acceptable agreement between the results of the tests mentioned and the modeling results. Also, modeling with this software as an appropriate option for technical feasibility can be used to soil improvement using JG.

Survey of Epidemiology and Mechanisms of Badminton Injury Using Medical Check-Up and Questionnaire of School Age Badminton Players

Badminton is one type of racket sports that requires repetitive overhead motion, with the shoulder in abduction/external rotation and requires players to perform jumps, lunges, and quick directional changes. These characteristics could be stressful for body regions that may cause badminton injuries. Regarding racket players including badminton players, there have not been any studies that have utilized medical check-up to evaluate epidemiology and mechanism of injuries. In addition, epidemiology of badminton injury in school age badminton players is unknown. The first purpose of this study was to investigate the badminton injuries, physical fitness parameters, and intensity of shoulder pain using medical check-up so that the mechanisms of shoulder injuries might be revealed. The second purpose of this study was to survey the distribution of badminton injuries in elementary school age players so that injury prevention can be implemented as early as possible. The results of this study revealed that shoulder pain occurred in all players, and present shoulder pain players had smaller weight, greater shoulder external rotation (ER) gain, significantly thinner circumference of upper limbs and greater trunk extension. Identifying players with specific of these factors may enhance the prevention of badminton injury. This study also shows that there are high incidences of knee, ankle, plantar, and shoulder injury or pain in elementary school age badminton players. Injury prevention program might be implemented for elementary school age players.

Effect of Plant Nutrients on Anthocyanin Content and Yield Component of Black Glutinous Rice Plants

The cultivation of black glutinous rice rich in anthocyanins can provide great benefits to both farmers and consumers. Total anthocyanins content and yield component data of black glutinous rice cultivar (KHHK) grown with the addition of mineral elements (Ca, Mg, Cu, Cr, Fe and Se) under soilless conditions were studied. Ca application increased seed anthocyanins content by three-folds compared to controls. Cu application to rice plants obtained the highest number of grains panicle, panicle length and subsequently high panicle weight. Se application had the largest effect on leaf anthocyanins content, the number of tillers, number of panicles and 100-grain weight. These findings showed that the addition of mineral elements had a positive effect on increasing anthocyanins content in black rice plants and seeds as well as the heightened development of black glutinous rice plant growth.

Secured Mutual Authentication Protocol for Radio Frequency Identification Systems

Radio Frequency Identification (RFID) is a blooming technology which uses radio frequency to track the objects. This technology transmits signals between tag and reader to fetch information from the tag with a unique serial identity. Generally, the drawbacks of RFID technology are high cost, high consumption of power and weak authentication systems between a reader and a tag. The proposed protocol utilizes less dynamic power using reversible truncated multipliers which are implemented in RFID tag-reader with mutual authentication protocol system to reduce both leakage and dynamic power consumption. The proposed system was simulated using Xilinx and Cadence tools.

Fatal Road Accident Causer's Driving Aptitude in Hungary

Those causing fatal traffic accidents are traumatized, which negatively influences their cognitive functions and their personality. In order to clarify how much the trauma of causing a fatal accident effects their driving skills and personality traits, the results of a psychological aptitude and a personality test of drivers carelessly causing fatal accidents and of drivers not causing any accidents were compared separately. The sample (N = 354) consists of randomly selected drivers from the Transportation Aptitude and Examination Centre database who caused fatal accidents (Fatal group, n = 177) or did not cause accidents (Control group, n = 177). The aptitude tests were taken between 2014 and 2019. The comparison of the 2 groups was done according to 3 aspects: 1. Categories of aptitude (suitable, restricted, unsuited); 2. Categories of causes (ability, personality, ability and personality) within the restricted or unsuited (altogether: non-suitable subgroups); 3. Categories of ability and personality within the non-suitable subgroups regardless of the cause-category. Within ability deficiency, the two groups include those, whose ability factor is impaired or limited. This is also true in case of personality failure. Compared to the control group, the number of restricted drivers causing fatal accidents is significantly higher (p < .000) and the number of unsuited drivers is higher on a tendency-level (p = .06). Compared to the control group in the fatal non-suitable subgroup, the ratio of restricted suitability and the unsuitability due to ability factors is exclusively significantly lower (p < .000). The restricted suitability and the unsuitability due to personality factors are more significant in the fatal non-suitable subgroup (p < .000). Incapacity due to combination of ability and personality is also significantly higher in the fatal group (p = .002). Compared to the control group both ability and personality factors are also significantly higher in the fatal non-suitable subgroup (p < .000). Overall, the control group is more eligible for driving than drivers who have caused fatalities. The ability and personality factors are significantly higher in the case of fatal accident causers who are non-suitable for driving. Moreover the concomitance of ability and personality factors occur almost exclusively to drivers who caused fatal accidents. Further investigation is needed to understand the causes and how the aptitude test results for the fatal group could improve over time.

Computational Model for Prediction of Soil-Gas Radon-222 Concentration in Soil-Depths and Soil Grain Size Particles

Percentage of soil-gas radon-222 concentration (222Rn) from soil-depths contributing to outdoor radon atmospheric level depends largely on some physical parameters of the soil. To determine its dependency in soil-depths, survey tests were carried out on soil depths and grain size particles using in-situ measurement method of soil-gas radon-222 concentration at different soil depths. The measurements were carried out with an electronic active radon detector (RAD-7) manufactured by Durridge Company USA. Radon-222 concentrations (222Rn) in soil-gas were measured at four different soil depths of 20, 40, 60 and 100 cm in five feasible locations. At each soil depth, soil samples were collected for grain size particle analysis using soil grasp sampler. The result showed that highest value of radon-222 concentration (24,680 ± 1960 Bqm-3) was measured at 100 cm depth with utmost grain size particle of 17.64% while the lowest concentration (7370 ± 1139 Bqm-3) was measured at 100 cm depth with least grain size particle of 10.75% respectively. A computational model was derived using SPSS regression package. This model could be a yardstick for prediction on soil gas radon concentration reference to soil grain size particle at different soil-depths.

Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Digestibility in Yankasa Rams Fed Brachiaria ruziziensis – Centrosema pascuorum Hay Mixtures with Concentrate

This study investigated the digestibility of Brachiaria ruziziensis and Centrosema pascuorum hay mixtures at varying proportions in Yankasa rams. Twelve Yankasa rams with average initial weight 10.25 ± 0.1 kg were assigned to three dietary treatments of B. ruziziensis and C. pascuorum hay at different mixtures (75BR:25CP, 50BR:50CP and 25BR:75CP, respectively) in a Completely Randomized Design (CRD) for a period of 14 days. Concentrate diet was given to the experimental animals as supplement at fixed proportion, while the forage mixture (basal diet) was fed at 3% body weight. Animals on 50BR:50CP had better nutrient digestibility (crude protein, acid and neutral detergent fibre, ether extract and nitrogen free extract) than other treatment diets, except in dry matter digestibility (87.35%) which compared with 87.54% obtained in 25BR:75CP treatment diet and also organic matter digestibility. All parameters taken on nitrogen balance with the exception of nitrogen retained were significantly higher (P < 0.05) in animals fed 25BR:75CP diet, but were statistically similar with values obtained for animals on 50BR:50CP diet. From results obtained in this study, it is concluded that mixture of 25%BR75%CP gave the best nutrient digestibility and nitrogen balance in Yankasa rams. It is therefore recommended that B. ruziziensis and C. pascuorum should be fed at 50:50 mixture ratio for enhanced animal growth and performance in Nigeria.

The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Lexicon-Based Sentiment Analysis for Stock Movement Prediction

Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.