Readiness of Military Professionals for Challenging Situations

The article deals with the readiness of military professionals for challenging situations. It discusses higher requirements on the psychical endurance of military professionals arising from the specific nature of the military occupation, which is typical for being very difficult to maintain regularity, which is in accordance with the hygiene of work alternated by relaxation. The soldier must be able to serve in the long term and constantly intense performance that goes beyond human tolerance to stress situations. A challenging situation is always associated with overcoming difficulties, obstacles and complicated circumstances or using unusual methods, ways and means to achieve the desired (expected) objectives, performing a given task or satisfying an important need. This paper describes the categories of challenging situations, their classification and characteristics. Attention is also paid to the formation of personality in challenging situations, coping with stress in challenging situations, Phases of solutions of stressful situations, resistance to challenging life situations and its factors. Finally, the article is focused on increasing the readiness of military professionals for challenging situations.

Contention Window Adjustment in IEEE 802.11-Based Industrial Wireless Networks

The use of wireless technology in industrial networks has gained vast attraction in recent years. In this paper, we have thoroughly analyzed the effect of contention window (CW) size on the performance of IEEE 802.11-based industrial wireless networks (IWN), from delay and reliability perspective. Results show that the default values of CWmin, CWmax, and retry limit (RL) are far from the optimum performance due to the industrial application characteristics, including short packet and noisy environment. In this paper, an adaptive CW algorithm (payload-dependent) has been proposed to minimize the average delay. Finally a simple, but effective CW and RL setting has been proposed for industrial applications which outperforms the minimum-average-delay solution from maximum delay and jitter perspective, at the cost of a little higher average delay. Simulation results show an improvement of up to 20%, 25%, and 30% in average delay, maximum delay and jitter respectively.

Logistics Model for Improving Quality in Railway Transport

This contribution is focused on the methodology for identifying levels of quality and improving quality through new logistics model in railway transport. It is oriented on the application of dynamic quality models, which represent an innovative method of evaluation quality services. Through this conception, time factor, expected, and perceived quality in each moment of the transportation process within logistics chain can be taken into account. Various models describe the improvement of the quality which emphases the time factor throughout the whole transportation logistics chain. Quality of services in railway transport can be determined by the existing level of service quality, by detecting the causes of dissatisfaction employees but also customers, to uncover strengths and weaknesses. This new logistics model is able to recognize critical processes in logistic chain. It includes service quality rating that must respect its specific properties, which are unrepeatability, impalpability, their use right at the time they are provided and particularly changeability, which is significant factor in the conditions of rail transport as well. These peculiarities influence the quality of service regarding the constantly increasing requirements and that result in new ways of finding progressive attitudes towards the service quality rating.

Dynamic Model Conception of Improving Services Quality in Railway Transport

This article describes the results of research focused on quality of railway freight transport services. Improvement of these services has a crucial importance in customer considering on the future use of railway transport. Processes filling the customer demands and output quality assessment were defined as a part of the research. In this contribution is introduced the map of quality planning and the algorithm of applied methodology. It characterizes a model which takes into account characters of transportation with linking a perception services quality in ordinary and extraordinary operation. Despite the fact that rail freight transport has its solid position in the transport market, lots of carriers worldwide have been experiencing a stagnation for a couple of years. Therefore, specific results of the research have a significant importance and belong to numerous initiatives aimed to develop and support railway transport not only by creating a single railway area or reducing noise but also by promoting railway services. This contribution is focused also on the application of dynamic quality models which represent an innovative method of evaluation quality services. Through this conception, time factor, expected, and perceived quality in each moment of the transportation process can be taken into account.

Reading against the Grain: Transcodifying Stimulus Meaning

The paper shows that on transferring sense from the SL to the TL, the translator’s reading against the grain determines the creation of a faulty pattern of rendering the original meaning in the receiving culture which reflects the use of misleading transformative codes. In this case, the translator is a writer per se who decides what goes in and out of the book, how the style is to be ciphered and what elements of ideology are to be highlighted. The paper also proves that figurative language must not be flattened for the sake of clarity or naturalness. The missing figurative elements make the translated text less interesting, less challenging and less vivid which reflects poorly on the writer. There is a close connection between style and the writer’s person. If the writer’s style is very much altered in a translation, the translation is useless as the original writer and his / her imaginative world can no longer be discovered. The purpose of the paper is to prove that adaptation is a dangerous tool which leads to variants that sometimes reflect the original less than the reader would wish to. It contradicts the very essence of the process of translation which is that of making an original work available in a foreign language. If the adaptive transformative codes are so flexible that they encourage the translator to repeatedly leave out parts of the original work, then a subversive pattern emerges which changes the entire book. In conclusion, as a result of using adaptation, manipulative or subversive effects are created in the translated work. This is generally achieved by adding new words or connotations, creating new figures of speech or using explicitations. The additional meanings of the original work are neglected and the translator creates new meanings, implications, emphases and contexts. Again s/he turns into a new author who enjoys the freedom of expressing his / her own ideas without the constraints of the original text. Reading against the grain is unadvisable during the process of translation and consequently, following personal common sense becomes essential in the field of translation as well as everywhere else, so that translation should not become a source of fantasy.

Predicting Bridge Pier Scour Depth with SVM

Prediction of maximum local scour is necessary for the safety and economical design of the bridges. A number of equations have been developed over the years to predict local scour depth using laboratory data and a few pier equations have also been proposed using field data. Most of these equations are empirical in nature as indicated by the past publications. In this paper attempts have been made to compute local depth of scour around bridge pier in dimensional and non-dimensional form by using linear regression, simple regression and SVM (Poly & Rbf) techniques along with few conventional empirical equations. The outcome of this study suggests that the SVM (Poly & Rbf) based modeling can be employed as an alternate to linear regression, simple regression and the conventional empirical equations in predicting scour depth of bridge piers. The results of present study on the basis of non-dimensional form of bridge pier scour indicate the improvement in the performance of SVM (Poly & Rbf) in comparison to dimensional form of scour.

Analysis of the Key Indicators of Sustainable Tourism: A Case Study in Lagoa da Confusão, Brazil

From the start, the importance of having a plan to sustain tourism was acknowledged. The correct methods to monitor that type of tourism have been researched. Thus, we propose in this work to analyze the applicability of a monitoring and assistance method on the understanding of the tourism sustainability in a small size destiny or getaway. In this study, the subject is Lagoa da Confusão, in the state of Tocantins and the analysis was carried out through the efficiency of the local indicators, according to the WOT approach. We concluded that the sustainable tourism key points that were analyzed demonstrated to be important evaluation and quantification tools for the proposed tasks to be developed in the mentioned destiny. This is a study of an interdisciplinary character and the deductive method was chosen as the guiding line.

A Review on Applications of Evolutionary Algorithms to Reservoir Operation for Hydropower Production

Evolutionary Algorithms (EAs) have been used widely through evolution theory to discover acceptable solutions that corresponds to challenges such as natural resources management. EAs are also used to solve varied problems in the real world. EAs have been rapidly identified for its ease in handling multiple objective problems. Reservoir operations is a vital and researchable area which has been studied in the last few decades due to the limited nature of water resources that is found mostly in the semi-arid regions of the world. The state of some developing economy that depends on electricity for overall development through hydropower production, a renewable form of energy, is appalling due to water scarcity. This paper presents a review of the applications of evolutionary algorithms to reservoir operation for hydropower production. This review includes the discussion on areas such as genetic algorithm, differential evolution, and reservoir operation. It also identified the research gaps discovered in these areas. The results of this study will be an eye opener for researchers and decision makers to think deeply of the adverse effect of water scarcity and drought towards economic development of a nation. Hence, it becomes imperative to identify evolutionary algorithms that can address this issue which can hamper effective hydropower generation.

Upon Further Reflection: More on the History, Tripartite Role, and Challenges of the Professoriate

This paper expands on the role of the professor by detailing the origins of the profession, adding some of the unique contributions of North American universities as well as some of the best practice recommendations to the unique tripartite role of the professor. It describes current challenges to the profession including the ever-controversial student rating of professors. It continues with the significance of empowerment to the role of the professor. It concludes with a predictive prescription for the future of the professoriate and the role of the university-level educational administrator toward that end.

Spatio-Temporal Data Mining with Association Rules for Lake Van

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatiotemporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newlyformed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Bridging Consumer-Farmer Mobile Application Divide

Electronic mediums such as websites, feeds, blogs and social media sites are on a daily basis influencing our decision making, are improving our productivity and are shaping futures of many consumers and service/product providers. This research identifies that both customers and business providers heavily rely on smart phone applications. Based on this, mobile applications available on iTunes store were studied. It was identified that fruit and vegetable related applications used by consumers can broadly be categorized into purchase applications, diaries, tracking health applications, trip farm location and cooking applications. On the other hand, applications used by farmers can broadly be classified as: weather tracking, pests / fertilizer applications and general social media applications such as Facebook. To blur this farmer-consumer application divide, our research utilizes Context Specific eTransformation Framework and based on it identifies characteristic future consumer-farmer applications will need to have so that the current divide can be narrowed and consequently better farmerconsumer supply chain link established.

An Institutional Analysis of IFRS Adoption in Poor Jurisdictions

The last two decades witnessed a movement towards harmonization of international financial reporting standards (IFRS) throughout the global economy. This investigation seeks to identify the factors that could explain the adoption of IFRS by poor jurisdictions. While there has been a considerable amount of literature published on the effects and key drivers of IFRS adoption in both developed and developing countries, little attention has been paid to jurisdictions with less developed capital markets and low income levels exclusively. Drawing upon the Institutional Isomorphism theory and analyzing a sample of 45 poor jurisdictions between 2008 and 2013, the study empirically shows that poor jurisdictions are driven by legitimacy concerns rather than by economic reasoning to adopt an international accounting perspective. This in turn has implications for the IASB, as it should seek to influence institutional pressures within a particular jurisdiction in order to promote IFRS adoption.

Sorption of Charged Organic Dyes from Anionic Hydrogels

Hydrogels are three-dimensional, hydrophilic, polymeric networks composed of homopolymers or copolymers and are insoluble in water due to the presence of chemical or physical cross-links. When hydrogels come in contact with aqueous solutions, they can effectively sorb and retain the dissolved substances, depending on the nature of the monomeric units comprising the hydrogel. For this reason, hydrogels have been proposed in several studies as water purification agents. At the present work anionic hydrogels bearing negatively charged –COO- groups were prepared and investigated. These gels are based on sodium acrylate (ANa), either homopolymerized (poly(sodiumacrylate), PANa) or copolymerized (P(DMAM-co-ANa)) with N,N Dimethylacrylamide (DMAM). The hydrogels were used to extract some model organic dyes from water. It is found that cationic dyes are strongly sorbed and retained by the hydrogels, while sorption of anionic dyes was negligible. In all cases it was found that both maximum sorption capacity and equilibrium binding constant varied from one dye to the other depending on the chemical structure of the dye, the presence of functional chemical groups and the hydrophobic-hydrophilic balance. Finally, the nonionic hydrogel of the homopolymer poly(N,Ndimethylacrylamide), PDMAM, was also used for reasons of comparison.

Advantages of Fuzzy Control Application in Fast and Sensitive Technological Processes

This paper presents the advantages of fuzzy control use in technological processes control. The paper presents a real application of the Linguistic Fuzzy-Logic Control, developed at the University of Ostrava for the control of physical models in the Intelligent Systems Laboratory. The paper presents an example of a sensitive non-linear model, such as a magnetic levitation model and obtained results which show how modern information technologies can help to solve actual technical problems. A special method based on the LFLC controller with partial components is presented in this paper followed by the method of automatic context change, which is very helpful to achieve more accurate control results. The main advantage of the used system is its robustness in changing conditions demonstrated by comparing with conventional PID controller. This technology and real models are also used as a background for problem-oriented teaching, realized at the department for master students and their collaborative as well as individual final projects.

MFCA: An Environmental Management Accounting Technique for Optimal Resource Efficiency in Production Processes

Revenue leakages are one of the major challenges manufacturers face in production processes, as most of the input materials that should emanate as products from the lines are lost as waste. Rather than generating income from material input which is meant to end-up as products, losses are further incurred as costs in order to manage waste generated. In addition, due to the lack of a clear view of the flow of resources on the lines from input to output stage, acquiring information on the true cost of waste generated have become a challenge. This has therefore given birth to the conceptualization and implementation of waste minimization strategies by several manufacturing industries. This paper reviews the principles and applications of three environmental management accounting tools namely Activity-based Costing (ABC), Life-Cycle Assessment (LCA) and Material Flow Cost Accounting (MFCA) in the manufacturing industry and their effectiveness in curbing revenue leakages. The paper unveils the strengths and limitations of each of the tools; beaming a searchlight on the tool that could allow for optimal resource utilization, transparency in production process as well as improved cost efficiency. Findings from this review reveal that MFCA may offer superior advantages with regards to the provision of more detailed information (both in physical and monetary terms) on the flow of material inputs throughout the production process compared to the other environmental accounting tools. This paper therefore makes a case for the adoption of MFCA as a viable technique for the identification and reduction of waste in production processes, and also for effective decision making by production managers, financial advisors and other relevant stakeholders.

Accrual Based Scheduling for Cloud in Single and Multi Resource System: Study of Three Techniques

This paper evaluates the accrual based scheduling for cloud in single and multi-resource system. Numerous organizations benefit from Cloud computing by hosting their applications. The cloud model provides needed access to computing with potentially unlimited resources. Scheduling is tasks and resources mapping to a certain optimal goal principle. Scheduling, schedules tasks to virtual machines in accordance with adaptable time, in sequence under transaction logic constraints. A good scheduling algorithm improves CPU use, turnaround time, and throughput. In this paper, three realtime cloud services scheduling algorithm for single resources and multiple resources are investigated. Experimental results show Resource matching algorithm performance to be superior for both single and multi-resource scheduling when compared to benefit first scheduling, Migration, Checkpoint algorithms.

Identification and Classification of Gliadin Genes in Iranian Diploid Wheat

Wheat is the first and the most important grain of the world and its bakery property is due to glutenin and gliadin qualities. Wheat seed proteins were divided into four groups according to solubility including albumin, globulin, glutenin and prolamin or gliadin. Gliadins are major components of the storage proteins in wheat endosperm. It seems that little information is available about gliadin genes in Iranian wild relatives of wheat. Thus, the aim of this study was the evaluation of the wheat wild relatives collected from different origins of Zagros Mountains in Iran, in terms of coding gliadin genes using specific primers. For this, forty accessions of Triticum boeoticum and Triticum urartu were selected for this study. For each accession, genomic DNA was extracted and PCRs were performed in total volumes of 15 μl. The amplification products were separated on 1.5% agarose gels. In results, for Gli-2A locus three allelic variants were detected by Gli-2As primer pairs. The sizes of PCR products for these alleles were 210, 490 and 700 bp. Only five (13%) and two accessions (5%) produced 700 and 490 bp fragments when their DNA was amplified with the Gli.As.2 primer pairs. However, 93% of the accessions carried allele 210 bp, and only 8% did not any product for this marker. Therefore, these germplasm could be used as rich gene pool to broaden the genetic base of bread wheat.

Enhanced Imperialist Competitive Algorithm for the Cell Formation Problem Using Sequence Data

Imperialist Competitive Algorithm (ICA) is a recent meta-heuristic method that is inspired by the social evolutions for solving NP-Hard problems. The ICA is a population-based algorithm which has achieved a great performance in comparison to other metaheuristics. This study is about developing enhanced ICA approach to solve the Cell Formation Problem (CFP) using sequence data. In addition to the conventional ICA, an enhanced version of ICA, namely EICA, applies local search techniques to add more intensification aptitude and embed the features of exploration and intensification more successfully. Suitable performance measures are used to compare the proposed algorithms with some other powerful solution approaches in the literature. In the same way, for checking the proficiency of algorithms, forty test problems are presented. Five benchmark problems have sequence data, and other ones are based on 0-1 matrices modified to sequence based problems. Computational results elucidate the efficiency of the EICA in solving CFP problems.

Transmission Performance Analysis for Live Broadcasting over IPTV Service in Telemedicine Applications

The health care must be a right for people around the world, but in order to guarantee the access to all, it is necessary to overcome geographical barriers. Telemedicine take advantage of Information Communication Technologies to deploy health care services around the world. To achieve those goals, it is necessary to use existing last mile solution to create access for home users, which is why is necessary to establish the channel characteristics for those kinds of services. This paper presents an analysis of network performance of last mile solution for the use of IPTV broadcasting with the application of streaming for telemedicine apps.

MCDM Spectrum Handover Models for Cognitive Wireless Networks

Spectrum handover is a significant topic in the cognitive radio networks to assure an efficient data transmission in the cognitive radio user’s communications. This paper proposes a comparison between three spectrum handover models: VIKOR, SAW and MEW. Four evaluation metrics are used. These metrics are, accumulative average of failed handover, accumulative average of handover performed, accumulative average of transmission bandwidth and, accumulative average of the transmission delay. As a difference with related work, the performance of the three spectrum handover models was validated with captured data of spectrum occupancy in experiments performed at the GSM frequency band (824 MHz - 849 MHz). These data represent the actual behavior of the licensed users for this wireless frequency band. The results of the comparison show that VIKOR Algorithm provides a 15.8% performance improvement compared to SAW Algorithm and, it is 12.1% better than the MEW Algorithm.