Understanding Grip Choice and Comfort Whilst Hoovering

The hand is one of the essential parts of the body for carrying out Activities of Daily Living (ADLs). Individuals use their hands and fingers in everyday activities in the both the workplace and home. Hand-intensive tasks require diverse and sometimes extreme levels of exertion, depending on the action, movement or manipulation involved. The authors have undertaken several studies looking at grip choice and comfort. It is hoped that in providing improved understanding of discomfort during ADLs this will aid in the design of consumer products. Previous work by the authors outlined a methodology for calculating pain frequency and pain level for a range of tasks. From an online survey undertaken by the authors with regards manipulating objects during everyday tasks, tasks involving gripping were seen to produce the highest levels of pain and discomfort. Questioning of the participants showed that cleaning tasks were seen to be ADL's that produced the highest levels of discomfort, with women feeling higher levels of discomfort than men. This paper looks at the methodology for calculating pain frequency and pain level with particular regards to gripping activities. This methodology shows that activities such as mopping, sweeping and hoovering shows the highest numbers of pain frequency and pain level at 3112.5 frequency per month while the pain level per person doing this action was 0.78.The study then uses thin-film force sensors to analyze the force distribution in the hand whilst hoovering and compares this for differing grip styles and genders. Women were seen to have more of their hand under a higher pressure than men when undertaking hoovering. This suggests that women may feel greater discomfort than men since their hand is at a higher pressure more of the time.

Multi-stage Directional Median Filter

Median filter is widely used to remove impulse noise without blurring sharp edges. However, when noise level increased, or with thin edges, median filter may work poorly. This paper proposes a new filter, which will detect edges along four possible directions, and then replace noise corrupted pixel with estimated noise-free edge median value. Simulations show that the proposed multi-stage directional median filter can provide excellent performance of suppressing impulse noise in all situations.

Changes in Subjective and Objective Measures of Performance in Ramadan

The Muslim faith requires individuals to fast between the hours of sunrise and sunset during the month of Ramadan. Our recent work has concentrated on some of the changes that take place during the daytime when fasting. A questionnaire was developed to assess subjective estimates of physical, mental and social activities, and fatigue. Four days were studied: in the weeks before and after Ramadan (control days) and during the first and last weeks of Ramadan (experimental days). On each of these four days, this questionnaire was given several times during the daytime and once after the fast had been broken and just before individuals retired at night. During Ramadan, daytime mental, physical and social activities all decreased below control values but then increased to abovecontrol values in the evening. The desires to perform physical and mental activities showed very similar patterns. That is, individuals tried to conserve energy during the daytime in preparation for the evenings when they ate and drank, often with friends. During Ramadan also, individuals were more fatigued in the daytime and napped more often than on control days. This extra fatigue probably reflected decreased sleep, individuals often having risen earlier (before sunrise, to prepare for fasting) and retired later (to enable recovery from the fast). Some physiological measures and objective measures of performance (including the response to a bout of exercise) have also been investigated. Urine osmolality fell during the daytime on control days as subjects drank, but rose in Ramadan to reach values at sunset indicative of dehydration. Exercise performance was also compromised, particularly late in the afternoon when the fast had lasted several hours. Self-chosen exercise work-rates fell and a set amount of exercise felt more arduous. There were also changes in heart rate and lactate accumulation in the blood, indicative of greater cardiovascular and metabolic stress caused by the exercise in subjects who had been fasting. Daytime fasting in Ramadan produces widespread effects which probably reflect combined effects of sleep loss and restrictions to intakes of water and food.

Implementation of Neural Network Based Electricity Load Forecasting

This paper proposed a novel model for short term load forecast (STLF) in the electricity market. The prior electricity demand data are treated as time series. The model is composed of several neural networks whose data are processed using a wavelet technique. The model is created in the form of a simulation program written with MATLAB. The load data are treated as time series data. They are decomposed into several wavelet coefficient series using the wavelet transform technique known as Non-decimated Wavelet Transform (NWT). The reason for using this technique is the belief in the possibility of extracting hidden patterns from the time series data. The wavelet coefficient series are used to train the neural networks (NNs) and used as the inputs to the NNs for electricity load prediction. The Scale Conjugate Gradient (SCG) algorithm is used as the learning algorithm for the NNs. To get the final forecast data, the outputs from the NNs are recombined using the same wavelet technique. The model was evaluated with the electricity load data of Electronic Engineering Department in Mandalay Technological University in Myanmar. The simulation results showed that the model was capable of producing a reasonable forecasting accuracy in STLF.

SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids

Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.

An Evaluation of Carbon Dioxide Emissions Trading among Enterprises -The Tokyo Cap and Trade Program-

This study aims to propose three evaluation methods to evaluate the Tokyo Cap and Trade Program when emissions trading is performed virtually among enterprises, focusing on carbon dioxide (CO2), which is the only emitted greenhouse gas that tends to increase. The first method clarifies the optimum reduction rate for the highest cost benefit, the second discusses emissions trading among enterprises through market trading, and the third verifies long-term emissions trading during the term of the plan (2010-2019), checking the validity of emissions trading partly using Geographic Information Systems (GIS). The findings of this study can be summarized in the following three points. 1. Since the total cost benefit is the greatest at a 44% reduction rate, it is possible to set it more highly than that of the Tokyo Cap and Trade Program to get more total cost benefit. 2. At a 44% reduction rate, among 320 enterprises, 8 purchasing enterprises and 245 sales enterprises gain profits from emissions trading, and 67 enterprises perform voluntary reduction without conducting emissions trading. Therefore, to further promote emissions trading, it is necessary to increase the sales volumes of emissions trading in addition to sales enterprises by increasing the number of purchasing enterprises. 3. Compared to short-term emissions trading, there are few enterprises which benefit in each year through the long-term emissions trading of the Tokyo Cap and Trade Program. Only 81 enterprises at the most can gain profits from emissions trading in FY 2019. Therefore, by setting the reduction rate more highly, it is necessary to increase the number of enterprises that participate in emissions trading and benefit from the restraint of CO2 emissions.

System-Level Energy Estimation for SoC based on the Dynamic Behavior of Embedded Software

This paper describes a system-level SoC energy consumption estimation method based on a dynamic behavior of embedded software in the early stages of the SoC development. A major problem of SOC development is development rework caused by unreliable energy consumption estimation at the early stages. The energy consumption of an SoC used in embedded systems is strongly affected by the dynamic behavior of the software. At the early stages of SoC development, modeling with a high level of abstraction is required for both the dynamic behavior of the software, and the behavior of the SoC. We estimate the energy consumption by a UML model-based simulation. The proposed method is applied for an actual embedded system in an MFP. The energy consumption estimation of the SoC is more accurate than conventional methods and this proposed method is promising to reduce the chance of development rework in the SoC development. ∈

Weed Classification using Histogram Maxima with Threshold for Selective Herbicide Applications

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Maxima with threshold of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Project Selection by Using Fuzzy AHP and TOPSIS Technique

In this article, by using fuzzy AHP and TOPSIS technique we propose a new method for project selection problem. After reviewing four common methods of comparing alternatives investment (net present value, rate of return, benefit cost analysis and payback period) we use them as criteria in AHP tree. In this methodology by utilizing improved Analytical Hierarchy Process by Fuzzy set theory, first we try to calculate weight of each criterion. Then by implementing TOPSIS algorithm, assessment of projects has been done. Obtained results have been tested in a numerical example.

Hiding Data in Images Using PCP

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Transmission Performance of Millimeter Wave Multiband OFDM UWB Wireless Signal over Fiber System

Performance of millimeter-wave (mm-wave) multiband orthogonal frequency division multiplexing (MB-OFDM) ultrawideband (UWB) signal generation using frequency quadrupling technique and transmission over fiber is experimentally investigated. The frequency quadrupling is achived by using only one Mach- Zehnder modulator (MZM) that is biased at maximum transmission (MATB) point. At the output, a frequency quadrupling signal is obtained then sent to a second MZM. This MZM is used for MBOFDM UWB signal modulation. In this work, we demonstrate 30- GHz mm-wave wireless that carries three-bands OFDM UWB signals, and error vector magnitude (EVM) is used to analyze the transmission quality. It is found that our proposed technique leads to an improvement of 3.5 dB in EVM at 40% of local oscillator (LO) modulation with comparison to the technique using two cascaded MZMs biased at minimum transmission (MITB) point.

A Field Research for Investigating the Effect of Strategic Management on Institutionalization Levels of Enterprises

The aim of this study is to determine the effect of strategic management implementations on the institutionalization levels. In this regard a field study has been made over 31 stone quarry enterprises in cement producing sector in Konya by using survey method. In this study, institutionalization levels of the enterprises have been evaluated regarding three dimensions: professionalization, management approach, participation in decisions and delegation of authority. According to the results of the survey, there is a highly positive and statistically significant relationship between the strategic management implementations and institutionalization levels of the enterprises. Additionally,-considering the results of regression analysis made for establishing the relationship between strategic management and institutionalization levels- it has been determined that strategic management implementations of the enterprises can be used as a variable to explain the institutionalization levels of them, and also strategic management implementations of the enterprises increase the institutionalization levels of them.

Evaluation Framework for Agent-Oriented Methodologies

Many agent-oriented software engineering methodologies have been proposed for software developing; however their application is still limited due to their lack of maturity. Evaluating the strengths and weaknesses of these methodologies plays an important role in improving them and in developing new stronger methodologies. This paper presents an evaluation framework for agent-oriented methodologies, which addresses six major areas: concepts, notation, process, pragmatics, support for software engineering and marketability. The framework is then used to evaluate the Gaia methodology to identify its strengths and weaknesses, and to prove the ability of the framework for promoting the agent-oriented methodologies by detecting their weaknesses in detail.

Comparative Study of Complexity in Streetscape Composition

This research is a comparative study of complexity, as a multidimensional concept, in the context of streetscape composition in Algeria and Japan. 80 streetscapes visual arrays have been collected and then presented to 20 participants, with different cultural backgrounds, in order to be categorized and classified according to their degrees of complexity. Three analysis methods have been used in this research: cluster analysis, ranking method and Hayashi Quantification method (Method III). The results showed that complexity, disorder, irregularity and disorganization are often conflicting concepts in the urban context. Algerian daytime streetscapes seem to be balanced, ordered and regular, and Japanese daytime streetscapes seem to be unbalanced, regular and vivid. Variety, richness and irregularity with some aspects of order and organization seem to characterize Algerian night streetscapes. Japanese night streetscapes seem to be more related to balance, regularity, order and organization with some aspects of confusion and ambiguity. Complexity characterized mainly Algerian avenues with green infrastructure. Therefore, for Japanese participants, Japanese traditional night streetscapes were complex. And for foreigners, Algerian and Japanese avenues nightscapes were the most complex visual arrays.

The Role of Knowledge Management in Enterprise 2.0

The term Enterprise 2.0 (E2.0) describes a collection of organizational and IT practices that help organizations establish flexible work models, visible knowledge-sharing practices, and higher levels of community participation. E2.0 parallels and builds on another term commonly being used in the industry – Web 2.0. E2.0 represents also new packaging for strategic collaboration and Knowledge Management (KM). Organizations rely on collaboration and KM initiatives to attain innovation, growth, productivity, and performance goals.

Use of Novel Algorithms MAJE4 and MACJER-320 for Achieving Confidentiality and Message Authentication in SSL and TLS

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

Effect of Enzyme and Heat Pretreatment on Sunflower Oil Recovery Using Aqueous and Hexane Extractions

The effects of enzyme action and heat pretreatment on oil extraction yield from sunflower kernels were analysed using hexane extraction with Soxhlet, and aqueous extraction with incubator shaker. Ground kernels of raw and heat treated kernels, each with and without Viscozyme treatment were used. Microscopic images of the kernels were taken to analyse the visible effects of each treatment on the cotyledon cell structure of the kernels. Heat pretreated kernels before both extraction processes produced enhanced oil extraction yields than the control, with steam explosion the most efficient. In hexane extraction, applying a combination of steam explosion and Viscozyme treatments to the kernels before the extraction gave the maximum oil extractable in 1 hour; while for aqueous extraction, raw kernels treated with Viscozyme gave the highest oil extraction yield. Remarkable cotyledon cell disruption was evident in kernels treated with Viscozyme; whereas steam explosion and conventional heat treated kernels had similar effects.

Flow and Heat Transfer of a Nanofluid over a Shrinking Sheet

The problem of laminar fluid flow which results from the shrinking of a permeable surface in a nanofluid has been investigated numerically. The model used for the nanofluid incorporates the effects of Brownian motion and thermophoresis. A similarity solution is presented which depends on the mass suction parameter S, Prandtl number Pr, Lewis number Le, Brownian motion number Nb and thermophoresis number Nt. It was found that the reduced Nusselt number is decreasing function of each dimensionless number.

DEA Method for Evaluation of EU Performance

The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.

A State Aggregation Approach to Singularly Perturbed Markov Reward Processes

In this paper, we propose a single sample path based algorithm with state aggregation to optimize the average rewards of singularly perturbed Markov reward processes (SPMRPs) with a large scale state spaces. It is assumed that such a reward process depend on a set of parameters. Differing from the other kinds of Markov chain, SPMRPs have their own hierarchical structure. Based on this special structure, our algorithm can alleviate the load in the optimization for performance. Moreover, our method can be applied on line because of its evolution with the sample path simulated. Compared with the original algorithm applied on these problems of general MRPs, a new gradient formula for average reward performance metric in SPMRPs is brought in, which will be proved in Appendix, and then based on these gradients, the schedule of the iteration algorithm is presented, which is based on a single sample path, and eventually a special case in which parameters only dominate the disturbance matrices will be analyzed, and a precise comparison with be displayed between our algorithm with the old ones which is aim to solve these problems in general Markov reward processes. When applied in SPMRPs, our method will approach a fast pace in these cases. Furthermore, to illustrate the practical value of SPMRPs, a simple example in multiple programming in computer systems will be listed and simulated. Corresponding to some practical model, physical meanings of SPMRPs in networks of queues will be clarified.