Usability and Affordances: Examinations of Object-Naming and Object-Task Performance in Haptic Interfaces

The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.

Highly Sensitive Label Free Biosensor for Tumor Necrosis Factor

We present a label-free biosensor based on electrochemical impedance spectroscopy for the detection of proinflammatory cytokine Tumor Necrosis Factor (TNF-α). Secretion of TNF-α has been correlated to the onset of various diseases including rheumatoid arthritis, Crohn-s disease etc. Gold electrodes were patterned on a silicon substrate and self assembled monolayer of dithiobis-succinimidyl propionate was used to develop the biosensor which achieved a detection limit of ~57fM. A linear relationship was also observed between increasing TNF-α concentrations and chargetransfer resistance within a dynamic range of 1pg/ml – 1ng/ml.

Genetic Programming Approach for Multi-Category Pattern Classification Appliedto Network Intrusions Detection

This paper describes a new approach of classification using genetic programming. The proposed technique consists of genetically coevolving a population of non-linear transformations on the input data to be classified, and map them to a new space with a reduced dimension, in order to get a maximum inter-classes discrimination. The classification of new samples is then performed on the transformed data, and so become much easier. Contrary to the existing GP-classification techniques, the proposed one use a dynamic repartition of the transformed data in separated intervals, the efficacy of a given intervals repartition is handled by the fitness criterion, with a maximum classes discrimination. Experiments were first performed using the Fisher-s Iris dataset, and then, the KDD-99 Cup dataset was used to study the intrusion detection and classification problem. Obtained results demonstrate that the proposed genetic approach outperform the existing GP-classification methods [1],[2] and [3], and give a very accepted results compared to other existing techniques proposed in [4],[5],[6],[7] and [8].

Improving Classification Accuracy with Discretization on Datasets Including Continuous Valued Features

This study analyzes the effect of discretization on classification of datasets including continuous valued features. Six datasets from UCI which containing continuous valued features are discretized with entropy-based discretization method. The performance improvement between the dataset with original features and the dataset with discretized features is compared with k-nearest neighbors, Naive Bayes, C4.5 and CN2 data mining classification algorithms. As the result the classification accuracies of the six datasets are improved averagely by 1.71% to 12.31%.

On-line Handwritten Character Recognition: An Implementation of Counterpropagation Neural Net

On-line handwritten scripts are usually dealt with pen tip traces from pen-down to pen-up positions. Time evaluation of the pen coordinates is also considered along with trajectory information. However, the data obtained needs a lot of preprocessing including filtering, smoothing, slant removing and size normalization before recognition process. Instead of doing such lengthy preprocessing, this paper presents a simple approach to extract the useful character information. This work evaluates the use of the counter- propagation neural network (CPN) and presents feature extraction mechanism in full detail to work with on-line handwriting recognition. The obtained recognition rates were 60% to 94% using the CPN for different sets of character samples. This paper also describes a performance study in which a recognition mechanism with multiple thresholds is evaluated for counter-propagation architecture. The results indicate that the application of multiple thresholds has significant effect on recognition mechanism. The method is applicable for off-line character recognition as well. The technique is tested for upper-case English alphabets for a number of different styles from different peoples.

Resistive RAM Based on Hfox and its Temperature Instability Study

High performance Resistive Random Access Memory (RRAM) based on HfOx has been prepared and its temperature instability has been investigated in this work. With increasing temperature, it is found that: leakage current at high resistance state increases, which can be explained by the higher density of traps inside dielectrics (related to trap-assistant tunneling), leading to a smaller On/Off ratio; set and reset voltages decrease, which may be attributed to the higher oxygen ion mobility, in addition to the reduced potential barrier to create / recover oxygen ions (or oxygen vacancies); temperature impact on the RRAM retention degradation is more serious than electrical bias.

A Study on the Condition Monitoring of Transmission Line by On-line Circuit Parameter Measurement

An on-line condition monitoring method for transmission line is proposed using electrical circuit theory and IT technology in this paper. It is reasonable that the circuit parameters such as resistance (R), inductance (L), conductance (g) and capacitance (C) of a transmission line expose the electrical conditions and physical state of the line. Those parameters can be calculated from the linear equation composed of voltages and currents measured by synchro-phasor measurement technique at both end of the line. A set of linear voltage drop equations containing four terminal constants (A, B ,C ,D ) are mathematical models of the transmission line circuits. At least two sets of those linear equations are established from different operation condition of the line, they may mathematically yield those circuit parameters of the line. The conditions of line connectivity including state of connecting parts or contacting parts of the switching device may be monitored by resistance variations during operation. The insulation conditions of the line can be monitored by conductance (g) and capacitance(C) measurements. Together with other condition monitoring devices such as partial discharge, sensors and visual sensing device etc.,they may give useful information to monitor out any incipient symptoms of faults. The prototype of hardware system has been developed and tested through laboratory level simulated transmission lines. The test has shown enough evident to put the proposed method to practical uses.

Automatic Iterative Methods for the Multivariate Solution of Nonlinear Algebraic Equations

Most real world systems express themselves formally as a set of nonlinear algebraic equations. As applications grow, the size and complexity of these equations also increase. In this work, we highlight the key concepts in using the homotopy analysis method as a methodology used to construct efficient iteration formulas for nonlinear equations solving. The proposed method is experimentally characterized according to a set of determined parameters which affect the systems. The experimental results show the potential and limitations of the new method and imply directions for future work.

Mining Implicit Knowledge to Predict Political Risk by Providing Novel Framework with Using Bayesian Network

Nowadays predicting political risk level of country has become a critical issue for investors who intend to achieve accurate information concerning stability of the business environments. Since, most of the times investors are layman and nonprofessional IT personnel; this paper aims to propose a framework named GECR in order to help nonexpert persons to discover political risk stability across time based on the political news and events. To achieve this goal, the Bayesian Networks approach was utilized for 186 political news of Pakistan as sample dataset. Bayesian Networks as an artificial intelligence approach has been employed in presented framework, since this is a powerful technique that can be applied to model uncertain domains. The results showed that our framework along with Bayesian Networks as decision support tool, predicted the political risk level with a high degree of accuracy.

Intelligent Speaker Verification based Biometric System for Electronic Commerce Applications

Electronic commerce is growing rapidly with on-line sales already heading for hundreds of billion dollars per year. Due to the huge amount of money transferred everyday, an increased security level is required. In this work we present the architecture of an intelligent speaker verification system, which is able to accurately verify the registered users of an e-commerce service using only their voices as an input. According to the proposed architecture, a transaction-based e-commerce application should be complemented by a biometric server where customer-s unique set of speech models (voiceprint) is stored. The verification procedure requests from the user to pronounce a personalized sequence of digits and after capturing speech and extracting voice features at the client side are sent back to the biometric server. The biometric server uses pattern recognition to decide whether the received features match the stored voiceprint of the customer who claims to be, and accordingly grants verification. The proposed architecture can provide e-commerce applications with a higher degree of certainty regarding the identity of a customer, and prevent impostors to execute fraudulent transactions.

A New Similarity Measure on Intuitionistic Fuzzy Sets

Intuitionistic fuzzy sets as proposed by Atanassov, have gained much attention from past and latter researchers for applications in various fields. Similarity measures between intuitionistic fuzzy sets were developed afterwards. However, it does not cater the conflicting behavior of each element evaluated. We therefore made some modification to the similarity measure of IFS by considering conflicting concept to the model. In this paper, we concentrate on Zhang and Fu-s similarity measures for IFSs and some examples are given to validate these similarity measures. A simple modification to Zhang and Fu-s similarity measures of IFSs was proposed to find the best result according to the use of degree of indeterminacy. Finally, we mark up with the application to real decision making problems.

Sedimentation and its Challenges for Operation and Maintenance of Hydraulic Structures using SHARC Software- A Case Study of Eastern Intake in Dez Diversion Dam in Iran

Analytical investigation of the sedimentation processes in the river engineering and hydraulic structures is of vital importance as this can affect water supply for the cultivating lands in the command area. The reason being that gradual sediment formation behind the reservoir can reduce the nominal capacity of these dams. The aim of the present paper is to analytically investigate sedimentation process along the river course and behind the storage reservoirs in general and the Eastern Intake of the Dez Diversion weir in particular using the SHARC software. Results of the model indicated the water level at 115.97m whereas the real time measurement from the river cross section was 115.98 m which suggests a significantly close relation between them. The average transported sediment load in the river was measured at 0.25mm , from which it can be concluded that nearly 100% of the suspended loads in river are moving which suggests no sediment settling but indicates that almost all sediment loads enters into the intake. It was further showed the average sediment diameter entering the intake to be 0.293 mm which in turn suggests that about 85% of suspended sediments in the river entre the intake. Comparison of the results from the SHARC model with those obtained form the SSIIM software suggests quite similar outputs but distinguishing the SHARC model as more appropriate for the analysis of simpler problems than other model.

An Evaluation Model for Semantic Enablement of Virtual Research Environments

The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for crossdomain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.

Operating Room Capacity Planning Decisions

Operating rooms are important assets for hospitals as they generate the largest revenue and, at the same time, produce the largest cost for hospitals. The model presented in this paper helps make capacity planning decisions on the combination of open operating rooms (ORs) and estimated overtime to satisfy the allocated OR time to each specialty. The model combines both decisions on determining the amount of OR time to open and to allocate to different surgical specialties. The decisions made are based on OR costs, overutilization and underutilization costs, and contribution margins from allocating OR time. The results show the importance of having a good estimate of specialty usage of OR time to determine the amount of needed capacity and highlighted the tradeoff that the OR manager faces between opening more ORs versus extending the working time of the ORs already in use.

Properties of Composite Nanofiber Produced by Single and Coaxial Nozzle Method used for Electrospinning Technique

In this study, single nozzle method used for electrospinning technique which composite polymer solution with cellulose nanowiskers (CNW) was treated by ultrasonic sonificator have been compared with coaxial (double) nozzle method, in terms of mechanical, thermal and morphological properties of composite nanofiber. The effect of water content in composite polymer solution on properties of nanofiber has also been examined. It has been seen that single nozzle method which polymer solution does not contain water has better results than that of coaxial method, in terms of mechanical, thermal and morphological properties of nanofiber. However, it is necessary to make an optimization study on setting condition of ultrasonic treatment to get better dispersion of CNW in composite nanofiber and to get better mechanical and thermal properties

The Relationship between Internal Corporate Social Responsibility and Organizational Commitment within the Banking Sector in Jordan

This study attempts to investigate the relationship between internal CSR practices and organizational commitment based on the social exchange theory (SET). Specifically, we examine the impact of five dimensions of internal CSR practices on organizational commitment: health and safety, human rights, training and education, work life balance and workplace diversity. The proposed model was tested on a sample of 336 frontline employees within the banking sector in Jordan. Results showed that all internal CSR dimensions are significantly and positively related to affective and normative commitment. In addition, the findings of this study indicate that all internal CSR dimensions did not have a significant relationship with continuance commitment. Limitations of the study, directions for future research, and implications of the findings are discussed.

Certain Important Aspects of Cost Contribution Arrangements in Financial Management

Cost contribution arrangements (CCAs) and Cost sharing agreements (CCAs) belong to the tools of modern finance management. Costs spend by associated enterprises on developing producing or obtaining assets, services or rights (in general - benefits) are used for tax optimizing too. The main purpose of joint research and development, producing or obtaining benefits is to lower these costs as much as possible or to maximize the benefits. In this article is mentioned the problematic of transfer pricing and arm's length principle with connection of CCAs, CSAs. Next, there is mentioned how to settle participation shares of the total cost and benefits contributions with respect to the OECD Transfer pricing for MNEs Guidelines and with respect to other significant regulations.

The New AIMD Congestion Control Algorithm

Congestion control is one of the fundamental issues in computer networks. Without proper congestion control mechanisms there is the possibility of inefficient utilization of resources, ultimately leading to network collapse. Hence congestion control is an effort to adapt the performance of a network to changes in the traffic load without adversely affecting users perceived utilities. AIMD (Additive Increase Multiplicative Decrease) is the best algorithm among the set of liner algorithms because it reflects good efficiency as well as good fairness. Our control model is based on the assumption of the original AIMD algorithm; we show that both efficiency and fairness of AIMD can be improved. We call our approach is New AIMD. We present experimental results with TCP that match the expectation of our theoretical analysis.

Novel Use of a Quality Assurance Tool for Integrating Technology to HSE

The product development process (PDP) in the Technology group plays a very important role in the launch of any product. While a manufacturing process encourages the use of certain measures to reduce health, safety and environmental (HSE) risks on the shop floor, the PDP concentrates on the use of Geometric Dimensioning and Tolerancing (GD&T) to develop a flawless design. Furthermore, PDP distributes and coordinates activities between different departments such as marketing, purchasing, and manufacturing. However, it is seldom realized that PDP makes a significant contribution to developing a product that reduces HSE risks by encouraging the Technology group to use effective GD&T. The GD&T is a precise communication tool that uses a set of symbols, rules, and definitions to mathematically define parts to be manufactured. It is a quality assurance method widely used in the oil and gas sector. Traditionally it is used to ensure the interchangeability of a part without affecting its form, fit, and function. Parts that do not meet these requirements are rejected during quality audits. This paper discusses how the Technology group integrates this quality assurance tool into the PDP and how the tool plays a major role in helping the HSE department in its goal towards eliminating HSE incidents. The PDP involves a thorough risk assessment and establishes a method to address those risks during the design stage. An illustration shows how GD&T helped reduce safety risks by ergonomically improving assembling operations. A brief discussion explains how tolerances provided on a part help prevent finger injury. This tool has equipped Technology to produce fixtures, which are used daily in operations as well as manufacturing. By applying GD&T to create good fits, HSE risks are mitigated for operating personnel. Both customers and service providers benefit from reduced safety risks.

Automatic Detection of Mass Type Breast Cancer using Texture Analysis in Korean Digital Mammography

In this study, we present an advanced detection technique for mass type breast cancer based on texture information of organs. The proposed method detects the cancer areas in three stages. In the first stage, the midpoints of mass area are determined based on AHE (Adaptive Histogram Equalization). In the second stage, we set the threshold coefficient of homogeneity by using MLE (Maximum Likelihood Estimation) to compute the uniformity of texture. Finally, mass type cancer tissues are extracted from the original image. As a result, it was observed that the proposed method shows an improved detection performance on dense breast tissues of Korean women compared with the existing methods. It is expected that the proposed method may provide additional diagnostic information for detection of mass-type breast cancer.