Visualization of Code Clone Detection Results and the Implementation with Structured Data

This paper describes a code clone visualization method, called FC graph, and the implementation issues. Code clone detection tools usually show the results in a textual representation. If the results are large, it makes a problem to software maintainers with understanding them. One of the approaches to overcome the situation is visualization of code clone detection results. A scatter plot is a popular approach to the visualization. However, it represents only one-to-one correspondence and it is difficult to find correspondence of code clones over multiple files. FC graph represents correspondence among files, code clones and packages in Java. All nodes in FC graph are positioned using force-directed graph layout, which is dynami- cally calculated to adjust the distances of nodes until stabilizing them. We applied FC graph to some open source programs and visualized the results. In the author’s experience, FC graph is helpful to grasp correspondence of code clones over multiple files and also code clones with in a file.

Hardware Approach to Solving Password Exposure Problem through Keyboard Sniff

This paper introduces a hardware solution to password exposure problem caused by direct accesses to the keyboard hardware interfaces through which a possible attacker is able to grab user-s password even where existing countermeasures are deployed. Several researches have proposed reasonable software based solutions to the problem for years. However, recently introduced hardware vulnerability problems have neutralized the software approaches and yet proposed any effective software solution to the vulnerability. Hardware approach in this paper is expected as the only solution to the vulnerability

Experimental and Theoretical Study of Melt Viscosity in Injection Process

The state of melt viscosity in injection process is significantly influenced by the setting parameters due to that the shear rate of injection process is higher than other processes. How to determine plastic melt viscosity during injection process is important to understand the influence of setting parameters on the melt viscosity. An apparatus named as pressure sensor bushing (PSB) module that is used to evaluate the melt viscosity during injection process is developed in this work. The formulations to coupling melt viscosity with fill time and injection pressure are derived and then the melt viscosity is determined. A test mold is prepared to evaluate the accuracy on viscosity calculations between the PSB module and the conventional approaches. The influence of melt viscosity on the tensile strength of molded part is proposed to study the consistency of injection quality.

Real-Time Vision-based Korean Finger Spelling Recognition System

Finger spelling is an art of communicating by signs made with fingers, and has been introduced into sign language to serve as a bridge between the sign language and the verbal language. Previous approaches to finger spelling recognition are classified into two categories: glove-based and vision-based approaches. The glove-based approach is simpler and more accurate recognizing work of hand posture than vision-based, yet the interfaces require the user to wear a cumbersome and carry a load of cables that connected the device to a computer. In contrast, the vision-based approaches provide an attractive alternative to the cumbersome interface, and promise more natural and unobtrusive human-computer interaction. The vision-based approaches generally consist of two steps: hand extraction and recognition, and two steps are processed independently. This paper proposes real-time vision-based Korean finger spelling recognition system by integrating hand extraction into recognition. First, we tentatively detect a hand region using CAMShift algorithm. Then fill factor and aspect ratio estimated by width and height estimated by CAMShift are used to choose candidate from database, which can reduce the number of matching in recognition step. To recognize the finger spelling, we use DTW(dynamic time warping) based on modified chain codes, to be robust to scale and orientation variations. In this procedure, since accurate hand regions, without holes and noises, should be extracted to improve the precision, we use graph cuts algorithm that globally minimize the energy function elegantly expressed by Markov random fields (MRFs). In the experiments, the computational times are less than 130ms, and the times are not related to the number of templates of finger spellings in database, as candidate templates are selected in extraction step.

Folksonomy-based Recommender Systems with User-s Recent Preferences

Social bookmarking is an environment in which the user gradually changes interests over time so that the tag data associated with the current temporal period is usually more important than tag data temporally far from the current period. This implies that in the social tagging system, the newly tagged items by the user are more relevant than older items. This study proposes a novel recommender system that considers the users- recent tag preferences. The proposed system includes the following stages: grouping similar users into clusters using an E-M clustering algorithm, finding similar resources based on the user-s bookmarks, and recommending the top-N items to the target user. The study examines the system-s information retrieval performance using a dataset from del.icio.us, which is a famous social bookmarking web site. Experimental results show that the proposed system is better and more effective than traditional approaches.

Improvement in Power Transformer Intelligent Dissolved Gas Analysis Method

Non-Destructive evaluation of in-service power transformer condition is necessary for avoiding catastrophic failures. Dissolved Gas Analysis (DGA) is one of the important methods. Traditional, statistical and intelligent DGA approaches have been adopted for accurate classification of incipient fault sources. Unfortunately, there are not often enough faulty patterns required for sufficient training of intelligent systems. By bootstrapping the shortcoming is expected to be alleviated and algorithms with better classification success rates to be obtained. In this paper the performance of an artificial neural network, K-Nearest Neighbour and support vector machine methods using bootstrapped data are detailed and shown that while the success rate of the ANN algorithms improves remarkably, the outcome of the others do not benefit so much from the provided enlarged data space. For assessment, two databases are employed: IEC TC10 and a dataset collected from reported data in papers. High average test success rate well exhibits the remarkable outcome.

Left Ventricular Model to Study the Combined Viscoelastic, Heart Rate, and Size Effects

It is known that the heart interacts with and adapts to its venous and arterial loading conditions. Various experimental studies and modeling approaches have been developed to investigate the underlying mechanisms. This paper presents a model of the left ventricle derived based on nonlinear stress-length myocardial characteristics integrated over truncated ellipsoidal geometry, and second-order dynamic mechanism for the excitation-contraction coupling system. The results of the model presented here describe the effects of the viscoelastic damping element of the electromechanical coupling system on the hemodynamic response. Different heart rates are considered to study the pacing effects on the performance of the left-ventricle against constant preload and afterload conditions under various damping conditions. The results indicate that the pacing process of the left ventricle has to take into account, among other things, the viscoelastic damping conditions of the myofilament excitation-contraction process. The effects of left ventricular dimensions on the hemdynamic response have been examined. These effects are found to be different at different viscoelastic and pacing conditions.

Evaluating and Selecting Optimization Software Packages: A Framework for Business Applications

Owing the fact that optimization of business process is a crucial requirement to navigate, survive and even thrive in today-s volatile business environment, this paper presents a framework for selecting a best-fit optimization package for solving complex business problems. Complexity level of the problem and/or using incorrect optimization software can lead to biased solutions of the optimization problem. Accordingly, the proposed framework identifies a number of relevant factors (e.g. decision variables, objective functions, and modeling approach) to be considered during the evaluation and selection process. Application domain, problem specifications, and available accredited optimization approaches are also to be regarded. A recommendation of one or two optimization software is the output of the framework which is believed to provide the best results of the underlying problem. In addition to a set of guidelines and recommendations on how managers can conduct an effective optimization exercise is discussed.

Shift Invariant Support Vector Machines Face Recognition System

In this paper, we present a new method for incorporating global shift invariance in support vector machines. Unlike other approaches which incorporate a feature extraction stage, we first scale the image and then classify it by using the modified support vector machines classifier. Shift invariance is achieved by replacing dot products between patterns used by the SVM classifier with the maximum cross-correlation value between them. Unlike the normal approach, in which the patterns are treated as vectors, in our approach the patterns are treated as matrices (or images). Crosscorrelation is computed by using computationally efficient techniques such as the fast Fourier transform. The method has been tested on the ORL face database. The tests indicate that this method can improve the recognition rate of an SVM classifier.

A New Approaches for Seismic Signals Discrimination

The automatic discrimination of seismic signals is an important practical goal for the earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present new techniques for seismic signals classification: local, regional and global discrimination. These techniques were tested on seismic signals from the data base of the National Geophysical Institute of the Centre National pour la Recherche Scientifique et Technique (Morocco) by using the Moroccan software for seismic signals analysis.

Universities Strategic Evaluation Using Balanced Scorecard

Defining strategic position of the organizations within the industry environment is one of the basic and most important phases of strategic planning to which extent that one of the fundamental schools of strategic planning is the strategic positioning school. In today-s knowledge-based economy and dynamic environment, it is essential for universities as the centers of education, knowledge creation and knowledge worker evolvement. Till now, variant models with different approaches to strategic positioning are deployed in defining the strategic position within the various industries. Balanced Scorecard as one of the powerful models for strategic positioning, analyzes all aspects of the organization evenly. In this paper with the consideration of BSC strength in strategic evaluation, it is used for analyzing the environmental position of the best-s Iranian Business Schools. The results could be used in developing strategic plans for these schools as well as other Iranian Management and Business Schools.

Public R and D Risk and Risk Management Policy

R&D risk management has been suggested as one of the management approaches for accomplishing the goals of public R&D investment. The investment in basic science and core technology development is the essential roles of government for securing the social base needed for continuous economic growth. And, it is also an important role of the science and technology policy sectors to generate a positive environment in which the outcomes of public R&D can be diffused in a stable fashion by controlling the uncertainties and risk factors in advance that may arise during the application of such achievements to society and industry. Various policies have already been implemented to manage uncertainties and variables that may have negative impact on accomplishing public R& investment goals. But we may derive new policy measures for complementing the existing policies and for exploring progress direction by analyzing them in a policy package from the viewpoint of R&D risk management.

Trust Managementfor Pervasive Computing Environments

Trust is essential for further and wider acceptance of contemporary e-services. It was first addressed almost thirty years ago in Trusted Computer System Evaluation Criteria standard by the US DoD. But this and other proposed approaches of that period were actually solving security. Roughly some ten years ago, methodologies followed that addressed trust phenomenon at its core, and they were based on Bayesian statistics and its derivatives, while some approaches were based on game theory. However, trust is a manifestation of judgment and reasoning processes. It has to be dealt with in accordance with this fact and adequately supported in cyber environment. On the basis of the results in the field of psychology and our own findings, a methodology called qualitative algebra has been developed, which deals with so far overlooked elements of trust phenomenon. It complements existing methodologies and provides a basis for a practical technical solution that supports management of trust in contemporary computing environments. Such solution is also presented at the end of this paper.

A Short Glimpse to Environmental Management at Alborz Integrated Land and Water Management Project-Iran

Environmental considerations have become an integral part of developmental thinking and decision making in many countries. It is growing rapidly in importance as a discipline of its own. Preventive approaches have been used at the evolutional process of environmental management as a broad and dynamic system for dealing with pollution and environmental degradation. In this regard, Environmental Assessment as an activity for identification and prediction of project’s impacts carried out in the world and its legal significance dates back to late 1960. In Iran, according to the Article 2 of Environmental Protection Act, Environmental Impact Assessment (EIA) should be prepared for seven categories of project. This article has been actively implementing by Department of Environment at 1997. World Bank in 1989 attempted to introducing application of Environmental Assessment for making decision about projects which are required financial assistance in developing countries. So, preparing EIA for obtaining World Bank loan was obligated. Alborz Project is one of the World Bank Projects in Iran which is environmentally significant. Seven out of ten W.B safeguard policies were considered at this project. In this paper, Alborz project, objectives, safeguard policies and role of environmental management will be elaborated

On the Noise Distance in Robust Fuzzy C-Means

In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.

Mass Customization in Supply Chain Management Environment: A Review

In the supply chain management customer is the most significant component and mass customization is mostly related to customers because it is the capability of any industry or organization to deliver highly customized products and its services to the respective customers with flexibility and integration, providing such a variety of products that nearly everyone can find what they want. Today all over the world many companies and markets are facing varied situations that at one side customers are demanding that their orders should be completed as quickly as possible while on other hand it requires highly customized products and services. By applying mass customization some companies face unwanted cost and complexity. Now they are realizing that they should completely examine what kind of customization would be best suited for their companies. In this paper authors review some approaches and principles which show effect in supply chain management that can be adopted and used by companies for quickly meeting the customer orders at reduced cost, with minimum amount of inventory and maximum efficiency.

Networks with Unreliable Nodes and Edges: Monte Carlo Lifetime Estimation

Estimating the lifetime distribution of computer networks in which nodes and links exist in time and are bound for failure is very useful in various applications. This problem is known to be NP-hard. In this paper we present efficient combinatorial approaches to Monte Carlo estimation of network lifetime distribution. We also present some simulation results.

Financial Regulations in the Process of Global Financial Crisis and Macroeconomics Impact of Basel III

Basel III (or the Third Basel Accord) is a global regulatory standard on bank capital adequacy, stress testing and market liquidity risk agreed upon by the members of the Basel Committee on Banking Supervision in 2010-2011, and scheduled to be introduced from 2013 until 2018. Basel III is a comprehensive set of reform measures. These measures aim to; (1) improve the banking sector-s ability to absorb shocks arising from financial and economic stress, whatever the source, (2) improve risk management and governance, (3) strengthen banks- transparency and disclosures. Similarly the reform target; (1) bank level or micro-prudential, regulation, which will help raise the resilience of individual banking institutions to periods of stress. (2) Macro-prudential regulations, system wide risk that can build up across the banking sector as well as the pro-cyclical implication of these risks over time. These two approaches to supervision are complementary as greater resilience at the individual bank level reduces the risk system wide shocks. Macroeconomic impact of Basel III; OECD estimates that the medium-term impact of Basel III implementation on GDP growth is in the range -0,05 percent to -0,15 percent per year. On the other hand economic output is mainly affected by an increase in bank lending spreads as banks pass a rise in banking funding costs, due to higher capital requirements, to their customers. Consequently the estimated effects on GDP growth assume no active response from monetary policy. Basel III impact on economic output could be offset by a reduction (or delayed increase) in monetary policy rates by about 30 to 80 basis points. The aim of this paper is to create a framework based on the recent regulations in order to prevent financial crises. Thus the need to overcome the global financial crisis will contribute to financial crises that may occur in the future periods. In the first part of the paper, the effects of the global crisis on the banking system examine the concept of financial regulations. In the second part; especially in the financial regulations and Basel III are analyzed. The last section in this paper explored the possible consequences of the macroeconomic impacts of Basel III.

Simulation of Dynamics of a Permanent Magnet Linear Actuator

Comparison of two approaches for the simulation of the dynamic behaviour of a permanent magnet linear actuator is presented. These are full coupled model, where the electromagnetic field, electric circuit and mechanical motion problems are solved simultaneously, and decoupled model, where first a set of static magnetic filed analysis is carried out and then the electric circuit and mechanical motion equations are solved employing bi-cubic spline approximations of the field analysis results. The results show that the proposed decoupled model is of satisfactory accuracy and gives more flexibility when the actuator response is required to be estimated for different external conditions, e.g. external circuit parameters or mechanical loads.

Evaluating Sinusoidal Functions by a Low Complexity Cubic Spline Interpolator with Error Optimization

We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.