Towards a Compliance Reporting using a Balanced Scorecard

Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.

Scope and Application of Collaborative Tools and Digital Manufacturing in Dentistry

It is necessary to incorporate technological advances achieved in the field of engineering into dentistry in order to enhance the process of diagnosis, treatment planning and enable the doctors to render better treatment to their patients. To achieve this ultimate goal long distance collaborations are often necessary. This paper discusses the various collaborative tools and their applications to solve a few burning problems confronted by the dentists. Customization is often the solution to most of the problems. But rapid designing, development and cost effective manufacturing is a difficult task to achieve. This problem can be solved using the technique of digital manufacturing. Cases from 6 major branches of dentistry have been discussed and possible solutions with the help of state of art technology using rapid digital manufacturing have been proposed in the present paper. The paper also entails the usage of existing tools in collaborative and digital manufacturing area.

Synthesis and Electrochemical Characterization of Iron Oxide / Activated Carbon Composite Electrode for Symmetrical Supercapacitor

In the present work, we have developed a symmetric electrochemical capacitor based on the nanostructured iron oxide (Fe3O4)-activated carbon (AC) nanocomposite materials. The physical properties of the nanocomposites were characterized by Scanning Electron Microscopy (SEM) and Brunauer-Emmett-Teller (BET) analysis. The electrochemical performances of the composite electrode in 1.0 M Na2SO3 and 1.0 M Na2SO4 aqueous solutions were evaluated using cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS). The composite electrode with 4 wt% of iron oxide nanomaterials exhibits the highest capacitance of 86 F/g. The experimental results clearly indicate that the incorporation of iron oxide nanomaterials at low concentration to the composite can improve the capacitive performance, mainly attributed to the contribution of the pseudocapacitance charge storage mechanism and the enhancement on the effective surface area of the electrode. Nevertheless, there is an optimum threshold on the amount of iron oxide that needs to be incorporated into the composite system. When this optimum threshold is exceeded, the capacitive performance of the electrode starts to deteriorate, as a result of the undesired particle aggregation, which is clearly indicated in the SEM analysis. The electrochemical performance of the composite electrode is found to be superior when Na2SO3 is used as the electrolyte, if compared to the Na2SO4 solution. It is believed that Fe3O4 nanoparticles can provide favourable surface adsorption sites for sulphite (SO3 2-) anions which act as catalysts for subsequent redox and intercalation reactions.

A Bi-Objective Preventive Healthcare Facility Network Design with Incorporating Cost and Time Saving

Main goal of preventive healthcare problems are at decreasing the likelihood and severity of potentially life-threatening illnesses by protection and early detection. The levels of establishment and staffing costs along with summation of the travel and waiting time that clients spent are considered as objectives functions of the proposed nonlinear integer programming model. In this paper, we have proposed a bi-objective mathematical model for designing a network of preventive healthcare facilities so as to minimize aforementioned objectives, simultaneously. Moreover, each facility acts as M/M/1 queuing system. The number of facilities to be established, the location of each facility, and the level of technology for each facility to be chosen are provided as the main determinants of a healthcare facility network. Finally, to demonstrate performance of the proposed model, four multi-objective decision making techniques are presented to solve the model.

A Cumulative Learning Approach to Data Mining Employing Censored Production Rules (CPRs)

Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.

Slovenian Text-to-Speech Synthesis for Speech User Interfaces

The paper presents the design concept of a unitselection text-to-speech synthesis system for the Slovenian language. Due to its modular and upgradable architecture, the system can be used in a variety of speech user interface applications, ranging from server carrier-grade voice portal applications, desktop user interfaces to specialized embedded devices. Since memory and processing power requirements are important factors for a possible implementation in embedded devices, lexica and speech corpora need to be reduced. We describe a simple and efficient implementation of a greedy subset selection algorithm that extracts a compact subset of high coverage text sentences. The experiment on a reference text corpus showed that the subset selection algorithm produced a compact sentence subset with a small redundancy. The adequacy of the spoken output was evaluated by several subjective tests as they are recommended by the International Telecommunication Union ITU.

Single and Multiple Sourcing in the Auto-Manufacturing Industry

This article outlines a hybrid method, incorporating multiple techniques into an evaluation process, in order to select competitive suppliers in a supply chain. It enables a purchaser to do single sourcing and multiple sourcing by calculating a combined supplier score, which accounts for both qualitative and quantitative factors that have impact on supply chain performance.

Scale-Space Volume Descriptors for Automatic 3D Facial Feature Extraction

An automatic method for the extraction of feature points for face based applications is proposed. The system is based upon volumetric feature descriptors, which in this paper has been extended to incorporate scale space. The method is robust to noise and has the ability to extract local and holistic features simultaneously from faces stored in a database. Extracted features are stable over a range of faces, with results indicating that in terms of intra-ID variability, the technique has the ability to outperform manual landmarking.

Average Switching Thresholds and Average Throughput for Adaptive Modulation using Markov Model

The motivation for adaptive modulation and coding is to adjust the method of transmission to ensure that the maximum efficiency is achieved over the link at all times. The receiver estimates the channel quality and reports it back to the transmitter. The transmitter then maps the reported quality into a link mode. This mapping however, is not a one-to-one mapping. In this paper we investigate a method for selecting the proper modulation scheme. This method can dynamically adapt the mapping of the Signal-to- Noise Ratio (SNR) into a link mode. It enables the use of the right modulation scheme irrespective of changes in the channel conditions by incorporating errors in the received data. We propose a Markov model for this method, and use it to derive the average switching thresholds and the average throughput. We show that the average throughput of this method outperforms the conventional threshold method.

A Hybrid CamShift and l1-Minimization Video Tracking Algorithm

The Continuously Adaptive Mean-Shift (CamShift) algorithm, incorporating scene depth information is combined with the l1-minimization sparse representation based method to form a hybrid kernel and state space-based tracking algorithm. We take advantage of the increased efficiency of the former with the robustness to occlusion property of the latter. A simple interchange scheme transfers control between algorithms based upon drift and occlusion likelihood. It is quantified by the projection of target candidates onto a depth map of the 2D scene obtained with a low cost stereo vision webcam. Results are improved tracking in terms of drift over each algorithm individually, in a challenging practical outdoor multiple occlusion test case.

Identifying and Adopting Latter Instruments Determining the Sustainable Company Competitiveness

Nowadays companies in all sectors are looking for the sources of competitive advantages. Holistic marketing approach searches for their emergence based on the integration of all components and elements across the organization. Modern marketing sees the sources of competitive advantage in implementing the latest managerial practices, motivation, intelligent project management, knowledge management, collaborative marketing, CSR and, in the recent years, also in the business process optimization. With the use of modern tools including business process management and business process modelling the company can markedly increase its internal efficiency which can lead not only to lowering the costs but to creating the environment for optimal customer care, positive corporate culture and for origination of innovations as well. In the article the authors analyze the recent trend in this area and introduce suggestions to companies to identify and optimize the key processes that have a significant impact of the company´s competitiveness.

Combing LCIA and Fuzzy Risk Assessment for Environmental Impact Assessment

Environmental impact assessment (EIA) is a procedure tool of environmental management for identifying, predicting, evaluating and mitigating the adverse effects of development proposals. EIA reports usually analyze how the amounts or concentrations of pollutants obey the relevant standards. Actually, many analytical tools can deepen the analysis of environmental impacts in EIA reports, such as life cycle assessment (LCA) and environmental risk assessment (ERA). Life cycle impact assessment (LCIA) is one of steps in LCA to introduce the causal relationships among environmental hazards and damage. Incorporating the LCIA concept into ERA as an integrated tool for EIA can extend the focus of the regulatory compliance of environmental impacts to determine of the significance of environmental impacts. Sometimes, when using integrated tools, it is necessary to consider fuzzy situations due to insufficient information; therefore, ERA should be generalized to fuzzy risk assessment (FRA). Finally, the use of the proposed methodology is demonstrated through the study case of the expansion plan of the world-s largest plastics processing factory.

Using Mean-Shift Tracking Algorithms for Real-Time Tracking of Moving Images on an Autonomous Vehicle Testbed Platform

This paper describes new computer vision algorithms that have been developed to track moving objects as part of a long-term study into the design of (semi-)autonomous vehicles. We present the results of a study to exploit variable kernels for tracking in video sequences. The basis of our work is the mean shift object-tracking algorithm; for a moving target, it is usual to define a rectangular target window in an initial frame, and then process the data within that window to separate the tracked object from the background by the mean shift segmentation algorithm. Rather than use the standard, Epanechnikov kernel, we have used a kernel weighted by the Chamfer distance transform to improve the accuracy of target representation and localization, minimising the distance between the two distributions in RGB color space using the Bhattacharyya coefficient. Experimental results show the improved tracking capability and versatility of the algorithm in comparison with results using the standard kernel. These algorithms are incorporated as part of a robot test-bed architecture which has been used to demonstrate their effectiveness.

Techniques for Video Mosaicing

Video Mosaicing is the stitching of selected frames of a video by estimating the camera motion between the frames and thereby registering successive frames of the video to arrive at the mosaic. Different techniques have been proposed in the literature for video mosaicing. Despite of the large number of papers dealing with techniques to generate mosaic, only a few authors have investigated conditions under which these techniques generate good estimate of motion parameters. In this paper, these techniques are studied under different videos, and the reasons for failures are found. We propose algorithms with incorporation of outlier removal algorithms for better estimation of motion parameters.

Traffic Density Estimation for Multiple Segment Freeways

Traffic density, an indicator of traffic conditions, is one of the most critical characteristics to Intelligent Transport Systems (ITS). This paper investigates recursive traffic density estimation using the information provided from inductive loop detectors. On the basis of the phenomenological relationship between speed and density, the existing studies incorporate a state space model and update the density estimate using vehicular speed observations via the extended Kalman filter, where an approximation is made because of the linearization of the nonlinear observation equation. In practice, this may lead to substantial estimation errors. This paper incorporates a suitable transformation to deal with the nonlinear observation equation so that the approximation is avoided when using Kalman filter to estimate the traffic density. A numerical study is conducted. It is shown that the developed method outperforms the existing methods for traffic density estimation.

Immobilization of Simulated High Level Nuclear Wastes with Li2O-CeO2-Fe2O3-P2O5 Glasses

The leaching behavior and structure of Li2O-CeO2- Fe2O3-P2O5 glasses incorporated with simulated high level nuclear wastes (HLW) were studied. The leach rates of gross and each constituent element were determined from the total weight loss of the specimen and the leachate analyses by inductively coupled argon plasma spectroscopy (ICP). The gross leach rate of the 4.5Li2O- 9.7CeO2-34.7Fe2O3-51.5P2O5 glass waste form containing 45 mass% simulated HLW is of the order of 10

Application of the Balanced Scorecard into the Formulation of the Firm Strategy

In contemporary global and dynamically developing environment there is a need of the strategic planning fundamental. It is complicated, but at the same time important process from the point of view of continual keeping of competitive advantage. The aim of the paper is formulation of strategic goals for the needs of the small enterprises. There will be used Balanced Scorecard as a balanced system of the indicators for the clearing and transferring vision into particular goals. In particular perspectives the theme will be focused on strategic goals. Consequently will be mention the concept of the competitiveness IDINMOSU. This connect to Balanced Scorecard.

Globally Convergent Edge-preserving Reconstruction with Contour-line Smoothing

The standard approach to image reconstruction is to stabilize the problem by including an edge-preserving roughness penalty in addition to faithfulness to the data. However, this methodology produces noisy object boundaries and creates a staircase effect. The existing attempts to favor the formation of smooth contour lines take the edge field explicitly into account; they either are computationally expensive or produce disappointing results. In this paper, we propose to incorporate the smoothness of the edge field in an implicit way by means of an additional penalty term defined in the wavelet domain. We also derive an efficient half-quadratic algorithm to solve the resulting optimization problem, including the case when the data fidelity term is non-quadratic and the cost function is nonconvex. Numerical experiments show that our technique preserves edge sharpness while smoothing contour lines; it produces visually pleasing reconstructions which are quantitatively better than those obtained without wavelet-domain constraints.

An Empirical Study of Taiwan-s Hospital Foundation Investment in Corporate Social Responsibility and Financial Performance

Corporate Social Responsibility (CSR) has become a new trend of business governance. Few research studies on CSR published in Taiwanese academia, especially for medical settings, we were interested in probing the relationship of CSR and financial performance in medical settings in Taiwan. The results illustrate that: (1) a time delay effect exists with a lag between CSR effort and its performance in the hospital foundation, (2) input into the internal domains of CSR will be helpful to improve employee productivity in the hospital foundation, and (3) input into the external domains of CSR will be helpful in improving financial performance in the hospital foundation. This study overviews CSR in the medical industry in Taiwan and the relationship of CSR and financial performance. Discussions of possible implications from the study results are applied to consult the CSR concept that will be transferred into a business strategy for the organization manager.

Analysis of Phosphate in Wastewater Using an Autonomous Microfluidics-Based Analyser

A portable sensor for the analysis of phosphate in aqueous samples has been developed. The sensor incorporates microfluidic technology, colorimetric detection, and wireless communications into a compact and rugged portable device. The detection method used is the molybdenum yellow method, in which a phosphate-containing sample is mixed with a reagent containing ammonium metavanadate and ammonium molybdate in an acidic medium. A yellow-coloured compound is generated and the absorption of this compound is measured using a light emitting diode (LED) light source and a photodiode detector. The absorption is directly proportional to the phosphate concentration in the original sample. In this paper we describe the application of this phosphate sensor to the analysis of wastewater at a municipal wastewater treatment plant in Co. Kildare, Ireland.