Towards a Compliance Reporting using a Balanced Scorecard

Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.

Combining the Description Features of UMLRT and CSP+T Specifications Applied to a Complete Design of Real-Time Systems

UML is a collection of notations for capturing a software system specification. These notations have a specific syntax defined by the Object Management Group (OMG), but many of their constructs only present informal semantics. They are primarily graphical, with textual annotation. The inadequacies of standard UML as a vehicle for complete specification and implementation of real-time embedded systems has led to a variety of competing and complementary proposals. The Real-time UML profile (UML-RT), developed and standardized by OMG, defines a unified framework to express the time, scheduling and performance aspects of a system. We present in this paper a framework approach aimed at deriving a complete specification of a real-time system. Therefore, we combine two methods, a semiformal one, UML-RT, which allows the visual modeling of a realtime system and a formal one, CSP+T, which is a design language including the specification of real-time requirements. As to show the applicability of the approach, a correct design of a real-time system with hard real time constraints by applying a set of mapping rules is obtained.

New Mitigating Technique to Overcome DDOS Attack

In this paper, we explore a new scheme for filtering spoofed packets (DDOS attack) which is a combination of path fingerprint and client puzzle concepts. In this each IP packet has a unique fingerprint is embedded that represents, the route a packet has traversed. The server maintains a mapping table which contains the client IP address and its corresponding fingerprint. In ingress router, client puzzle is placed. For each request, the puzzle issuer provides a puzzle which the source has to solve. Our design has the following advantages over prior approaches, 1) Reduce the network traffic, as we place a client puzzle at the ingress router. 2) Mapping table at the server is lightweight and moderate.

Fuzzy T-Neighborhood Groups Acting on Sets

In this paper, The T-G-action topology on a set acted on by a fuzzy T-neighborhood (T-neighborhood, for short) group is defined as a final T-neighborhood topology with respect to a set of maps. We mainly prove that this topology is a T-regular Tneighborhood topology.

Evolutionary Algorithms for Learning Primitive Fuzzy Behaviors and Behavior Coordination in Multi-Objective Optimization Problems

Evolutionary robotics is concerned with the design of intelligent systems with life-like properties by means of simulated evolution. Approaches in evolutionary robotics can be categorized according to the control structures that represent the behavior and the parameters of the controller that undergo adaptation. The basic idea is to automatically synthesize behaviors that enable the robot to perform useful tasks in complex environments. The evolutionary algorithm searches through the space of parameterized controllers that map sensory perceptions to control actions, thus realizing a specific robotic behavior. Further, the evolutionary algorithm maintains and improves a population of candidate behaviors by means of selection, recombination and mutation. A fitness function evaluates the performance of the resulting behavior according to the robot-s task or mission. In this paper, the focus is in the use of genetic algorithms to solve a multi-objective optimization problem representing robot behaviors; in particular, the A-Compander Law is employed in selecting the weight of each objective during the optimization process. Results using an adaptive fitness function show that this approach can efficiently react to complex tasks under variable environments.

Hand Gesture Recognition Based on Combined Features Extraction

Hand gesture is an active area of research in the vision community, mainly for the purpose of sign language recognition and Human Computer Interaction. In this paper, we propose a system to recognize alphabet characters (A-Z) and numbers (0-9) in real-time from stereo color image sequences using Hidden Markov Models (HMMs). Our system is based on three main stages; automatic segmentation and preprocessing of the hand regions, feature extraction and classification. In automatic segmentation and preprocessing stage, color and 3D depth map are used to detect hands where the hand trajectory will take place in further step using Mean-shift algorithm and Kalman filter. In the feature extraction stage, 3D combined features of location, orientation and velocity with respected to Cartesian systems are used. And then, k-means clustering is employed for HMMs codeword. The final stage so-called classification, Baum- Welch algorithm is used to do a full train for HMMs parameters. The gesture of alphabets and numbers is recognized using Left-Right Banded model in conjunction with Viterbi algorithm. Experimental results demonstrate that, our system can successfully recognize hand gestures with 98.33% recognition rate.

Average Switching Thresholds and Average Throughput for Adaptive Modulation using Markov Model

The motivation for adaptive modulation and coding is to adjust the method of transmission to ensure that the maximum efficiency is achieved over the link at all times. The receiver estimates the channel quality and reports it back to the transmitter. The transmitter then maps the reported quality into a link mode. This mapping however, is not a one-to-one mapping. In this paper we investigate a method for selecting the proper modulation scheme. This method can dynamically adapt the mapping of the Signal-to- Noise Ratio (SNR) into a link mode. It enables the use of the right modulation scheme irrespective of changes in the channel conditions by incorporating errors in the received data. We propose a Markov model for this method, and use it to derive the average switching thresholds and the average throughput. We show that the average throughput of this method outperforms the conventional threshold method.

A Hybrid CamShift and l1-Minimization Video Tracking Algorithm

The Continuously Adaptive Mean-Shift (CamShift) algorithm, incorporating scene depth information is combined with the l1-minimization sparse representation based method to form a hybrid kernel and state space-based tracking algorithm. We take advantage of the increased efficiency of the former with the robustness to occlusion property of the latter. A simple interchange scheme transfers control between algorithms based upon drift and occlusion likelihood. It is quantified by the projection of target candidates onto a depth map of the 2D scene obtained with a low cost stereo vision webcam. Results are improved tracking in terms of drift over each algorithm individually, in a challenging practical outdoor multiple occlusion test case.

Predicting DHF Incidence in Northern Thailand using Time Series Analysis Technique

This study aimed at developing a forecasting model on the number of Dengue Haemorrhagic Fever (DHF) incidence in Northern Thailand using time series analysis. We developed Seasonal Autoregressive Integrated Moving Average (SARIMA) models on the data collected between 2003-2006 and then validated the models using the data collected between January-September 2007. The results showed that the regressive forecast curves were consistent with the pattern of actual values. The most suitable model was the SARIMA(2,0,1)(0,2,0)12 model with a Akaike Information Criterion (AIC) of 12.2931 and a Mean Absolute Percent Error (MAPE) of 8.91713. The SARIMA(2,0,1)(0,2,0)12 model fitting was adequate for the data with the Portmanteau statistic Q20 = 8.98644 ( x20,95= 27.5871, P>0.05). This indicated that there was no significant autocorrelation between residuals at different lag times in the SARIMA(2,0,1)(0,2,0)12 model.

Analysis of Secondary School Students’ Perceptions about Information Technologies through a Word Association Test

The aim of this study is to discover secondary school students’ perceptions related to information technologies and the connections between concepts in their cognitive structures. A word association test consisting of six concepts related to information technologies is used to collect data from 244 secondary school students. Concept maps that present students’ cognitive structures are drawn with the help of frequency data. Data are analyzed and interpreted according to the connections obtained as a result of the concept maps. It is determined students associate most with these concepts—computer, Internet, and communication of the given concepts, and associate least with these concepts—computer-assisted education and information technologies. These results show the concepts, Internet, communication, and computer, are an important part of students’ cognitive structures. In addition, students mostly answer computer, phone, game, Internet and Facebook as the key concepts. These answers show students regard information technologies as a means for entertainment and free time activity, not as a means for education.

High Level Synthesis of Kahn Process Networks(KPN) for Streaming Applications

Streaming Applications usually run in parallel or in series that incrementally transform a stream of input data. It poses a design challenge to break such an application into distinguishable blocks and then to map them into independent hardware processing elements. For this, there is required a generic controller that automatically maps such a stream of data into independent processing elements without any dependencies and manual considerations. In this paper, Kahn Process Networks (KPN) for such streaming applications is designed and developed that will be mapped on MPSoC. This is designed in such a way that there is a generic Cbased compiler that will take the mapping specifications as an input from the user and then it will automate these design constraints and automatically generate the synthesized RTL optimized code for specified application.

Edge Detection in Digital Images Using Fuzzy Logic Technique

The fuzzy technique is an operator introduced in order to simulate at a mathematical level the compensatory behavior in process of decision making or subjective evaluation. The following paper introduces such operators on hand of computer vision application. In this paper a novel method based on fuzzy logic reasoning strategy is proposed for edge detection in digital images without determining the threshold value. The proposed approach begins by segmenting the images into regions using floating 3x3 binary matrix. The edge pixels are mapped to a range of values distinct from each other. The robustness of the proposed method results for different captured images are compared to those obtained with the linear Sobel operator. It is gave a permanent effect in the lines smoothness and straightness for the straight lines and good roundness for the curved lines. In the same time the corners get sharper and can be defined easily.

Object Tracking using MACH filter and Optical Flow in Cluttered Scenes and Variable Lighting Conditions

Vision based tracking problem is solved through a combination of optical flow, MACH filter and log r-θ mapping. Optical flow is used for detecting regions of movement in video frames acquired under variable lighting conditions. The region of movement is segmented and then searched for the target. A template is used for target recognition on the segmented regions for detecting the region of interest. The template is trained offline on a sequence of target images that are created using the MACH filter and log r-θ mapping. The template is applied on areas of movement in successive frames and strong correlation is seen for in-class targets. Correlation peaks above a certain threshold indicate the presence of target and the target is tracked over successive frames.

Mapping Paddy Rice Agriculture using Multi-temporal FORMOSAT-2 Images

Most paddy rice fields in East Asia are small parcels, and the weather conditions during the growing season are usually cloudy. FORMOSAT-2 multi-spectral images have an 8-meter resolution and one-day recurrence, ideal for mapping paddy rice fields in East Asia. To map rice fields, this study first determined the transplanting and the most active tillering stages of paddy rice and then used multi-temporal images to distinguish different growing characteristics between paddy rice and other ground covers. The unsupervised ISODATA (iterative self-organizing data analysis techniques) and supervised maximum likelihood were both used to discriminate paddy rice fields, with training areas automatically derived from ten-year cultivation parcels in Taiwan. Besides original bands in multi-spectral images, we also generated normalized difference vegetation index and experimented with object-based pre-classification and post-classification. This paper discusses results of different image classification methods in an attempt to find a precise and automatic solution to mapping paddy rice in Taiwan.

Development of EPID-based Real time Dose Verification for Dynamic IMRT

An electronic portal image device (EPID) has become a method of patient-specific IMRT dose verification for radiotherapy. Research studies have focused on pre and post-treatment verification, however, there are currently no interventional procedures using EPID dosimetry that measure the dose in real time as a mechanism to ensure that overdoses do not occur and underdoses are detected as soon as is practically possible. As a result, an EPID-based real time dose verification system for dynamic IMRT was developed and was implemented with MATLAB/Simulink. The EPID image acquisition was set to continuous acquisition mode at 1.4 images per second. The system defined the time constraint gap, or execution gap at the image acquisition time, so that every calculation must be completed before the next image capture is completed. In addition, the

Finding Approximate Tandem Repeats with the Burrows-Wheeler Transform

Approximate tandem repeats in a genomic sequence are two or more contiguous, similar copies of a pattern of nucleotides. They are used in DNA mapping, studying molecular evolution mechanisms, forensic analysis and research in diagnosis of inherited diseases. All their functions are still investigated and not well defined, but increasing biological databases together with tools for identification of these repeats may lead to discovery of their specific role or correlation with particular features. This paper presents a new approach for finding approximate tandem repeats in a given sequence, where the similarity between consecutive repeats is measured using the Hamming distance. It is an enhancement of a method for finding exact tandem repeats in DNA sequences based on the Burrows- Wheeler transform.

Two States Mapping Based Neural Network Model for Decreasing of Prediction Residual Error

The objective of this paper is to design a model of human vital sign prediction for decreasing prediction error by using two states mapping based time series neural network BP (back-propagation) model. Normally, lot of industries has been applying the neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has a residual error between real value and prediction output. Therefore, we designed two states of neural network model for compensation of residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We found that most of simulations cases were satisfied by the two states mapping based time series prediction model compared to normal BP. In particular, small sample size of times series were more accurate than the standard MLP model. We expect that this algorithm can be available to sudden death prevention and monitoring AGENT system in a ubiquitous homecare environment.

A Blind SLM Scheme for Reduction of PAPR in OFDM Systems

In this paper we propose a blind algorithm for peakto- average power ratio (PAPR) reduction in OFDM systems, based on selected mapping (SLM) algorithm as a distortionless method. The main drawback of the conventional SLM technique is the need for transmission of several side information bits, for each data block, which results in loss in data rate transmission. In the proposed method some special number of carriers in the OFDM frame is reserved to be rotated with one of the possible phases according to the number of phase sequence blocks in SLM algorithm. Reserving some limited number of carriers wont effect the reduction in PAPR of OFDM signal. Simulation results show using ML criteria at the receiver will lead to the same system-performance as the conventional SLM algorithm, while there is no need to send any side information to the receiver.

A Fragile Watermarking Scheme for Color Image Authentication

In this paper, a fragile watermarking scheme is proposed for color image specified object-s authentication. The color image is first transformed from RGB to YST color space, suitable for watermarking the color media. The T channel corresponds to the chrominance component of a color image andYS ÔèÑ T , therefore selected for embedding the watermark. The T channel is first divided into 2×2 non-overlapping blocks and the two LSBs are set to zero. The object that is to be authenticated is also divided into 2×2 nonoverlapping blocks and each block-s intensity mean is computed followed by eight bit encoding. The generated watermark is then embedded into T channel randomly selected 2×2 block-s LSBs using 2D-Torus Automorphism. Selection of block size is paramount for exact localization and recovery of work. The proposed scheme is blind, efficient and secure with ability to detect and locate even minor tampering applied to the image with full recovery of original work. The quality of watermarked media is quite high both subjectively and objectively. The technique is suitable for class of images with format such as gif, tif or bitmap.

Concurrency without Locking in Parallel Hash Structures used for Data Processing

Various mechanisms providing mutual exclusion and thread synchronization can be used to support parallel processing within a single computer. Instead of using locks, semaphores, barriers or other traditional approaches in this paper we focus on alternative ways for making better use of modern multithreaded architectures and preparing hash tables for concurrent accesses. Hash structures will be used to demonstrate and compare two entirely different approaches (rule based cooperation and hardware synchronization support) to an efficient parallel implementation using traditional locks. Comparison includes implementation details, performance ranking and scalability issues. We aim at understanding the effects the parallelization schemes have on the execution environment with special focus on the memory system and memory access characteristics.