Quality-Driven Business Process Refactoring

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

An Efficient and Generic Hybrid Framework for High Dimensional Data Clustering

Clustering in high dimensional space is a difficult problem which is recurrent in many fields of science and engineering, e.g., bioinformatics, image processing, pattern reorganization and data mining. In high dimensional space some of the dimensions are likely to be irrelevant, thus hiding the possible clustering. In very high dimensions it is common for all the objects in a dataset to be nearly equidistant from each other, completely masking the clusters. Hence, performance of the clustering algorithm decreases. In this paper, we propose an algorithmic framework which combines the (reduct) concept of rough set theory with the k-means algorithm to remove the irrelevant dimensions in a high dimensional space and obtain appropriate clusters. Our experiment on test data shows that this framework increases efficiency of the clustering process and accuracy of the results.

Motivated Support Vector Regression using Structural Prior Knowledge

It-s known that incorporating prior knowledge into support vector regression (SVR) can help to improve the approximation performance. Most of researches are concerned with the incorporation of knowledge in the form of numerical relationships. Little work, however, has been done to incorporate the prior knowledge on the structural relationships among the variables (referred as to Structural Prior Knowledge, SPK). This paper explores the incorporation of SPK in SVR by constructing appropriate admissible support vector kernel (SV kernel) based on the properties of reproducing kernel (R.K). Three-levels specifications of SPK are studied with the corresponding sub-levels of prior knowledge that can be considered for the method. These include Hierarchical SPK (HSPK), Interactional SPK (ISPK) consisting of independence, global and local interaction, Functional SPK (FSPK) composed of exterior-FSPK and interior-FSPK. A convenient tool for describing the SPK, namely Description Matrix of SPK is introduced. Subsequently, a new SVR, namely Motivated Support Vector Regression (MSVR) whose structure is motivated in part by SPK, is proposed. Synthetic examples show that it is possible to incorporate a wide variety of SPK and helpful to improve the approximation performance in complex cases. The benefits of MSVR are finally shown on a real-life military application, Air-toground battle simulation, which shows great potential for MSVR to the complex military applications.

An Agent Oriented Architecture to Supply Integration in ERP Systems

One of the most important aspects expected from ERP systems is to integrate various operations existing in administrative, financial, commercial, human resources, and production departments of the consumer organization. Also, it is often needed to integrate the new ERP system with the organization legacy systems when implementing the ERP package in the organization. Without relying on an appropriate software architecture to realize the required integration, ERP implementation processes become error prone and time consuming; in some cases, the ERP implementation may even encounters serious risks. In this paper, we propose a new architecture that is based on the agent oriented vision and supplies the integration expected from ERP systems using several independent but cooperator agents. Besides integration which is the main issue of this paper, the presented architecture will address some aspects of intelligence and learning capabilities existing in ERP systems

Metadata Update Mechanism Improvements in Data Grid

Grid environments include aggregation of geographical distributed resources. Grid is put forward in three types of computational, data and storage. This paper presents a research on data grid. Data grid is used for covering and securing accessibility to data from among many heterogeneous sources. Users are not worry on the place where data is located in it, provided that, they should get access to the data. Metadata is used for getting access to data in data grid. Presently, application metadata catalogue and SRB middle-ware package are used in data grids for management of metadata. At this paper, possibility of updating, streamlining and searching is provided simultaneously and rapidly through classified table of preserving metadata and conversion of each table to numerous tables. Meanwhile, with regard to the specific application, the most appropriate and best division is set and determined. Concurrency of implementation of some of requests and execution of pipeline is adaptability as a result of this technique.

A Mathematical Representation for Mechanical Model Assessment: Numerical Model Qualification Method

This article illustrates a model selection management approach for virtual prototypes in interactive simulations. In those numerical simulations, the virtual prototype and its environment are modelled as a multiagent system, where every entity (prototype,human, etc.) is modelled as an agent. In particular, virtual prototyp ingagents that provide mathematical models of mechanical behaviour inform of computational methods are considered. This work argues that selection of an appropriate model in a changing environment,supported by models? characteristics, can be managed by the deter-mination a priori of specific exploitation and performance measures of virtual prototype models. As different models exist to represent a single phenomenon, it is not always possible to select the best one under all possible circumstances of the environment. Instead the most appropriate shall be selecting according to the use case. The proposed approach consists in identifying relevant metrics or indicators for each group of models (e.g. entity models, global model), formulate their qualification, analyse the performance, and apply the qualification criteria. Then, a model can be selected based on the performance prediction obtained from its qualification. The authors hope that this approach will not only help to inform engineers and researchers about another approach for selecting virtual prototype models, but also assist virtual prototype engineers in the systematic or automatic model selection.

Performance Analysis of Energy-Efficient Home Femto Base Stations

The energy consumption of home femto base stations (BSs) can be reduced, by turning off the Wi-Fi radio interface when there is no mobile station (MS) under the coverage of the BSs or MSs do not transmit or receive data packet for long time, especially in late night. In the energy-efficient home femto BSs, if MSs have any data packet to transmit and the Wi-Fi radio interface in off state, MSs wake up the Wi-Fi radio interface of home femto BSs by using additional low power radio interface. In this paper, the performance of the energy-efficient home femto BSs from the aspect of energy consumption and cumulative average delay, and show the effect of various parameters on energy consumption and cumulative average delay. From the results, the tradeoff relationship between energy consumption and cumulative average delay is shown and thus, appropriate operation should be needed to balance the tradeoff.

A Design of Electronically Tunable Voltagemode Universal Filter with High Input Impedance

This article presents a voltage-mode universal biquadratic filter performing simultaneous 3 standard functions: lowpass, high-pass and band-pass functions, employing differential different current conveyor (DDCC) and current controlled current conveyor (CCCII) as active element. The features of the circuit are that: the quality factor and pole frequency can be tuned independently via the input bias currents: the circuit description is very simple, consisting of 1 DDCC, 2 CCCIIs, 2 electronic resistors and 2 grounded capacitors. Without requiring component matching conditions, the proposed circuit is very appropriate to further develop into an integrated circuit. The PSPICE simulation results are depicted. The given results agree well with the theoretical anticipation.

Artificial Neural Networks Modeling in Water Resources Engineering: Infrastructure and Applications

The use of artificial neural network (ANN) modeling for prediction and forecasting variables in water resources engineering are being increasing rapidly. Infrastructural applications of ANN in terms of selection of inputs, architecture of networks, training algorithms, and selection of training parameters in different types of neural networks used in water resources engineering have been reported. ANN modeling conducted for water resources engineering variables (river sediment and discharge) published in high impact journals since 2002 to 2011 have been examined and presented in this review. ANN is a vigorous technique to develop immense relationship between the input and output variables, and able to extract complex behavior between the water resources variables such as river sediment and discharge. It can produce robust prediction results for many of the water resources engineering problems by appropriate learning from a set of examples. It is important to have a good understanding of the input and output variables from a statistical analysis of the data before network modeling, which can facilitate to design an efficient network. An appropriate training based ANN model is able to adopt the physical understanding between the variables and may generate more effective results than conventional prediction techniques.

Implementing Knowledge Transfer Solution through Web-based Help Desk System

Knowledge management is a process taking any steps that needed to get the most out of available knowledge resources. KM involved several steps; capturing the knowledge discovering new knowledge, sharing the knowledge and applied the knowledge in the decision making process. In applying the knowledge, it is not necessary for the individual that use the knowledge to comprehend it as long as the available knowledge is used in guiding the decision making and actions. When an expert is called and he provides stepby- step procedure on how to solve the problems to the caller, the expert is transferring the knowledge or giving direction to the caller. And the caller is 'applying' the knowledge by following the instructions given by the expert. An appropriate mechanism is needed to ensure effective knowledge transfer which in this case is by telephone or email. The problem with email and telephone is that the knowledge is not fully circulated and disseminated to all users. In this paper, with related experience of local university Help Desk, it is proposed the usage of Information Technology (IT)to effectively support the knowledge transfer in the organization. The issues covered include the existing knowledge, the related works, the methodology used in defining the knowledge management requirements as well the overview of the prototype.

The Use of Information for Inventory Decision in the Healthcare Industry

In this study, we explore the use of information for inventory decision in the healthcare organization (HO). We consider the scenario when the HO can make use of the information collected from some correlated products to enhance its inventory planning. Motivated by our real world observations that HOs adopt RFID and bar-coding system for information collection purpose, we examine the effectiveness of these systems for inventory planning with Bayesian information updating. We derive the optimal ordering decision and study the issue of Pareto improvement in the supply chain. Our analysis demonstrates that RFID system will outperform the bar-coding system when the RFID system installation cost and the tag cost reduce to a level that is comparable with that of the barcoding system. We also show how an appropriately set wholesale pricing contract can achieve Pareto improvement in the HO supply chain.

Hiding Data in Images Using PCP

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Efficient System for Speech Recognition using General Regression Neural Network

In this paper we present an efficient system for independent speaker speech recognition based on neural network approach. The proposed architecture comprises two phases: a preprocessing phase which consists in segmental normalization and features extraction and a classification phase which uses neural networks based on nonparametric density estimation namely the general regression neural network (GRNN). The relative performances of the proposed model are compared to the similar recognition systems based on the Multilayer Perceptron (MLP), the Recurrent Neural Network (RNN) and the well known Discrete Hidden Markov Model (HMM-VQ) that we have achieved also. Experimental results obtained with Arabic digits have shown that the use of nonparametric density estimation with an appropriate smoothing factor (spread) improves the generalization power of the neural network. The word error rate (WER) is reduced significantly over the baseline HMM method. GRNN computation is a successful alternative to the other neural network and DHMM.

Hybrid Modeling and Optimal Control of a Two-Tank System as a Switched System

In the past decade, because of wide applications of hybrid systems, many researchers have considered modeling and control of these systems. Since switching systems constitute an important class of hybrid systems, in this paper a method for optimal control of linear switching systems is described. The method is also applied on the two-tank system which is a much appropriate system to analyze different modeling and control techniques of hybrid systems. Simulation results show that, in this method, the goals of control and also problem constraints can be satisfied by an appropriate selection of cost function.