Business Rules for Data Warehouse

Business rules and data warehouse are concepts and technologies that impact a wide variety of organizational tasks. In general, each area has evolved independently, impacting application development and decision-making. Generating knowledge from data warehouse is a complex process. This paper outlines an approach to ease import of information and knowledge from a data warehouse star schema through an inference class of business rules. The paper utilizes the Oracle database for illustrating the working of the concepts. The star schema structure and the business rules are stored within a relational database. The approach is explained through a prototype in Oracle-s PL/SQL Server Pages.

Visualization of Code Clone Detection Results and the Implementation with Structured Data

This paper describes a code clone visualization method, called FC graph, and the implementation issues. Code clone detection tools usually show the results in a textual representation. If the results are large, it makes a problem to software maintainers with understanding them. One of the approaches to overcome the situation is visualization of code clone detection results. A scatter plot is a popular approach to the visualization. However, it represents only one-to-one correspondence and it is difficult to find correspondence of code clones over multiple files. FC graph represents correspondence among files, code clones and packages in Java. All nodes in FC graph are positioned using force-directed graph layout, which is dynami- cally calculated to adjust the distances of nodes until stabilizing them. We applied FC graph to some open source programs and visualized the results. In the author’s experience, FC graph is helpful to grasp correspondence of code clones over multiple files and also code clones with in a file.

Analytical Solution for Free Vibration of Rectangular Kirchhoff Plate from Wave Approach

In this paper, an analytical approach for free vibration analysis of four edges simply supported rectangular Kirchhoff plates is presented. The method is based on wave approach. From wave standpoint vibration propagate, reflect and transmit in a structure. Firstly, the propagation and reflection matrices for plate with simply supported boundary condition are derived. Then, these matrices are combined to provide a concise and systematic approach to free vibration analysis of a simply supported rectangular Kirchhoff plate. Subsequently, the eigenvalue problem for free vibration of plates is formulated and the equation of plate natural frequencies is constructed. Finally, the effectiveness of the approach is shown by comparison of the results with existing classical solution.

Fuzzy Control of the Air Conditioning System at Different Operating Pressures

The present work demonstrates the design and simulation of a fuzzy control of an air conditioning system at different pressures. The first order Sugeno fuzzy inference system is utilized to model the system and create the controller. In addition, an estimation of the heat transfer rate and water mass flow rate injection into or withdraw from the air conditioning system is determined by the fuzzy IF-THEN rules. The approach starts by generating the input/output data. Then, the subtractive clustering algorithm along with least square estimation (LSE) generates the fuzzy rules that describe the relationship between input/output data. The fuzzy rules are tuned by Adaptive Neuro-Fuzzy Inference System (ANFIS). The results show that when the pressure increases the amount of water flow rate and heat transfer rate decrease within the lower ranges of inlet dry bulb temperatures. On the other hand, and as pressure increases the amount of water flow rate and heat transfer rate increases within the higher ranges of inlet dry bulb temperatures. The inflection in the pressure effect trend occurs at lower temperatures as the inlet air humidity increases.

A Monte Carlo Method to Data Stream Analysis

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Faults Forecasting System

This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.

A Multiresolution Approach for Noised Texture Classification based on the Co-occurrence Matrix and First Order Statistics

Wavelet transform provides several important characteristics which can be used in a texture analysis and classification. In this work, an efficient texture classification method, which combines concepts from wavelet and co-occurrence matrices, is presented. An Euclidian distance classifier is used to evaluate the various methods of classification. A comparative study is essential to determine the ideal method. Using this conjecture, we developed a novel feature set for texture classification and demonstrate its effectiveness

Neuro-Fuzzy Network Based On Extended Kalman Filtering for Financial Time Series

The neural network's performance can be measured by efficiency and accuracy. The major disadvantages of neural network approach are that the generalization capability of neural networks is often significantly low, and it may take a very long time to tune the weights in the net to generate an accurate model for a highly complex and nonlinear systems. This paper presents a novel Neuro-fuzzy architecture based on Extended Kalman filter. To test the performance and applicability of the proposed neuro-fuzzy model, simulation study of nonlinear complex dynamic system is carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction of financial time series. A benchmark case studie is used to demonstrate that the proposed model is a superior neuro-fuzzy modeling technique.

Motion Recognition Based On Fuzzy WP Feature Extraction Approach

This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.

Hardware Approach to Solving Password Exposure Problem through Keyboard Sniff

This paper introduces a hardware solution to password exposure problem caused by direct accesses to the keyboard hardware interfaces through which a possible attacker is able to grab user-s password even where existing countermeasures are deployed. Several researches have proposed reasonable software based solutions to the problem for years. However, recently introduced hardware vulnerability problems have neutralized the software approaches and yet proposed any effective software solution to the vulnerability. Hardware approach in this paper is expected as the only solution to the vulnerability

Proposition for a New Approach of Version Control System Based On ECA Active Rules

We try to give a solution of version control for documents in web service, that-s why we propose a new approach used specially for the XML documents. The new approach is applied in a centralized repository, this repository coexist with other repositories in a decentralized system. To achieve the activities of this approach in a standard model we use the ECA active rules. We also show how the Event-Condition-Action rules (ECA rules) have been incorporated as a mechanism for the version control of documents. The need to integrate ECA rules is that it provides a clear declarative semantics and induces an immediate operational realization in the system without the need for human intervention.

Stress Ratio and Notch Effect on Fatigue Crack Initiation and Propagation in 2024 Al-alloy

This study reports an empirical investigation of fatigue crack initiation and propagation in 2024 T351 aluminium alloy using constant amplitude loading. In initiation stage, local strain approach at the notch was used and in stable propagation stage NASGRO model was applied. In this investigation, the flat plate of double through crack at hole is used. Based on experimental results (AFGROW Database), effect of stress ratio, R, is highlights on fatigue initiation life (FIL) and fatigue crack growth rate (FCGR). The increasing of dimension of hole characterizing the notch effect decrease the fatigue life.

Stochastic Modeling and Combined Spatial Pattern Analysis of Epidemic Spreading

We present analysis of spatial patterns of generic disease spread simulated by a stochastic long-range correlation SIR model, where individuals can be infected at long distance in a power law distribution. We integrated various tools, namely perimeter, circularity, fractal dimension, and aggregation index to characterize and investigate spatial pattern formations. Our primary goal was to understand for a given model of interest which tool has an advantage over the other and to what extent. We found that perimeter and circularity give information only for a case of strong correlation– while the fractal dimension and aggregation index exhibit the growth rule of pattern formation, depending on the degree of the correlation exponent (β). The aggregation index method used as an alternative method to describe the degree of pathogenic ratio (α). This study may provide a useful approach to characterize and analyze the pattern formation of epidemic spreading

Experimental and Theoretical Study of Melt Viscosity in Injection Process

The state of melt viscosity in injection process is significantly influenced by the setting parameters due to that the shear rate of injection process is higher than other processes. How to determine plastic melt viscosity during injection process is important to understand the influence of setting parameters on the melt viscosity. An apparatus named as pressure sensor bushing (PSB) module that is used to evaluate the melt viscosity during injection process is developed in this work. The formulations to coupling melt viscosity with fill time and injection pressure are derived and then the melt viscosity is determined. A test mold is prepared to evaluate the accuracy on viscosity calculations between the PSB module and the conventional approaches. The influence of melt viscosity on the tensile strength of molded part is proposed to study the consistency of injection quality.

A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Aquatic Modeling: An Interplay between Scales

This paper presents an integrated knowledge-based approach to multi-scale modeling of aquatic systems, with a view to enhancing predictive power and aiding environmental management and policy-making. The basic phases of this approach have been exemplified in the case of a bay in Saronicos Gulf (Attiki, Greece). The results showed a significant problem with rising phytoplankton blooms linked to excessive microbial growth, arisen mostly due to increased nitrogen inflows; therefore, the nitrification/denitrification processes of the benthic and water column sub-systems have provided the quality variables to be monitored for assessing environmental status. It is thereby demonstrated that the proposed approach facilitates modeling choices and implementation option decisions, while it provides substantial support for knowledge and experience capitalization in long-term water management.

Data Migration between Document-Oriented and Relational Databases

Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.

MIMO System Order Reduction Using Real-Coded Genetic Algorithm

In this paper, real-coded genetic algorithm (RCGA) optimization technique has been applied for large-scale linear dynamic multi-input-multi-output (MIMO) system. The method is based on error minimization technique where the integral square error between the transient responses of original and reduced order models has been minimized by RCGA. The reduction procedure is simple computer oriented and the approach is comparable in quality with the other well-known reduction techniques. Also, the proposed method guarantees stability of the reduced model if the original high-order MIMO system is stable. The proposed approach of MIMO system order reduction is illustrated with the help of an example and the results are compared with the recently published other well-known reduction techniques to show its superiority.

Color Image Segmentation and Multi-Level Thresholding by Maximization of Conditional Entropy

In this work a novel approach for color image segmentation using higher order entropy as a textural feature for determination of thresholds over a two dimensional image histogram is discussed. A similar approach is applied to achieve multi-level thresholding in both grayscale and color images. The paper discusses two methods of color image segmentation using RGB space as the standard processing space. The threshold for segmentation is decided by the maximization of conditional entropy in the two dimensional histogram of the color image separated into three grayscale images of R, G and B. The features are first developed independently for the three ( R, G, B ) spaces, and combined to get different color component segmentation. By considering local maxima instead of the maximum of conditional entropy yields multiple thresholds for the same image which forms the basis for multilevel thresholding.

Further Thoughtson a Sequential Life Testing Approach Using an Inverse Weibull Model

In this paper we will develop further the sequential life test approach presented in a previous article by [1] using an underlying two parameter Inverse Weibull sampling distribution. The location parameter or minimum life will be considered equal to zero. Once again we will provide rules for making one of the three possible decisions as each observation becomes available; that is: accept the null hypothesis H0; reject the null hypothesis H0; or obtain additional information by making another observation. The product being analyzed is a new electronic component. There is little information available about the possible values the parameters of the corresponding Inverse Weibull underlying sampling distribution could have.To estimate the shape and the scale parameters of the underlying Inverse Weibull model we will use a maximum likelihood approach for censored failure data. A new example will further develop the proposed sequential life testing approach.