Syntactic Recognition of Distorted Patterns

In syntactic pattern recognition a pattern can be represented by a graph. Given an unknown pattern represented by a graph g, the problem of recognition is to determine if the graph g belongs to a language L(G) generated by a graph grammar G. The so-called IE graphs have been defined in [1] for a description of patterns. The IE graphs are generated by so-called ETPL(k) graph grammars defined in [1]. An efficient, parsing algorithm for ETPL(k) graph grammars for syntactic recognition of patterns represented by IE graphs has been presented in [1]. In practice, structural descriptions may contain pattern distortions, so that the assignment of a graph g, representing an unknown pattern, to a graph language L(G) generated by an ETPL(k) graph grammar G is rejected by the ETPL(k) type parsing. Therefore, there is a need for constructing effective parsing algorithms for recognition of distorted patterns. The purpose of this paper is to present a new approach to syntactic recognition of distorted patterns. To take into account all variations of a distorted pattern under study, a probabilistic description of the pattern is needed. A random IE graph approach is proposed here for such a description ([2]).

An Efficient Run Time Interface for Heterogeneous Architecture of Large Scale Supercomputing System

In this paper we propose a novel Run Time Interface (RTI) technique to provide an efficient environment for MPI jobs on the heterogeneous architecture of PARAM Padma. It suggests an innovative, unified framework for the job management interface system in parallel and distributed computing. This approach employs proxy scheme. The implementation shows that the proposed RTI is highly scalable and stable. Moreover RTI provides the storage access for the MPI jobs in various operating system platforms and improve the data access performance through high performance C-DAC Parallel File System (C-PFS). The performance of the RTI is evaluated by using the standard HPC benchmark suites and the simulation results show that the proposed RTI gives good performance on large scale supercomputing system.

Wavelet Transform and Support Vector Machine Approach for Fault Location in Power Transmission Line

This paper presents a wavelet transform and Support Vector Machine (SVM) based algorithm for estimating fault location on transmission lines. The Discrete wavelet transform (DWT) is used for data pre-processing and this data are used for training and testing SVM. Five types of mother wavelet are used for signal processing to identify a suitable wavelet family that is more appropriate for use in estimating fault location. The results demonstrated the ability of SVM to generalize the situation from the provided patterns and to accurately estimate the location of faults with varying fault resistance.

Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Intelligent Agents for Distributed Intrusion Detection System

This paper presents a distributed intrusion detection system IDS, based on the concept of specialized distributed agents community representing agents with the same purpose for detecting distributed attacks. The semantic of intrusion events occurring in a predetermined network has been defined. The correlation rules referring the process which our proposed IDS combines the captured events that is distributed both spatially and temporally. And then the proposed IDS tries to extract significant and broad patterns for set of well-known attacks. The primary goal of our work is to provide intrusion detection and real-time prevention capability against insider attacks in distributed and fully automated environments.

Improved Modulo 2n +1 Adder Design

Efficient modulo 2n+1 adders are important for several applications including residue number system, digital signal processors and cryptography algorithms. In this paper we present a novel modulo 2n+1 addition algorithm for a recently represented number system. The proposed approach is introduced for the reduction of the power dissipated. In a conventional modulo 2n+1 adder, all operands have (n+1)-bit length. To avoid using (n+1)-bit circuits, the diminished-1 and carry save diminished-1 number systems can be effectively used in applications. In the paper, we also derive two new architectures for designing modulo 2n+1 adder, based on n-bit ripple-carry adder. The first architecture is a faster design whereas the second one uses less hardware. In the proposed method, the special treatment required for zero operands in Diminished-1 number system is removed. In the fastest modulo 2n+1 adders in normal binary system, there are 3-operand adders. This problem is also resolved in this paper. The proposed architectures are compared with some efficient adders based on ripple-carry adder and highspeed adder. It is shown that the hardware overhead and power consumption will be reduced. As well as power reduction, in some cases, power-delay product will be also reduced.

Distinguishing Playing Pattern between Winning and Losing Field Hockey Team in Delhi FIH Road to London 2012 Tournament

The aim of the present study was to analyze and distinguish playing pattern between winning and losing field hockey team in Delhi 2012 tournament. The playing pattern is focus to the D penetration (right, center, left.) and to distinguish D penetration linking to end shot made from it. The data was recorded and analyzed using Sportscode elite computer software. 12 matches were analyzed from the tournament. Two groups of performance indicators are used to analyze, that is D penetration right, center, and left. The type of shot chosen is hit, push, flick, drag, drag flick, deflect sweep, deflect push, scoop, sweep, and reverse hit. This is to distinguish the pattern of play between winning and losing, only 2 performance indicator showed high significant differences from right (Z=-2.87, p=.004, p

Graph-Based Text Similarity Measurement by Exploiting Wikipedia as Background Knowledge

Text similarity measurement is a fundamental issue in many textual applications such as document clustering, classification, summarization and question answering. However, prevailing approaches based on Vector Space Model (VSM) more or less suffer from the limitation of Bag of Words (BOW), which ignores the semantic relationship among words. Enriching document representation with background knowledge from Wikipedia is proven to be an effective way to solve this problem, but most existing methods still cannot avoid similar flaws of BOW in a new vector space. In this paper, we propose a novel text similarity measurement which goes beyond VSM and can find semantic affinity between documents. Specifically, it is a unified graph model that exploits Wikipedia as background knowledge and synthesizes both document representation and similarity computation. The experimental results on two different datasets show that our approach significantly improves VSM-based methods in both text clustering and classification.

Customer-Supplier Collaboration in Casting Industry: a Review on Organizational and Human Aspects

Customer-supplier collaboration enables firms to achieve greater success than acting independently. Nevertheless, not many firms have fully utilized the potential of collaboration. This paper presents organizational and human related success factors for collaboration in manufacturing supply chains in casting industry. Our research approach was a case study including multiple cases. Data was gathered by interviews and group discussions in two different research projects. In the first research project we studied seven firms and in the second five. It was found that the success factors are interrelated, in other words, organizational and human factors together enable success but not any of them alone. Some of the found success factors are a culture of following agreements, and a speed of informing the partner about changes affecting to the product or the delivery chain.

Internet Purchases in European Union Countries: Multiple Linear Regression Approach

This paper examines economic and Information and Communication Technology (ICT) development influence on recently increasing Internet purchases by individuals for European Union member states. After a growing trend for Internet purchases in EU27 was noticed, all possible regression analysis was applied using nine independent variables in 2011. Finally, two linear regression models were studied in detail. Conducted simple linear regression analysis confirmed the research hypothesis that the Internet purchases in analyzed EU countries is positively correlated with statistically significant variable Gross Domestic Product per capita (GDPpc). Also, analyzed multiple linear regression model with four regressors, showing ICT development level, indicates that ICT development is crucial for explaining the Internet purchases by individuals, confirming the research hypothesis.

Identifying Relationships between Technology-based Services and ICTs: A Patent Analysis Approach

A variety of new technology-based services have emerged with the development of Information and Communication Technologies (ICTs). Since technology-based services have technology-driven characteristics, the identification of relationships between technology-based services and ICTs would give meaningful implications. Thus, this paper proposes an approach for identifying the relationships between technology-based services and ICTs by analyzing patent documents. First, business model (BM) patents are classified into relevant service categories. Second, patent citation analysis is conducted to investigate the technological linkage and impacts between technology-based services and ICTs at macro level. Third, as a micro level analysis, patent co-classification analysis is employed to identify the technological linkage and coverage. The proposed approach could guide and help managers and designers of technology-based services to discover the opportunity of the development of new technology-based services in emerging service sectors.

Risk Classification of SMEs by Early Warning Model Based on Data Mining

One of the biggest problems of SMEs is their tendencies to financial distress because of insufficient finance background. In this study, an Early Warning System (EWS) model based on data mining for financial risk detection is presented. CHAID algorithm has been used for development of the EWS. Developed EWS can be served like a tailor made financial advisor in decision making process of the firms with its automated nature to the ones who have inadequate financial background. Besides, an application of the model implemented which covered 7,853 SMEs based on Turkish Central Bank (TCB) 2007 data. By using EWS model, 31 risk profiles, 15 risk indicators, 2 early warning signals, and 4 financial road maps has been determined for financial risk mitigation.

Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification

Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.

A Wavelet Based Object Watermarking System for Image and Video

Efficient storage, transmission and use of video information are key requirements in many multimedia applications currently being addressed by MPEG-4. To fulfill these requirements, a new approach for representing video information which relies on an object-based representation, has been adopted. Therefore, objectbased watermarking schemes are needed for copyright protection. This paper proposes a novel blind object watermarking scheme for images and video using the in place lifting shape adaptive-discrete wavelet transform (SA-DWT). In order to make the watermark robust and transparent, the watermark is embedded in the average of wavelet blocks using the visual model based on the human visual system. Wavelet coefficients n least significant bits (LSBs) are adjusted in concert with the average. Simulation results shows that the proposed watermarking scheme is perceptually invisible and robust against many attacks such as lossy image/video compression (e.g. JPEG, JPEG2000 and MPEG-4), scaling, adding noise, filtering, etc.

Optimizing Mobile Agents Migration Based on Decision Tree Learning

Mobile agents are a powerful approach to develop distributed systems since they migrate to hosts on which they have the resources to execute individual tasks. In a dynamic environment like a peer-to-peer network, Agents have to be generated frequently and dispatched to the network. Thus they will certainly consume a certain amount of bandwidth of each link in the network if there are too many agents migration through one or several links at the same time, they will introduce too much transferring overhead to the links eventually, these links will be busy and indirectly block the network traffic, therefore, there is a need of developing routing algorithms that consider about traffic load. In this paper we seek to create cooperation between a probabilistic manner according to the quality measure of the network traffic situation and the agent's migration decision making to the next hop based on decision tree learning algorithms.

The Effects of an Information Delivery Modality on Psychology of E-learning Students

Does a communication modality matter in delivering e-learning information? With the recent growth of broadcasting systems, media technologies and e-learning contents, various systems with different communication modalities have been introduced. In accordance with these trends, this study examines the effects of the information delivery modality on psychology of students. Findings from an experiment indicated that the delivering information which includes a video modality elicited higher degrees of credibility, quality, representativeness of content, and perceived suitability for delivering information than those of auditory information. However, there is no difference between content liking and attitude. The Implications of the findings and the limitations are discussed.

Error-Robust Nature of Genome Profiling Applied for Clustering of Species Demonstrated by Computer Simulation

Genome profiling (GP), a genotype based technology, which exploits random PCR and temperature gradient gel electrophoresis, has been successful in identification/classification of organisms. In this technology, spiddos (Species identification dots) and PaSS (Pattern similarity score) were employed for measuring the closeness (or distance) between genomes. Based on the closeness (PaSS), we can buildup phylogenetic trees of the organisms. We noticed that the topology of the tree is rather robust against the experimental fluctuation conveyed by spiddos. This fact was confirmed quantitatively in this study by computer-simulation, providing the limit of the reliability of this highly powerful methodology. As a result, we could demonstrate the effectiveness of the GP approach for identification/classification of organisms.

Design of Robust Fuzzy Logic Power System Stabilizer

Power system stabilizers (PSS) must be capable of providing appropriate stabilization signals over a broad range of operating conditions and disturbance. Traditional PSS rely on robust linear design method in an attempt to cover a wider range of operating condition. Expert or rule-based controllers have also been proposed. Recently fuzzy logic (FL) as a novel robust control design method has shown promising results. The emphasis in fuzzy control design center is around uncertainties in the system parameters & operating conditions. In this paper a novel Robust Fuzzy Logic Power System Stabilizer (RFLPSS) design is proposed The RFLPSS basically utilizes only one measurable Δω signal as input (generator shaft speed). The speed signal is discretized resulting in three inputs to the RFLPSS. There are six rules for the fuzzification and two rules for defuzzification. To provide robustness, additional signal namely, speed are used as inputs to RFLPSS enabling appropriate gain adjustments for the three RFLPSS inputs. Simulation studies show the superior performance of the RFLPSS compared with an optimally designed conventional PSS and discrete mode FLPSS.

Broadcasting to Handheld Devices: The Challenges

Digital Video Terrestrial Broadcasting (DVB-T) allows combining broadcasting, telephone and data services in one network. It has facilitated mobile TV broadcasting. Mobile TV broadcasting is dominated by fragmentation of standards in use in different continents. In Asia T-DMB and ISDB-T are used while Europe uses mainly DVB-H and in USA it is MediaFLO. Issues of royalty for developers of these different incompatible technologies, investments made and differing local conditions shall make it difficult to agree on a unified standard in a very near future. Despite this shortcoming, mobile TV has shown very good market potential. There are a number of challenges that still exist for regulators, investors and technology developers but the future looks bright. There is need for mobile telephone operators to cooperate with content providers and those operating terrestrial digital broadcasting infrastructure for mutual benefit.

An Engineering Approach to Forecast Volatility of Financial Indices

By systematically applying different engineering methods, difficult financial problems become approachable. Using a combination of theory and techniques such as wavelet transform, time series data mining, Markov chain based discrete stochastic optimization, and evolutionary algorithms, this work formulated a strategy to characterize and forecast non-linear time series. It attempted to extract typical features from the volatility data sets of S&P100 and S&P500 indices that include abrupt drops, jumps and other non-linearity. As a result, accuracy of forecasting has reached an average of over 75% surpassing any other publicly available results on the forecast of any financial index.