Testing Visual Abilities of Machines - Visual Intelligence Tests

Intelligence tests are series of tasks designed to measure the capacity to make abstractions, to learn, and to deal with novel situations. Testing of the visual abilities of the shape understanding system (SUS) is performed based on the visual intelligence tests. In this paper the progressive matrices tests are formulated as tasks given to SUS. These tests require good visual problem solving abilities of the human subject. SUS solves these tests by performing complex visual reasoning transforming the visual forms (tests) into the string forms. The experiment proved that the proposed method, which is part of the SUS visual understanding abilities, can solve a test that is very difficult for human subject.

Evaluation of Electronic Payment Systems Using Fuzzy Multi-Criteria Decision Making Approach

Global competitiveness has recently become the biggest concern of both manufacturing and service companies. Electronic commerce, as a key technology enables the firms to reach all the potential consumers from all over the world. In this study, we have presented commonly used electronic payment systems, and then we have shown the evaluation of these systems in respect to different criteria. The payment systems which are included in this research are the credit card, the virtual credit card, the electronic money, the mobile payment, the credit transfer and the debit instruments. We have realized a systematic comparison of these systems in respect to three main criteria: Technical, economical and social. We have conducted a fuzzy multi-criteria decision making procedure to deal with the multi-attribute nature of the problem. The subjectiveness and imprecision of the evaluation process are modeled using triangular fuzzy numbers.

A Distributed Group Mutual Exclusion Algorithm for Soft Real Time Systems

The group mutual exclusion (GME) problem is an interesting generalization of the mutual exclusion problem. Several solutions of the GME problem have been proposed for message passing distributed systems. However, none of these solutions is suitable for real time distributed systems. In this paper, we propose a token-based distributed algorithms for the GME problem in soft real time distributed systems. The algorithm uses the concepts of priority queue, dynamic request set and the process state. The algorithm uses first come first serve approach in selecting the next session type between the same priority levels and satisfies the concurrent occupancy property. The algorithm allows all n processors to be inside their CS provided they request for the same session. The performance analysis and correctness proof of the algorithm has also been included in the paper.

Analysis and Research of Two-Level Scheduling Profile for Open Real-Time System

In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.

A Robust Method for Encrypted Data Hiding Technique Based on Neighborhood Pixels Information

This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.

A Framework for Data Mining Based Multi-Agent: An Application to Spatial Data

Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.

A Fuzzy Approach for Delay Proportion Differentiated Service

There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.

Adaptive Anisotropic Diffusion for Ultrasonic Image Denoising and Edge Enhancement

Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.

Standard Deviation of Mean and Variance of Rows and Columns of Images for CBIR

This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.

VLSI Design of 2-D Discrete Wavelet Transform for Area-Efficient and High-Speed Image Computing

This paper presents a VLSI design approach of a highspeed and real-time 2-D Discrete Wavelet Transform computing. The proposed architecture, based on new and fast convolution approach, reduces the hardware complexity in addition to reduce the critical path to the multiplier delay. Furthermore, an advanced twodimensional (2-D) discrete wavelet transform (DWT) implementation, with an efficient memory area, is designed to produce one output in every clock cycle. As a result, a very highspeed is attained. The system is verified, using JPEG2000 coefficients filters, on Xilinx Virtex-II Field Programmable Gate Array (FPGA) device without accessing any external memory. The resulting computing rate is up to 270 M samples/s and the (9,7) 2-D wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out memory) with 256×256 image size. In this way, the developed design requests reduced memory and provide very high-speed processing as well as high PSNR quality.

Radiation Effect on Unsteady MHD Flow over a Stretching Surface

Unsteady magnetohydrodynamics (MHD) boundary layer flow and heat transfer over a continuously stretching surface in the presence of radiation is examined. By similarity transformation, the governing partial differential equations are transformed to a set of ordinary differential equations. Numerical solutions are obtained by employing the Runge-Kutta-Fehlberg method scheme with shooting technique in Maple software environment. The effects of unsteadiness parameter, radiation parameter, magnetic parameter and Prandtl number on the heat transfer characteristics are obtained and discussed. It is found that the heat transfer rate at the surface increases as the Prandtl number and unsteadiness parameter increase but decreases with magnetic and radiation parameter.

Designing a Novel General Sorting Network Constructor Using Artificial Evolution

A method is presented for the construction of arbitrary even-input sorting networks exhibiting better properties than the networks created using a conventional technique of the same type. The method was discovered by means of a genetic algorithm combined with an application-specific development. Similarly to human inventions in the area of theoretical computer science, the evolved invention was analyzed: its generality was proven and area and time complexities were determined.

Data Mining Using Learning Automata

In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).

Density Clustering Based On Radius of Data (DCBRD)

Clustering algorithms are attractive for the task of class identification in spatial databases. However, the application to large spatial databases rises the following requirements for clustering algorithms: minimal requirements of domain knowledge to determine the input parameters, discovery of clusters with arbitrary shape and good efficiency on large databases. The well-known clustering algorithms offer no solution to the combination of these requirements. In this paper, a density based clustering algorithm (DCBRD) is presented, relying on a knowledge acquired from the data by dividing the data space into overlapped regions. The proposed algorithm discovers arbitrary shaped clusters, requires no input parameters and uses the same definitions of DBSCAN algorithm. We performed an experimental evaluation of the effectiveness and efficiency of it, and compared this results with that of DBSCAN. The results of our experiments demonstrate that the proposed algorithm is significantly efficient in discovering clusters of arbitrary shape and size.

Interactive PTZ Camera Control System Using Wii Remote and Infrared Sensor Bar

This paper proposes an alternative control mechanism for an interactive Pan/Tilt/Zoom (PTZ) camera control system. Instead of using a mouse or a joystick, the proposed mechanism utilizes a Nintendo Wii remote and infrared (IR) sensor bar. The Wii remote has buttons that allows the user to control the movement of a PTZ camera through Bluetooth connectivity. In addition, the Wii remote has a built-in motion sensor that allows the user to give control signals to the PTZ camera through pitch and roll movement. A stationary IR sensor bar, placed at some distance away opposite the Wii remote, enables the detection of yaw movement. In addition, the Wii remote-s built-in IR camera has the ability to detect its spatial position, and thus generates a control signal when the user moves the Wii remote. Some experiments are carried out and their performances are compared with an industry-standard PTZ joystick.

Iterative Process to Improve Simple Adaptive Subdivision Surfaces Method with Butterfly Scheme

Subdivision surfaces were applied to the entire meshes in order to produce smooth surfaces refinement from coarse mesh. Several schemes had been introduced in this area to provide a set of rules to converge smooth surfaces. However, to compute and render all the vertices are really inconvenient in terms of memory consumption and runtime during the subdivision process. It will lead to a heavy computational load especially at a higher level of subdivision. Adaptive subdivision is a method that subdivides only at certain areas of the meshes while the rest were maintained less polygons. Although adaptive subdivision occurs at the selected areas, the quality of produced surfaces which is their smoothness can be preserved similar as well as regular subdivision. Nevertheless, adaptive subdivision process burdened from two causes; calculations need to be done to define areas that are required to be subdivided and to remove cracks created from the subdivision depth difference between the selected and unselected areas. Unfortunately, the result of adaptive subdivision when it reaches to the higher level of subdivision, it still brings the problem with memory consumption. This research brings to iterative process of adaptive subdivision to improve the previous adaptive method that will reduce memory consumption applied on triangular mesh. The result of this iterative process was acceptable better in memory and appearance in order to produce fewer polygons while it preserves smooth surfaces.

A Logic Based Framework for Planning for Mobile Agents

The objective of the paper is twofold. First, to develop a formal framework for planning for mobile agents. A logical language based on a temporal logic is proposed that can express a type of tasks which often arise in network management. Second, to design a planning algorithm for such tasks. The aim of this paper is to study the importance of finding plans for mobile agents. Although there has been a lot of research in mobile agents, not much work has been done to incorporate planning ideas for such agents. This paper makes an attempt in this direction. A theoretical study of finding plans for mobile agents is undertaken. A planning algorithm (based on the paradigm of mobile computing) is proposed and its space, time, and communication complexity is analyzed. The algorithm is illustrated by working out an example in detail.

Design Techniques and Implementation of Low Power High-Throughput Discrete Wavelet Transform Tilters for JPEG 2000 Standard

In this paper, the implementation of low power, high throughput convolutional filters for the one dimensional Discrete Wavelet Transform and its inverse are presented. The analysis filters have already been used for the implementation of a high performance DWT encoder [15] with minimum memory requirements for the JPEG 2000 standard. This paper presents the design techniques and the implementation of the convolutional filters included in the JPEG2000 standard for the forward and inverse DWT for achieving low-power operation, high performance and reduced memory accesses. Moreover, they have the ability of performing progressive computations so as to minimize the buffering between the decomposition and reconstruction phases. The experimental results illustrate the filters- low power high throughput characteristics as well as their memory efficient operation.

DRE - A Quality Metric for Component based Software Products

The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).

A Web Text Mining Flexible Architecture

Text Mining is an important step of Knowledge Discovery process. It is used to extract hidden information from notstructured o semi-structured data. This aspect is fundamental because much of the Web information is semi-structured due to the nested structure of HTML code, much of the Web information is linked, much of the Web information is redundant. Web Text Mining helps whole knowledge mining process to mining, extraction and integration of useful data, information and knowledge from Web page contents. In this paper, we present a Web Text Mining process able to discover knowledge in a distributed and heterogeneous multiorganization environment. The Web Text Mining process is based on flexible architecture and is implemented by four steps able to examine web content and to extract useful hidden information through mining techniques. Our Web Text Mining prototype starts from the recovery of Web job offers in which, through a Text Mining process, useful information for fast classification of the same are drawn out, these information are, essentially, job offer place and skills.