An Implementation of MacMahon's Partition Analysis in Ordering the Lower Bound of Processing Elements for the Algorithm of LU Decomposition

A lot of Scientific and Engineering problems require the solution of large systems of linear equations of the form bAx in an effective manner. LU-Decomposition offers good choices for solving this problem. Our approach is to find the lower bound of processing elements needed for this purpose. Here is used the so called Omega calculus, as a computational method for solving problems via their corresponding Diophantine relation. From the corresponding algorithm is formed a system of linear diophantine equalities using the domain of computation which is given by the set of lattice points inside the polyhedron. Then is run the Mathematica program DiophantineGF.m. This program calculates the generating function from which is possible to find the number of solutions to the system of Diophantine equalities, which in fact gives the lower bound for the number of processors needed for the corresponding algorithm. There is given a mathematical explanation of the problem as well. Keywordsgenerating function, lattice points in polyhedron, lower bound of processor elements, system of Diophantine equationsand : calculus.

A General Framework for Modeling Replicated Real-Time Database

There are many issues that affect modeling and designing real-time databases. One of those issues is maintaining consistency between the actual state of the real-time object of the external environment and its images as reflected by all its replicas distributed over multiple nodes. The need to improve the scalability is another important issue. In this paper, we present a general framework to design a replicated real-time database for small to medium scale systems and maintain all timing constrains. In order to extend the idea for modeling a large scale database, we present a general outline that consider improving the scalability by using an existing static segmentation algorithm applied on the whole database, with the intent to lower the degree of replication, enables segments to have individual degrees of replication with the purpose of avoiding excessive resource usage, which all together contribute in solving the scalability problem for DRTDBS.

Liveness Detection for Embedded Face Recognition System

To increase reliability of face recognition system, the system must be able to distinguish real face from a copy of face such as a photograph. In this paper, we propose a fast and memory efficient method of live face detection for embedded face recognition system, based on the analysis of the movement of the eyes. We detect eyes in sequential input images and calculate variation of each eye region to determine whether the input face is a real face or not. Experimental results show that the proposed approach is competitive and promising for live face detection.

Design and Implementation of Project Time Management Risk Assessment Tool for SME Projects using Oracle Application Express

Risk Assessment Tool (RAT) is an expert system that assesses, monitors, and gives preliminary treatments automatically based on the project plan. In this paper, a review was taken out for the current project time management risk assessment tools for SME software development projects, analyze risk assessment parameters, conditions, scenarios, and finally propose risk assessment tool (RAT) model to assess, treat, and monitor risks. An implementation prototype system is developed to validate the model.

Nuclear Medical Image Treatment System Based On FPGA in Real Time

We present in this paper an acquisition and treatment system designed for semi-analog Gamma-camera. It consists of a nuclear medical Image Acquisition, Treatment and Display chain(IATD) ensuring the acquisition, the treatment of the signals(resulting from the Gamma-camera detection head) and the scintigraphic image construction in real time. This chain is composed by an analog treatment board and a digital treatment board. We describe the designed systems and the digital treatment algorithms in which we have improved the performance and the flexibility. The digital treatment algorithms are implemented in a specific reprogrammable circuit FPGA (Field Programmable Gate Array).interface for semi-analog cameras of Sopha Medical Vision(SMVi) by taking as example SOPHY DS7. The developed system consists of an Image Acquisition, Treatment and Display (IATD) ensuring the acquisition and the treatment of the signals resulting from the DH. The developed chain is formed by a treatment analog board and a digital treatment board designed around a DSP [2]. In this paper we have presented the architecture of a new version of our chain IATD in which the integration of the treatment algorithms is executed on an FPGA (Field Programmable Gate Array)

Palmprint Recognition by Wavelet Transform with Competitive Index and PCA

This manuscript presents, palmprint recognition by combining different texture extraction approaches with high accuracy. The Region of Interest (ROI) is decomposed into different frequencytime sub-bands by wavelet transform up-to two levels and only the approximate image of two levels is selected, which is known as Approximate Image ROI (AIROI). This AIROI has information of principal lines of the palm. The Competitive Index is used as the features of the palmprint, in which six Gabor filters of different orientations convolve with the palmprint image to extract the orientation information from the image. The winner-take-all strategy is used to select dominant orientation for each pixel, which is known as Competitive Index. Further, PCA is applied to select highly uncorrelated Competitive Index features, to reduce the dimensions of the feature vector, and to project the features on Eigen space. The similarity of two palmprints is measured by the Euclidean distance metrics. The algorithm is tested on Hong Kong PolyU palmprint database. Different AIROI of different wavelet filter families are also tested with the Competitive Index and PCA. AIROI of db7 wavelet filter achievs Equal Error Rate (EER) of 0.0152% and Genuine Acceptance Rate (GAR) of 99.67% on the palm database of Hong Kong PolyU.

Simulating a Single-Server Queue using the Q – Simulator

This paper introduces a technique for simulating a single-server exponential queuing system. The technique called the Q-Simulator is a computer program which can simulate the effect of traffic intensity on all system average quantities given the arrival and/or service rates. The Q-Simulator has three phases namely: the formula based method, the uncontrolled simulation, and the controlled simulation. The Q-Simulator generates graphs (crystal solutions) for all results of the simulation or calculation and can be used to estimate desirable average quantities such as waiting times, queue lengths, etc.

Search Engine Module in Voice Recognition Browser to Facilitate the Visually Impaired in Virtual Learning (MGSYS VISI-VL)

Nowadays, web-based technologies influence in people-s daily life such as in education, business and others. Therefore, many web developers are too eager to develop their web applications with fully animation graphics and forgetting its accessibility to its users. Their purpose is to make their web applications look impressive. Thus, this paper would highlight on the usability and accessibility of a voice recognition browser as a tool to facilitate the visually impaired and blind learners in accessing virtual learning environment. More specifically, the objectives of the study are (i) to explore the challenges faced by the visually impaired learners in accessing virtual learning environment (ii) to determine the suitable guidelines for developing a voice recognition browser that is accessible to the visually impaired. Furthermore, this study was prepared based on an observation conducted with the Malaysian visually impaired learners. Finally, the result of this study would underline on the development of an accessible voice recognition browser for the visually impaired.

A Design of Fractional-Order PI Controller with Error Compensation

Fractional-order controller was proven to perform better than the integer-order controller. However, the absence of a pole at origin produced marginal error in fractional-order control system. This study demonstrated the enhancement of the fractionalorder PI over the integer-order PI in a steam temperature control. The fractional-order controller was cascaded with an error compensator comprised of a very small zero and a pole at origin to produce a zero steady-state error for the closed-loop system. Some modification on the error compensator was suggested for different order fractional integrator that can improve the overall phase margin.

B-VIS Service-oriented Middleware for RFID Sensor Network

One of the most importance of intelligence in-car and roadside systems is the cooperative vehicle-infrastructure system. In Thailand, ITS technologies are rapidly growing and real-time vehicle information is considerably needed for ITS applications; for example, vehicle fleet tracking and control and road traffic monitoring systems. This paper defines the communication protocols and software design for middleware components of B-VIS (Burapha Vehicle-Infrastructure System). The proposed B-VIS middleware architecture serves the needs of a distributed RFID sensor network and simplifies some intricate details of several communication standards.

A Decision Boundary based Discretization Technique using Resampling

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

A Design-Based Cohesion Metric for Object-Oriented Classes

Class cohesion is an important object-oriented software quality attribute. It indicates how much the members in a class are related. Assessing the class cohesion and improving the class quality accordingly during the object-oriented design phase allows for cheaper management of the later phases. In this paper, the notion of distance between pairs of methods and pairs of attribute types in a class is introduced and used as a basis for introducing a novel class cohesion metric. The metric considers the methodmethod, attribute-attribute, and attribute-method direct interactions. It is shown that the metric gives more sensitive values than other well-known design-based class cohesion metrics.

An Adaptive Fuzzy Clustering Approach for the Network Management

The Chiu-s method which generates a Takagi-Sugeno Fuzzy Inference System (FIS) is a method of fuzzy rules extraction. The rules output is a linear function of inputs. In addition, these rules are not explicit for the expert. In this paper, we develop a method which generates Mamdani FIS, where the rules output is fuzzy. The method proceeds in two steps: first, it uses the subtractive clustering principle to estimate both the number of clusters and the initial locations of a cluster centers. Each obtained cluster corresponds to a Mamdani fuzzy rule. Then, it optimizes the fuzzy model parameters by applying a genetic algorithm. This method is illustrated on a traffic network management application. We suggest also a Mamdani fuzzy rules generation method, where the expert wants to classify the output variables in some fuzzy predefined classes.

Simulating Discrete Time Model Reference Adaptive Control System with Great Initial Error

This article is based on the technique which is called Discrete Parameter Tracking (DPT). First introduced by A. A. Azab [8] which is applicable for less order reference model. The order of the reference model is (n-l) and n is the number of the adjustable parameters in the physical plant. The technique utilizes a modified gradient method [9] where the knowledge of the exact order of the nonadaptive system is not required, so, as to eliminate the identification problem. The applicability of the mentioned technique (DPT) was examined through the solution of several problems. This article introduces the solution of a third order system with three adjustable parameters, controlled according to second order reference model. The adjustable parameters have great initial error which represent condition. Computer simulations for the solution and analysis are provided to demonstrate the simplicity and feasibility of the technique.

Target Concept Selection by Property Overlap in Ontology Population

An ontology is widely used in many kinds of applications as a knowledge representation tool for domain knowledge. However, even though an ontology schema is well prepared by domain experts, it is tedious and cost-intensive to add instances into the ontology. The most confident and trust-worthy way to add instances into the ontology is to gather instances from tables in the related Web pages. In automatic populating of instances, the primary task is to find the most proper concept among all possible concepts within the ontology for a given table. This paper proposes a novel method for this problem by defining the similarity between the table and the concept using the overlap of their properties. According to a series of experiments, the proposed method achieves 76.98% of accuracy. This implies that the proposed method is a plausible way for automatic ontology population from Web tables.

Semantic Web Agent Communication Capable of Reasoning with Ontology and Agent Locations

Multi-agent communication of Semantic Web information cannot be realized without the need to reason with ontology and agent locations. This is because for an agent to be able to reason with an external semantic web ontology, it must know where and how to access to that ontology. Similarly, for an agent to be able to communicate with another agent, it must know where and how to send a message to that agent. In this paper we propose a framework of an agent which can reason with ontology and agent locations in order to perform reasoning with multiple distributed ontologies and perform communication with other agents on the semantic web. The agent framework and its communication mechanism are formulated entirely in meta-logic.

Ground System Software for Unmanned Aerial Vehicles on Android Device

A Ground Control System (GCS), which controls Unmanned Aerial Vehicles (UAVs) and monitors their missionrelated data, is one of the major components of UAVs. In fact, some traditional GCSs were built on an expensive, complicated hardware infrastructure with workstations and PCs. In contrast, a GCS on a portable device – such as an Android phone or tablet – takes advantage of its light-weight hardware and the rich User Interface supported by the Android Operating System. We implemented that kind of GCS and called it Ground System Software (GSS) in this paper. In operation, our GSS communicates with UAVs or other GSS via TCP/IP connection to get mission-related data, visualizes it on the device-s screen, and saves the data in its own database. Our study showed that this kind of system will become a potential instrument in UAV-related systems and this kind of topic will appear in many research studies in the near future.

Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm

This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.

A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.