Model Inversion of a Two Degrees of Freedom Linearized PUMA from Bicausal Bond Graphs

A bond graph model of a two degrees of freedom PUMA is described. System inversion gives the system input required to generate a given system output. In order to get the system inversion of the PUMA manipulator, a linearization of the nonlinear bond graph is obtained. Hence, the bicausality of the linearized bond graph of the PUMA manipulator is applied. Thus, the bicausal bond graph provides a systematic way of generating the equations of the system inversion. Simulation results to verify the calculated input for a given output are shown.

Forces Association-Based Active Contour

A welded structure must be inspected to guarantee that the weld quality meets the design requirements to assure safety and reliability. However, X-ray image analyses and defect recognition with the computer vision techniques are very complex. Most difficulties lie in finding the small, irregular defects in poor contrast images which requires pre processing to image, extract, and classify features from strong background noise. This paper addresses the issue of designing methodology to extract defect from noisy background radiograph with image processing. Based on the use of actives contours this methodology seems to give good results

Image Contrast Enhancement based Sub-histogram Equalization Technique without Over-equalization Noise

In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.

Modeling of Reusability of Object Oriented Software System

Automatic reusability appraisal is helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this research work, structural attributes of software components are explored using software metrics and quality of the software is inferred by different Neural Network based approaches, taking the metric values as input. The calculated reusability value enables to identify a good quality code automatically. It is found that the reusability value determined is close to the manual analysis used to be performed by the programmers or repository managers. So, the developed system can be used to enhance the productivity and quality of software development.

Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Combinatorial Optimisation of Worm Propagationon an Unknown Network

Worm propagation profiles have significantly changed since 2003-2004: sudden world outbreaks like Blaster or Slammer have progressively disappeared and slower but stealthier worms appeared since, most of them for botnets dissemination. Decreased worm virulence results in more difficult detection. In this paper, we describe a stealth worm propagation model which has been extensively simulated and analysed on a huge virtual network. The main features of this model is its ability to infect any Internet-like network in a few seconds, whatever may be its size while greatly limiting the reinfection attempt overhead of already infected hosts. The main simulation results shows that the combinatorial topology of routing may have a huge impact on the worm propagation and thus some servers play a more essential and significant role than others. The real-time capability to identify them may be essential to greatly hinder worm propagation.

Secure Block-Based Video Authentication with Localization and Self-Recovery

Because of the great advance in multimedia technology, digital multimedia is vulnerable to malicious manipulations. In this paper, a public key self-recovery block-based video authentication technique is proposed which can not only precisely localize the alteration detection but also recover the missing data with high reliability. In the proposed block-based technique, multiple description coding MDC is used to generate two codes (two descriptions) for each block. Although one block code (one description) is enough to rebuild the altered block, the altered block is rebuilt with better quality by the two block descriptions. So using MDC increases the ratability of recovering data. A block signature is computed using a cryptographic hash function and a doubly linked chain is utilized to embed the block signature copies and the block descriptions into the LSBs of distant blocks and the block itself. The doubly linked chain scheme gives the proposed technique the capability to thwart vector quantization attacks. In our proposed technique , anyone can check the authenticity of a given video using the public key. The experimental results show that the proposed technique is reliable for detecting, localizing and recovering the alterations.

Design and Simulation Interface Circuit for Piezoresistive Accelerometers with Offset Cancellation Ability

This paper presents a new method for read out of the piezoresistive accelerometer sensors. The circuit works based on Instrumentation amplifier and it is useful for reducing offset In Wheatstone Bridge. The obtained gain is 645 with 1μv/°c Equivalent drift and 1.58mw power consumption. A Schmitt trigger and multiplexer circuit control output node. a high speed counter is designed in this work .the proposed circuit is designed and simulated In 0.18μm CMOS technology with 1.8v power supply.

The Problem of Using the Calculation of the Critical Path to Solver Instances of the Job Shop Scheduling Problem

A procedure commonly used in Job Shop Scheduling Problem (JSSP) to evaluate the neighborhoods functions that use the non-deterministic algorithms is the calculation of the critical path in a digraph. This paper presents an experimental study of the cost of computation that exists when the calculation of the critical path in the solution for instances in which a JSSP of large size is involved. The results indicate that if the critical path is use in order to generate neighborhoods in the meta-heuristics that are used in JSSP, an elevated cost of computation exists in spite of the fact that the calculation of the critical path in any digraph is of polynomial complexity.

Visual Object Tracking in 3D with Color Based Particle Filter

This paper addresses the problem of determining the current 3D location of a moving object and robustly tracking it from a sequence of camera images. The approach presented here uses a particle filter and does not perform any explicit triangulation. Only the color of the object to be tracked is required, but not any precisemotion model. The observation model we have developed avoids the color filtering of the entire image. That and the Monte Carlotechniques inside the particle filter provide real time performance.Experiments with two real cameras are presented and lessons learned are commented. The approach scales easily to more than two cameras and new sensor cues.

Robust Regression and its Application in Financial Data Analysis

This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from the robust regression and the least square regression shows that the former can provide the possibility of a better and more realistic analysis owing to eliminating or reducing the contribution of outliers and influential data. Therefore, robust regression is recommended for getting more precise results in financial data analysis.

The Incorporation of In in GaAsN as a Means of N Fraction Calibration

InGaAsN and GaAsN epitaxial layers with similar nitrogen compositions in a sample were successfully grown on a GaAs (001) substrate by solid source molecular beam epitaxy. An electron cyclotron resonance nitrogen plasma source has been used to generate atomic nitrogen during the growth of the nitride layers. The indium composition changed from sample to sample to give compressive and tensile strained InGaAsN layers. Layer characteristics have been assessed by high-resolution x-ray diffraction to determine the relationship between the lattice constant of the GaAs1-yNy layer and the fraction x of In. The objective was to determine the In fraction x in an InxGa1-xAs1-yNy epitaxial layer which exactly cancels the strain present in a GaAs1-yNy epitaxial layer with the same nitrogen content when grown on a GaAs substrate.

Transmitter Macrodiversity in Multihopping- SFN Based Algorithm for Improved Node Reachability and Robust Routing

A novel idea presented in this paper is to combine multihop routing with single-frequency networks (SFNs) for a broadcasting scenario. An SFN is a set of multiple nodes that transmit the same data simultaneously, resulting in transmitter macrodiversity. Two of the most important performance factors of multihop networks, node reachability and routing robustness, are analyzed. Simulation results show that our proposed SFN-D routing algorithm improves the node reachability by 37 percentage points as compared to non-SFN multihop routing. It shows a diversity gain of 3.7 dB, meaning that 3.7 dB lower transmission powers are required for the same reachability. Even better results are possible for larger networks. If an important node becomes inactive, this algorithm can find new routes that a non-SFN scheme would not be able to find. Thus, two of the major problems in multihopping are addressed; achieving robust routing as well as improving node reachability or reducing transmission power.

Clustering Multivariate Empiric Characteristic Functions for Multi-Class SVM Classification

A dissimilarity measure between the empiric characteristic functions of the subsamples associated to the different classes in a multivariate data set is proposed. This measure can be efficiently computed, and it depends on all the cases of each class. It may be used to find groups of similar classes, which could be joined for further analysis, or it could be employed to perform an agglomerative hierarchical cluster analysis of the set of classes. The final tree can serve to build a family of binary classification models, offering an alternative approach to the multi-class SVM problem. We have tested this dendrogram based SVM approach with the oneagainst- one SVM approach over four publicly available data sets, three of them being microarray data. Both performances have been found equivalent, but the first solution requires a smaller number of binary SVM models.

Trust and Reliability for Public Sector Data

The public sector holds large amounts of data of various areas such as social affairs, economy, or tourism. Various initiatives such as Open Government Data or the EU Directive on public sector information aim to make these data available for public and private service providers. Requirements for the provision of public sector data are defined by legal and organizational frameworks. Surprisingly, the defined requirements hardly cover security aspects such as integrity or authenticity. In this paper we discuss the importance of these missing requirements and present a concept to assure the integrity and authenticity of provided data based on electronic signatures. We show that our concept is perfectly suitable for the provisioning of unaltered data. We also show that our concept can also be extended to data that needs to be anonymized before provisioning by incorporating redactable signatures. Our proposed concept enhances trust and reliability of provided public sector data.

Coherent and Incoherent Scattering Cross Sections for Elements with 13

Coherent and incoherent scattering cross section measurements have been carried out using a HPGe detector on elements in the range of Z = 13 - 50 using 241Am gamma rays. The cross sections have been derived by comparing the net count rate obtained from the Compton peak of aluminium with the corresponding peak of the target. The measured cross sections for the coherent and incoherent processes are compared with theoretical values and earlier reported values. Our results are in agreement with the theoretical values.

Use of Ecommerce Websites in Developing Countries

The purpose of this study is to investiagte the use of the ecommerce website in Indonesia as a developing country. The ecommerce website has been identified having the significant impact on business activities in particular solving the geographical problem for islanded countries likes Indonesia. Again, website is identified as a crucial marketing tool. This study presents the effect of quality and features on the use and user satisfaction employing ecommerce websites. Survey method for 115 undergraduate students of Management Department in Andalas University who are attending Management Information Systems (SIM) class have been undertaken. The data obtained is analyzed using Structural Equation Modeling (SEM) using SmartPLS program. This result found that quality of system and information, feature as well satisfaction influencing the use ecommerce website in Indonesia contexts.

Objective Assessment of Psoriasis Lesion Thickness for PASI Scoring using 3D Digital Imaging

Psoriasis is a chronic inflammatory skin condition which affects 2-3% of population around the world. Psoriasis Area and Severity Index (PASI) is a gold standard to assess psoriasis severity as well as the treatment efficacy. Although a gold standard, PASI is rarely used because it is tedious and complex. In practice, PASI score is determined subjectively by dermatologists, therefore inter and intra variations of assessment are possible to happen even among expert dermatologists. This research develops an algorithm to assess psoriasis lesion for PASI scoring objectively. Focus of this research is thickness assessment as one of PASI four parameters beside area, erythema and scaliness. Psoriasis lesion thickness is measured by averaging the total elevation from lesion base to lesion surface. Thickness values of 122 3D images taken from 39 patients are grouped into 4 PASI thickness score using K-means clustering. Validation on lesion base construction is performed using twelve body curvature models and show good result with coefficient of determinant (R2) is equal to 1.

Clustering Methods Applied to the Tracking of user Traces Interacting with an e-Learning System

Many research works are carried out on the analysis of traces in a digital learning environment. These studies produce large volumes of usage tracks from the various actions performed by a user. However, to exploit these data, compare and improve performance, several issues are raised. To remedy this, several works deal with this problem seen recently. This research studied a series of questions about format and description of the data to be shared. Our goal is to share thoughts on these issues by presenting our experience in the analysis of trace-based log files, comparing several approaches used in automatic classification applied to e-learning platforms. Finally, the obtained results are discussed.

Forecasting Malaria Cases in Bujumbura

The focus in this work is to assess which method allows a better forecasting of malaria cases in Bujumbura ( Burundi) when taking into account association between climatic factors and the disease. For the period 1996-2007, real monthly data on both malaria epidemiology and climate in Bujumbura are described and analyzed. We propose a hierarchical approach to achieve our objective. We first fit a Generalized Additive Model to malaria cases to obtain an accurate predictor, which is then used to predict future observations. Various well-known forecasting methods are compared leading to different results. Based on in-sample mean average percentage error (MAPE), the multiplicative exponential smoothing state space model with multiplicative error and seasonality performed better.