Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

A Novel Machining Signal Filtering Technique: Z-notch Filter

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Human Verification in a Video Surveillance System Using Statistical Features

A human verification system is presented in this paper. The system consists of several steps: background subtraction, thresholding, line connection, region growing, morphlogy, star skelatonization, feature extraction, feature matching, and decision making. The proposed system combines an advantage of star skeletonization and simple statistic features. A correlation matching and probability voting have been used for verification, followed by a logical operation in a decision making stage. The proposed system uses small number of features and the system reliability is convincing.

Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

MC and IC – What Is the Relationship?

MC (Management Control)& IC (Internal Control) – what is the relationship? (an empirical study into the definitions between MC and IC) based on the wider considerations of Internal Control and Management Control terms, attention is focused not only on the financial aspects but also more on the soft aspects of the business, such as culture, behaviour, standards and values. The limited considerations of Management Control are focused mainly in the hard, financial aspects of business operation. The definitions of Management Control and Internal Control are often used interchangeably and the results of this empirical study reveal that Management Control is part of Internal Control, there is no causal link between the two concepts. Based on the interpretation of the respondents, the term Management Control has moved from a broad term to a more limited term with the soft aspects of the influencing of behaviour, performance measurements, incentives and culture. This paper is an exploratory study based on qualitative research and on a qualitative matrix method analysis of the thematic definition of the terms Management Control and Internal Control.

Effect of Salt Solution and Plasticity Index on undrain Shear Strength of Clays

Compacted clay liners (CCLs) are the main materials used in waste disposal landfills due to their low permeability. In this study, the effect on the shear resistant of clays with inorganic salt solutions as permeate fluid was experimentally investigated. For this purpose, NaCl inorganic salt solution at concentrations of 2, 5, 10% and deionized water were used. Laboratory direct shear and Vane shear tests were conducted on three compacted clays with low, medium and high plasticity. Results indicated that the solutions type and its concentration affect the shear properties of the mixture. In the light of this study, the influence magnitude of these inorganic salts in varies concentrations in different clays were determined and more suitable compacted clay with the compare of plasticity were found.

Enhancing Performance of Bluetooth Piconets Using Priority Scheduling and Exponential Back-Off Mechanism

Bluetooth is a personal wireless communication technology and is being applied in many scenarios. It is an emerging standard for short range, low cost, low power wireless access technology. Current existing MAC (Medium Access Control) scheduling schemes only provide best-effort service for all masterslave connections. It is very challenging to provide QoS (Quality of Service) support for different connections due to the feature of Master Driven TDD (Time Division Duplex). However, there is no solution available to support both delay and bandwidth guarantees required by real time applications. This paper addresses the issue of how to enhance QoS support in a Bluetooth piconet. The Bluetooth specification proposes a Round Robin scheduler as possible solution for scheduling the transmissions in a Bluetooth Piconet. We propose an algorithm which will reduce the bandwidth waste and enhance the efficiency of network. We define token counters to estimate traffic of real-time slaves. To increase bandwidth utilization, a back-off mechanism is then presented for best-effort slaves to decrease the frequency of polling idle slaves. Simulation results demonstrate that our scheme achieves better performance over the Round Robin scheduling.

The Effects of Biomass Parameters on the Dissolved Organic Carbon Removal in a Sponge Submerged Membrane Bioreactor

A novel sponge submerged membrane bioreactor (SSMBR) was developed to effectively remove organics and nutrients from wastewater. Sponge is introduced within the SSMBR as a medium for the attached growth of biomass. This paper evaluates the effects of new and acclimatized sponges for dissolved organic carbon (DOC) removal from wastewater at different mixed liquor suspended solids- (MLSS) concentration of the sludge. It was observed in a series of experimental studies that the acclimatized sponge performed better than the new sponge whilst the optimum DOC removal could be achieved at 10g/L of MLSS with the acclimatized sponge. Moreover, the paper analyses the relationships between the MLSSsponge/MLSSsludge and the DOC removal efficiency of SSMBR. The results showed a non-linear relationship between the biomass parameters of the sponge and the sludge, and the DOC removal efficiency of SSMBR. A second-order polynomial function could reasonably represent these relationships.

Validation of Automation Systems using Temporal Logic Model Checking and Groebner Bases

Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.

Modeling Language for Machine Learning

For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem.

Coupling Compensation of 6-DOF Parallel Robot Based on Screw Theory

In order to improve control performance and eliminate steady, a coupling compensation for 6-DOF parallel robot is presented. Taking dynamic load Tank Simulator as the research object, this paper analyzes the coupling of 6-DOC parallel robot considering the degree of freedom of the 6-DOF parallel manipulator. The coupling angle and coupling velocity are derived based on inverse kinematics model. It uses the mechanism-model combined method which takes practical moving track that considering the performance of motion controller and motor as its input to make the study. Experimental results show that the coupling compensation improves motion stability as well as accuracy. Besides, it decreases the dither amplitude of dynamic load Tank Simulator.

Design and Implementation of Cricket-based Location Tracking System

In this paper, we present a novel approach to location system under indoor environment. The key idea of our work is accurate distance estimation with cricket-based location system using A* algorithm. We also use magnetic sensor for detecting obstacles in indoor environment. Finally, we suggest how this system can be used in various applications such as asset tracking and monitoring.

Signals from the Rocks

There is increasing evidence that earthquakes produce electromagnetic signals observable at the surface in the extremely low to very low freqency (ELF - VLF) range often in advance to the main event. These precursors are candidates for prediction purposes. Laboratory experiments con´¼ürm that material under load emits an electromagnetic signature, the detailed generation mechanisms how- ever are not well understood yet.

Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Goal-Based Request Cloud Resource Broker in Medical Application

In this paper, cloud resource broker using goalbased request in medical application is proposed. To handle recent huge production of digital images and data in medical informatics application, the cloud resource broker could be used by medical practitioner for proper process in discovering and selecting correct information and application. This paper summarizes several reviewed articles to relate medical informatics application with current broker technology and presents a research work in applying goal-based request in cloud resource broker to optimize the use of resources in cloud environment. The objective of proposing a new kind of resource broker is to enhance the current resource scheduling, discovery, and selection procedures. We believed that it could help to maximize resources allocation in medical informatics application.

Parallel Image Compression and Analysis with Wavelets

This paper presents image compression with wavelet based method. The wavelet transformation divides image to low- and high pass filtered parts. The traditional JPEG compression technique requires lower computation power with feasible losses, when only compression is needed. However, there is obvious need for wavelet based methods in certain circumstances. The methods are intended to the applications in which the image analyzing is done parallel with compression. Furthermore, high frequency bands can be used to detect changes or edges. Wavelets enable hierarchical analysis for low pass filtered sub-images. The first analysis can be done for a small image, and only if any interesting is found, the whole image is processed or reconstructed.

Approximate Range-Sum Queries over Data Cubes Using Cosine Transform

In this research, we propose to use the discrete cosine transform to approximate the cumulative distributions of data cube cells- values. The cosine transform is known to have a good energy compaction property and thus can approximate data distribution functions easily with small number of coefficients. The derived estimator is accurate and easy to update. We perform experiments to compare its performance with a well-known technique - the (Haar) wavelet. The experimental results show that the cosine transform performs much better than the wavelet in estimation accuracy, speed, space efficiency, and update easiness.

Fractal - Wavelet Based Techniques for Improving the Artificial Neural Network Models

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Nonlinear Effects in Stiffness Modeling of Robotic Manipulators

The paper focuses on the enhanced stiffness modeling of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by rigid links and perfect joints. In contrast to the conventional formulation, which is valid for the unloaded mode and small displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The developed numerical technique allows computing the static equilibrium and relevant force/torque reaction of the manipulator for any given displacement of the end-effector. This enables designer detecting essentially nonlinear effects in elastic behavior of manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of the dedicated matrix composed of the stiffness parameters of the virtual springs and the Jacobians/Hessians of the active and passive joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel manipulator of the Orthoglide family

Navigation Patterns Mining Approach based on Expectation Maximization Algorithm

Web usage mining algorithms have been widely utilized for modeling user web navigation behavior. In this study we advance a model for mining of user-s navigation pattern. The model makes user model based on expectation-maximization (EM) algorithm.An EM algorithm is used in statistics for finding maximum likelihood estimates of parameters in probabilistic models, where the model depends on unobserved latent variables. The experimental results represent that by decreasing the number of clusters, the log likelihood converges toward lower values and probability of the largest cluster will be decreased while the number of the clusters increases in each treatment.