Segmentation of Images through Clustering to Extract Color Features: An Application forImage Retrieval

This paper deals with the application for contentbased image retrieval to extract color feature from natural images stored in the image database by segmenting the image through clustering. We employ a class of nonparametric techniques in which the data points are regarded as samples from an unknown probability density. Explicit computation of the density is avoided by using the mean shift procedure, a robust clustering technique, which does not require prior knowledge of the number of clusters, and does not constrain the shape of the clusters. A non-parametric technique for the recovery of significant image features is presented and segmentation module is developed using the mean shift algorithm to segment each image. In these algorithms, the only user set parameter is the resolution of the analysis and either gray level or color images are accepted as inputs. Extensive experimental results illustrate excellent performance.

Machine Learning Techniques for Short-Term Rain Forecasting System in the Northeastern Part of Thailand

This paper presents the methodology from machine learning approaches for short-term rain forecasting system. Decision Tree, Artificial Neural Network (ANN), and Support Vector Machine (SVM) were applied to develop classification and prediction models for rainfall forecasts. The goals of this presentation are to demonstrate (1) how feature selection can be used to identify the relationships between rainfall occurrences and other weather conditions and (2) what models can be developed and deployed for predicting the accurate rainfall estimates to support the decisions to launch the cloud seeding operations in the northeastern part of Thailand. Datasets collected during 2004-2006 from the Chalermprakiat Royal Rain Making Research Center at Hua Hin, Prachuap Khiri khan, the Chalermprakiat Royal Rain Making Research Center at Pimai, Nakhon Ratchasima and Thai Meteorological Department (TMD). A total of 179 records with 57 features was merged and matched by unique date. There are three main parts in this work. Firstly, a decision tree induction algorithm (C4.5) was used to classify the rain status into either rain or no-rain. The overall accuracy of classification tree achieves 94.41% with the five-fold cross validation. The C4.5 algorithm was also used to classify the rain amount into three classes as no-rain (0-0.1 mm.), few-rain (0.1- 10 mm.), and moderate-rain (>10 mm.) and the overall accuracy of classification tree achieves 62.57%. Secondly, an ANN was applied to predict the rainfall amount and the root mean square error (RMSE) were used to measure the training and testing errors of the ANN. It is found that the ANN yields a lower RMSE at 0.171 for daily rainfall estimates, when compared to next-day and next-2-day estimation. Thirdly, the ANN and SVM techniques were also used to classify the rain amount into three classes as no-rain, few-rain, and moderate-rain as above. The results achieved in 68.15% and 69.10% of overall accuracy of same-day prediction for the ANN and SVM models, respectively. The obtained results illustrated the comparison of the predictive power of different methods for rainfall estimation.

Determination of Volatile Organic Compounds in Human Breath by Optical Fiber Sensing

This work proposes an optical fiber system (OF) for sensing various volatile organic compounds (VOCs) in human breath for the diagnosis of some metabolic disorders as a non-invasive methodology. The analyzed VOCs are alkanes (i.e., ethane, pentane, heptane, octane, and decane), and aromatic compounds (i.e., benzene, toluene, and styrene). The OF displays high analytical performance since it provides near real-time responses, rapid analysis, and low instrumentation costs, as well as it exhibits useful linear range and detection limits; the developed OF sensor is also comparable to a reference methodology (gas chromatography-mass spectrometry) for the eight tested VOCs.

Space Vector PWM Simulation for Three Phase DC/AC Inverter

Space Vector Pulse Width Modulation SVPWM is one of the most used techniques to generate sinusoidal voltage and current due to its facility and efficiency with low harmonics distortion. This algorithm is specially used in power electronic applications. This paper describes simulation algorithm of SVPWM & SPWM using MatLab/simulink environment. It also implements a closed loop three phases DC-AC converter controlling its outputs voltages amplitude and frequency using MatLab. Also comparison between SVPWM & SPWM results is given.

WLAN Positioning Based on Joint TOA and RSS Characteristics

WLAN Positioning has been presented by many approaches in literatures using the characteristics of Received Signal Strength (RSS), Time of Arrival (TOA) or Time Difference of Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these, RSS approach is the simplest method to implement because there is no need of modification on both access points and client devices whereas its accuracy is terrible due to physical environments. For TOA or TDOA approach, the accuracy is quite acceptable but most researches have to modify either software or hardware on existing WLAN infrastructure. The scales of modifications are made on only access card up to the changes in protocol of WLAN. Hence, it is an unattractive approach to use TOA or TDOA for positioning system. In this paper, the new concept of merging both RSS and TOA positioning techniques is proposed. In addition, the method to achieve TOA characteristic for positioning WLAN user without any extra modification necessarily appended in the existing system is presented. The measurement results confirm that the proposed technique using both RSS and TOA characteristics provides better accuracy than using only either RSS or TOA approach.

Investigation of VMAT Algorithms and Dosimetry

Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.

Fast Dummy Sequence Insertion Method for PAPR Reduction in WiMAX Systems

In literatures, many researches proposed various methods to reduce PAPR (Peak to Average Power Ratio). Among those, DSI (Dummy Sequence Insertion) is one of the most attractive methods for WiMAX systems because it does not require side information transmitted along with user data. However, the conventional DSI methods find dummy sequence by performing an iterative procedure until achieving PAPR under a desired threshold. This causes a significant delay on finding dummy sequence and also effects to the overall performances in WiMAX systems. In this paper, the new method based on DSI is proposed by finding dummy sequence without the need of iterative procedure. The fast DSI method can reduce PAPR without either delays or required side information. The simulation results confirm that the proposed method is able to carry out PAPR performances as similar to the other methods without any delays. In addition, the simulations of WiMAX system with adaptive modulations are also investigated to realize the use of proposed methods on various fading schemes. The results suggest the WiMAX designers to modify a new Signal to Noise Ratio (SNR) criteria for adaptation.

Effect of Network Communication Overhead on the Performance of Adaptive Speculative Locking Protocol

The speculative locking (SL) protocol extends the twophase locking (2PL) protocol to allow for parallelism among conflicting transactions. The adaptive speculative locking (ASL) protocol provided further enhancements and outperformed SL protocols under most conditions. Neither of these protocols consider the impact of network latency on the performance of the distributed database systems. We have studied the performance of ASL protocol taking into account the communication overhead. The results indicate that though system load can counter network latency, it can still become a bottleneck in many situations. The impact of latency on performance depends on many factors including the system resources. A flexible discrete event simulator was used as the testbed for this study.

Dynamic Routing to Multiple Destinations in IP Networks using Hybrid Genetic Algorithm (DRHGA)

In this paper we have proposed a novel dynamic least cost multicast routing protocol using hybrid genetic algorithm for IP networks. Our protocol finds the multicast tree with minimum cost subject to delay, degree, and bandwidth constraints. The proposed protocol has the following features: i. Heuristic local search function has been devised and embedded with normal genetic operation to increase the speed and to get the optimized tree, ii. It is efficient to handle the dynamic situation arises due to either change in the multicast group membership or node / link failure, iii. Two different crossover and mutation probabilities have been used for maintaining the diversity of solution and quick convergence. The simulation results have shown that our proposed protocol generates dynamic multicast tree with lower cost. Results have also shown that the proposed algorithm has better convergence rate, better dynamic request success rate and less execution time than other existing algorithms. Effects of degree and delay constraints have also been analyzed for the multicast tree interns of search success rate.

In vivo Histomorphometric and Corrosion Analysis of Ti-Ni-Cr Shape Memory Alloys in Rabbits

A series of Ti based shape memory alloys with composition of Ti50Ni49Cr1, Ti50Ni47Cr3 and Ti50Ni45Cr5 were developed by vacuum arc-melting under a purified argon atmosphere. The histometric and corrosion evaluation of Ti-Ni-Cr shape memory alloys have been considered in this research work. The alloys were developed by vacuum arc melting and implanted subcutaneously in rabbits for 4, 8 and 12 weeks. Metallic implants were embedded in order to determine the outcome of implantation on histometric and corrosion evaluation of Ti-Ni-Cr metallic strips. Encapsulating membrane formation around the alloys was minimal in the case of all materials. After histomorphometric analyses it was possible to demonstrate that there were no statistically significant differences between the materials. Corrosion rate was also determined in this study which is within acceptable range. The results showed the Ti- Ni-Cr alloy was neither cytotoxic, nor have any systemic reaction on living system in any of the test performed. Implantation shows good compatibility and a potential of being used directly in vivo system.

On the Move to Semantic Web Services

Semantic Web services will enable the semiautomatic and automatic annotation, advertisement, discovery, selection, composition, and execution of inter-organization business logic, making the Internet become a common global platform where organizations and individuals communicate with each other to carry out various commercial activities and to provide value-added services. There is a growing consensus that Web services alone will not be sufficient to develop valuable solutions due the degree of heterogeneity, autonomy, and distribution of the Web. This paper deals with two of the hottest R&D and technology areas currently associated with the Web – Web services and the Semantic Web. It presents the synergies that can be created between Web Services and Semantic Web technologies to provide a new generation of eservices.

Context Generation with Image Based Sensors: An Interdisciplinary Enquiry on Technical and Social Issues and their Implications for System Design

Image data holds a large amount of different context information. However, as of today, these resources remain largely untouched. It is thus the aim of this paper to present a basic technical framework which allows for a quick and easy exploitation of context information from image data especially by non-expert users. Furthermore, the proposed framework is discussed in detail concerning important social and ethical issues which demand special requirements in system design. Finally, a first sensor prototype is presented which meets the identified requirements. Additionally, necessary implications for the software and hardware design of the system are discussed, rendering a sensor system which could be regarded as a good, acceptable and justifiable technical and thereby enabling the extraction of context information from image data.

Selection of Best Band Combination for Soil Salinity Studies using ETM+ Satellite Images (A Case study: Nyshaboor Region,Iran)

One of the main environmental problems which affect extensive areas in the world is soil salinity. Traditional data collection methods are neither enough for considering this important environmental problem nor accurate for soil studies. Remote sensing data could overcome most of these problems. Although satellite images are commonly used for these studies, however there are still needs to find the best calibration between the data and real situations in each specified area. Neyshaboor area, North East of Iran was selected as a field study of this research. Landsat satellite images for this area were used in order to prepare suitable learning samples for processing and classifying the images. 300 locations were selected randomly in the area to collect soil samples and finally 273 locations were reselected for further laboratory works and image processing analysis. Electrical conductivity of all samples was measured. Six reflective bands of ETM+ satellite images taken from the study area in 2002 were used for soil salinity classification. The classification was carried out using common algorithms based on the best composition bands. The results showed that the reflective bands 7, 3, 4 and 1 are the best band composition for preparing the color composite images. We also found out, that hybrid classification is a suitable method for identifying and delineation of different salinity classes in the area.

Learning and Evaluating Possibilistic Decision Trees using Information Affinity

This paper investigates the issue of building decision trees from data with imprecise class values where imprecision is encoded in the form of possibility distributions. The Information Affinity similarity measure is introduced into the well-known gain ratio criterion in order to assess the homogeneity of a set of possibility distributions representing instances-s classes belonging to a given training partition. For the experimental study, we proposed an information affinity based performance criterion which we have used in order to show the performance of the approach on well-known benchmarks.

Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

A Multi-Criteria Evaluation Incorporating Linguistic Computing for Service Innovation Performance

The growing influence of service industries has prompted greater attention being paid to service operations management. However, service managers often have difficulty articulating the veritable effects of their service innovation. Especially, the performance evaluation process of service innovation problems generally involves uncertain and imprecise data. This paper presents a 2-tuple fuzzy linguistic computing approach to dealing with heterogeneous information and information loss problems while the processes of subjective evaluation integration. The proposed method based on group decision-making scenario to assist business managers in measuring performance of service innovation manipulates the heterogeneity integration processes and avoids the information loss effectively.

Continuity Planning in Supply Chain Networks: Degrees of Freedom and Application in the Risk Management Process

Supply chain networks are frequently hit by unplanned events which lead to disruptions and cause operational and financial consequences. It is neither possible to avoid disruption risk entirely, nor are network members able to prepare for every possible disruptive event. Therefore a continuity planning should be set up which supports effective operational responses in supply chain networks in times of emergencies. In this research network related degrees of freedom which determine the options for responsive actions are derived from interview data. The findings are further embedded into a common risk management process. The paper provides support for researchers and practitioners to identify the network related options for responsive actions and to determine the need for improving the reaction capabilities.

Transformation of Vocal Characteristics: A Review of Literature

The transformation of vocal characteristics aims at modifying voice such that the intelligibility of aphonic voice is increased or the voice characteristics of a speaker (source speaker) to be perceived as if another speaker (target speaker) had uttered it. In this paper, the current state-of-the-art voice characteristics transformation methodology is reviewed. Special emphasis is placed on voice transformation methodology and issues for improving the transformed speech quality in intelligibility and naturalness are discussed. In particular, it is suggested to use the modulation theory of speech as a base for research on high quality voice transformation. This approach allows one to separate linguistic, expressive, organic and perspective information of speech, based on an analysis of how they are fused when speech is produced. Therefore, this theory provides the fundamentals not only for manipulating non-linguistic, extra-/paralinguistic and intra-linguistic variables for voice transformation, but also for paving the way for easily transposing the existing voice transformation methods to emotion-related voice quality transformation and speaking style transformation. From the perspectives of human speech production and perception, the popular voice transformation techniques are described and classified them based on the underlying principles either from the speech production or perception mechanisms or from both. In addition, the advantages and limitations of voice transformation techniques and the experimental manipulation of vocal cues are discussed through examples from past and present research. Finally, a conclusion and road map are pointed out for more natural voice transformation algorithms in the future.

Comparative Approach of Measuring Price Risk on Romanian and International Wheat Market

This paper aims to present the main instruments used in the economic literature for measuring the price risk, pointing out on the advantages brought by the conditional variance in this respect. The theoretical approach will be exemplified by elaborating an EGARCH model for the price returns of wheat, both on Romanian and on international market. To our knowledge, no previous empirical research, either on price risk measurement for the Romanian markets or studies that use the ARIMA-EGARCH methodology, have been conducted. After estimating the corresponding models, the paper will compare the estimated conditional variance on the two markets.

Qualitative Parametric Comparison of Load Balancing Algorithms in Parallel and Distributed Computing Environment

Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. One of the biggest issues in such systems is the development of effective techniques/algorithms for the distribution of the processes/load of a parallel program on multiple hosts to achieve goal(s) such as minimizing execution time, minimizing communication delays, maximizing resource utilization and maximizing throughput. Substantive research using queuing analysis and assuming job arrivals following a Poisson pattern, have shown that in a multi-host system the probability of one of the hosts being idle while other host has multiple jobs queued up can be very high. Such imbalances in system load suggest that performance can be improved by either transferring jobs from the currently heavily loaded hosts to the lightly loaded ones or distributing load evenly/fairly among the hosts .The algorithms known as load balancing algorithms, helps to achieve the above said goal(s). These algorithms come into two basic categories - static and dynamic. Whereas static load balancing algorithms (SLB) take decisions regarding assignment of tasks to processors based on the average estimated values of process execution times and communication delays at compile time, Dynamic load balancing algorithms (DLB) are adaptive to changing situations and take decisions at run time. The objective of this paper work is to identify qualitative parameters for the comparison of above said algorithms. In future this work can be extended to develop an experimental environment to study these Load balancing algorithms based on comparative parameters quantitatively.