FEM Analysis of the Interaction between a Piezoresistive Tactile Sensor and Biological Tissues

The present paper presents a finite element model and analysis for the interaction between a piezoresistive tactile sensor and biological tissues. The tactile sensor is proposed for use in minimally invasive surgery to deliver tactile information of biological tissues to surgeons. The proposed sensor measures the relative hardness of soft contact objects as well as the contact force. Silicone rubbers were used as the phantom of biological tissues. Finite element analysis of the silicone rubbers and the mechanical structure of the sensor were performed using COMSOL Multiphysics (v3.4) environment. The simulation results verify the capability of the sensor to be used to differentiate between different kinds of silicone rubber materials.

Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Automatically Driven Vector for Guidewire Segmentation in 2D and Biplane Fluoroscopy

The segmentation of endovascular tools in fluoroscopy images can be accurately performed automatically or by minimum user intervention, using known modern techniques. It has been proven in literature, but no clinical implementation exists so far because the computational time requirements of such technology have not yet been met. A classical segmentation scheme is composed of edge enhancement filtering, line detection, and segmentation. A new method is presented that consists of a vector that propagates in the image to track an edge as it advances. The filtering is performed progressively in the projected path of the vector, whose orientation allows for oriented edge detection, and a minimal image area is globally filtered. Such an algorithm is rapidly computed and can be implemented in real-time applications. It was tested on medical fluoroscopy images from an endovascular cerebral intervention. Ex- periments showed that the 2D tracking was limited to guidewires without intersection crosspoints, while the 3D implementation was able to cope with such planar difficulties.

ISTER (Immune System - Tumor Efficiency Rate): An Important Key for Planning in Radiotherapic Facilities

The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.

Design of the Production Line Based On RFID through 3D Modeling

Radio-frequency identification has entered as a beneficial means with conforming GS1 standards to provide the best solutions in the manufacturing area. It competes with other automated identification technologies e.g. barcodes and smart cards with regard to high speed scanning, reliability and accuracy as well. The purpose of this study is to improve production line-s performance by implementing RFID system in the manufacturing area on the basis of radio-frequency identification (RFID) system by 3D modeling in the program Cinema 4D R13 which provides obvious graphical scenes for users to portray their applications. Finally, with regard to improving system performance, it shows how RFID appears as a well-suited technology in a comparison of the barcode scanner to handle different kinds of raw materials in the production line base on logical process.

Thermal Diffusivity Measurement of Cadmium Sulphide Nanoparticles Prepared by γ-Radiation Technique

In this study we applied thermal lens (TL) technique to study the effect of size on thermal diffusivity of cadmium sulphide (CdS) nanofluid prepared by using γ-radiation method containing particles with different sizes. In TL experimental set up a diode laser of wavelength 514 nm and intensity stabilized He-Ne laser were used as the excitation source and the probe beam respectively, respectively. The experimental results showed that the thermal diffusivity value of CdS nanofluid increases when the of particle size increased.

A Multi-Criteria Evaluation Incorporating Linguistic Computing for Service Innovation Performance

The growing influence of service industries has prompted greater attention being paid to service operations management. However, service managers often have difficulty articulating the veritable effects of their service innovation. Especially, the performance evaluation process of service innovation problems generally involves uncertain and imprecise data. This paper presents a 2-tuple fuzzy linguistic computing approach to dealing with heterogeneous information and information loss problems while the processes of subjective evaluation integration. The proposed method based on group decision-making scenario to assist business managers in measuring performance of service innovation manipulates the heterogeneity integration processes and avoids the information loss effectively.

Agent-based Simulation for Blood Glucose Control in Diabetic Patients

This paper employs a new approach to regulate the blood glucose level of type I diabetic patient under an intensive insulin treatment. The closed-loop control scheme incorporates expert knowledge about treatment by using reinforcement learning theory to maintain the normoglycemic average of 80 mg/dl and the normal condition for free plasma insulin concentration in severe initial state. The insulin delivery rate is obtained off-line by using Qlearning algorithm, without requiring an explicit model of the environment dynamics. The implementation of the insulin delivery rate, therefore, requires simple function evaluation and minimal online computations. Controller performance is assessed in terms of its ability to reject the effect of meal disturbance and to overcome the variability in the glucose-insulin dynamics from patient to patient. Computer simulations are used to evaluate the effectiveness of the proposed technique and to show its superiority in controlling hyperglycemia over other existing algorithms

Stress Analysis for Two Fitted Thin Walled Cylinder with High Angular Velocity

In this paper stress and strain for two rotating thin wall cylinder fitted together with initial interference and overlap are computed. Also stress value for variation of initial interference is calculated. At first problem is considered without rotation and next angular velocity increased from 0 to 50000 rev/min and stress in each stage is calculated. The important point is that when stress become very small in magnitude the angular velocity is critical and two cylinders will separate. The critical speed i.e. speed of separation is calculated in each step.

Speaker Identification using Neural Networks

The speech signal conveys information about the identity of the speaker. The area of speaker identification is concerned with extracting the identity of the person speaking the utterance. As speech interaction with computers becomes more pervasive in activities such as the telephone, financial transactions and information retrieval from speech databases, the utility of automatically identifying a speaker is based solely on vocal characteristic. This paper emphasizes on text dependent speaker identification, which deals with detecting a particular speaker from a known population. The system prompts the user to provide speech utterance. System identifies the user by comparing the codebook of speech utterance with those of the stored in the database and lists, which contain the most likely speakers, could have given that speech utterance. The speech signal is recorded for N speakers further the features are extracted. Feature extraction is done by means of LPC coefficients, calculating AMDF, and DFT. The neural network is trained by applying these features as input parameters. The features are stored in templates for further comparison. The features for the speaker who has to be identified are extracted and compared with the stored templates using Back Propogation Algorithm. Here, the trained network corresponds to the output; the input is the extracted features of the speaker to be identified. The network does the weight adjustment and the best match is found to identify the speaker. The number of epochs required to get the target decides the network performance.

Multi-Agent Systems Applied in the Modeling and Simulation of Biological Problems: A Case Study in Protein Folding

Multi-agent system approach has proven to be an effective and appropriate abstraction level to construct whole models of a diversity of biological problems, integrating aspects which can be found both in "micro" and "macro" approaches when modeling this type of phenomena. Taking into account these considerations, this paper presents the important computational characteristics to be gathered into a novel bioinformatics framework built upon a multiagent architecture. The version of the tool presented herein allows studying and exploring complex problems belonging principally to structural biology, such as protein folding. The bioinformatics framework is used as a virtual laboratory to explore a minimalist model of protein folding as a test case. In order to show the laboratory concept of the platform as well as its flexibility and adaptability, we studied the folding of two particular sequences, one of 45-mer and another of 64-mer, both described by an HP model (only hydrophobic and polar residues) and coarse grained 2D-square lattice. According to the discussion section of this piece of work, these two sequences were chosen as breaking points towards the platform, in order to determine the tools to be created or improved in such a way to overcome the needs of a particular computation and analysis of a given tough sequence. The backwards philosophy herein is that the continuous studying of sequences provides itself important points to be added into the platform, to any time improve its efficiency, as is demonstrated herein.

Reconfigurable Autonomous Mini Robot Design using CPLD's

This paper explains a project based learning method where autonomous mini-robots are developed for research, education and entertainment purposes. In case of remote systems wireless sensors are developed in critical areas, which would collect data at specific time intervals, send the data to the central wireless node based on certain preferred information would make decisions to turn on or off a switch or control unit. Such information transfers hardly sums up to a few bytes and hence low data rates would suffice for such implementations. As a robot is a multidisciplinary platform, the interfacing issues involved are discussed in this paper. The paper is mainly focused on power supply, grounding and decoupling issues.

Augmenting Use Case View for Modeling

Mathematical, graphical and intuitive models are often constructed in the development process of computational systems. The Unified Modeling Language (UML) is one of the most popular modeling languages used by practicing software engineers. This paper critically examines UML models and suggests an augmented use case view with the addition of new constructs for modeling software. It also shows how a use case diagram can be enhanced. The improved modeling constructs are presented with examples for clarifying important design and implementation issues.

A Novel Estimation Method for Integer Frequency Offset in Wireless OFDM Systems

Ren et al. presented an efficient carrier frequency offset (CFO) estimation method for orthogonal frequency division multiplexing (OFDM), which has an estimation range as large as the bandwidth of the OFDM signal and achieves high accuracy without any constraint on the structure of the training sequence. However, its detection probability of the integer frequency offset (IFO) rapidly varies according to the fractional frequency offset (FFO) change. In this paper, we first analyze the Ren-s method and define two criteria suitable for detection of IFO. Then, we propose a novel method for the IFO estimation based on the maximum-likelihood (ML) principle and the detection criteria defined in this paper. The simulation results demonstrate that the proposed method outperforms the Ren-s method in terms of the IFO detection probability irrespective of a value of the FFO.

Influence of the Entropic Parameter on the Flow Geometry and Morphology

The necessity of updating the numerical models inputs, because of geometrical and resistive variations in rivers subject to solid transport phenomena, requires detailed control and monitoring activities. The human employment and financial resources of these activities moves the research towards the development of expeditive methodologies, able to evaluate the outflows through the measurement of more easily acquirable sizes. Recent studies highlighted the dependence of the entropic parameter on the kinematical and geometrical flow conditions. They showed a meaningful variability according to the section shape, dimension and slope. Such dependences, even if not yet well defined, could reduce the difficulties during the field activities, and also the data elaboration time. On the basis of such evidences, the relationships between the entropic parameter and the geometrical and resistive sizes, obtained through a large and detailed laboratory experience on steady free surface flows in conditions of macro and intermediate homogeneous roughness, are analyzed and discussed.

Turbulent Mixing and its Effects on Thermal Fatigue in Nuclear Reactors

The turbulent mixing of coolant streams of different temperature and density can cause severe temperature fluctuations in piping systems in nuclear reactors. In certain periodic contraction cycles these conditions lead to thermal fatigue. The resulting aging effect prompts investigation in how the mixing of flows over a sharp temperature/density interface evolves. To study the fundamental turbulent mixing phenomena in the presence of density gradients, isokinetic (shear-free) mixing experiments are performed in a square channel with Reynolds numbers ranging from 2-500 to 60-000. Sucrose is used to create the density difference. A Wire Mesh Sensor (WMS) is used to determine the concentration map of the flow in the cross section. The mean interface width as a function of velocity, density difference and distance from the mixing point are analyzed based on traditional methods chosen for the purposes of atmospheric/oceanic stratification analyses. A definition of the mixing layer thickness more appropriate to thermal fatigue and based on mixedness is devised. This definition shows that the thermal fatigue risk assessed using simple mixing layer growth can be misleading and why an approach that separates the effects of large scale (turbulent) and small scale (molecular) mixing is necessary.

Performance Analysis of Parallel Client-Server Model Versus Parallel Mobile Agent Model

Mobile agent has motivated the creation of a new methodology for parallel computing. We introduce a methodology for the creation of parallel applications on the network. The proposed Mobile-Agent parallel processing framework uses multiple Javamobile Agents. Each mobile agent can travel to the specified machine in the network to perform its tasks. We also introduce the concept of master agent, which is Java object capable of implementing a particular task of the target application. Master agent is dynamically assigns the task to mobile agents. We have developed and tested a prototype application: Mobile Agent Based Parallel Computing. Boosted by the inherited benefits of using Java and Mobile Agents, our proposed methodology breaks the barriers between the environments, and could potentially exploit in a parallel manner all the available computational resources on the network. This paper elaborates performance issues of a mobile agent for parallel computing.

Effects of Microwave Heating on Biogas Production, Chemical Oxygen Demand and Volatile Solids Solubilization of Food Residues

This paper presents the results of the preliminary investigation of microwave (MW) irradiation pretreatments on the anaerobic digestion of food residues using biochemical methane potential (BMP) assays. Low solids systems with a total solids (TS) content ranging from 5.0-10.0% were analyzed. The inoculum to bulk mass of substrates to water ratio was 1:2:2 (mass basis). The experimental conditions for pretreatments were as follows: a control (no MW irradiation), two runs with MW irradiation for 15 and 30 minutes at 320 W, and another two runs with MW irradiation at 528 W for 30 and 60 minutes. The cumulative biogas production were 6.3 L and 8.7 L for 15min/320 W and 30min/320 W MW irradiation conditions, respectively, and 10.5 L and 11.4 L biogas for 30min/528 W and 60min/528 W, respectively, as compared to the control giving 5.8 L biogas. Both an increase in exposure time of irradiation and power of MW had increased the rate and yield of biogas. Singlefactor ANOVA tests (p

The Use of Artificial Neural Network in Option Pricing: The Case of S and P 100 Index Options

Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.

Applications of Trigonometic Measures of Fuzzy Entropy to Geometry

In the literature of fuzzy measures, there exist many well known parametric and non-parametric measures, each with its own merits and limitations. But our main emphasis is on applications of these measures to a variety of disciplines. To extend the scope of applications of these fuzzy measures to geometry, we need some special fuzzy measures. In this communication, we have introduced two new fuzzy measures involving trigonometric functions and simultaneously provided their applications to obtain the basic results already existing in the literature of geometry.