Breast Cancer Treatment Evaluation based on Mammographic and Echographic Distance Computing

Accurate assessment of the primary tumor response to treatment is important in the management of breast cancer. This paper introduces a new set of treatment evaluation indicators for breast cancer cases based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The distance principals are applied to pairs of mammograms and/or echograms, recorded before and after treatment, determining a reference point in judging the evolution amount of the studied carcinoma. The obtained numerical results are indeed very transparent and indicate not only the evolution or the involution of the tumor under treatment, but also a quantitative measurement of the benefit in using the selected method of treatment.

Distributed Architecture of an Autonomous Four Rotor Mini-Rotorcraft based on Multi-Agent System

In this paper, we present the recently implemented approach allowing dynamics systems to plan its actions, taking into account the environment perception changes, and to control their execution when uncertainty and incomplete knowledge are the major characteristics of the situated environment [1],[2],[3],[4]. The control distributed architecture has three modules and the approach is related to hierarchical planning: the plan produced by the planner is further refined at the control layer that in turn supervises its execution by a functional level. We propose a new intelligent distributed architecture constituted by: Multi-Agent subsystem of the sensor, of the interpretation and representation of environment [9], of the dynamic localization and of the action. We tested this distributed architecture with dynamic system in the known environment. The autonomous for Rotor Mini Rotorcraft task is described by the primitive actions. The distributed controlbased on multi-agent system is in charge of achieving each task in the best possible way taking into account the context and sensory feedback.

Extended Study on Removing Gaussian Noise in Mechanical Engineering Drawing Images using Median Filters

In this paper, an extended study is performed on the effect of different factors on the quality of vector data based on a previous study. In the noise factor, one kind of noise that appears in document images namely Gaussian noise is studied while the previous study involved only salt-and-pepper noise. High and low levels of noise are studied. For the noise cleaning methods, algorithms that were not covered in the previous study are used namely Median filters and its variants. For the vectorization factor, one of the best available commercial raster to vector software namely VPstudio is used to convert raster images into vector format. The performance of line detection will be judged based on objective performance evaluation method. The output of the performance evaluation is then analyzed statistically to highlight the factors that affect vector quality.

An Implementation of EURORADIO Protocol for ERTMS Systems

European Rail Traffic Management System (ERTMS) is the European reference for interoperable and safer signaling systems to efficiently manage trains running. If implemented, it allows trains cross seamlessly intra-European national borders. ERTMS has defined a secure communication protocol, EURORADIO, based on open communication networks. Its RadioInfill function can improve the reaction of the signaling system to changes in line conditions, avoiding unnecessary braking: its advantages in terms of power saving and travel time has been analyzed. In this paper a software implementation of the EURORADIO protocol with RadioInfill for ERTMS Level 1 using GSM-R is illustrated as part of the SR-Secure Italian project. In this building-blocks architecture the EURORADIO layers communicates together through modular Application Programm Interfaces. Security coding rules and railway industry requirements specified by EN 50128 standard have been respected. The proposed implementation has successfully passed conformity tests and has been tested on a computer-based simulator.

King Bhumibol Adulyadej’s “Learn Wisely” Concept: An Application to Instructional Design

This study is about an application of King Bhumibol Adulyadej’s “Learn Wisely” (LW) concept in instructional design and management process at the Faculty of Education, Suan Sunahdha Rajabhat University. The concept suggests four strategies for true learning. Related literature and significant LW methods in teaching and learning are also reviewed and then applied in designing a pedagogy learning module. The design has been implemented in three classrooms with a total of 115 sophomore student teachers. After one consecutive semester of managing and adjusting the process by instructors and experts using collected data from minutes, assessment of learning management, satisfaction and learning achievement of the students, it is found that the effective SSRU model of LW instructional method comprises of five steps.

Climate Change Finger Prints in Mountainous Upper Euphrates Basin

Climate change leading to global warming affects the earth through many different ways such as weather (temperature, precipitation, humidity and the other parameters of weather), snow coverage and ice melting, sea level rise, hydrological cycles, quality of water, agriculture, forests, ecosystems and health. One of the most affected areas by climate change is hydrology and water resources. Regions where majority of runoff consists of snow melt are more sensitive to climate change. The first step of climate change studies is to establish trends of significant climate variables including precipitation, temperature and flow data to detect any potential climate change impacts already happened. Two popular non-parametric trend analysis methods, Mann-Kendal and Spearman-s Rho were applied to Upper Euphrates Basin (Turkey) to detect trends of precipitation, temperatures (maximum, minimum and average) and streamflow.

Osteogenesis by Dextran Coating on and among Fibers of a Polyvinyl Formal Sponge

A scaffold is necessary for tooth regeneration because of its three-dimensional geometry. For restoration of defect, it is necessary for the scaffold to be prepared in the shape of the defect. Sponges made from polyvinyl alcohol with formalin cross-linking (PVF sponge) have been used for scaffolds for bone formation in vivo. To induce osteogenesis within the sponge, methods of growing rat bone marrow cells (rBMCs) among the fiber structures in the sponge might be considered. Storage of rBMCs among the fibers in the sponge coated with dextran (10 kDa) was tried. After seeding of rBMCs to PVF sponge immersed in dextran solution at 2 g/dl concentration, osteogenesis was recognized in subcutaneously implanted PVF sponge as a scaffold in vivo. The level of osteocalcin was 25.28±5.71 ng/scaffold and that of Ca was 129.20±19.69 µg/scaffold. These values were significantly higher than those in sponges without dextran coating (p

Board Members' Financial Education and Firms' Performance: Empirical Evidence for Bucharest Stock Exchange Companies

After the accounting scandals and the financial crisis, regulators have stressed the need for more financial experts on boards. Several studies conducted in countries with developed capital markets report positive effects of board financial competencies. As each country offers a different context and specific institutional factors this paper addresses the subject in the context of Romania. The Romanian capital market offers an interesting research field because of the heterogeneity of listed firms. After analyzing board members education based on public information posted on listed companies websites and their annual reports we found a positive association between the proportion of board members holding a postgraduate degree in financial fields and market based performance measured by Tobin q. We found also that the proportion of Board members holding degrees in financial fields is higher in bigger firms and firms with more concentrated ownership.

Pentachlorophenol Removal via Adsorption and Biodegradation

Removal of PCP by a system combining biodegradation by biofilm and adsorption was investigated here. Three studies were conducted employing batch tests, sequencing batch reactor (SBR) and continuous biofilm activated carbon column reactor (BACCOR). The combination of biofilm-GAC batch process removed about 30% more PCP than GAC adsorption alone. For the SBR processes, both the suspended and attached biomass could remove more than 90% of the PCP after acclimatisation. BACCOR was able to remove more than 98% of PCP-Na at concentrations ranging from 10 to 100 mg/L, at empty bed contact time (EBCT) ranging from 0.75 to 4 hours. Pure and mixed cultures from BACCOR were tested for use of PCP as sole carbon and energy source under aerobic conditions. The isolates were able to degrade up to 42% of PCP under aerobic conditions in pure cultures. However, mixed cultures were found able to degrade more than 99% PCP indicating interdependence of species.

Interstate Comparison of Environmental Performance using Stochastic Frontier Analysis: The United States Case Study

Environmental performance of the U.S. States is investigated for the period of 1990 – 2007 using Stochastic Frontier Analysis (SFA). The SFA accounts for both efficiency measure and stochastic noise affecting a frontier. The frontier is formed using indicators of GDP, energy consumption, population, and CO2 emissions. For comparability, all indicators are expressed as ratios to total. Statistical information of the Energy Information Agency of the United States is used. Obtained results reveal the bell - shaped dynamics of environmental efficiency scores. The average efficiency scores rise from 97.6% in 1990 to 99.6% in 1999, and then fall to 98.4% in 2007. The main factor is insufficient decrease in the rate of growth of CO2 emissions with regards to the growth of GDP, population and energy consumption. Data for 2008 following the research period allow for an assumption that the environmental performance of the U.S. States has improved in the last years.

Segmentation of Lungs from CT Scan Images for Early Diagnosis of Lung Cancer

Segmentation is an important step in medical image analysis and classification for radiological evaluation or computer aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT generally first segment the area of interest (lung) and then analyze the separately obtained area for nodule detection in order to diagnosis the disease. For normal lung, segmentation can be performed by making use of excellent contrast between air and surrounding tissues. However this approach fails when lung is affected by high density pathology. Dense pathologies are present in approximately a fifth of clinical scans, and for computer analysis such as detection and quantification of abnormal areas it is vital that the entire and perfectly lung part of the image is provided and no part, as present in the original image be eradicated. In this paper we have proposed a lung segmentation technique which accurately segment the lung parenchyma from lung CT Scan images. The algorithm was tested against the 25 datasets of different patients received from Ackron Univeristy, USA and AGA Khan Medical University, Karachi, Pakistan.

Price Quoting Method for Contract Manufacturer

This is an applied research to propose the method for price quotation for a contract electronics manufacturer. It has had a precise price quoting method but such method could not quickly provide a result as the customer required. This reduces the ability of company to compete in this kind of business. In this case, the cause of long time quotation process was analyzed. A lot of product features have been demanded by customer. By checking routine processes, it was found that high fraction of quoting time was used for production time estimating which has effected to the manufacturing or production cost. Then the historical data of products including types, number of components, assembling method, and their assembling time were used to analyze the key components affecting to production time. The price quoting model then was proposed. The implementation of proposed model was able to remarkably reduce quoting time with an acceptable required precision.

Realignment of f-actin Cytoskeleton in Osteocytes after Mechanical Loading

F-actin fibrils are the cytoskeleton of osteocytes. They react in a dynamic manner to mechanical loading, and strength and reposition their efforts to reinforce the cells structure. We hypothesize that f-actin is temporarly disrupted after loading and repolymerizes in a new orientation to oppose the applied load. In vitro studies are conducted to determine f-actin disruption after varying mechanical stimulus parameters that are known to affect bone formation. Results indicate that the f-actin cytoskeleton is disrupted in vitro as a function of applied mechanical stimulus parameters and that the f-actin bundles reassemble after loading induced disruption within 3 minutes after cessation of loading. The disruption of the factin cytoskeleton depends on the magnitude of stretch, the numbers of loading cycles, frequency, the insertion of rest between loading cycles and extracellular calcium. In vivo studies also demonstrate disruption of the f-actin cytoskeleton in cells embedded in the bone matrix immediately after mechanical loading. These studies suggest that adaptation of the f-actin fiber bundles of the cytoskeleton in response to applied loads occurs by disruption and subsequent repolymerization.

Self-evolving Neural Networks Based On PSO and JPSO Algorithms

A self-evolution algorithm for optimizing neural networks using a combination of PSO and JPSO is proposed. The algorithm optimizes both the network topology and parameters simultaneously with the aim of achieving desired accuracy with less complicated networks. The performance of the proposed approach is compared with conventional back-propagation networks using several synthetic functions, with better results in the case of the former. The proposed algorithm is also implemented on slope stability problem to estimate the critical factor of safety. Based on the results obtained, the proposed self evolving network produced a better estimate of critical safety factor in comparison to conventional BPN network.

Effect of Temperature on the Performance of Multi-Stage Distillation

The tray/multi-tray distillation process is a topic that has been investigated to great detail over the last decade by many teams such as Jubran et al. [1], Adhikari et al. [2], Mowla et al. [3], Shatat et al. [4] and Fath [5] to name a few. A significant amount of work and effort was spent focusing on modeling and/simulation of specific distillation hardware designs. In this work, we have focused our efforts on investigating and gathering experimental data on several engineering and design variables to quantify their influence on the yield of the multi-tray distillation process. Our goals are to generate experimental performance data to bridge some existing gaps in the design, engineering, optimization and theoretical modeling aspects of the multi-tray distillation process.

A Novel Approach of Power Transformer Diagnostic Using 3D FEM Parametrical Model

This paper deals with a novel approach of power transformers diagnostics. This approach identifies the exact location and the range of a fault in the transformer and helps to reduce operation costs related to handling of the faulty transformer, its disassembly and repair. The advantage of the approach is a possibility to simulate healthy transformer and also all faults, which can occur in transformer during its operation without its disassembling, which is very expensive in practice. The approach is based on creating frequency dependent impedance of the transformer by sweep frequency response analysis measurements and by 3D FE parametrical modeling of the fault in the transformer. The parameters of the 3D FE model are the position and the range of the axial short circuit. Then, by comparing the frequency dependent impedances of the parametrical models with the measured ones, the location and the range of the fault is identified. The approach was tested on a real transformer and showed high coincidence between the real fault and the simulated one.

Inter-frame Collusion Attack in SS-N Video Watermarking System

Video watermarking is usually considered as watermarking of a set of still images. In frame-by-frame watermarking approach, each video frame is seen as a single watermarked image, so collusion attack is more critical in video watermarking. If the same or redundant watermark is used for embedding in every frame of video, the watermark can be estimated and then removed by watermark estimate remodolulation (WER) attack. Also if uncorrelated watermarks are used for every frame, these watermarks can be washed out with frame temporal filtering (FTF). Switching watermark system or so-called SS-N system has better performance against WER and FTF attacks. In this system, for each frame, the watermark is randomly picked up from a finite pool of watermark patterns. At first SS-N system will be surveyed and then a new collusion attack for SS-N system will be proposed using a new algorithm for separating video frame based on watermark pattern. So N sets will be built in which every set contains frames carrying the same watermark. After that, using WER attack in every set, N different watermark patterns will be estimated and removed later.

Study and Design of Patient Flow at the Medicine Department of a University Hospital

Most, if not all, public hospitals in Thailand have encountered a common problem regarding the increasing demand for medical services. The increasing number of patients causes so much strain on the hospital-s services, over-crowded, overloaded working hours, staff fatigue, medical error and long waiting time. This research studied the characteristics of operational processes of the medical care services at the medicine department in a large public university hospital. The research focuses on details regarding methods, procedures, processes, resources, and time management in overall processes. The simulation model is used as a tool to analyze the impact of various improvement strategies.

Benchmarking: Performance on ALPS and Formosa Clusters

This paper presents the benchmarking results and performance evaluation of differentclustersbuilt atthe National Center for High-Performance Computingin Taiwan. Performance of processor, memory subsystem andinterconnect is a critical factor in the overall performance of high performance computing platforms. The evaluation compares different system architecture and software platforms. Most supercomputer used HPL to benchmark their system performance, in accordance with the requirement of the TOP500 List. In this paper we consider system memory access factors that affect benchmark performance, such as processor and memory performance.We hope these works will provide useful information for future development and construct cluster system.

Influence of IMV on Space Station

To study the impact of the inter-module ventilation (IMV) on the space station, the Computational Fluid Dynamic (CFD) model under the influence of IMV, the mathematical model, boundary conditions and calculation method are established and determined to analyze the influence of IMV on cabin air flow characteristics and velocity distribution firstly; and then an integrated overall thermal mathematical model of the space station is used to consider the impact of IMV on thermal management. The results show that: the IMV has a significant influence on the cabin air flow, the flowrate of IMV within a certain range can effectively improve the air velocity distribution in cabin, if too much may lead to its deterioration; IMV can affect the heat deployment of the different modules in space station, thus affecting its thermal management, the use of IMV can effectively maintain the temperature levels of the different modules and help the space station to dissipate the waste heat.