Breast Skin-Line Estimation and Breast Segmentation in Mammograms using Fast-Marching Method

Breast skin-line estimation and breast segmentation is an important pre-process in mammogram image processing and computer-aided diagnosis of breast cancer. Limiting the area to be processed into a specific target region in an image would increase the accuracy and efficiency of processing algorithms. In this paper we are presenting a new algorithm for estimating skin-line and breast segmentation using fast marching algorithm. Fast marching is a partial-differential equation based numerical technique to track evolution of interfaces. We have introduced some modifications to the traditional fast marching method, specifically to improve the accuracy of skin-line estimation and breast tissue segmentation. Proposed modifications ensure that the evolving front stops near the desired boundary. We have evaluated the performance of the algorithm by using 100 mammogram images taken from mini-MIAS database. The results obtained from the experimental evaluation indicate that this algorithm explains 98.6% of the ground truth breast region and accuracy of the segmentation is 99.1%. Also this algorithm is capable of partially-extracting nipple when it is available in the profile.

Model to Support Synchronous and Asynchronous in the Learning Process with An Adaptive Hypermedia System

In blended learning environments, the Internet can be combined with other technologies. The aim of this research was to design, introduce and validate a model to support synchronous and asynchronous activities by managing content domains in an Adaptive Hypermedia System (AHS). The application is based on information recovery techniques, clustering algorithms and adaptation rules to adjust the user's model to contents and objects of study. This system was applied to blended learning in higher education. The research strategy used was the case study method. Empirical studies were carried out on courses at two universities to validate the model. The results of this research show that the model had a positive effect on the learning process. The students indicated that the synchronous and asynchronous scenario is a good option, as it involves a combination of work with the lecturer and the AHS. In addition, they gave positive ratings to the system and stated that the contents were adapted to each user profile.

Problem Solving Techniques with Extensive Computational Network and Applying in an Educational Software

Knowledge bases are basic components of expert systems or intelligent computational programs. Knowledge bases provide knowledge, events that serve deduction activity, computation and control. Therefore, researching and developing of models for knowledge representation play an important role in computer science, especially in Artificial Intelligence Science and intelligent educational software. In this paper, the extensive deduction computational model is proposed to design knowledge bases whose attributes are able to be real values or functional values. The system can also solve problems based on knowledge bases. Moreover, the models and algorithms are applied to produce the educational software for solving alternating current problems or solving set of equations automatically.

Immobilization of Aspergillus awamori 1-8 for Subsequent Pectinase Production

The overall objective of this research is a strain improvement technology for efficient pectinase production. A novel cells cultivation technology by immobilization of fungal cells has been studied in long time continuous fermentations. Immobilization was achieved by using of new material for absorption of stores of immobilized cultures which was for the first time used for immobilization of microorganisms. Effects of various conditions of nitrogen and carbon nutrition on the biosynthesis of pectolytic enzymes in Aspergillus awamori 1-8 strain were studied. Proposed cultivation technology along with optimization of media components for pectinase overproduction led to increased pectinase productivity in Aspergillus awamori 1-8 from 7 to 8 times. Proposed technology can be applied successfully for production of major industrial enzymes such as α-amylase, protease, collagenase etc.

Building Gabor Filters from Retinal Responses

Starting from a biologically inspired framework, Gabor filters were built up from retinal filters via LMSE algorithms. Asubset of retinal filter kernels was chosen to form a particular Gabor filter by using a weighted sum. One-dimensional optimization approaches were shown to be inappropriate for the problem. All model parameters were fixed with biological or image processing constraints. Detailed analysis of the optimization procedure led to the introduction of a minimization constraint. Finally, quantization of weighting factors was investigated. This resulted in an optimized cascaded structure of a Gabor filter bank implementation with lower computational cost.

Enhanced Coagulation of Disinfection By-Products Precursors in Porsuk Water Resource, Eskisehir

Natural organic matter (NOM) is heterogeneous mixture of organic compounds that enter the water media from animal and plant remains, domestic and industrial wastes. Researches showed that NOM is likely precursor material for disinfection by products (DBPs). Chlorine very commenly used for disinfection purposes and NOM and chlorine reacts then Trihalomethane (THM) and Haloacetic acids (HAAs) which are cancerogenics for human health are produced. The aim of the study is to search NOM removal by enhanced coagulation from drinking water source of Eskisehir which is supplied from Porsuk Dam. Recently, Porsuk dam water is getting highly polluted and therefore NOM concentration is increasing. Enhanced coagulation studies were evaluated by measurement of Dissolved Organic Carbon (DOC), UV absorbance at 254 nm (UV254), and different trihalomethane formation potential (THMFP) tests. Results of jar test experiments showed that NOM can be removed from water about 40-50 % of efficiency by enhanced coagulation. Optimum coagulant type and coagulant dosages were determined using FeCl3 and Alum.

Interoperability in Component Based Software Development

The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.

Tsunami Modelling using the Well-Balanced Scheme

A well balanced numerical scheme based on stationary waves for shallow water flows with arbitrary topography has been introduced by Thanh et al. [18]. The scheme was constructed so that it maintains equilibrium states and tests indicate that it is stable and fast. Applying the well-balanced scheme for the one-dimensional shallow water equations, we study the early shock waves propagation towards the Phuket coast in Southern Thailand during a hypothetical tsunami. The initial tsunami wave is generated in the deep ocean with the strength that of Indonesian tsunami of 2004.

Extension of a Smart Piezoelectric Ceramic Rod

This paper presents an exact solution and a finite element method (FEM) for a Piezoceramic Rod under static load. The cylindrical rod is made from polarized ceramics (piezoceramics) with axial poling. The lateral surface of the rod is traction-free and is unelectroded. The two end faces are under a uniform normal traction. Electrically, the two end faces are electroded with a circuit between the electrodes, which can be switched on or off. Two cases of open and shorted electrodes (short circuit and open circuit) will be considered. Finally, a finite element model will be used to compare the results with an exact solution. The study uses ABAQUS (v.6.7) software to derive the finite element model of the ceramic rod.

Action Recognition in Video Sequences using a Mealy Machine

In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.

Spreading Dynamics of a Viral Infection in a Complex Network

We report a computational study of the spreading dynamics of a viral infection in a complex (scale-free) network. The final epidemic size distribution (FESD) was found to be unimodal or bimodal depending on the value of the basic reproductive number R0 . The FESDs occurred on time-scales long enough for intermediate-time epidemic size distributions (IESDs) to be important for control measures. The usefulness of R0 for deciding on the timeliness and intensity of control measures was found to be limited by the multimodal nature of the IESDs and by its inability to inform on the speed at which the infection spreads through the population. A reduction of the transmission probability at the hubs of the scale-free network decreased the occurrence of the larger-sized epidemic events of the multimodal distributions. For effective epidemic control, an early reduction in transmission at the index cell and its neighbors was essential.

Adsorption of Crystal Violet onto BTEA- and CTMA-bentonite from Aqueous Solutions

CTMA-bentonite and BTEA-Bentonite prepared by Na-bentonite cation exchanged with cetyltrimethylammonium(CTMA) and benzyltriethylammonium (BTEA). Products were characterized by XRD and IR techniques.The d001 spacing value of CTMA-bentonite and BTEA-bentonite are 7.54Å and 3.50Å larger than that of Na-bentonite at 100% cation exchange capacity, respectively. The IR spectrum showed that the intensities of OH stretching and bending vibrations of the two organoclays decreased greatly comparing to untreated Na-bentonite. Batch experiments were carried out at 303 K, 318 K and 333 K to obtain the sorption isotherms of Crystal violet onto the two organoclays. The results show that the sorption isothermal data could be well described by Freundlich model. The dynamical data for the two organoclays fit well with pseudo-second-order kinetic model. The adsorption capacity of CTMA-bentonite was found higher than that of BTEA-Bentonite. Thermodynamic parameters such as changes in the free energy (ΔG°), the enthalpy (ΔH°) and the entropy (ΔS°) were also evaluated. The overall adsorption process of Crystal violet onto the two organoclays were spontaneous, endothermic physisorption. The CTMA-bentonite and BTEA-Bentonite could be employed as low-cost alternatives to activated carbon in wastewater treatment for the removal of color which comes from textile dyes.

Surface Roughness Optimization in End Milling Operation with Damper Inserted End Milling Cutters

This paper presents a study of the Taguchi design application to optimize surface quality in damper inserted end milling operation. Maintaining good surface quality usually involves additional manufacturing cost or loss of productivity. The Taguchi design is an efficient and effective experimental method in which a response variable can be optimized, given various factors, using fewer resources than a factorial design. This Study included spindle speed, feed rate, and depth of cut as control factors, usage of different tools in the same specification, which introduced tool condition and dimensional variability. An orthogonal array of L9(3^4)was used; ANOVA analyses were carried out to identify the significant factors affecting surface roughness, and the optimal cutting combination was determined by seeking the best surface roughness (response) and signal-to-noise ratio. Finally, confirmation tests verified that the Taguchi design was successful in optimizing milling parameters for surface roughness.

A Microstrip Antenna Design and Performance Analysis for RFID High Bit Rate Applications

Lately, an interest has grown greatly in the usages of RFID in an un-presidential applications. It is shown in the adaptation of major software companies such as Microsoft, IBM, and Oracle the RFID capabilities in their major software products. For example Microsoft SharePoints 2010 workflow is now fully compatible with RFID platform. In addition, Microsoft BizTalk server is also capable of all RFID sensors data acquisition. This will lead to applications that required high bit rate, long range and a multimedia content in nature. Higher frequencies of operation have been designated for RFID tags, among them are the 2.45 and 5.8 GHz. The higher the frequency means higher range, and higher bit rate, but the drawback is the greater cost. In this paper we present a single layer, low profile patch antenna operates at 5.8 GHz with pure resistive input impedance of 50 and close to directive radiation. Also, we propose a modification to the design in order to improve the operation band width from 8.7 to 13.8

The Effect of Different Nozzle Configurations on Airflow Behaviour and Yarn Quality

Nozzle is the main part of various spinning systems such as air-jet and Murata air vortex systems. Recently, many researchers worked on the usage of the nozzle on different spinning systems such as conventional ring and compact spinning systems. In these applications, primary purpose is to improve the yarn quality. In present study, it was produced the yarns with two different nozzle types and determined the changes in yarn properties. In order to explain the effect of the nozzle, airflow structure in the nozzle was modelled and airflow variables were determined. In numerical simulation, ANSYS 12.1 package program and Fluid Flow (CFX) analysis method was used. As distinct from the literature, Shear Stress Turbulent (SST) model is preferred. And also air pressure at the nozzle inlet was measured by electronic mass flow meter and these values were used for the simulation of the airflow. At last, the yarn was modelled and the area from where the yarn is passing was included to the numerical analysis.

Low Air Velocity Measurement Characteristics- Variation Due to Flow Regime

The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.

A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression

There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.

Efficient Use of Energy through Incorporation of a Gas Turbine in Methanol Plant

A techno-economic evaluation for efficient use of energy in a large scale industrial plant of methanol is carried out. This assessment is based on integration of a gas turbine with an existing plant of methanol in which the outlet gas products of exothermic reactor is expanded to power generation. Also, it is decided that methanol production rate is constant through addition of power generation system to the existing methanol plant. Having incorporated a gas turbine with the existing plant, the economic results showed total investment of MUSD 16.9, energy saving of 3.6 MUSD/yr with payback period of approximately 4.7 years.

The Semantic Web: a New Approach for Future World Wide Web

The purpose of semantic web research is to transform the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently exploited. As a first step in this transformation, languages such as OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very successful and has rapidly become a defacto standard for ontology development in fields as diverse as geography, geology, astronomy, agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new practical approach which uses these concepts to outperform Word Wide Web.

Genetic Programming Approach to Hierarchical Production Rule Discovery

Automated discovery of hierarchical structures in large data sets has been an active research area in the recent past. This paper focuses on the issue of mining generalized rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses flat rules as initial individuals of GP and discovers hierarchical structure. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy. Experimental results are presented to demonstrate the performance of the proposed algorithm.