The Resource Description Framework (RDF) as a Modern Structure for Medical Data

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.

Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images

We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.

Assessment of Performance Measures of Large-Scale Power Systems

In a recent major industry-supported research and development study, a novel framework was developed and applied for assessment of reliability and quality performance levels in reallife power systems with practical large-scale sizes. The new assessment methodology is based on three metaphors (dimensions) representing the relationship between available generation capacities and required demand levels. The paper shares the results of the successfully completed stud and describes the implementation of the new methodology on practical zones in the Saudi electricity system.

Simplex Method for Solving Linear Programming Problems with Fuzzy Numbers

The fuzzy set theory has been applied in many fields, such as operations research, control theory, and management sciences, etc. In particular, an application of this theory in decision making problems is linear programming problems with fuzzy numbers. In this study, we present a new method for solving fuzzy number linear programming problems, by use of linear ranking function. In fact, our method is similar to simplex method that was used for solving linear programming problems in crisp environment before.

Environmental Inspection using WSANs Based on Multi-agent Coordination Method

In this paper, we focus on the problem of driving and herding a collection of autonomous actors to a given area. Then, a new method based on multi-agent coordination is proposed for solving the problem. In our proposed method, we assume that the environment is covered by sensors. When an event is occurred, sensors forward information to a sink node. Based on received information, the sink node will estimate the direction and the speed of movement of actors and announce the obtained value to the actors. The actors coordinate to reach the target location.

Preliminary Study on Fixture Layout Optimization Using Element Strain Energy

The objective of positioning the fixture elements in the fixture is to make the workpiece stiff, so that geometric errors in the manufacturing process can be reduced. Most of the work for optimal fixture layout used the minimization of the sum of the nodal deflection normal to the surface as objective function. All deflections in other direction have been neglected. We propose a new method for fixture layout optimization in this paper, which uses the element strain energy. The deformations in all the directions have been considered in this way. The objective function in this method is to minimize the sum of square of element strain energy. Strain energy and stiffness are inversely proportional to each other. The optimization problem is solved by the sequential quadratic programming method. Three different kinds of case studies are presented, and results are compared with the method using nodal deflections as objective function to verify the propose method.

The Applicability of the Zipper Strut to Seismic Rehabilitation of Steel Structures

Chevron frames (Inverted-V-braced frames or Vbraced frames) have seismic disadvantages, such as not good exhibit force redistribution capability and compression brace buckles immediately. Researchers developed new design provisions on increasing both the ductility and lateral resistance of these structures in seismic areas. One of these new methods is adding zipper columns, as proposed by Khatib et al. (1988) [2]. Zipper columns are vertical members connecting the intersection points of the braces above the first floor. In this paper applicability of the suspended zipper system to Seismic Rehabilitation of Steel Structures is investigated. The models are 3-, 6-, 9-, and 12-story Inverted-V-braced frames. In this case, it is assumed that the structures must be rehabilitated. For rehabilitation of structures, zipper column is used. The result of researches showed that the suspended zipper system is effective in case of 3-, 6-, and 9-story Inverted-V-braced frames and it would increase lateral resistance of structure up to life safety level. But in case of high-rise buildings (such as 12 story frame), it doesn-t show good performance. For solving this problem, the braced bay can consist of small “units" over the height of the entire structure, which each of them is a zipper-braced bay with a few stories. By using this method the lateral resistance of 12 story Inverted-V-braced frames is increased up to safety life level.

Construct Pairwise Test Suites Based on the Bak-Sneppen Model of Biological Evolution

Pairwise testing, which requires that every combination of valid values of each pair of system factors be covered by at lease one test case, plays an important role in software testing since many faults are caused by unexpected 2-way interactions among system factors. Although meta-heuristic strategies like simulated annealing can generally discover smaller pairwise test suite, they may cost more time to perform search, compared with greedy algorithms. We propose a new method, improved Extremal Optimization (EO) based on the Bak-Sneppen (BS) model of biological evolution, for constructing pairwise test suites and define fitness function according to the requirement of improved EO. Experimental results show that improved EO gives similar size of resulting pairwise test suite and yields an 85% reduction in solution time over SA.

Weighted Harmonic Arnoldi Method for Large Interior Eigenproblems

The harmonic Arnoldi method can be used to find interior eigenpairs of large matrices. However, it has been shown that this method may converge erratically and even may fail to do so. In this paper, we present a new method for computing interior eigenpairs of large nonsymmetric matrices, which is called weighted harmonic Arnoldi method. The implementation of the method has been tested by numerical examples, the results show that the method converges fast and works with high accuracy.

Optimization of Distribution Network Configuration for Loss Reduction Using Artificial Bee Colony Algorithm

Network reconfiguration in distribution system is realized by changing the status of sectionalizing switches to reduce the power loss in the system. This paper presents a new method which applies an artificial bee colony algorithm (ABC) for determining the sectionalizing switch to be operated in order to solve the distribution system loss minimization problem. The ABC algorithm is a new population based metaheuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 14, 33, and 119-bus systems and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.

A New Method for Detection of Artificial Objects and Materials from Long Distance Environmental Images

The article presents a new method for detection of artificial objects and materials from images of the environmental (non-urban) terrain. Our approach uses the hue and saturation (or Cb and Cr) components of the image as the input to the segmentation module that uses the mean shift method. The clusters obtained as the output of this stage have been processed by the decision-making module in order to find the regions of the image with the significant possibility of representing human. Although this method will detect various non-natural objects, it is primarily intended and optimized for detection of humans; i.e. for search and rescue purposes in non-urban terrain where, in normal circumstances, non-natural objects shouldn-t be present. Real world images are used for the evaluation of the method.

EEG Spikes Detection, Sorting, and Localization

This study introduces a new method for detecting, sorting, and localizing spikes from multiunit EEG recordings. The method combines the wavelet transform, which localizes distinctive spike features, with Super-Paramagnetic Clustering (SPC) algorithm, which allows automatic classification of the data without assumptions such as low variance or Gaussian distributions. Moreover, the method is capable of setting amplitude thresholds for spike detection. The method makes use of several real EEG data sets, and accordingly the spikes are detected, clustered and their times were detected.

The New Method of Concealed Data Aggregation in Wireless Sensor: A Case Study

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

A New Fuzzy Decision Support Method for Analysis of Economic Factors of Turkey's Construction Industry

Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.

A New Method for Multiobjective Optimization Based on Learning Automata

The necessity of solving multi dimensional complicated scientific problems beside the necessity of several objective functions optimization are the most motive reason of born of artificial intelligence and heuristic methods. In this paper, we introduce a new method for multiobjective optimization based on learning automata. In the proposed method, search space divides into separate hyper-cubes and each cube is considered as an action. After gathering of all objective functions with separate weights, the cumulative function is considered as the fitness function. By the application of all the cubes to the cumulative function, we calculate the amount of amplification of each action and the algorithm continues its way to find the best solutions. In this Method, a lateral memory is used to gather the significant points of each iteration of the algorithm. Finally, by considering the domination factor, pareto front is estimated. Results of several experiments show the effectiveness of this method in comparison with genetic algorithm based method.

Automatic Extraction of Roads from High Resolution Aerial and Satellite Images with Heavy Noise

Aerial and satellite images are information rich. They are also complex to analyze. For GIS systems, many features require fast and reliable extraction of roads and intersections. In this paper, we study efficient and reliable automatic extraction algorithms to address some difficult issues that are commonly seen in high resolution aerial and satellite images, nonetheless not well addressed in existing solutions, such as blurring, broken or missing road boundaries, lack of road profiles, heavy shadows, and interfering surrounding objects. The new scheme is based on a new method, namely reference circle, to properly identify the pixels that belong to the same road and use this information to recover the whole road network. This feature is invariable to the shape and direction of roads and tolerates heavy noise and disturbances. Road extraction based on reference circles is much more noise tolerant and flexible than the previous edge-detection based algorithms. The scheme is able to extract roads reliably from images with complex contents and heavy obstructions, such as the high resolution aerial/satellite images available from Google maps.

Determination of Moisture Diffusivity of AACin Drying Phase using Genetic Algorithm

The current practice of determination of moisture diffusivity of building materials under laboratory conditions is predominantly aimed at the absorption phase. The main reason is the simplicity of the inverse analysis of measured moisture profiles. However, the liquid moisture transport may exhibit significant hysteresis. Thus, the moisture diffusivity should be different in the absorption (wetting) and desorption (drying) phase. In order to bring computer simulations of hygrothermal performance of building materials closer to the reality, it is then necessary to find new methods for inverse analysis which could be used in the desorption phase as well. In this paper we present genetic algorithm as a possible method of solution of the inverse problem of moisture transport in desorption phase. Its application is demonstrated for AAC as a typical building material.

Oncogene Identification using Filter based Approaches between Various Cancer Types in Lung

Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.

Iris Localization using Circle and Fuzzy Circle Detection Method

Iris localization is a very important approach in biometric identification systems. Identification process usually is implemented in three levels: iris localization, feature extraction, and pattern matching finally. Accuracy of iris localization as the first step affects all other levels and this shows the importance of iris localization in an iris based biometric system. In this paper, we consider Daugman iris localization method as a standard method, propose a new method in this field and then analyze and compare the results of them on a standard set of iris images. The proposed method is based on the detection of circular edge of iris, and improved by fuzzy circles and surface energy difference contexts. Implementation of this method is so easy and compared to the other methods, have a rather high accuracy and speed. Test results show that the accuracy of our proposed method is about Daugman method and computation speed of it is 10 times faster.