Image Segmentation Based on Graph Theoretical Approach to Improve the Quality of Image Segmentation

Graph based image segmentation techniques are considered to be one of the most efficient segmentation techniques which are mainly used as time & space efficient methods for real time applications. How ever, there is need to focus on improving the quality of segmented images obtained from the earlier graph based methods. This paper proposes an improvement to the graph based image segmentation methods already described in the literature. We contribute to the existing method by proposing the use of a weighted Euclidean distance to calculate the edge weight which is the key element in building the graph. We also propose a slight modification of the segmentation method already described in the literature, which results in selection of more prominent edges in the graph. The experimental results show the improvement in the segmentation quality as compared to the methods that already exist, with a slight compromise in efficiency.

The Survey and the Comparison of Maximum Likelihood, Mahalanobis Distance and Minimum Distance Methods in Preparing Landuse Map in the Western Part of Isfahan Province

In this research three methods of Maximum Likelihood, Mahalanobis Distance and Minimum Distance were analyzed in the Western part of Isfahan province in the Iran country. For this purpose, the IRS satellite images and various land preparation uses in region including rangelands, irrigation farming, dry farming, gardens and urban areas were separated and identified. In these methods, matrix error and Kappa index were calculated and accuracy of each method, based on percentages: 53.13, 56.64 and 48.44, were obtained respectively. Considering the low accuracy of these methods to separate land uses due to spread of the land uses, it-s suggested the visual interpretation of the map, to preparing the land use map in this region. The map prepared by visual interpretation is in high accuracy if it will be accompany with the visit of the region.

Standard Deviation of Mean and Variance of Rows and Columns of Images for CBIR

This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.

On Quantum BCH Codes and Its Duals

Classical Bose-Chaudhuri-Hocquenghem (BCH) codes C that contain their dual codes can be used to construct quantum stabilizer codes this chapter studies the properties of such codes. It had been shown that a BCH code of length n which contains its dual code satisfies the bound on weight of any non-zero codeword in C and converse is also true. One impressive difficulty in quantum communication and computation is to protect informationcarrying quantum states against undesired interactions with the environment. To address this difficulty, many good quantum errorcorrecting codes have been derived as binary stabilizer codes. We were able to shed more light on the structure of dual containing BCH codes. These results make it possible to determine the parameters of quantum BCH codes in terms of weight of non-zero dual codeword.

The Measurement of Latvian and Russian Ethnic Attitudes, Using Evaluative Priming Task and Self-Report Methods

The purposes of researches - to estimate implicit ethnic attitudes by direct and indirect methods, to determine the accordance of two types measuring, to investigate influence of task type used in an experiment, on the results of measuring, as well as to determine a presence or communication between recent episodic events and chronologic correlations of ethnic attitudes. Method of the implicit measuring - an evaluative priming (EPT) carried out with the use of different SOA intervals, explicit methods of research are G.Soldatova-s types of ethnic identity, G.Soldatova-s index of tolerance, E.Bogardus scale of social distance. During five stages of researches received results open some aspects of implicit measuring, its correlation with the results of self-reports on different SOA intervals, connection of implicit measuring with emotional valence of episodic events of participants and other indexes, presenting a contribution to the decision of implicit measuring application problem for study of different social constructs

Salient Points Reduction for Content-Based Image Retrieval

Salient points are frequently used to represent local properties of the image in content-based image retrieval. In this paper, we present a reduction algorithm that extracts the local most salient points such that they not only give a satisfying representation of an image, but also make the image retrieval process efficiently. This algorithm recursively reduces the continuous point set by their corresponding saliency values under a top-down approach. The resulting salient points are evaluated with an image retrieval system using Hausdoff distance. In this experiment, it shows that our method is robust and the extracted salient points provide better retrieval performance comparing with other point detectors.

Fast Facial Feature Extraction and Matching with Artificial Face Models

Facial features are frequently used to represent local properties of a human face image in computer vision applications. In this paper, we present a fast algorithm that can extract the facial features online such that they can give a satisfying representation of a face image. It includes one step for a coarse detection of each facial feature by AdaBoost and another one to increase the accuracy of the found points by Active Shape Models (ASM) in the regions of interest. The resulted facial features are evaluated by matching with artificial face models in the applications of physiognomy. The distance measure between the features and those in the fate models from the database is carried out by means of the Hausdorff distance. In the experiment, the proposed method shows the efficient performance in facial feature extractions and online system of physiognomy.

Mental Illness Stigma and Causal Beliefs: Among Potential Mental Health Professionals

Mental health professionals views about mental illness is an important issue which has not received enough attention. The negative stigma associated with mental illness can have many negative consequences. Unfortunately, health professionals working with the mentally ill can also exhibit stigma. It has been suggested that causal explanations or beliefs around the causes of mental illness may influence stigma. This study aims to gain a greater insight into stigma through examining stigma among potential mental health professionals. Firstly, results found that potential mental health professionals had relatively low social distance t(205) = -3.62, p

Effect of Swirl on Gas-Fired Combustion Behavior in a 3-D Rectangular Combustion Chamber

The objective of this work is to investigate the turbulent reacting flow in a three dimensional combustor with emphasis on the effect of inlet swirl flow through a numerical simulation. Flow field is analyzed using the SIMPLE method which is known as stable as well as accurate in the combustion modeling, and the finite volume method is adopted in solving the radiative transfer equation. In this work, the thermal and flow characteristics in a three dimensional combustor by changing parameters such as equivalence ratio and inlet swirl angle have investigated. As the equivalence ratio increases, which means that more fuel is supplied due to a larger inlet fuel velocity, the flame temperature increases and the location of maximum temperature has moved towards downstream. In the mean while, the existence of inlet swirl velocity makes the fuel and combustion air more completely mixed and burnt in short distance. Therefore, the locations of the maximum reaction rate and temperature were shifted to forward direction compared with the case of no swirl.

Evaluation of Handover Latency in Intra- Domain Mobility

Mobile IPv6 (MIPv6) describes how mobile node can change its point of attachment from one access router to another. As a demand for wireless mobile devices increases, many enhancements for macro-mobility (inter-domain) protocols have been proposed, designed and implemented in Mobile IPv6. Hierarchical Mobile IPv6 (HMIPv6) is one of them that is designed to reduce the amount of signaling required and to improve handover speed for mobile connections. This is achieved by introducing a new network entity called Mobility Anchor Point (MAP). This report presents a comparative study of the Hierarchical Mobility IPv6 and Mobile IPv6 protocols and we have narrowed down the scope to micro-mobility (intra-domain). The architecture and operation of each protocol is studied and they are evaluated based on the Quality of Service (QoS) parameter; handover latency. The simulation was carried out by using the Network Simulator-2. The outcome from this simulation has been discussed. From the results, it shows that, HMIPv6 performs best under intra-domain mobility compared to MIPv6. The MIPv6 suffers large handover latency. As enhancement we proposed to HMIPv6 to locate the MAP to be in the middle of the domain with respect to all Access Routers. That gives approximately same distance between MAP and Mobile Node (MN) regardless of the new location of MN, and possible shorter distance. This will reduce the delay since the distance is shorter. As a future work performance analysis is to be carried for the proposed HMIPv6 and compared to HMIPv6.

A Perceptually Optimized Foveation Based Wavelet Embedded Zero Tree Image Coding

In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Sustainable Design of Impinging Premixed Slot Jets

Cooktop burners are widely used nowadays. In cooktop burner design, nozzle efficiency and greenhouse gas(GHG) emissions mainly depend on heat transfer from the premixed flame to the impinging surface. This is a complicated issue depending on the individual and combined effects of various input combustion variables. Optimal operating conditions for sustainable burner design were rarely addressed, especially in the case of multiple slot-jet burners. Through evaluating the optimal combination of combustion conditions for a premixed slot-jet array, this paper develops a practical approach for the sustainable design of gas cooktop burners. Efficiency, CO and NOx emissions in respect of an array of slot jets using premixed flames were analysed. Response surface experimental design were applied to three controllable factors of the combustion process, viz. Reynolds number, equivalence ratio and jet-to-vessel distance. Desirability Function Approach(DFA) is the analytic technique used for the simultaneous optimization of the efficiency and emission responses.

An Examination and Validation of the Theoretical Resistivity-Temperature Relationship for Conductors

Electrical resistivity is a fundamental parameter of metals or electrical conductors. Since resistivity is a function of temperature, in order to completely understand the behavior of metals, a temperature dependent theoretical model is needed. A model based on physics principles has recently been developed to obtain an equation that relates electrical resistivity to temperature. This equation is dependent upon a parameter associated with the electron travel time before being scattered, and a parameter that relates the energy of the atoms and their separation distance. Analysis of the energy parameter reveals that the equation is optimized if the proportionality term in the equation is not constant but varies over the temperature range. Additional analysis reveals that the theoretical equation can be used to determine the mean free path of conduction electrons, the number of defects in the atomic lattice, and the ‘equivalent’ charge associated with the metallic bonding of the atoms. All of this analysis provides validation for the theoretical model and provides insight into the behavior of metals where performance is affected by temperatures (e.g., integrated circuits and temperature sensors).

The Fatigue Damage Accumulation on Systems of Concentrators

Fatigue tests of specimen-s with numerous holes are presented. The tests were made up till fatigue cracks have been created on both sides of the hole. Their extension was stopping with pressed plastic deformation at the mouth of the detected crack. It is shown that the moments of occurrence of cracks on holes are stochastically dependent. This dependence has positive and negative correlation relations. Shown that the positive correlation is formed across of the applied force, while negative one – along it. The negative relationship extends over a greater distance. The mathematical model of dependence area formation is represented as well as the estimating of model parameters. The positive correlation of fatigue cracks origination can be considered as an extension of one main crack. With negative correlation the first crack locates the place of its origin, leading to the appearance of multiple cracks; do not merge with each other.

Matrix Based Synthesis of EXOR dominated Combinational Logic for Low Power

This paper discusses a new, systematic approach to the synthesis of a NP-hard class of non-regenerative Boolean networks, described by FON[FOFF]={mi}[{Mi}], where for every mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where 'n' represents the number of distinct primary inputs). The method automatically ensures exact minimization for certain important selfdual functions with 2n-1 points in its one-set. The elements meant for grouping are determined from a newly proposed weighted incidence matrix. Then the binary value corresponding to the candidate pair is correlated with the proposed binary value matrix to enable direct synthesis. We recommend algebraic factorization operations as a post processing step to enable reduction in literal count. The algorithm can be implemented in any high level language and achieves best cost optimization for the problem dealt with, irrespective of the number of inputs. For other cases, the method is iterated to subsequently reduce it to a problem of O(n-1), O(n-2),.... and then solved. In addition, it leads to optimal results for problems exhibiting higher degree of adjacency, with a different interpretation of the heuristic, and the results are comparable with other methods. In terms of literal cost, at the technology independent stage, the circuits synthesized using our algorithm enabled net savings over AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of- Products or ESOP forms) and AND-OR-EXOR logic by 45.57%, 41.78% and 41.78% respectively for the various problems. Circuit level simulations were performed for a wide variety of case studies at 3.3V and 2.5V supply to validate the performance of the proposed method and the quality of the resulting synthesized circuits at two different voltage corners. Power estimation was carried out for a 0.35micron TSMC CMOS process technology. In comparison with AOI logic, the proposed method enabled mean savings in power by 42.46%. With respect to AND-EXOR logic, the proposed method yielded power savings to the tune of 31.88%, while in comparison with AND-OR-EXOR level networks; average power savings of 33.23% was obtained.

Grouping and Indexing Color Features for Efficient Image Retrieval

Content-based Image Retrieval (CBIR) aims at searching image databases for specific images that are similar to a given query image based on matching of features derived from the image content. This paper focuses on a low-dimensional color based indexing technique for achieving efficient and effective retrieval performance. In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique. Then the cluster (region) mode is used as representative of the image in 3-D color space. The feature descriptor consists of the representative color of a region and is indexed using a spatial indexing method that uses *R -tree thus avoiding the high-dimensional indexing problems associated with the traditional color histogram. Alternatively, the images in the database are clustered based on region feature similarity using Euclidian distance. Only representative (centroids) features of these clusters are indexed using *R -tree thus improving the efficiency. For similarity retrieval, each representative color in the query image or region is used independently to find regions containing that color. The results of these methods are compared. A JAVA based query engine supporting query-by- example is built to retrieve images by color.

Comparison of Phylogenetic Trees of Multiple Protein Sequence Alignment Methods

Multiple sequence alignment is a fundamental part in many bioinformatics applications such as phylogenetic analysis. Many alignment methods have been proposed. Each method gives a different result for the same data set, and consequently generates a different phylogenetic tree. Hence, the chosen alignment method affects the resulting tree. However in the literature, there is no evaluation of multiple alignment methods based on the comparison of their phylogenetic trees. This work evaluates the following eight aligners: ClustalX, T-Coffee, SAGA, MUSCLE, MAFFT, DIALIGN, ProbCons and Align-m, based on their phylogenetic trees (test trees) produced on a given data set. The Neighbor-Joining method is used to estimate trees. Three criteria, namely, the dNNI, the dRF and the Id_Tree are established to test the ability of different alignment methods to produce closer test tree compared to the reference one (true tree). Results show that the method which produces the most accurate alignment gives the nearest test tree to the reference tree. MUSCLE outperforms all aligners with respect to the three criteria and for all datasets, performing particularly better when sequence identities are within 10-20%. It is followed by T-Coffee at lower sequence identity (30%), trees scores of all methods become similar.

Model Development for Allocation of Raw Material in Timber Processing Industry in Indonesia

This research is intended to develop a raw material allocation model in timber processing industry in Perum Perhutani Unit I, Central Java, Indonesia. The model can be used to determine the quantity of allocation of timber between chain in the supply chain to select supplier considering factors that are log price and the distance. In determining the quantity of allocation of timber between chains in the supply chain, the model considers the optimal inventory in each chain. Whilst the optimal inventory is determined based on demand forecast, the capacity and safety stock. Problem solving allocation is conducted by developing linear programming model that aims to minimize the total cost of the purchase, transportation cost and storage costs at each chain. The results of numerical examples show that the proposed model can generate savings of the purchase cost of 20.84% and select suppliers with mileage closer.

Sequence Relationships Similarity of Swine Influenza a (H1N1) Virus

In April 2009, a new variant of Influenza A virus subtype H1N1 emerged in Mexico and spread all over the world. The influenza has three subtypes in human (H1N1, H1N2 and H3N2) Types B and C influenza tend to be associated with local or regional epidemics. Preliminary genetic characterization of the influenza viruses has identified them as swine influenza A (H1N1) viruses. Nucleotide sequence analysis of the Haemagglutinin (HA) and Neuraminidase (NA) are similar to each other and the majority of their genes of swine influenza viruses, two genes coding for the neuraminidase (NA) and matrix (M) proteins are similar to corresponding genes of swine influenza. Sequence similarity between the 2009 A (H1N1) virus and its nearest relatives indicates that its gene segments have been circulating undetected for an extended period. Nucleic acid sequence Maximum Likelihood (MCL) and DNA Empirical base frequencies, Phylogenetic relationship amongst the HA genes of H1N1 virus isolated in Genbank having high nucleotide sequence homology. In this paper we used 16 HA nucleotide sequences from NCBI for computing sequence relationships similarity of swine influenza A virus using the following method MCL the result is 28%, 36.64% for Optimal tree with the sum of branch length, 35.62% for Interior branch phylogeny Neighber – Join Tree, 1.85% for the overall transition/transversion, and 8.28% for Overall mean distance.

Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.