New Graph Similarity Measurements based on Isomorphic and Nonisomorphic Data Fusion and their Use in the Prediction of the Pharmacological Behavior of Drugs

New graph similarity methods have been proposed in this work with the aim to refining the chemical information extracted from molecules matching. For this purpose, data fusion of the isomorphic and nonisomorphic subgraphs into a new similarity measure, the Approximate Similarity, was carried out by several approaches. The application of the proposed method to the development of quantitative structure-activity relationships (QSAR) has provided reliable tools for predicting several pharmacological parameters: binding of steroids to the globulin-corticosteroid receptor, the activity of benzodiazepine receptor compounds, and the blood brain barrier permeability. Acceptable results were obtained for the models presented here.

Gauteng-s Waste Outlook: A Reflection

Gauteng, as the province with the greatest industrial and population density, the economic hub of South Africa also generates the greatest amount of waste, both general and hazardous. Therefore the province has a significant need to develop and apply appropriate integrated waste management policies that ensure that waste is recognised as a serious problem and is managed in an effective integrated manner to preserve both the present and future human health and environment. This paper reflects on Gauteng-s waste outlook in particular the province-s General Waste Minimisation Plan and its Integrated Waste Management Policy. The paper also looks at general waste generation, recyclable waste streams as well as recycling and separation at source initiatives in the province. Both the quantity and nature of solid waste differs considerably across the socio-economic spectrum. People in informal settlements generate an average of 0.16 kg per person per day whereas 2 kg per day is not unusual in affluent areas. For example the amount of waste generated in Johannesburg is approximately 1.2 kg per person per day.

The Influence of Heat Treatment on Antimicrobial Proteins in Milk

the obligatory step during immunoglobulin and lysozyme concentration process is thermal treatment. The combination of temperature and time used in processing can affect the structure of the proteins and involve unfolding and aggregation. The aim of the present study was to evaluate the heat stability of total Igs, the particular immunoglobulin classes and lysozyme in milk. Milk samples were obtained from conventional dairy herd in Latvia. Raw milk samples were pasteurized in different regimes: 63 °C 30 min, 72 °C 15-20 s, 78 °C 15-20 s, 85 °C 15-20 s, 95 °C 15-20 s. The concentrations of Igs (IgA, IgG, IgM) and lysozyme were determined by turbodimetric method. During research was established, that activity of antimicrobial proteins decreases differently. Less concentration reduce was established in a case of lysozyme.

Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Fiber Optic Sensors

Fiber optic sensor technology offers the possibility of sensing different parameters like strain, temperature, pressure in harsh environment and remote locations. these kinds of sensors modulates some features of the light wave in an optical fiber such an intensity and phase or use optical fiber as a medium for transmitting the measurement information. The advantages of fiber optic sensors in contrast to conventional electrical ones make them popular in different applications and now a day they consider as a key component in improving industrial processes, quality control systems, medical diagnostics, and preventing and controlling general process abnormalities. This paper is an introduction to fiber optic sensor technology and some of the applications that make this branch of optic technology, which is still in its early infancy, an interesting field.

The Association between the Firm Characteristics and Corporate Mandatory Disclosure the Case of Greece

The main thrust of this paper is to assess the level of disclosure in the annual reports of non-financial Greek firms and to empirically investigate the hypothesized impact of several firm characteristics on the extent of mandatory disclosure. A disclosure checklist consisting of 100 mandatory items was developed to assess the level of disclosure in the 2009 annual reports of 43 Greek companies listed at the Athens stock exchange. The association between the level of disclosure and some firm characteristics was examined using multiple linear regression analysis. The study reveals that Greek companies on general have responded adequately to the mandatory disclosure requirements of the regulatory bodies. The findings also indicate that firm size was significant positively associated with the level of disclosure. The remaining variables such as age, profitability, liquidity, and board composition were found to be insignificant in explaining the variation of mandatory disclosures. The outcome of this study is undoubtedly of great concern to the investment community at large to assist in evaluating the extent of mandatory disclosure by Greek firms and explaining the variation of disclosure in light of firm-specific characteristics.

Adaptation of State/Transition-Based Methods for Embedded System Testing

In this paper test generation methods and appropriate fault models for testing and analysis of embedded systems described as (extended) finite state machines ((E)FSMs) are presented. Compared to simple FSMs, EFSMs specify not only the control flow but also the data flow. Thus, we define a two-level fault model to cover both aspects. The goal of this paper is to reuse well-known FSM-based test generation methods for automation of embedded system testing. These methods have been widely used in testing and validation of protocols and communicating systems. In particular, (E)FSMs-based specification and testing is more advantageous because (E)FSMs support the formal semantic of already standardised formal description techniques (FDTs) despite of their popularity in the design of hardware and software systems.

Royal Mound “Baygetobe“ from the Burial Ground Shilikty

Mounds are one of the most valuable sources of information on various aspects of life, household skills, rituals and beliefs of the ancient peoples of Kazakhstan. Moreover, the objects associated with the cult of the burial of the dead are the most informative, and often the only source of knowledge about past eras. The present study is devoted to some results of the excavations carried out on the mound "Baygetobe" of Shilikti burial ground. The purpose of the work is associated with certain categories of grave goods and reading "Fine Text" of Shilikti graves, whose structure is the same for burials of nobles and ordinary graves. The safety of a royal burial mounds, the integrity and completeness of the source are of particular value for studying.

ASLT Method for Beer Accelerated Shelf-Life Determination

The aim of current research was to investigate ASLT method suitability for accelerated beer shelf-life determination. The research was accomplished on popular Latvian beer: light filtrated and unfiltered pasteurized beer with alcohol content 5.2%; dark filtrated pasteurized beer with alcohol content 4.2% with shelf-life five months. Bottled in dark glass bottles beer samples were storage during 20 weeks at several temperature regimes: +10±1 °C, +20±1 °C, +30±1 °C, +40±1 °C. Samples quality parameters as physically-chemical and microbiological was tested every two weeks using standard methods. It is possible to determine beer shelf-life rapidly during storage at +30±1 °C for filtered pasteurized light beer by 2.5 times, unfiltered pasteurized light beer by 1.4 times and for filtered pasteurized dark beer by 1.7 times. During preset experiments it was proved, that it is possible to determine beer shelf-life rapidly using ASLT method if beer storage temperature could be increased by +10±1 °C.

Integrated Energy-Aware Mechanism for MANETs using On-demand Routing

Mobile Ad Hoc Networks (MANETs) are multi-hop wireless networks in which all nodes cooperatively maintain network connectivity. In such a multi-hop wireless network, every node may be required to perform routing in order to achieve end-to-end communication among nodes. These networks are energy constrained as most ad hoc mobile nodes today operate with limited battery power. Hence, it is important to minimize the energy consumption of the entire network in order to maximize the lifetime of ad hoc networks. In this paper, a mechanism involving the integration of load balancing approach and transmission power control approach is introduced to maximize the life-span of MANETs. The mechanism is applied on Ad hoc On-demand Vector (AODV) protocol to make it as energy aware AODV (EA_AODV). The simulation is carried out using GloMoSim2.03 simulator. The results show that the proposed mechanism reduces the average required transmission energy per packet compared to the standard AODV.

Environmental Management System for Tourist Accommodations in Amphawa, Samut Songkram,Thailand

Amphawa is the most popular weekend destination for both domestic and international tourists in Thailand. More than 112 homestays and resorts have been developed along the water resources. This research aims to initiate appropriate environmental management system for riverside tourist accommodations in Amphawa by investigating current environmental characteristics. Eighty-eight riverside tourist accommodations were survey from specific questionnaire, GPS data were also gathered for spatial analysis. The results revealed that the accommodations are welled manage in regards to some environmental aspects. In order to reduce economic costs, energy efficiency equipment is utilized. A substantial number of tourist accommodations encouraged waste separation, followed by transfer to local administration organization. Grease traps also utilized in order to decrease chemical discharged, grease and oil from canteen and restaurants on natural environment. The most notable mitigation is to initiate environmental friendly cleansers for tourist accommodation along the riverside in tourism destinations.

Measurement of the Bipolarization Events

We intend to point out the differences which exist between the classical Gini concentration coefficient and a proposed bipolarization index defined for an arbitrary random variable which have a finite support. In fact Gini's index measures only the "poverty degree" for the individuals from a given population taking into consideration their wages. The Gini coefficient is not so sensitive to the significant income variations in the "rich people class" . In practice there are multiple interdependent relations between the pauperization and the socio-economical polarization phenomena. The presence of a strong pauperization aspect inside the population induces often a polarization effect in this society. But the pauperization and the polarization phenomena are not identical. For this reason it isn't always adequate to use a Gini type coefficient, based on the Lorenz order, to estimate the bipolarization level of the individuals from the studied population. The present paper emphasizes these ideas by considering two families of random variables which have a linear or a triangular type distributions. In addition, the continuous variation, depending on the parameter "time" of the chosen distributions, could simulate a real dynamical evolution of the population.

Dual Pyramid of Agents for Image Segmentation

An effective method for the early detection of breast cancer is the mammographic screening. One of the most important signs of early breast cancer is the presence of microcalcifications. For the detection of microcalcification in a mammography image, we propose to conceive a multiagent system based on a dual irregular pyramid. An initial segmentation is obtained by an incremental approach; the result represents level zero of the pyramid. The edge information obtained by application of the Canny filter is taken into account to affine the segmentation. The edge-agents and region-agents cooper level by level of the pyramid by exploiting its various characteristics to provide the segmentation process convergence.

Food Deserts and the Sociology of Space: Distance to Food Retailers and Food Insecurity in an Urban American Neighborhood

Recent changes in food retailing structure have led to the development of large supercenters in suburban areas of the United States. These changes have led some authors to suggest that there are food deserts in some urban areas, where food is difficult to access, especially for disadvantaged consumers. This study tests the food desert hypothesis by comparing the distance from food retailers to food secure and food insecure households in one urban, Midwest neighborhood. This study utilizes GIS to compare household survey respondent locations against the location of various types of area food retailers. Results of this study indicate no apparent difference between food secure and insecure households in the reported importance of distance on the decision to shop at various retailers. However, there were differences in the spatial relationship between households and retailers. Food insecure households tended to be located slightly farther from large food retailers and slightly closer to convenience stores. Furthermore, food insecure households reported traveling slightly farther to their primary food retailer. The differences between the two groups was, however, relatively small.

Similarity Measure Functions for Strategy-Based Biometrics

Functioning of a biometric system in large part depends on the performance of the similarity measure function. Frequently a generalized similarity distance measure function such as Euclidian distance or Mahalanobis distance is applied to the task of matching biometric feature vectors. However, often accuracy of a biometric system can be greatly improved by designing a customized matching algorithm optimized for a particular biometric application. In this paper we propose a tailored similarity measure function for behavioral biometric systems based on the expert knowledge of the feature level data in the domain. We compare performance of a proposed matching algorithm to that of other well known similarity distance functions and demonstrate its superiority with respect to the chosen domain.

DIFFER: A Propositionalization approach for Learning from Structured Data

Logic based methods for learning from structured data is limited w.r.t. handling large search spaces, preventing large-sized substructures from being considered by the resulting classifiers. A novel approach to learning from structured data is introduced that employs a structure transformation method, called finger printing, for addressing these limitations. The method, which generates features corresponding to arbitrarily complex substructures, is implemented in a system, called DIFFER. The method is demonstrated to perform comparably to an existing state-of-art method on some benchmark data sets without requiring restrictions on the search space. Furthermore, learning from the union of features generated by finger printing and the previous method outperforms learning from each individual set of features on all benchmark data sets, demonstrating the benefit of developing complementary, rather than competing, methods for structure classification.

Frequency-Energy Characteristics of Local Earthquakes using Discrete Wavelet Transform(DWT)

The wavelet transform is one of the most important method used in signal processing. In this study, we have introduced frequency-energy characteristics of local earthquakes using discrete wavelet transform. Frequency-energy characteristic was analyzed depend on difference between P and S wave arrival time and noise within records. We have found that local earthquakes have similar characteristics. If frequency-energy characteristics can be found accurately, this gives us a hint to calculate P and S wave arrival time. It can be seen that wavelet transform provides successful approximation for this. In this study, 100 earthquakes with 500 records were analyzed approximately.

Engineering Study and Equipment Design: Effects of Temperature and design variables on Yield of a Multi-Stage Distillator

The distillation process in the general sense is a relatively simple technique from the standpoints of its principles. When dedicating distillation to water treatment and specifically producing fresh water from sea, ocean and/ briny waters it is interesting to notice that distillation has no limitations or domains of applicability regarding the nature or the type of the feedstock water. This is not the case however for other techniques that are technologically quite complex, necessitate bigger capital investments and are limited in their usability. In a previous paper we have explored some of the effects of temperature on yield. In this paper, we continue building onto that knowledge base and focus on the effects of several additional engineering and design variables on productivity.

Evaluating the Effectiveness of Memory Overcommit Techniques on KVM-based Hosting Platform

Determining how many virtual machines a Linux host could run can be a challenge. One of tough missions is to find the balance among performance, density and usability. Now KVM hypervisor has become the most popular open source full virtualization solution. It supports several ways of running guests with more memory than host really has. Due to large differences between minimum and maximum guest memory requirements, this paper presents initial results on same-page merging, ballooning and live migration techniques that aims at optimum memory usage on KVM-based cloud platform. Given the design of initial experiments, the results data is worth reference for system administrators. The results from these experiments concluded that each method offers different reliability tradeoff.

Kinematic Analysis of Roll Motion for a Strut/SLA Suspension System

The roll center is one of the key parameters for designing a suspension. Several driving characteristics are affected significantly by the migration of the roll center during the suspension-s motion. The strut/SLA (strut/short-long-arm) suspension, which is widely used in production cars, combines the space-saving characteristics of a MacPherson strut suspension with some of the preferred handling characteristics of an SLA suspension. In this study, a front strut/SLA suspension is modeled by ADAMS/Car software. Kinematic roll analysis is then employed to investigate how the rolling characteristics change under the wheel travel and steering input. The related parameters, including the roll center height, roll camber gain, toe change, scrub radius and wheel track width change, are analyzed and discussed. It is found that the strut/SLA suspension clearly has a higher roll center than strut and SLA suspensions do. The variations in the roll center height under roll analysis are very different as the wheel travel displacement and steering angle are added. The results of the roll camber gain, scrub radius and wheel track width change are considered satisfactory. However, the toe change is too large and needs fine-tuning through a sensitivity analysis.