Recent Advances on Computational Proteomics

In this work we report the recent progresses that have been achieved by our group in the last half decade on the field of computational proteomics. Specifically, we discuss the application of Molecular Dynamics Simulations and Electronic Structure Calculations in drug design, in the clarification of the structural and dynamic properties of proteins and enzymes and in the understanding of the catalytic and inhibition mechanism of cancer-related enzymes. A set of examples illustrate the concepts and help to introduce the reader into this important and fast moving field.

Finding Fuzzy Association Rules Using FWFP-Growth with Linguistic Supports and Confidences

In data mining, the association rules are used to search for the relations of items of the transactions database. Following the data is collected and stored, it can find rules of value through association rules, and assist manager to proceed marketing strategy and plan market framework. In this paper, we attempt fuzzy partition methods and decide membership function of quantitative values of each transaction item. Also, by managers we can reflect the importance of items as linguistic terms, which are transformed as fuzzy sets of weights. Next, fuzzy weighted frequent pattern growth (FWFP-Growth) is used to complete the process of data mining. The method above is expected to improve Apriori algorithm for its better efficiency of the whole association rules. An example is given to clearly illustrate the proposed approach.

Public Participation in Sustainable Urban Planning

Urban planning, in particular on protected landscape areas, demands an increasing role of public participation within the frame of the efficiency of sustainable planning process. The development of urban planning actions in Protected Landscape areas, as Sintra-Cascais Natural Park, should perform a methodological process that is structured over distinct sequential stages, providing the development of a continuous, interactive, integrated and participative planning. From the start of Malveira da Serra and Janes Plan process, several public participation actions were promoted, in order to involve the local agents, stakeholders and the population in the decision of specific local key issues and define the appropriate priorities within the goals and strategies previously settled. As a result, public participation encouraged an innovative process that guarantees the efficiency of sustainable urban planning and promotes a sustainable new way of living in community.

Regular Generalized Star Star closed sets in Bitopological Spaces

The aim of this paper is to introduce the concepts of τ1τ2-regular generalized star star closed sets , τ1τ2-regular generalized star star open sets and study their basic properties in bitopological spaces.

Dynamic Metrics for Polymorphism in Object Oriented Systems

Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.

Using Exponential Lévy Models to Study Implied Volatility patterns for Electricity Options

German electricity European options on futures using Lévy processes for the underlying asset are examined. Implied volatility evolution, under each of the considered models, is discussed after calibrating for the Merton jump diffusion (MJD), variance gamma (VG), normal inverse Gaussian (NIG), Carr, Geman, Madan and Yor (CGMY) and the Black and Scholes (B&S) model. Implied volatility is examined for the entire sample period, revealing some curious features about market evolution, where data fitting performances of the five models are compared. It is shown that variance gamma processes provide relatively better results and that implied volatility shows significant differences through time, having increasingly evolved. Volatility changes for changed uncertainty, or else, increasing futures prices and there is evidence for the need to account for seasonality when modelling both electricity spot/futures prices and volatility.

Validation and Application of a New Optimized RP-HPLC-Fluorescent Detection Method for Norfloxacin

A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.

Automata Theory Approach for Solving Frequent Pattern Discovery Problems

The various types of frequent pattern discovery problem, namely, the frequent itemset, sequence and graph mining problems are solved in different ways which are, however, in certain aspects similar. The main approach of discovering such patterns can be classified into two main classes, namely, in the class of the levelwise methods and in that of the database projection-based methods. The level-wise algorithms use in general clever indexing structures for discovering the patterns. In this paper a new approach is proposed for discovering frequent sequences and tree-like patterns efficiently that is based on the level-wise issue. Because the level-wise algorithms spend a lot of time for the subpattern testing problem, the new approach introduces the idea of using automaton theory to solve this problem.

Assembly and Alignment of Ship Power Plants in Modern Shipbuilding

Fine alignment of main ship power plants mechanisms and shaft lines provides long-term and failure-free performance of propulsion system while fast and high-quality installation of mechanisms and shaft lines decreases common labor intensity. For checking shaft line allowed stress and setting its alignment it is required to perform calculations considering various stages of life cycle. In 2012 JSC SSTC developed special software complex “Shaftline” for calculation of alignment of having its own I/O interface and display of shaft line 3D model. Alignment of shaft line as per bearing loads is rather labor-intensive procedure. In order to decrease its duration, JSC SSTC developed automated alignment system from ship power plants mechanisms. System operation principle is based on automatic simulation of design load on bearings. Initial data for shaft line alignment can be exported to automated alignment system from PC “Shaft line”.

An Efficient Algorithm for Computing all Program Forward Static Slices

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program backward slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. The existing algorithms for computing program slices are introduced to compute a slice at a program point. In these algorithms, the program, or the model that represents the program, is traversed completely or partially once. To compute more than one slice, the same algorithm is applied for every point of interest in the program. Thus, the same program, or program representation, is traversed several times. In this paper, an algorithm is introduced to compute all forward static slices of a computer program by traversing the program representation graph once. Therefore, the introduced algorithm is useful for software engineering applications that require computing program slices at different points of a program. The program representation graph used in this paper is called Program Dependence Graph (PDG).

Optimization Using Simulation of the Vehicle Routing Problem

A key element of many distribution systems is the routing and scheduling of vehicles servicing a set of customers. A wide variety of exact and approximate algorithms have been proposed for solving the vehicle routing problems (VRP). Exact algorithms can only solve relatively small problems of VRP, which is classified as NP-Hard. Several approximate algorithms have proven successful in finding a feasible solution not necessarily optimum. Although different parts of the problem are stochastic in nature; yet, limited work relevant to the application of discrete event system simulation has addressed the problem. Presented here is optimization using simulation of VRP; where, a simplified problem has been developed in the ExtendSimTM simulation environment; where, ExtendSimTM evolutionary optimizer is used to minimize the total transportation cost of the problem. Results obtained from the model are very satisfactory. Further complexities of the problem are proposed for consideration in the future.

An Automated Test Setup for the Characterization of Antenna in CATR

This paper describes the development of a fully automated measurement software for antenna radiation pattern measurements in a Compact Antenna Test Range (CATR). The CATR has a frequency range from 2-40 GHz and the measurement hardware includes a Network Analyzer for transmitting and Receiving the microwave signal and a Positioner controller to control the motion of the Styrofoam column. The measurement process includes Calibration of CATR with a Standard Gain Horn (SGH) antenna followed by Gain versus angle measurement of the Antenna under test (AUT). The software is designed to control a variety of microwave transmitter / receiver and two axis Positioner controllers through the standard General Purpose interface bus (GPIB) interface. Addition of new Network Analyzers is supported through a slight modification of hardware control module. Time-domain gating is implemented to remove the unwanted signals and get the isolated response of AUT. The gated response of the AUT is compared with the calibration data in the frequency domain to obtain the desired results. The data acquisition and processing is implemented in Agilent VEE and Matlab. A variety of experimental measurements with SGH antennas were performed to validate the accuracy of software. A comparison of results with existing commercial softwares is presented and the measured results are found to be within .2 dBm.

A Hybrid Fuzzy AGC in a Competitive Electricity Environment

This paper presents a new Hybrid Fuzzy (HF) PID type controller based on Genetic Algorithms (GA-s) for solution of the Automatic generation Control (AGC) problem in a deregulated electricity environment. In order for a fuzzy rule based control system to perform well, the fuzzy sets must be carefully designed. A major problem plaguing the effective use of this method is the difficulty of accurately constructing the membership functions, because it is a computationally expensive combinatorial optimization problem. On the other hand, GAs is a technique that emulates biological evolutionary theories to solve complex optimization problems by using directed random searches to derive a set of optimal solutions. For this reason, the membership functions are tuned automatically using a modified GA-s based on the hill climbing method. The motivation for using the modified GA-s is to reduce fuzzy system effort and take large parametric uncertainties into account. The global optimum value is guaranteed using the proposed method and the speed of the algorithm-s convergence is extremely improved, too. This newly developed control strategy combines the advantage of GA-s and fuzzy system control techniques and leads to a flexible controller with simple stricture that is easy to implement. The proposed GA based HF (GAHF) controller is tested on a threearea deregulated power system under different operating conditions and contract variations. The results of the proposed GAHF controller are compared with those of Multi Stage Fuzzy (MSF) controller, robust mixed H2/H∞ and classical PID controllers through some performance indices to illustrate its robust performance for a wide range of system parameters and load changes.

Digital Automatic Gain Control Integrated on WLAN Platform

In this work we present a solution for DAGC (Digital Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4 GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used enables gain control over Low Noise Amplifier (LNA) and a Variable Gain Amplifier (VGA). The control over those signals is performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the average power of the baseband signal close to the desired set point. DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and actual gain setting, adjusting a gain factor of the accumulation, and applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.

Effect of Non Uniformity Factors and Assignment Factors on Errors in Charge Simulation Method with Point Charge Model

Charge Simulation Method (CSM) is one of the very widely used numerical field computation technique in High Voltage (HV) engineering. The high voltage fields of varying non uniformities are encountered in practice. CSM programs being case specific, the simulation accuracies heavily depend on the user (programmers) experience. Here is an effort to understand CSM errors and evolve some guidelines to setup accurate CSM models, relating non uniformities with assignment factors. The results are for the six-point-charge model of sphere-plane gap geometry. Using genetic algorithm (GA) as tool, optimum assignment factors at different non uniformity factors for this model have been evaluated and analyzed. It is shown that the symmetrically placed six-point-charge models can be good enough to set up CSM programs with potential errors less than 0.1% when the field non uniformity factor is greater than 2.64 (field utilization factor less than 52.76%).

Analysis of Physicochemical Properties on Prediction of R5, X4 and R5X4 HIV-1 Coreceptor Usage

Bioinformatics methods for predicting the T cell coreceptor usage from the array of membrane protein of HIV-1 are investigated. In this study, we aim to propose an effective prediction method for dealing with the three-class classification problem of CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made efforts in investigating the coreceptor prediction problem as follows: 1) proposing a feature set of informative physicochemical properties which is cooperated with SVM to achieve high prediction test accuracy of 81.48%, compared with the existing method with accuracy of 70.00%; 2) establishing a large up-to-date data set by increasing the size from 159 to 1225 sequences to verify the proposed prediction method where the mean test accuracy is 88.59%, and 3) analyzing the set of 14 informative physicochemical properties to further understand the characteristics of HIV-1coreceptors.

A Novel Method to Evaluate Line Loadability for Distribution Systems with Realistic Loads

This paper presents a simple method for estimation of additional load as a factor of the existing load that may be drawn before reaching the point of line maximum loadability of radial distribution system (RDS) with different realistic load models at different substation voltages. The proposed method involves a simple line loadability index (LLI) that gives a measure of the proximity of the present state of a line in the distribution system. The LLI can use to assess voltage instability and the line loading margin. The proposed method also compares with the existing method of maximum loadability index [10]. The simulation results show that the LLI can identify not only the weakest line/branch causing system instability but also the system voltage collapse point when it is near one. This feature enables us to set an index threshold to monitor and predict system stability on-line so that a proper action can be taken to prevent the system from collapse. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on two bus and 69 bus RDS.

Quality of Non-Point Source Pollutant Identification using Digital Image and Remote Sensing Image

The integration between technology of remote sensing, information from the data of digital image, and modeling technology for the simulation of water quality will provide easiness during the observation on the quality of water changes on the river surface. For example, Ciliwung River which is contaminated with non-point source pollutant from household wastes, particularly on its downstream. This fact informed that the quality of water in this river is getting worse. The land use for settlements and housing ranges between 62.84% - 81.26% on the downstream of Ciliwung River, give a significant picture in seeing factors that affected the water quality of Ciliwung River.

OCR for Script Identification of Hindi (Devnagari) Numerals using Feature Sub Selection by Means of End-Point with Neuro-Memetic Model

Recognition of Indian languages scripts is challenging problems. In Optical Character Recognition [OCR], a character or symbol to be recognized can be machine printed or handwritten characters/numerals. There are several approaches that deal with problem of recognition of numerals/character depending on the type of feature extracted and different way of extracting them. This paper proposes a recognition scheme for handwritten Hindi (devnagiri) numerals; most admired one in Indian subcontinent. Our work focused on a technique in feature extraction i.e. global based approach using end-points information, which is extracted from images of isolated numerals. These feature vectors are fed to neuro-memetic model [18] that has been trained to recognize a Hindi numeral. The archetype of system has been tested on varieties of image of numerals. . In proposed scheme data sets are fed to neuro-memetic algorithm, which identifies the rule with highest fitness value of nearly 100 % & template associates with this rule is nothing but identified numerals. Experimentation result shows that recognition rate is 92-97 % compared to other models.

Application of a Similarity Measure for Graphs to Web-based Document Structures

Due to the tremendous amount of information provided by the World Wide Web (WWW) developing methods for mining the structure of web-based documents is of considerable interest. In this paper we present a similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as linear integer strings, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments for solving a novel and challenging problem: Measuring the structural similarity of generalized trees. In other words: We first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem for developing a efficient graph similarity measure. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based document structures.