Sequence-based Prediction of Gamma-turn Types using a Physicochemical Property-based Decision Tree Method

The γ-turns play important roles in protein folding and molecular recognition. The prediction and analysis of γ-turn types are important for both protein structure predictions and better understanding the characteristics of different γ-turn types. This study proposed a physicochemical property-based decision tree (PPDT) method to interpretably predict γ-turn types. In addition to the good prediction performance of PPDT, three simple and human interpretable IF-THEN rules are extracted from the decision tree constructed by PPDT. The identified informative physicochemical properties and concise rules provide a simple way for discriminating and understanding γ-turn types.

What Have Banks Done Wrong?

This paper aims to provide a conceptual framework to examine competitive disadvantage of banks that suffer from poor performance. Banks generate revenues mainly from the interest rate spread on taking deposits and making loans while collecting fees in the process. To maximize firm value, banks seek loan growth and expense control while managing risk associated with loans with respect to non-performing borrowers or narrowing interest spread between assets and liabilities. Competitive disadvantage refers to the failure to access imitable resources and to build managing capabilities to gain sustainable return given appropriate risk management. This paper proposes a four-quadrant framework of organizational typology is subsequently proposed to examine the features of competitive disadvantage in the banking sector. A resource configuration model, which is extracted from CAMEL indicators to examine the underlying features of bank failures.

Network Intrusion Detection Design Using Feature Selection of Soft Computing Paradigms

The network traffic data provided for the design of intrusion detection always are large with ineffective information and enclose limited and ambiguous information about users- activities. We study the problems and propose a two phases approach in our intrusion detection design. In the first phase, we develop a correlation-based feature selection algorithm to remove the worthless information from the original high dimensional database. Next, we design an intrusion detection method to solve the problems of uncertainty caused by limited and ambiguous information. In the experiments, we choose six UCI databases and DARPA KDD99 intrusion detection data set as our evaluation tools. Empirical studies indicate that our feature selection algorithm is capable of reducing the size of data set. Our intrusion detection method achieves a better performance than those of participating intrusion detectors.

Numerical Analysis of Thermal Conductivity of Non-Charring Material Ablation Carbon-Carbon and Graphite with Considering Chemical Reaction Effects, Mass Transfer and Surface Heat Transfer

Nowadays, there is little information, concerning the heat shield systems, and this information is not completely reliable to use in so many cases. for example, the precise calculation cannot be done for various materials. In addition, the real scale test has two disadvantages: high cost and low flexibility, and for each case we must perform a new test. Hence, using numerical modeling program that calculates the surface recession rate and interior temperature distribution is necessary. Also, numerical solution of governing equation for non-charring material ablation is presented in order to anticipate the recession rate and the heat response of non-charring heat shields. the governing equation is nonlinear and the Newton- Rafson method along with TDMA algorithm is used to solve this nonlinear equation system. Using Newton- Rafson method for solving the governing equation is one of the advantages of the solving method because this method is simple and it can be easily generalized to more difficult problems. The obtained results compared with reliable sources in order to examine the accuracy of compiling code.

Improved Torque Control of Electrical Load Simulator with Parameters and State Estimation

ELS is an important ground based hardware in the loop simulator used for aerodynamics torque loading experiments of the actuators under test. This work focuses on improvement of the transient response of torque controller with parameters uncertainty of Electrical Load Simulator (ELS).The parameters of load simulator are estimated online and the model is updated, eliminating the model error and improving the steady state torque tracking response of torque controller. To improve the Transient control performance the gain of robust term of SMC is updated online using fuzzy logic system based on the amount of uncertainty in parameters of load simulator. The states of load simulator which cannot be measured directly are estimated using luenberger observer with update of new estimated parameters. The stability of the control scheme is verified using Lyapunov theorem. The validity of proposed control scheme is verified using simulations.

A New Approach to Polynomial Neural Networks based on Genetic Algorithm

Recently, a lot of attention has been devoted to advanced techniques of system modeling. PNN(polynomial neural network) is a GMDH-type algorithm (Group Method of Data Handling) which is one of the useful method for modeling nonlinear systems but PNN performance depends strongly on the number of input variables and the order of polynomial which are determined by trial and error. In this paper, we introduce GPNN (genetic polynomial neural network) to improve the performance of PNN. GPNN determines the number of input variables and the order of all neurons with GA (genetic algorithm). We use GA to search between all possible values for the number of input variables and the order of polynomial. GPNN performance is obtained by two nonlinear systems. the quadratic equation and the time series Dow Jones stock index are two case studies for obtaining the GPNN performance.

Study of Chest Pain and its Risk Factors in Over 30 Year-Old Individuals

Chest pain is one of the most prevalent complaints among adults that cause the people to attend to medical centers. The aim was to determine the prevalence and risk factors of chest pain among over 30 years old people in Tehran. In this cross-sectional study, 787 adults took part from Apr 2005 until Apr 2006. The sampling method was random cluster sampling and there were 25 clusters. In each cluster, interviews were performed with 32 over 30 years old, people lived in those houses. In cases with chest pain, extra questions asked. The prevalence of CP was 9% (71 cases). Of them 21 cases (6.5%) were in 41-60 year age ranges and the remainders were over 61 year old. 19 cases (26.8%) mentioned CP in resting state and all of the cases had exertion onset CP. The CP duration was 10 minutes or less in all of the cases and in most of them (84.5%), the location of pain mentioned left anterior part of chest, left anterior part of sternum and or left arm. There was positive history of myocardial infarction in 12 cases (17%). There was significant relation between CP and age, sex and between history of myocardial infarction and marital state of study people. Our results are similar to other studies- results in most parts, however it is necessary to perform supplementary tests and follow up studies to differentiate between cardiac and non-cardiac CP exactly.

A Type of Urban Genesis in Romanian Outer-Carpathian Area: the Genoan Cities

The Mongol expansion in the West and the political and commercial interests arising from antagonisms between the Golden Horde and the Persian Ilkhanate determined the transformation of the Black Sea into an international trade turntable beginning with the last third of the XIIIth century. As the Volga Khanate attracted the maritime power of Genoa in the transcontinental project of deviating the Silk Road to its own benefit, the latter took full advantage of the new historical conjuncture, to the detriment of its rival, Venice. As a consequence, Genoa settled important urban centers on the Pontic shores, having mainly a commercial role. In the Romanian outer-Carpathian area, Vicina, Cetatea Albâ, and Chilia are notable, representing distinct, important types of cities within the broader context of the Romanian medieval urban genesis typology.

The Analysis of Two-Phase Jet in Pneumatic Powder Injection into Liquid Alloys

The results of the two-phase gas-solid jet in pneumatic powder injection process analysis were presented in the paper. The researches were conducted on model set-up with high speed camera jet movement recording. Then the recorded material was analyzed to estimate main particles movement parameters. The values obtained from this direct measurement were compared to those calculated with the use of the well-known formulas for the two-phase flows (pneumatic conveying). Moreover, they were compared to experimental results previously achieved by authors. The analysis led to conclusions which to some extent changed the assumptions used even by authors, regarding the two-phase jet in pneumatic powder injection process. Additionally, the visual analysis of the recorded clips supplied data to make a more complete evaluation of the jet behavior in the lance outlet than before.

EPR Hiding in Medical Images for Telemedicine

Medical image data hiding has strict constrains such as high imperceptibility, high capacity and high robustness. Achieving these three requirements simultaneously is highly cumbersome. Some works have been reported in the literature on data hiding, watermarking and stegnography which are suitable for telemedicine applications. None is reliable in all aspects. Electronic Patient Report (EPR) data hiding for telemedicine demand it blind and reversible. This paper proposes a novel approach to blind reversible data hiding based on integer wavelet transform. Experimental results shows that this scheme outperforms the prior arts in terms of zero BER (Bit Error Rate), higher PSNR (Peak Signal to Noise Ratio), and large EPR data embedding capacity with WPSNR (Weighted Peak Signal to Noise Ratio) around 53 dB, compared with the existing reversible data hiding schemes.

Analysis of DNA-Recognizing Enzyme Interaction using Deaminated Lesions

Deaminated lesions were produced via nitrosative oxidation of natural nucleobases; uracul (Ura, U) from cytosine (Cyt, C), hypoxanthine (Hyp, H) from adenine (Ade, A), and xanthine (Xan, X) and oxanine (Oxa, O) from guanine (Gua, G). Such damaged nucleobases may induce mutagenic problems, so that much attentions and efforts have been poured on the revealing of their mechanisms in vivo or in vitro. In this study, we employed these deaminated lesions as useful probes for analysis of DNA-binding/recognizing proteins or enzymes. Since the pyrimidine lesions such as Hyp, Oxa and Xan are employed as analogues of guanine, their comparative uses are informative for analyzing the role of Gua in DNA sequence in DNA-protein interaction. Several DNA oligomers containing such Hyp, Oxa or Xan substituted for Gua were designed to reveal the molecular interaction between DNA and protein. From this approach, we have got useful information to understand the molecular mechanisms of the DNA-recognizing enzymes, which have not ever been observed using conventional DNA oligomer composed of just natural nucleobases.

Application of the Data Distribution Service for Flexible Manufacturing Automation

This paper discusses the applicability of the Data Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an infrastructure for platform-independent many-to-many communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory footprints and high robustness requirements. After an overview of the standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.

Comanche – A Compiler-Driven I/O Management System

Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.

Entropy Based Spatial Design: A Genetic Algorithm Approach (Case Study)

We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.

Influence of Atmospheric Physical Effects on Static Behavior of Building Plate Components Made of Fiber-Cement-Based Materials

The paper presents the brief information on particular results of experimental study focused to the problems of behavior of structural plated components made of fiber-cement-based materials and used in building constructions, exposed to atmospheric physical effects given by the weather changes in the summer period. Weather changes represented namely by temperature and rain cause also the changes of the temperature and moisture of the investigated structural components. This can affect their static behavior that means stresses and deformations, which have been monitored as the main outputs of tests performed. Experimental verification is based on the simulation of the influence of temperature and rain using the defined procedure of warming and water sprinkling with respect to the corresponding weather conditions during summer period in the South Moravian region at the Czech Republic, for which the application of these structural components is mainly planned. Two types of components have been tested: (i) glass-fiber-concrete panels used for building façades and (ii) fiber-cement slabs used mainly for claddings, but also as a part of floor structures or lost shuttering, and so on.

Evaluation on Recent Committed Crypt Analysis Hash Function

This paper describes the study of cryptographic hash functions, one of the most important classes of primitives used in recent techniques in cryptography. The main aim is the development of recent crypt analysis hash function. We present different approaches to defining security properties more formally and present basic attack on hash function. We recall Merkle-Damgard security properties of iterated hash function. The Main aim of this paper is the development of recent techniques applicable to crypt Analysis hash function, mainly from SHA family. Recent proposed attacks an MD5 & SHA motivate a new hash function design. It is designed not only to have higher security but also to be faster than SHA-256. The performance of the new hash function is at least 30% better than that of SHA-256 in software. And it is secure against any known cryptographic attacks on hash functions.

An Enhanced Artificial Neural Network for Air Temperature Prediction

The mitigation of crop loss due to damaging freezes requires accurate air temperature prediction models. An improved model for temperature prediction in Georgia was developed by including information on seasonality and modifying parameters of an existing artificial neural network model. Alternative models were compared by instantiating and training multiple networks for each model. The inclusion of up to 24 hours of prior weather information and inputs reflecting the day of year were among improvements that reduced average four-hour prediction error by 0.18°C compared to the prior model. Results strongly suggest model developers should instantiate and train multiple networks with different initial weights to establish appropriate model parameters.

Layered Multiple Description Coding For Robust Video Transmission Over Wireless Ad-Hoc Networks

This paper presents a video transmission system using layered multiple description (coding (MDC) and multi-path transport for reliable video communications in wireless ad-hoc networks. The proposed MDC extends a quality-scalable H.264/AVC video coding algorithm to generate two independent descriptions. The two descriptions are transmitted over different paths to a receiver in order to alleviate the effect of unstable channel conditions of wireless adhoc networks. If one description is lost due to transmission erros, then the correctly received description is used to estimate the lost information of the corrupted description. The proposed MD coder maintains an adequate video quality as long as both description are not simultaneously lost. Simulation results show that the proposed MD coding combined with multi-path transport system is largely immune to packet losses, and therefore, can be a promising solution for robust video communications over wireless ad-hoc networks.

Managing the Information System Life Cycle in Construction and Manufacturing

In this paper we present the information life cycle and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here corresponds not just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise and in a manufacturing enterprise.

Performance Modeling for Web based J2EE and .NET Applications

When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.