Enhanced Character Based Algorithm for Small Parsimony

Phylogenetic tree is a graphical representation of the evolutionary relationship among three or more genes or organisms. These trees show relatedness of data sets, species or genes divergence time and nature of their common ancestors. Quality of a phylogenetic tree requires parsimony criterion. Various approaches have been proposed for constructing most parsimonious trees. This paper is concerned about calculating and optimizing the changes of state that are needed called Small Parsimony Algorithms. This paper has proposed enhanced small parsimony algorithm to give better score based on number of evolutionary changes needed to produce the observed sequence changes tree and also give the ancestor of the given input.

Comparison of Current Chinese and Japanese Design Specification for Bridge Pile in Liquefied Ground

Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.

PID Controller Design for Following Control of Hard Disk Drive by Characteristic Ratio Assignment Method

The author present PID controller design for following control of hard disk drive by characteristic ratio assignment method. The study in this paper concerns design of a PID controller which sufficiently robust to the disturbances and plant perturbations on following control of hard disk drive. Characteristic Ratio Assignment (CRA) is shown to be an efficient control technique to serve this requirement. The controller design by CRA is based on the choice of the coefficients of the characteristic polynomial of the closed loop system according to the convenient performance criteria such as equivalent time constant and ration of characteristic coefficient. Hence, in this study, CRA method is applied in PID controller design for following control of hard disk drive. Matlab simulation results shown that CRA design is fairly stable and robust whilst giving the convenience in controller-s parameters adjustment.

Analyzing the Factors Effecting the Passenger Car Breakdowns using Com-Poisson GLM

Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observations as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use quasi-likelihood estimation approach to estimate the parameters of the model. Under-dispersion parameter is estimated to be 2.14 justifying the appropriateness of Com-Poisson distribution in modelling under-dispersed count responses recorded in this study.

In vivo Histomorphometric and Corrosion Analysis of Ti-Ni-Cr Shape Memory Alloys in Rabbits

A series of Ti based shape memory alloys with composition of Ti50Ni49Cr1, Ti50Ni47Cr3 and Ti50Ni45Cr5 were developed by vacuum arc-melting under a purified argon atmosphere. The histometric and corrosion evaluation of Ti-Ni-Cr shape memory alloys have been considered in this research work. The alloys were developed by vacuum arc melting and implanted subcutaneously in rabbits for 4, 8 and 12 weeks. Metallic implants were embedded in order to determine the outcome of implantation on histometric and corrosion evaluation of Ti-Ni-Cr metallic strips. Encapsulating membrane formation around the alloys was minimal in the case of all materials. After histomorphometric analyses it was possible to demonstrate that there were no statistically significant differences between the materials. Corrosion rate was also determined in this study which is within acceptable range. The results showed the Ti- Ni-Cr alloy was neither cytotoxic, nor have any systemic reaction on living system in any of the test performed. Implantation shows good compatibility and a potential of being used directly in vivo system.

Correlation of Microstructure and Corrosion Behavior of Martensitic Stainless Steel Surgical Grade AISI 420A Exposed to 980-1035oC

Martensitic stainless steels have been extensively used for their good corrosion resistance and better mechanical properties. Heat treatment was suggested as one of the most excellent ways to this regard; hence, it affects the microstructure, mechanical and corrosion properties of the steel. In the current research work the microstructural changes and corrosion behavior in an AISI 420A stainless steel exposed to temperatures in the 980-1035oC range were investigated. The heat treatment is carried out in vacuum furnace within the said temperature range. The quenching of the samples was carried out in oil, brine and water media. The formation and stability of passive film was studied by Open Circuit Potential, Potentiodynamic polarization and Electrochemical Scratch Tests. The Electrochemical Impedance Spectroscopy results simulated with Equivalent Electrical Circuit suggested bilayer structure of outer porous and inner barrier oxide films. The quantitative data showed thick inner barrier oxide film retarded electrochemical reactions. Micrographs of the quenched samples showed sigma and chromium carbide phases which prove the corrosion resistance of steel alloy.

Numerical and Experimental Investigations on Jet Impingement Cooling

Effective cooling of electronic equipment has emerged as a challenging and constraining problem of the new century. In the present work the feasibility and effectiveness of jet impingement cooling on electronics were investigated numerically and experimentally. Studies have been conducted to see the effect of the geometrical parameters such as jet diameter (D), jet to target spacing (Z) and ratio of jet spacing to jet diameter (Z/D) on the heat transfer characteristics. The values of Reynolds numbers considered are in the range 7000 to 42000. The results obtained from the numerical studies are validated by conducting experiments. From the studies it is found that the optimum value of Z/D ratio is 5. For a given Reynolds number, the Nusselt number increases by about 28% if the diameter of the nozzle is increased from 1mm to 2mm. Correlations are proposed for Nusselt number in terms of Reynolds number and these are valid for air as the cooling medium.

The Study on Service-oriented Encapsulating Methods of Legacy Systems

At present, web Service is the first choice to reuse the legacy system for the implementation of SOA. According to the status of the implementation of SOA and the status of the legacy systems, we propose four encapsulating strategies. Base on the strategies, we proposal the service-oriented encapsulating framework, the legacy system can be encapsulated by the service-oriented encapsulating layer in three aspects, communication protocols, data and program. The reuse rate of the legacy systems can be increased by using this framework

Target Tracking in Sensor Networks: A Distributed Constraint Satisfaction Approach

In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.

RFU Based Computational Unit Design For Reconfigurable Processors

Fully customized hardware based technology provides high performance and low power consumption by specializing the tasks in hardware but lacks design flexibility since any kind of changes require re-design and re-fabrication. Software based solutions operate with software instructions due to which a great flexibility is achieved from the easy development and maintenance of the software code. But this execution of instructions introduces a high overhead in performance and area consumption. In past few decades the reconfigurable computing domain has been introduced which overcomes the traditional trades-off between flexibility and performance and is able to achieve high performance while maintaining a good flexibility. The dramatic gains in terms of chip performance and design flexibility achieved through the reconfigurable computing systems are greatly dependent on the design of their computational units being integrated with reconfigurable logic resources. The computational unit of any reconfigurable system plays vital role in defining its strength. In this research paper an RFU based computational unit design has been presented using the tightly coupled, multi-threaded reconfigurable cores. The proposed design has been simulated for VLIW based architectures and a high gain in performance has been observed as compared to the conventional computing systems.

Mechanical and Thermal Properties Characterisation of Vinyl Ester Matrix Nanocomposites Based On Layered Silicate

The mechanical properties including flexural and tensile of neat vinyl ester and polymer based on layered silicate nanocomposite materials are discussed. The addition of layered silicate into the polymer matrix increased the tensile and flexural modulus up to 1 wt.% clay loading. The incorporation of more clay resulted in decreasing the mechanical properties which was traced to the existence of aggregation layers. Likewise, up to 1 wt.% clay loading, the thermal behaviour showed significant improvements and at higher clay loading the thermal pattern was reduced. The aggregation layers imparted a negative impact on the overall mechanical and thermal properties. Wide Angle X-ray Diffraction, Scanning Electron Microscopy and Transmission Electron Microscopy were utilised in order to characterise the interlamellar structure of nanocomposites.

Measuring Process Component Design on Achieving Managerial Goals

Process-oriented software development is a new software development paradigm in which software design is modeled by a business process which is in turn translated into a process execution language for execution. The building blocks of this paradigm are software units that are composed together to work according to the flow of the business process. This new paradigm still exhibits the characteristic of the applications built with the traditional software component technology. This paper discusses an approach to apply a traditional technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses, and these process components can be reused to design the business processes of other application domains. The decomposition considers five managerial goals, namely cost effectiveness, ease of assembly, customization, reusability, and maintainability. The paper presents how to design or decompose process components from a business process model and measure some technical features of the design that would affect the managerial goals. A comparison between the measurement values from different designs can tell which process component design is more appropriate for the managerial goals that have been set. The proposed approach can be applied in Web Services environment which accommodates process-oriented software development.

Corporate Social Responsibility Practices of the Textile Firms Quoted in Istanbul Stock Exchange

Corporate social responsibility (CSR) can be defined as the management of social, environmental, economical and ethical concepts and firms sensivities to the expectations of the social stakeholders. CSR is seen as an important competitive advantage in the textile sector because this sector has an important impact on the environment and it is labor extensive. Textile sector has a strong advantage when compared with other sectors in Turkey due to its low labor costs and abundancy of raw materials. Turkey was a producer and an exporter of cotton, and an importer of fiber, clothes and dresses until 1950s. After 1950s, Turkey has begun to export fiber, ready-made clothes and become one of the most important textile producers in the world recently. CSR practices of the textile firms that are quoted in Istanbul Stock Exchange and these firms sensivities to their internal and external stakeholders and environment will be presented in this study.

Numerical Analysis of Wave and Hydrodynamic Models for Energy Balance and Primitive Equations

A numerical analysis of wave and hydrodynamic models is used to investigate the influence of WAve and Storm Surge (WASS) in the regional and coastal zones. The numerical analyzed system consists of the WAve Model Cycle 4 (WAMC4) and the Princeton Ocean Model (POM) which used to solve the energy balance and primitive equations respectively. The results of both models presented the incorporated surface wave in the regional zone affected the coastal storm surge zone. Specifically, the results indicated that the WASS generally under the approximation is not only the peak surge but also the coastal water level drop which can also cause substantial impact on the coastal environment. The wave–induced surface stress affected the storm surge can significantly improve storm surge prediction. Finally, the calibration of wave module according to the minimum error of the significant wave height (Hs) is not necessarily result in the optimum wave module in the WASS analyzed system for the WASS prediction.

Blending Processing of Industrial Residues: A Specific Case of an Enterprise Located in the Municipality of Belo Horizonte, MG, Brazil

Residues are produced in all stages of human activities in terms of composition and volume which vary according to consumption practices and to production methods. Forms of significant harm to the environment are associated to volume of generated material as well as to improper disposal of solid wastes, whose negative effects are noticed more frequently in the long term. The solution to this problem constitutes a challenge to the government, industry and society, because they involve economic, social, environmental and, especially, awareness of the population in general. The main concerns are focused on the impact it can have on human health and on the environment (soil, water, air and sights). The hazardous waste produced mainly by industry, are particularly worrisome because, when improperly managed, they become a serious threat to the environment. In view of this issue, this study aimed to evaluate the management system of solid waste of a coprocessing industrial waste company, to propose improvements to the rejects generation management in a specific step of the Blending production process.

Analysis of a TBM Tunneling Effect on Surface Subsidence: A Case Study from Tehran, Iran

The development and extension of large cities induced a need for shallow tunnel in soft ground of building areas. Estimation of ground settlement caused by the tunnel excavation is important engineering point. In this paper, prediction of surface subsidence caused by tunneling in one section of seventh line of Tehran subway is considered. On the basis of studied geotechnical conditions of the region, tunnel with the length of 26.9km has been excavated applying a mechanized method using an EPB-TBM with a diameter of 9.14m. In this regard, settlement is estimated utilizing both analytical and numerical finite element method. The numerical method shows that the value of settlement in this section is 5cm. Besides, the analytical consequences (Bobet and Loganathan-Polous) are 5.29 and 12.36cm, respectively. According to results of this study, due tosaturation of this section, there are good agreement between Bobet and numerical methods. Therefore, tunneling processes in this section needs a special consolidation measurement and support system before the passage of tunnel boring machine.

Variable Step-Size APA with Decorrelation of AR Input Process

This paper introduces a new variable step-size APA with decorrelation of AR input process is based on the MSD analysis. To achieve a fast convergence rate and a small steady-state estimation error, he proposed algorithm uses variable step size that is determined by minimising the MSD. In addition, experimental results show that the proposed algorithm is achieved better performance than the other algorithms.

Design and Implementation of a WiFi Based Home Automation System

This paper presents a design and prototype implementation of new home automation system that uses WiFi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server (web server), which presents system core that manages, controls, and monitors users- home. Users and system administrator can locally (LAN) or remotely (internet) manage and control system code. Second part is hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of available home automation system in the market the proposed system is scalable that one server can manage many hardware interface modules as long as it exists on WiFi network coverage. System supports a wide range of home automation devices like power management components, and security components. The proposed system is better from the scalability and flexibility point of view than the commercially available home automation systems.

Development of Genetic-based Machine Learning for Network Intrusion Detection (GBML-NID)

Society has grown to rely on Internet services, and the number of Internet users increases every day. As more and more users become connected to the network, the window of opportunity for malicious users to do their damage becomes very great and lucrative. The objective of this paper is to incorporate different techniques into classier system to detect and classify intrusion from normal network packet. Among several techniques, Steady State Genetic-based Machine Leaning Algorithm (SSGBML) will be used to detect intrusions. Where Steady State Genetic Algorithm (SSGA), Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and Zeroth Level Classifier system are investigated in this research. SSGA is used as a discovery mechanism instead of SGA. SGA replaces all old rules with new produced rule preventing old good rules from participating in the next rule generation. Zeroth Level Classifier System is used to play the role of detector by matching incoming environment message with classifiers to determine whether the current message is normal or intrusion and receiving feedback from environment. Finally, in order to attain the best results, Modified SSGA will enhance our discovery engine by using Fuzzy Logic to optimize crossover and mutation probability. The experiments and evaluations of the proposed method were performed with the KDD 99 intrusion detection dataset.

A Diffusion Least-Mean Square Algorithm for Distributed Estimation over Sensor Networks

In this paper we consider the issue of distributed adaptive estimation over sensor networks. To deal with more realistic scenario, different variance for observation noise is assumed for sensors in the network. To solve the problem of different variance of observation noise, the proposed method is divided into two phases: I) Estimating each sensor-s observation noise variance and II) using the estimated variances to obtain the desired parameter. Our proposed algorithm is based on a diffusion least mean square (LMS) implementation with linear combiner model. In the proposed algorithm, the step-size parameter the coefficients of linear combiner are adjusted according to estimated observation noise variances. As the simulation results show, the proposed algorithm considerably improves the diffusion LMS algorithm given in literature.