Combinatorial Optimisation of Worm Propagationon an Unknown Network

Worm propagation profiles have significantly changed since 2003-2004: sudden world outbreaks like Blaster or Slammer have progressively disappeared and slower but stealthier worms appeared since, most of them for botnets dissemination. Decreased worm virulence results in more difficult detection. In this paper, we describe a stealth worm propagation model which has been extensively simulated and analysed on a huge virtual network. The main features of this model is its ability to infect any Internet-like network in a few seconds, whatever may be its size while greatly limiting the reinfection attempt overhead of already infected hosts. The main simulation results shows that the combinatorial topology of routing may have a huge impact on the worm propagation and thus some servers play a more essential and significant role than others. The real-time capability to identify them may be essential to greatly hinder worm propagation.

Algorithm Design and Performance Evaluation of Equivalent CMOS Model

This work is a proposed model of CMOS for which the algorithm has been created and then the performance evaluation of this proposition has been done. In this context, another commonly used model called ZSTT (Zero Switching Time Transient) model is chosen to compare all the vital features and the results for the Proposed Equivalent CMOS are promising. In the end, the excerpts of the created algorithm are also included

Viscosity Reduction and Upgrading of Athabasca Oilsands Bitumen by Natural Zeolite Cracking

Oilsands bitumen is an extremely important source of energy for North America. However, due to the presence of large molecules such as asphaltenes, the density and viscosity of the bitumen recovered from these sands are much higher than those of conventional crude oil. As a result the extracted bitumen has to be diluted with expensive solvents, or thermochemically upgraded in large, capital-intensive conventional upgrading facilities prior to pipeline transport. This study demonstrates that globally abundant natural zeolites such as clinoptilolite from Saint Clouds, New Mexico and Ca-chabazite from Bowie, Arizona can be used as very effective reagents for cracking and visbreaking of oilsands bitumen. Natural zeolite cracked oilsands bitumen products are highly recoverable (up to ~ 83%) using light hydrocarbons such as pentane, which indicates substantial conversion of heavier fractions to lighter components. The resultant liquid products are much less viscous, and have lighter product distribution compared to those produced from pure thermal treatment. These natural minerals impart similar effect on industrially extracted Athabasca bitumen.

Investigation of Scour Depth at Bridge Piers using Bri-Stars Model in Iran

BRI-STARS (BRIdge Stream Tube model for Alluvial River Simulation) program was used to investigate the scour depth around bridge piers in some of the major river systems in Iran. Model calibration was performed by collecting different field data. Field data are cataloged on three categories, first group of bridges that their rivers bed are formed by fine material, second group of bridges that their rivers bed are formed by sand material, and finally bridges that their rivers bed are formed by gravel or cobble materials. Verification was performed with some field data in Fars Province. Results show that for wide piers, computed scour depth is more than measured one. In gravel bed streams, computed scour depth is greater than measured scour depth, the reason is due to formation of armor layer on bed of channel. Once this layer is eroded, the computed scour depth is close to the measured one.

Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification

Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.

Investigation of Tool Temperature and Surface Quality in Hot Machining of Hard-to-Cut Materials

Production of hard-to-cut materials with uncoated carbide cutting tools in turning, not only cause tool life reduction but also, impairs the product surface roughness. In this paper, influence of hot machining method were studied and presented in two cases. Case1-Workpiece surface roughness quality with constant cutting parameter and 300 ºC initial workpiece surface temperature. Case 2- Tool temperature variation when cutting with two speeds 78.5 (m/min) and 51 (m/min). The workpiece material and tool used in this study were AISI 1060 steel (45HRC) and uncoated carbide TNNM 120408-SP10(SANDVIK Coromant) respectively. A gas flam heating source was used to preheating of the workpiece surface up to 300 ºC, causing reduction of yield stress about 15%. Results obtained experimentally, show that the method used can considerably improved surface quality of the workpiece.

Stability of Discrete Linear Systems with Periodic Coefficients under Parametric Perturbations

This paper studies the problem of exponential stability of perturbed discrete linear systems with periodic coefficients. Assuming that the unperturbed system is exponentially stable we obtain conditions on the perturbations under which the perturbed system is exponentially stable.

U.S. Nuclear Regulatory CommissionTraining for Research and Training Reactor Inspectors

Currently, a large number of license activities (Early Site Permits, Combined Operating License, reactor certifications, etc.), are pending for review before the United States Nuclear Regulatory Commission (US NRC). Much of the senior staff at the NRC is now committed to these review and licensing actions. To address this additional workload, the NRC has recruited a large number of new Regulatory Staff for dealing with these and other regulatory actions such as the US Fleet of Research and Test Reactors (RTRs). These reactors pose unusual demands on Regulatory Staff since the US Fleet of RTRs, although few (32 Licensed RTRs as of 2010), they represent a broad range of reactor types, operations, and research and training aspects that nuclear reactor power plants (such as the 104 LWRs) do not pose. The NRC must inspect and regulate all these facilities. This paper addresses selected training topics and regulatory activities providedNRC Inspectors for RTRs.

Physicochemical Properties of Microemulsions and their uses in Enhanced Oil Recovery

Use of microemulsion in enhanced oil recovery has become more attractive in recent years because of its high level of extraction efficiency. Experimental investigations have been made on characterization of microemulsions of oil-brinesurfactant/ cosurfactant system for its use in enhanced oil recovery (EOR). Sodium dodecyl sulfate, propan-1-ol and heptane were selected as surfactant, cosurfactant and oil respectively for preparation of microemulsion. The effects of salinity on the relative phase volumes and solubilization parameters have also been studied. As salinity changes from low to high value, phase transition takes place from Winsor I to Winsor II via Winsor III. Suitable microemulsion composition has been selected based on its stability and ability to reduce interfacial tension. A series of flooding experiments have been performed using the selected microemulsion. The flooding experiments were performed in a core flooding apparatus using uniform sand pack. The core holder was tightly packed with uniform sands (60-100 mesh) and saturated with brines of different salinities. It was flooded with the brine at 25 psig and the absolute permeability was calculated from the flow rate of the through sand pack. The sand pack was then flooded with the crude oil at 800 psig to irreducible water saturation. The initial water saturation was determined on the basis of mass balance. Waterflooding was conducted by placing the coreholder horizontally at a constant injection pressure at 200 pisg. After water flooding, when water-cut reached above 95%, around 0.5 pore volume (PV) of the above microemulsion slug was injected followed by chasing water. The experiments were repeated using different composition of microemulsion slug. The additional recoveries were calculated by material balance. Encouraging results with additional recovery more than 20% of original oil in place above the conventional water flooding have been observed.

High Level Synthesis of Digital Filters Based On Sub-Token Forwarding

High level synthesis (HLS) is a process which generates register-transfer level design for digital systems from behavioral description. There are many HLS algorithms and commercial tools. However, most of these algorithms consider a behavioral description for the system when a single token is presented to the system. This approach does not exploit extra hardware efficiently, especially in the design of digital filters where common operations may exist between successive tokens. In this paper, we modify the behavioral description to process multiple tokens in parallel. However, this approach is unlike the full processing that requires full hardware replication. It exploits the presence of common operations between successive tokens. The performance of the proposed approach is better than sequential processing and approaches that of full parallel processing as the hardware resources are increased.

Preliminary Investigation on Combustion Characteristics of Rice Husk in FBC

The experimental results on combustion of rice husk in a conical fluidized bed combustor (referred to as the conical FBC) using silica sand as the bed material are presented in this paper. The effects of excess combustion air and combustor loading as well as the sand bed height on the combustion pattern in FBC were investigated. Temperatures and gas concentrations (CO and NO) along over the combustor height as well as in the flue gas downstream from the ash collecting cyclone were measured. The results showed that the axial temperature profiles in FBC were explicitly affected by the combustor loading whereas the excess air and bed height were found to have minor influences on the temperature pattern. Meanwhile, the combustor loading and the excess air significantly affected the axial CO and NO concentration profiles; however, these profiles were almost independent of the bed height. The combustion and thermal efficiencies for this FBC were quantified for different operating conditions.

A Genetic Algorithm Based Classification Approach for Finding Fault Prone Classes

Fault-proneness of a software module is the probability that the module contains faults. A correlation exists between the fault-proneness of the software and the measurable attributes of the code (i.e. the static metrics) and of the testing (i.e. the dynamic metrics). Early detection of fault-prone software components enables verification experts to concentrate their time and resources on the problem areas of the software system under development. This paper introduces Genetic Algorithm based software fault prediction models with Object-Oriented metrics. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the classification of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results shows that Genetic algorithm approach can be used for finding the fault proneness in object oriented software components.

Fingerprint Verification System Using Minutiae Extraction Technique

Most fingerprint recognition techniques are based on minutiae matching and have been well studied. However, this technology still suffers from problems associated with the handling of poor quality impressions. One problem besetting fingerprint matching is distortion. Distortion changes both geometric position and orientation, and leads to difficulties in establishing a match among multiple impressions acquired from the same finger tip. Marking all the minutiae accurately as well as rejecting false minutiae is another issue still under research. Our work has combined many methods to build a minutia extractor and a minutia matcher. The combination of multiple methods comes from a wide investigation into research papers. Also some novel changes like segmentation using Morphological operations, improved thinning, false minutiae removal methods, minutia marking with special considering the triple branch counting, minutia unification by decomposing a branch into three terminations, and matching in the unified x-y coordinate system after a two-step transformation are used in the work.

A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis

Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 seconds

Analysis of Modified Heap Sort Algorithm on Different Environment

In field of Computer Science and Mathematics, sorting algorithm is an algorithm that puts elements of a list in a certain order i.e. ascending or descending. Sorting is perhaps the most widely studied problem in computer science and is frequently used as a benchmark of a system-s performance. This paper presented the comparative performance study of four sorting algorithms on different platform. For each machine, it is found that the algorithm depends upon the number of elements to be sorted. In addition, as expected, results show that the relative performance of the algorithms differed on the various machines. So, algorithm performance is dependent on data size and there exists impact of hardware also.

Characterization of Carbon Based Nanometer Scale Coil Growth

The carbon based coils with the nanometer scale have the 3 dimension helix geometry. We synthesized the carbon nano-coils by the use of chemical vapor deposition technique with iron and tin as the catalysts. The fabricated coils have the external diameter of ranging few hundred nm to few thousand nm. The Scanning Electro-Microscope (SEM) and Tunneling Electro-Microscope has shown detail images of the coil-s structure. The fabrication of the carbon nano-coils can be grown on the metal and non-metal substrates, such as the stainless steel and silicon substrates. Besides growth on the flat substrate; they also can be grown on the stainless steel wires. After the synthesis of the coils, the mechanical and electro-mechanical property is measured. The experimental results were reported.

Impact of Faults in Different Software Systems: A Survey

Software maintenance is extremely important activity in software development life cycle. It involves a lot of human efforts, cost and time. Software maintenance may be further subdivided into different activities such as fault prediction, fault detection, fault prevention, fault correction etc. This topic has gained substantial attention due to sophisticated and complex applications, commercial hardware, clustered architecture and artificial intelligence. In this paper we surveyed the work done in the field of software maintenance. Software fault prediction has been studied in context of fault prone modules, self healing systems, developer information, maintenance models etc. Still a lot of things like modeling and weightage of impact of different kind of faults in the various types of software systems need to be explored in the field of fault severity.

Protein Secondary Structure Prediction

Protein structure determination and prediction has been a focal research subject in the field of bioinformatics due to the importance of protein structure in understanding the biological and chemical activities of organisms. The experimental methods used by biotechnologists to determine the structures of proteins demand sophisticated equipment and time. A host of computational methods are developed to predict the location of secondary structure elements in proteins for complementing or creating insights into experimental results. However, prediction accuracies of these methods rarely exceed 70%.

TRS: System for Recommending Semantic Web Service Composition Approaches

A large number of semantic web service composition approaches are developed by the research community and one is more efficient than the other one depending on the particular situation of use. So a close look at the requirements of ones particular situation is necessary to find a suitable approach to use. In this paper, we present a Technique Recommendation System (TRS) which using a classification of state-of-art semantic web service composition approaches, can provide the user of the system with the recommendations regarding the use of service composition approach based on some parameters regarding situation of use. TRS has modular architecture and uses the production-rules for knowledge representation.

Gluten-Free Cookies Enriched with Blueberry Pomace: Optimization of Baking Process

With the aim of improving nutritional profile and antioxidant capacity of gluten-free cookies, blueberry pomace, by-product of juice production, was processed into a new food ingredient by drying and grinding and used for a gluten-free cookie formulation. Since the quality of a baked product is highly influenced by the baking conditions, the objective of this work was to optimize the baking time and thickness of dough pieces, by applying Response Surface Methodology (RSM) in order to obtain the best technological quality of the cookies. The experiments were carried out according to a Central Composite Design (CCD) by selecting the dough thickness and baking time as independent variables, while hardness, color parameters (L*, a* and b* values), water activity, diameter and short/long ratio were response variables. According to the results of RSM analysis, the baking time of 13.74min and dough thickness of 4.08mm was found to be the optimal for the baking temperature of 170°C. As similar optimal parameters were obtained by previously conducted experiment based on sensory analysis, response surface methodology (RSM) can be considered as a suitable approach to optimize the baking process.