Distributed Detection and Optimal Traffic-blocking of Network Worms

Despite the recent surge of research in control of worm propagation, currently, there is no effective defense system against such cyber attacks. We first design a distributed detection architecture called Detection via Distributed Blackholes (DDBH). Our novel detection mechanism could be implemented via virtual honeypots or honeynets. Simulation results show that a worm can be detected with virtual honeypots on only 3% of the nodes. Moreover, the worm is detected when less than 1.5% of the nodes are infected. We then develop two control strategies: (1) optimal dynamic trafficblocking, for which we determine the condition that guarantees minimum number of removed nodes when the worm is contained and (2) predictive dynamic traffic-blocking–a realistic deployment of the optimal strategy on scale-free graphs. The predictive dynamic traffic-blocking, coupled with the DDBH, ensures that more than 40% of the network is unaffected by the propagation at the time when the worm is contained.

Experimental Evaluation of Drilling Damage on the Strength of Cores Extracted from RC Buildings

Concrete strength evaluated from compression tests on cores is affected by several factors causing differences from the in-situ strength at the location from which the core specimen was extracted. Among the factors, there is the damage possibly occurring during the drilling phase that generally leads to underestimate the actual in-situ strength. In order to quantify this effect, in this study two wide datasets have been examined, including: (i) about 500 core specimens extracted from Reinforced Concrete existing structures, and (ii) about 600 cube specimens taken during the construction of new structures in the framework of routine acceptance control. The two experimental datasets have been compared in terms of compression strength and specific weight values, accounting for the main factors affecting a concrete property, that is type and amount of cement, aggregates' grading, type and maximum size of aggregates, water/cement ratio, placing and curing modality, concrete age. The results show that the magnitude of the strength reduction due to drilling damage is strongly affected by the actual properties of concrete, being inversely proportional to its strength. Therefore, the application of a single value of the correction coefficient, as generally suggested in the technical literature and in structural codes, appears inappropriate. A set of values of the drilling damage coefficient is suggested as a function of the strength obtained from compressive tests on cores.

Some Algebraic Properties of Universal and Regular Covering Spaces

Let X be a connected space, X be a space, let p : X -→ X be a continuous map and let (X, p) be a covering space of X. In the first section we give some preliminaries from covering spaces and their automorphism groups. In the second section we derive some algebraic properties of both universal and regular covering spaces (X, p) of X and also their automorphism groups A(X, p).

Nigerian Bread Contribute One Half of Recommended Vitamin a Intake in Poor-Urban Lagosian Preschoolers

Nigerian bread is baked with vitamin A fortified wheat flour. Study aimed at determining its contribution to preschoolers- vitamin A nutriture. A cross-sectional/experimental study was carried out in four poor-urban Local Government Areas (LGAs) of Metropolitan Lagos, Nigeria. A pretested food frequency questionnaire was administered to randomly selected mothers of 1600 preschoolers (24-59 months). Retinyl Palmitate content of fourteen bread samples randomly collected from bakeries in all LGAs was analyzed at 0 and 5 days at 25oC using High Performance Liquid Chromatography. Data analysis was done at p

Analysis of the Structural Fluctuation of the Permitted Building Areas and Housing Distribution Ratios - Focused on 5 Cities Including Bucheon

The purpose of this study was to analyze the correlation between permitted building areas and housing distribution ratios and their fluctuation, and test a distribution model during 3 successive governments in 5 cities including Bucheon in reference to the time series administrative data, and thereby, interpret the results of the analysis in association with the policies pursued by the successive governments to examine the structural fluctuation of permitted building areas and housing distribution ratios. In order to analyze the fluctuation of permitted building areas and housing distribution ratios during 3 successive governments and examine the cycles of the time series data, the spectral analysis was performed, and in order to analyze the correlation between permitted building areas and housing distribution ratios, the tabulation was performed to describe the correlations statistically, and in order to explain about differences of fluctuation distribution of permitted building areas and housing distribution ratios among 3 governments, the goodness of fit test was conducted.

Simulation of 3D Flow using Numerical Model at Open-channel Confluences

This paper analytically investigates the 3D flow pattern at the confluences of two rectangular channels having 900 angles using Navier-Stokes equations based on Reynolds Stress Turbulence Model (RSM). The equations are solved by the Finite- Volume Method (FVM) and the flow is analyzed in terms of steadystate (single-phased) conditions. The Shumate experimental findings were used to test the validity of data. Comparison of the simulation model with the experimental ones indicated a close proximity between the flow patterns of the two sets. Effects of the discharge ratio on separation zone dimensions created in the main-channel downstream of the confluence indicated an inverse relation, where a decrease in discharge ratio, will entail an increase in the length and width of the separation zone. The study also found the model as a powerful analytical tool in the feasibility study of hydraulic engineering projects.

Design of Folded Cascode OTA in Different Regions of Operation through gm/ID Methodology

This paper presents an optimized methodology to folded cascode operational transconductance amplifier (OTA) design. The design is done in different regions of operation, weak inversion, strong inversion and moderate inversion using the gm/ID methodology in order to optimize MOS transistor sizing. Using 0.35μm CMOS process, the designed folded cascode OTA achieves a DC gain of 77.5dB and a unity-gain frequency of 430MHz in strong inversion mode. In moderate inversion mode, it has a 92dB DC gain and provides a gain bandwidth product of around 69MHz. The OTA circuit has a DC gain of 75.5dB and unity-gain frequency limited to 19.14MHZ in weak inversion region.

Construction of Recombinant E.coli Expressing Fusion Protein to Produce 1,3-Propanediol

In this study, a synthetic pathway was created by assembling genes from Clostridium butyricum and Escherichia coli in different combinations. Among the genes were dhaB1 and dhaB2 from C. butyricum VPI1718 coding for glycerol dehydratase (GDHt) and its activator (GDHtAc), respectively, involved in the conversion of glycerol to 3-hydroxypropionaldehyde (3-HPA). The yqhD gene from E.coli BL21 was also included which codes for an NADPHdependent 1,3-propanediol oxidoreductase isoenzyme (PDORI) reducing 3-HPA to 1,3-propanediol (1,3-PD). Molecular modeling analysis indicated that the conformation of fusion protein of YQHD and DHAB1 was favorable for direct molecular channeling of the intermediate 3-HPA. According to the simulation results, the yqhD and dhaB1 gene were assembled in the upstream of dhaB2 to express a fusion protein, yielding the recombinant strain E. coliBL21 (DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP41Y3). Strain BP41Y3 gave 10-fold higher 1,3-PD concentration than E. coliBL21 (DE3)//pET22b+::yqhD-dhaB1_dhaB2 (strain BP31Y2) expressing the recombinant enzymes simultaneously but in a non-fusion mode. This is the first report using a gene fusion approach to enhance the biological conversion of glycerol to the value added compound 1,3- PD.

The Effect of Harmonic Power Fluctuation for Estimating Flicker

Voltage flicker problems have long existed in several of the distribution areas served by the Taiwan Power Company. In the past, those research results indicating that the estimated ΔV10 value based on the conventional method is significantly smaller than the survey value. This paper is used to study the relationship between the voltage flicker problems and harmonic power variation for the power system with electric arc furnaces. This investigation discussed thought the effect of harmonic power fluctuation with flicker estimate value. The method of field measurement, statistics and simulation is used. The survey results demonstrate that 10 ΔV estimate must account for the effect of harmonic power variation.

Awareness of Reading Strategies among EFL Learners at Bangkok University

This questionnaire-based study, aimed to measure and compare the awareness of English reading strategies among EFL learners at Bangkok University (BU) classified by their gender, field of study, and English learning experience. Proportional stratified random sampling was employed to formulate a sample of 380 BU students. The data were statistically analyzed in terms of the mean and standard deviation. t-Test analysis was used to find differences in awareness of reading strategies between two groups (-male and female- /-science and social-science students). In addition, one-way analysis of variance (ANOVA) was used to compare reading strategy awareness among BU students with different lengths of English learning experience. The results of this study indicated that the overall awareness of reading strategies of EFL learners at BU was at a high level (ðÑ = 3.60) and that there was no statistically significant difference between males and females, and among students who have different lengths of English learning experience at the significance level of 0.05. However, significant differences among students coming from different fields of study were found at the same level of significance.

Evaluation of Guaiacol and Syringol Emission upon Wood Pyrolysis for some Fast Growing Species

Wood pyrolysis for Casuarina glauca, Casuarina cunninghamiana, Eucalyptus camaldulensis, Eucalyptus microtheca was made at 450°C with 2.5°C/min. in a flowing N2-atmosphere. The Eucalyptus genus wood gave higher values of specific gravity, ash , total extractives, lignin, N2-liquid trap distillate (NLTD) and water trap distillate (WSP) than those for Casuarina genus. The GHC of NLTD was higher for Casuarina genus than that for Eucalyptus genus with the highest value for Casuarina cunninghamiana. Guiacol, 4-ethyl-2-methoxyphenol and syringol were observed in the NLTD of all the four wood species reflecting their parent hardwood lignin origin. Eucalyptus camaldulensis wood had the highest lignin content (28.89%) and was pyrolyzed to the highest values of phenolics (73.01%), guaiacol (11.2%) and syringol (32.28%) contents in methylene chloride fraction (MCF) of NLTD. Accordingly, recoveries of syringol and guaiacol may become economically attractive from Eucalyptus camaldulensis.

An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing

Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.

Role of Personnel Planning in Business Continuity Management

Business continuity management (BCM) identifies potential external and internal threats to an organization and their impacts to business operations. The goal of the article is to identify, based on the analysis of employee turnover in organizations in the Czech Republic, the role of personnel planning in BCM. The article is organized as follows. The first part of the article concentrates on the theoretical background of the topic. The second part of the article is dedicated to the evaluation of the outcomes of the survey conducted (questionnaire survey), focusing on the analysis of employee turnover in organizations in the Czech Republic. The final part of the article underlines the role of personnel planning in BCM, since poor planning of staff needs in an organization can represent a future threat for business continuity ensuring.

Environmental Analysis of the Zinc Oxide Nanophotocatalyst Synthesis

Nanophotocatalysts such as titanium (TiO2), zinc (ZnO), and iron (Fe2O3) oxides can be used in organic pollutants oxidation, and in many other applications. But among the challenges for technological application (scale-up) of the nanotechnology scientific developments two aspects are still little explored: research on environmental risk of the nanomaterials preparation methods, and the study of nanomaterials properties and/or performance variability. The environmental analysis was performed for six different methods of ZnO nanoparticles synthesis, and showed that it is possible to identify the more environmentally compatible process even at laboratory scale research. The obtained ZnO nanoparticles were tested as photocatalysts, and increased the degradation rate of the Rhodamine B dye up to 30 times.

Modeling “Web of Trust“ with Web 2.0

“Web of Trust" is one of the recognized goals for Web 2.0. It aims to make it possible for the people to take responsibility for what they publish on the web, including organizations, businesses and individual users. These objectives, among others, drive most of the technologies and protocols recently standardized by the governing bodies. One of the great advantages of Web infrastructure is decentralization of publication. The primary motivation behind Web 2.0 is to assist the people to add contents for Collective Intelligence (CI) while providing mechanisms to link content with people for evaluations and accountability of information. Such structure of contents will interconnect users and contents so that users can use contents to find participants and vice versa. This paper proposes conceptual information storage and linking model, based on decentralized information structure, that links contents and people together. The model uses FOAF, Atom, RDF and RDFS and can be used as a blueprint to develop Web 2.0 applications for any e-domain. However, primary target for this paper is online trust evaluation domain. The proposed model targets to assist the individuals to establish “Web of Trust" in online trust domain.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Injuries Related to Kitesurfing

Participation in sporting activities can lead to injury. Sport injuries have been widely studied in many sports including the more extreme categories of aquatic board sports. Kitesurfing is a relatively new water surface action sport, and has not yet been widely studied in terms of injuries and stress on the body. The aim of this study was to get information about which injuries that are most common among kitesurfing participants, where they occur, and their causes. Injuries were studied using an international open web questionnaire (n=206). The results showed that many respondents reported injuries, in total 251 injuries to knee (24%), ankle (17%), trunk (16%) and shoulders (10%), often sustained while doing jumps and tricks (40%). Among the reported injuries were joint injuries (n=101), muscle/tendon damages (n=47), wounds and cuts (n=36) and bone fractures (n=28). Also environmental factors and equipment can influence the risk of injury, or the extent of injury in a hazardous situation. Conclusively, the information from this retrospective study supports earlier studies in terms of prevalence and site of injuries. Suggestively, this information should be used for to build a foundation of knowledge about the sport for development of applications for physical training and product development.

Identify Features and Parameters to Devise an Accurate Intrusion Detection System Using Artificial Neural Network

The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.

Evaluating Performance of Quality-of-Service Routing in Large Networks

The performance and complexity of QoS routing depends on the complex interaction between a large set of parameters. This paper investigated the scaling properties of source-directed link-state routing in large core networks. The simulation results show that the routing algorithm, network topology, and link cost function each have a significant impact on the probability of successfully routing new connections. The experiments confirm and extend the findings of other studies, and also lend new insight designing efficient quality-of-service routing policies in large networks.

The Fundamental Reliance of Iterative Learning Control on Stability Robustness

Iterative learning control aims to achieve zero tracking error of a specific command. This is accomplished by iteratively adjusting the command given to a feedback control system, based on the tracking error observed in the previous iteration. One would like the iterations to converge to zero tracking error in spite of any error present in the model used to design the learning law. First, this need for stability robustness is discussed, and then the need for robustness of the property that the transients are well behaved. Methods of producing the needed robustness to parameter variations and to singular perturbations are presented. Then a method involving reverse time runs is given that lets the world behavior produce the ILC gains in such a way as to eliminate the need for a mathematical model. Since the real world is producing the gains, there is no issue of model error. Provided the world behaves linearly, the approach gives an ILC law with both stability robustness and good transient robustness, without the need to generate a model.