Analytical Prediction of Seismic Response of Steel Frames with Superelastic Shape Memory Alloy

Superelastic Shape Memory Alloy (SMA) is accepted when it used as connection in steel structures. The seismic behaviour of steel frames with SMA is being assessed in this study. Three eightstorey steel frames with different SMA systems are suggested, the first one of which is braced with diagonal bracing system, the second one is braced with nee bracing system while the last one is which the SMA is used as connection at the plastic hinge regions of beams. Nonlinear time history analyses of steel frames with SMA subjected to two different ground motion records have been performed using Seismostruct software. To evaluate the efficiency of suggested systems, the dynamic responses of the frames were compared. From the comparison results, it can be concluded that using SMA element is an effective way to improve the dynamic response of structures subjected to earthquake excitations. Implementing the SMA braces can lead to a reduction in residual roof displacement. The shape memory alloy is effective in reducing the maximum displacement at the frame top and it provides a large elastic deformation range. SMA connections are very effective in dissipating energy and reducing the total input energy of the whole frame under severe seismic ground motion. Using of the SMA connection system is more effective in controlling the reaction forces at the base frame than other bracing systems. Using SMA as bracing is more effective in reducing the displacements. The efficiency of SMA is dependant on the input wave motions and the construction system as well.

Method of Moments Applied to a Cuboidal Cavity Resonator: Effect of Gravitational Field Produced by a Black Hole

This paper deals with the formulation of Maxwell-s equations in a cavity resonator in the presence of the gravitational field produced by a blackhole. The metric of space-time due to the blackhole is the Schwarzchild metric. Conventionally, this is expressed in spherical polar coordinates. In order to adapt this metric to our problem, we have considered this metric in a small region close to the blackhole and expressed this metric in a cartesian system locally.

Behavioral Signature Generation using Shadow Honeypot

A novel behavioral detection framework is proposed to detect zero day buffer overflow vulnerabilities (based on network behavioral signatures) using zero-day exploits, instead of the signature-based or anomaly-based detection solutions currently available for IDPS techniques. At first we present the detection model that uses shadow honeypot. Our system is used for the online processing of network attacks and generating a behavior detection profile. The detection profile represents the dataset of 112 types of metrics describing the exact behavior of malware in the network. In this paper we present the examples of generating behavioral signatures for two attacks – a buffer overflow exploit on FTP server and well known Conficker worm. We demonstrated the visualization of important aspects by showing the differences between valid behavior and the attacks. Based on these metrics we can detect attacks with a very high probability of success, the process of detection is however very expensive.

Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort time is required for 99% - 99.75% of defect elimination. A comparison of the end results for the five selected projects between the companies is also brought about throwing light on the possibility of a particular company to position itself with an appropriate complementary ratio of inspection testing.

Design of a Carbon Silicon Electrode for Iontophoresis Treatment towards Alopecia

This study presents design of a carbon silicon electrode for iontophorsis treatment towards alopecia. The alopecia is a medical description means loss of hair from the body. For solving this problem, the drug need to be delivered into the scalp, therefore, the iontophoresis was chosen to use in this treatment. However, almost common electrodes of iontophoresis device are made with metal material, the electrodes could give patients hurt when they using it, and it is hard to avoid the hair for attaching the hair. For this reason, an electrode is made with silicon material to decrease the hurt from the electrodes, and the carbon material is mixed in it for increasing conductance. The several cones with stainless material on the electrode make the electrode is able to void hair to attach the affected part. According to the results of a vivo-experiment, the carbon silicon electrode showed a good performance and in treatment comfortably.

Testing Loaded Programs Using Fault Injection Technique

Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.

Encrypter Information Software Using Chaotic Generators

This document shows a software that shows different chaotic generator, as continuous as discrete time. The software gives the option for obtain the different signals, using different parameters and initial condition value. The program shows then critical parameter for each model. All theses models are capable of encrypter information, this software show it too.

Prospective Mathematics Teachers' Views about Using Flash Animations in Mathematics Lessons

The purpose of the study is to determine secondary prospective mathematics teachers- views related to using flash animations in mathematics lessons and to reveal how the sample presentations towards different mathematical concepts altered their views. This is a case study involving three secondary prospective mathematics teachers from a state university in Turkey. The data gathered from two semi-structural interviews. Findings revealed that these animations help understand mathematics meaningfully, relate mathematics and real world, visualization, and comprehend the importance of mathematics. The analysis of the data indicated that the sample presentations enhanced participants- views about using flash animations in mathematics lessons.

A Preliminary Study on the Suitability of Data Driven Approach for Continuous Water Level Modeling

Reliable water level forecasts are particularly important for warning against dangerous flood and inundation. The current study aims at investigating the suitability of the adaptive network based fuzzy inference system for continuous water level modeling. A hybrid learning algorithm, which combines the least square method and the back propagation algorithm, is used to identify the parameters of the network. For this study, water levels data are available for a hydrological year of 2002 with a sampling interval of 1-hour. The number of antecedent water level that should be included in the input variables is determined by two statistical methods, i.e. autocorrelation function and partial autocorrelation function between the variables. Forecasting was done for 1-hour until 12-hour ahead in order to compare the models generalization at higher horizons. The results demonstrate that the adaptive networkbased fuzzy inference system model can be applied successfully and provide high accuracy and reliability for river water level estimation. In general, the adaptive network-based fuzzy inference system provides accurate and reliable water level prediction for 1-hour ahead where the MAPE=1.15% and correlation=0.98 was achieved. Up to 12-hour ahead prediction, the model still shows relatively good performance where the error of prediction resulted was less than 9.65%. The information gathered from the preliminary results provide a useful guidance or reference for flood early warning system design in which the magnitude and the timing of a potential extreme flood are indicated.

IVE: Virtual Humans AI Prototyping Toolkit

IVE toolkit has been created for facilitating research,education and development in the ?eld of virtual storytelling andcomputer games. Primarily, the toolkit is intended for modellingaction selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploringjoint behaviour and role-passing technique (Sec. V). Additionally, thetoolkit can be used as an AI middleware without any changes. Themain facility of IVE is that it serves for prototyping both the AI andvirtual worlds themselves. The purpose of this paper is to describeIVE?s features in general and to present our current work - includingan educational game - on this platform.Keywords? AI middleware, simulation, virtual world.

On the Properties of Pseudo Noise Sequences with a Simple Proposal of Randomness Test

Maximal length sequences (m-sequences) are also known as pseudo random sequences or pseudo noise sequences for closely following Golomb-s popular randomness properties: (P1) balance, (P2) run, and (P3) ideal autocorrelation. Apart from these, there also exist certain other less known properties of such sequences all of which are discussed in this tutorial paper. Comprehensive proofs to each of these properties are provided towards better understanding of such sequences. A simple test is also proposed at the end of the paper in order to distinguish pseudo noise sequences from truly random sequences such as Bernoulli sequences.

An UML Statechart Diagram-Based MM-Path Generation Approach for Object-Oriented Integration Testing

MM-Path, an acronym for Method/Message Path, describes the dynamic interactions between methods in object-oriented systems. This paper discusses the classifications of MM-Path, based on the characteristics of object-oriented software. We categorize it according to the generation reasons, the effect scope and the composition of MM-Path. A formalized representation of MM-Path is also proposed, which has considered the influence of state on response method sequences of messages. .Moreover, an automatic MM-Path generation approach based on UML Statechart diagram has been presented, and the difficulties in identifying and generating MM-Path can be solved. . As a result, it provides a solid foundation for further research on test cases generation based on MM-Path.

A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation

Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.

High Efficiency, Selectivity against Cancer Cell Line of Purified L-Asparaginase from Pathogenic Escherichia coli

L-asparaginase was extracted from pathogenic Escherichia coli which was isolated from urinary tract infection patients. L-asparaginase was purified 96-fold by ultrafiltration, ion exchange and gel filtration giving 39.19% yield with final specific activity of 178.57 IU/mg. L-asparaginase showed 138,356±1,000 Dalton molecular weight with 31024±100 Dalton molecular mass. Kinetic properties of enzyme resulting 1.25×10-5 mM Km and 2.5×10-3 M/min Vmax. L-asparaginase showed a maximum activity at pH 7.5 when incubated at 37 ºC for 30 min and illustrated its full activity (100%) after 15 min incubation at 20-37 ºC, while 70% of its activity was lost when incubated at 60 ºC. L-asparaginase showed cytotoxicity to U937 cell line with IC50 0.5±0.19 IU/ml, and selectivity index (SI=7.6) about 8 time higher selectivity over the lymphocyte cells. Therefore, the local pathogenic E. coli strains may be used as a source of high yield of L-asparaginase to produce anti cancer agent with high selectivity.

Feature Based Unsupervised Intrusion Detection

The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.

Improving Location Management in Mobile IPv4 Networks

The Mobile IP Standard has been developed to support mobility over the Internet. This standard contains several drawbacks as in the cases where packets are routed via sub-optimal paths and significant amount of signaling messages is generated due to the home registration procedure which keeps the network aware of the current location of the mobile nodes. Recently, a dynamic hierarchical mobility management strategy for mobile IP networks (DHMIP) has been proposed to reduce home registrations costs. However, this strategy induces a packet delivery delay and increases the risk of packet loss. In this paper, we propose an enhanced version of the dynamic hierarchical strategy that reduces the packet delivery delay and minimizes the risk of packet loss. Preliminary results obtained from simulations are promising. They show that the enhanced version outperforms the original dynamic hierarchical mobility management strategy version.

A Face-to-Face Education Support System Capable of Lecture Adaptation and Q&A Assistance Based On Probabilistic Inference

Keys to high-quality face-to-face education are ensuring flexibility in the way lectures are given, and providing care and responsiveness to learners. This paper describes a face-to-face education support system that is designed to raise the satisfaction of learners and reduce the workload on instructors. This system consists of a lecture adaptation assistance part, which assists instructors in adapting teaching content and strategy, and a Q&A assistance part, which provides learners with answers to their questions. The core component of the former part is a “learning achievement map", which is composed of a Bayesian network (BN). From learners- performance in exercises on relevant past lectures, the lecture adaptation assistance part obtains information required to adapt appropriately the presentation of the next lecture. The core component of the Q&A assistance part is a case base, which accumulates cases consisting of questions expected from learners and answers to them. The Q&A assistance part is a case-based search system equipped with a search index which performs probabilistic inference. A prototype face-to-face education support system has been built, which is intended for the teaching of Java programming, and this approach was evaluated using this system. The expected degree of understanding of each learner for a future lecture was derived from his or her performance in exercises on past lectures, and this expected degree of understanding was used to select one of three adaptation levels. A model for determining the adaptation level most suitable for the individual learner has been identified. An experimental case base was built to examine the search performance of the Q&A assistance part, and it was found that the rate of successfully finding an appropriate case was 56%.

Effective Collaboration in Product Development via a Common Sharable Ontology

To achieve competitive advantage nowadays, most of the industrial companies are considering that success is sustained to great product development. That is to manage the product throughout its entire lifetime ranging from design, manufacture, operation and destruction. Achieving this goal requires a tight collaboration between partners from a wide variety of domains, resulting in various product data types and formats, as well as different software tools. So far, the lack of a meaningful unified representation for product data semantics has slowed down efficient product development. This paper proposes an ontology based approach to enable such semantic interoperability. Generic and extendible product ontology is described, gathering main concepts pertaining to the mechanical field and the relations that hold among them. The ontology is not exhaustive; nevertheless, it shows that such a unified representation is possible and easily exploitable. This is illustrated thru a case study with an example product and some semantic requests to which the ontology responds quite easily. The study proves the efficiency of ontologies as a support to product data exchange and information sharing, especially in product development environments where collaboration is not just a choice but a mandatory prerequisite.

Pathological Truth: The Use of Forensic Science in Kenya’s Criminal Justice System

Assassination of politicians, school mass murders, purported suicides, aircraft crash, mass shootings by police, sinking of sea ferries, mysterious car accidents, mass fire deaths and horrificterror attacks are some of the cases that bring forth scientific and legal conflicts. Questions about truth, justice and human rights are raised by both victims and perpetrators/offenders as they seek to understand why and how it happened to them. This kind of questioning manifests itself in medical-criminological-legalpsychological and scientific realms. An agreement towards truthinvestigations for possible legal-political-psychological transitory issues such as prosecution, victim-offender mediation, healing, reconciliation, amnesty, reparation, restitution, and policy formulations is seen as one way of transforming these conflicts. Forensic scientists and pathologists in particular have formed professional groups where the complexities between legal truth and scientific truth are dramatized and elucidated within the anatomy of courtrooms. This paper focuses on how pathological truth and legal truth interact with each other in Kenya’s criminal justice system. 

Optimal Location of Multi Type Facts Devices for Multiple Contingencies Using Particle Swarm Optimization

In deregulated operating regime power system security is an issue that needs due thoughtfulness from researchers in the horizon of unbundling of generation and transmission. Electric power systems are exposed to various contingencies. Network contingencies often contribute to overloading of branches, violation of voltages and also leading to problems of security/stability. To maintain the security of the systems, it is desirable to estimate the effect of contingencies and pertinent control measurement can be taken on to improve the system security. This paper presents the application of particle swarm optimization algorithm to find the optimal location of multi type FACTS devices in a power system in order to eliminate or alleviate the line over loads. The optimizations are performed on the parameters, namely the location of the devices, their types, their settings and installation cost of FACTS devices for single and multiple contingencies. TCSC, SVC and UPFC are considered and modeled for steady state analysis. The selection of UPFC and TCSC suitable location uses the criteria on the basis of improved system security. The effectiveness of the proposed method is tested for IEEE 6 bus and IEEE 30 bus test systems.