Traffic Signal Design and Simulation for Vulnerable Road Users Safety and Bus Preemption

Mostly, pedestrian-car accidents occurred at a signalized interaction is because pedestrians cannot across the intersection safely within the green light. From the viewpoint of pedestrian, there might have two reasons. The first one is pedestrians cannot speed up to across the intersection, such as the elders. The other reason is pedestrians do not sense that the signal phase is going to change and their right-of-way is going to lose. Developing signal logic to protect pedestrian, who is crossing an intersection is the first purpose of this study. Another purpose of this study is improving the reliability and reduce delay of public transportation service. Therefore, bus preemption is also considered in the designed signal logic. In this study, the traffic data of the intersection of Chong-Qing North Road and Min-Zu West Road, Taipei, Taiwan, is employed to calibrate and validate the signal logic by simulation. VISSIM 5.20, which is a microscopic traffic simulation software, is employed to simulate the signal logic. From the simulated results, the signal logic presented in this study can protect pedestrians crossing the intersection successfully. The design of bus preemption can reduce the average delay. However, the pedestrian safety and bus preemptive signal will influence the average delay of cars largely. Thus, whether applying the pedestrian safety and bus preemption signal logic to an isolated intersection or not should be evaluated carefully.

Heterogeneous Artifacts Construction for Software Evolution Control

The software evolution control requires a deep understanding of the changes and their impact on different system heterogeneous artifacts. And an understanding of descriptive knowledge of the developed software artifacts is a prerequisite condition for the success of the evolutionary process. The implementation of an evolutionary process is to make changes more or less important to many heterogeneous software artifacts such as source code, analysis and design models, unit testing, XML deployment descriptors, user guides, and others. These changes can be a source of degradation in functional, qualitative or behavioral terms of modified software. Hence the need for a unified approach for extraction and representation of different heterogeneous artifacts in order to ensure a unified and detailed description of heterogeneous software artifacts, exploitable by several software tools and allowing to responsible for the evolution of carry out the reasoning change concerned.

Static Voltage Stability Assessment Considering the Power System Contingencies using Continuation Power Flow Method

According to the increasing utilization in power system, the transmission lines and power plants often operate in stability boundary and system probably lose its stable condition by over loading or occurring disturbance. According to the reasons that are mentioned, the prediction and recognition of voltage instability in power system has particular importance and it makes the network security stronger.This paper, by considering of power system contingencies based on the effects of them on Mega Watt Margin (MWM) and maximum loading point is focused in order to analyse the static voltage stability using continuation power flow method. The study has been carried out on IEEE 14-Bus Test System using Matlab and Psat softwares and results are presented.

The Fuel Consumption and Non Linear Model Metropolitan and Large City Transportation System

The national economy development affects the vehicle ownership which ultimately increases fuel consumption. The rise of the vehicle ownership is dominated by the increasing number of motorcycles. This research aims to analyze and identify the characteristics of fuel consumption, the city transportation system, and to analyze the relationship and the effect of the city transportation system on the fuel consumption. A multivariable analysis is used in this study. The data analysis techniques include: a Multivariate Multivariable Analysis by using the R software. More than 84% of fuel on Java is consumed in metropolitan and large cities. The city transportation system variables that strongly effect the fuel consumption are population, public vehicles, private vehicles and private bus. This method can be developed to control the fuel consumption by considering the urban transport system and city tipology. The effect can reducing subsidy on the fuel consumption, increasing state economic.

Modeling Ecological Responses of Some Forage Legumes in Iran

Grasslands of Iran are encountered with a vast desertification and destruction. Some legumes are plants of forage importance with high palatability. Studied legumes in this project are Onobrychis, Medicago sativa (alfalfa) and Trifolium repens. Seeds were cultivated in research field of Kaboutarabad (33 km East of Isfahan, Iran) with an average 80 mm. annual rainfall. Plants were cultivated in a split plot design with 3 replicate and two water treatments (weekly irrigation, and under stress with same amount per 15 days interval). Water entrance to each plots were measured by Partial flow. This project lasted 20 weeks. Destructive samplings (1m2 each time) were done weekly. At each sampling plants were gathered and weighed separately for each vegetative parts. An Area Meter (Vista) was used to measure root surface and leaf area. Total shoot and root fresh and dry weight, leaf area index and soil coverage were evaluated too. Dry weight was achieved in 750c oven after 24 hours. Statgraphic and Harvard Graphic software were used to formulate and demonstrate the parameters curves due to time. Our results show that Trifolium repens has affected 60 % and Medicago sativa 18% by water stress. Onobrychis total fresh weight was reduced 45%. Dry weight or Biomass in alfalfa is not so affected by water shortage. This means that in alfalfa fields we can decrease the irrigation amount and have some how same amount of Biomass. Onobrychis show a drastic decrease in Biomass. The increases in total dry matter due to time in studied plants are formulated. For Trifolium repens if removal or cattle entrance to meadows do not occurred at perfect time, it will decrease the palatability and water content of the shoots. Water stress in a short period could develop the root system in Trifolium repens, but if it last more than this other ecological and soil factors will affect the growth of this plant. Low level of soil water is not so important for studied legume forges. But water shortage affect palatability and water content of aerial parts. Leaf area due to time in studied legumes is formulated. In fact leaf area is decreased by shortage in available water. Higher leaf area means higher forage and biomass production. Medicago and Onobrychis reach to the maximum leaf area sooner than Trifolium and are able to produce an optimum soil cover and inhibit the transpiration of soil water of meadows. Correlation of root surface to Total biomass in studied plants is formulated. Medicago under water stress show a 40% decrease in crown cover while at optimum condition this amount reach to 100%. In order to produce forage in areas without soil erosion Medicago is the best choice even with a shortage in water resources. It is tried to represent the growth simulation of three famous Forage Legumes. By growth simulation farmers and range managers could better decide to choose best plant adapted to water availability without designing different time and labor consuming field experiments.

Analytical Prediction of Seismic Response of Steel Frames with Superelastic Shape Memory Alloy

Superelastic Shape Memory Alloy (SMA) is accepted when it used as connection in steel structures. The seismic behaviour of steel frames with SMA is being assessed in this study. Three eightstorey steel frames with different SMA systems are suggested, the first one of which is braced with diagonal bracing system, the second one is braced with nee bracing system while the last one is which the SMA is used as connection at the plastic hinge regions of beams. Nonlinear time history analyses of steel frames with SMA subjected to two different ground motion records have been performed using Seismostruct software. To evaluate the efficiency of suggested systems, the dynamic responses of the frames were compared. From the comparison results, it can be concluded that using SMA element is an effective way to improve the dynamic response of structures subjected to earthquake excitations. Implementing the SMA braces can lead to a reduction in residual roof displacement. The shape memory alloy is effective in reducing the maximum displacement at the frame top and it provides a large elastic deformation range. SMA connections are very effective in dissipating energy and reducing the total input energy of the whole frame under severe seismic ground motion. Using of the SMA connection system is more effective in controlling the reaction forces at the base frame than other bracing systems. Using SMA as bracing is more effective in reducing the displacements. The efficiency of SMA is dependant on the input wave motions and the construction system as well.

Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort time is required for 99% - 99.75% of defect elimination. A comparison of the end results for the five selected projects between the companies is also brought about throwing light on the possibility of a particular company to position itself with an appropriate complementary ratio of inspection testing.

Testing Loaded Programs Using Fault Injection Technique

Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.

Encrypter Information Software Using Chaotic Generators

This document shows a software that shows different chaotic generator, as continuous as discrete time. The software gives the option for obtain the different signals, using different parameters and initial condition value. The program shows then critical parameter for each model. All theses models are capable of encrypter information, this software show it too.

An UML Statechart Diagram-Based MM-Path Generation Approach for Object-Oriented Integration Testing

MM-Path, an acronym for Method/Message Path, describes the dynamic interactions between methods in object-oriented systems. This paper discusses the classifications of MM-Path, based on the characteristics of object-oriented software. We categorize it according to the generation reasons, the effect scope and the composition of MM-Path. A formalized representation of MM-Path is also proposed, which has considered the influence of state on response method sequences of messages. .Moreover, an automatic MM-Path generation approach based on UML Statechart diagram has been presented, and the difficulties in identifying and generating MM-Path can be solved. . As a result, it provides a solid foundation for further research on test cases generation based on MM-Path.

A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation

Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.

Feature Based Unsupervised Intrusion Detection

The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.

Effective Collaboration in Product Development via a Common Sharable Ontology

To achieve competitive advantage nowadays, most of the industrial companies are considering that success is sustained to great product development. That is to manage the product throughout its entire lifetime ranging from design, manufacture, operation and destruction. Achieving this goal requires a tight collaboration between partners from a wide variety of domains, resulting in various product data types and formats, as well as different software tools. So far, the lack of a meaningful unified representation for product data semantics has slowed down efficient product development. This paper proposes an ontology based approach to enable such semantic interoperability. Generic and extendible product ontology is described, gathering main concepts pertaining to the mechanical field and the relations that hold among them. The ontology is not exhaustive; nevertheless, it shows that such a unified representation is possible and easily exploitable. This is illustrated thru a case study with an example product and some semantic requests to which the ontology responds quite easily. The study proves the efficiency of ontologies as a support to product data exchange and information sharing, especially in product development environments where collaboration is not just a choice but a mandatory prerequisite.

Numerical Analysis and Sensitivity Study of Non-Premixed Combustion Using LES

Non-premixed turbulent combustion Computational Fluid Dynamics (CFD) has been carried out in a simplified methanefuelled coaxial jet combustor employing Large Eddy Simulation (LES). The objective of this study is to evaluate the performance of LES in modelling non-premixed combustion using a commercial software, FLUENT, and investigate the effects of the grid density and chemistry models employed on the accuracy of the simulation results. A comparison has also been made between LES and Reynolds Averaged Navier-Stokes (RANS) predictions. For LES grid sensitivity test, 2.3 and 6.2 million cell grids are employed with the equilibrium model. The chemistry model sensitivity analysis is achieved by comparing the simulation results from the equilibrium chemistry and steady flamelet models. The predictions of the mixture fraction, axial velocity, species mass fraction and temperature by LES are in good agreement with the experimental data. The LES results are similar for the two chemistry models but influenced considerably by the grid resolution in the inner flame and near-wall regions.

Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems

Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.

Time-Cost-Quality Trade-off Software by using Simplified Genetic Algorithm for Typical Repetitive Construction Projects

Time-Cost Optimization "TCO" is one of the greatest challenges in construction project planning and control, since the optimization of either time or cost, would usually be at the expense of the other. Since there is a hidden trade-off relationship between project and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of the schedule compression. Recently third dimension in trade-off analysis is taken into consideration that is quality of the projects. Few of the existing algorithms are applied in a case of construction project with threedimensional trade-off analysis, Time-Cost-Quality relationships. The objective of this paper is to presents the development of a practical software system; that named Automatic Multi-objective Typical Construction Resource Optimization System "AMTCROS". This system incorporates the basic concepts of Line Of Balance "LOB" and Critical Path Method "CPM" in a multi-objective Genetic Algorithms "GAs" model. The main objective of this system is to provide a practical support for typical construction planners who need to optimize resource utilization in order to minimize project cost and duration while maximizing its quality simultaneously. The application of these research developments in planning the typical construction projects holds a strong promise to: 1) Increase the efficiency of resource use in typical construction projects; 2) Reduce construction duration period; 3) Minimize construction cost (direct cost plus indirect cost); and 4) Improve the quality of newly construction projects. A general description of the proposed software for the Time-Cost-Quality Trade-Off "TCQTO" is presented. The main inputs and outputs of the proposed software are outlined. The main subroutines and the inference engine of this software are detailed. The complexity analysis of the software is discussed. In addition, the verification, and complexity of the proposed software are proved and tested using a real case study.

A Dynamic Model of Air Pollution, Health,and Population Growth Using System Dynamics: A Study on Tehran-Iran (With Computer Simulation by the Software Vensim)

The significance of environmental protection is wellknown in today's world. The execution of any program depends on sufficient knowledge and required familiarity with environment and its pollutants. Taking advantage of a systematic method, as a new science, in environmental planning can solve many problems. In this article, air pollution in Tehran and its relationship with health and population growth have been analyzed using dynamic systems. Firstly, by using casual loops, the relationship between the parameters effective on air pollution in Tehran were taken into consideration, then these casual loops were turned into flow diagrams [6], and finally, they were simulated using the software Vensim [16]in order to conclude what the effect of each parameter will be on air pollution in Tehran in the next 10 years, how changing of one or more parameters influences other parameters, and which parameter among all other parameters requires to be controlled more.

Towards a New Methodology for Developing Web-Based Systems

Web-based systems have become increasingly important due to the fact that the Internet and the World Wide Web have become ubiquitous, surpassing all other technological developments in our history. The Internet and especially companies websites has rapidly evolved in their scope and extent of use, from being a little more than fixed advertising material, i.e. a "web presences", which had no particular influence for the company's business, to being one of the most essential parts of the company's core business. Traditional software engineering approaches with process models such as, for example, CMM and Waterfall models, do not work very well since web system development differs from traditional development. The development differs in several ways, for example, there is a large gap between traditional software engineering designs and concepts and the low-level implementation model, many of the web based system development activities are business oriented (for example web application are sales-oriented, web application and intranets are content-oriented) and not engineering-oriented. This paper aims to introduce Increment Iterative extreme Programming (IIXP) methodology for developing web based systems. In difference to the other existence methodologies, this methodology is combination of different traditional and modern software engineering and web engineering principles.

A Novel Approach to EMABS and Comparison with ABS

In this paper two different Antilock braking system (ABS) are simulated and compared. One is the ordinary hydraulic ABS system which we call it ABS and the other is Electromagnetic Antilock braking system which is called (EMABS) the basis of performance of an EMABS is based upon Electromagnetic force. In this system there is no need to use servo hydraulic booster which are used in ABS system. In EMABS to generate the desired force we have use a magnetic relay which works with an input voltage through an air gap (g). The generated force will be amplified by the relay arm, and is applied to the brake shoes and thus the braking torque is generated. The braking torque is proportional to the applied electrical voltage E. to adjust the braking torque it is only necessary to regulate the electrical voltage E which is very faster and has a much smaller time constant T than the ABS system. The simulations of these two different ABS systems are done with MATLAB/SIMULINK software and the superiority of the EMABS has been shown.

Removal of CO2 and H2S using Aqueous Alkanolamine Solusions

This work presents a theoretical investigation of the simultaneous absorption of CO2 and H2S into aqueous solutions of MDEA and DEA. In this process the acid components react with the basic alkanolamine solution via an exothermic, reversible reaction in a gas/liquid absorber. The use of amine solvents for gas sweetening has been investigated using process simulation programs called HYSYS and ASPEN. We use Electrolyte NRTL and Amine Package and Amines (experimental) equation of state. The effects of temperature and circulation rate and amine concentration and packed column and murphree efficiency on the rate of absorption were studied. When lean amine flow and concentration increase, CO2 and H2S absorption increase too. With the improvement of inlet amine temperature in absorber, CO2 and H2S penetrate to upper stages of absorber and absorption of acid gases in absorber decreases. The CO2 concentration in the clean gas can be greatly influenced by the packing height, whereas for the H2S concentration in the clean gas the packing height plays a minor role. HYSYS software can not estimate murphree efficiency correctly and it applies the same contributions in all diagrams for HYSYS software. By improvement in murphree efficiency, maximum temperature of absorber decrease and the location of reaction transfer to the stages of bottoms absorber and the absorption of acid gases increase.