Further the Effectiveness of Software Testability Measure

Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.

Finite Element Analysis of Full Ceramic Crowns with and without Zirconia Framework

Simulation of occlusal function during laboratory material-s testing becomes essential in predicting long-term performance before clinical usage. The aim of the study was to assess the influence of chamfer preparation depth on failure risk of heat pressed ceramic crowns with and without zirconia framework by means of finite element analysis. 3D models of maxillary central incisor, prepared for full ceramic crowns with different depths of the chamfer margin (between 0.8 and 1.2 mm) and 6-degree tapered walls together with the overlying crowns were generated using literature data (Fig. 1, 2). The crowns were designed with and without a zirconia framework with a thickness of 0.4 mm. For all preparations and crowns, stresses in the pressed ceramic crown, zirconia framework, pressed ceramic veneer, and dentin were evaluated separately. The highest stresses were registered in the dentin. The depth of the preparations had no significant influence on the stress values of the teeth and pressed ceramics for the studied cases, only for the zirconia framework. The zirconia framework decreases the stress values in the veneer.

Towards Modeling for Crashes A Low-Cost Adaptive Methodology for Karachi

The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.

A Formulation of the Latent Class Vector Model for Pairwise Data

In this research, a latent class vector model for pairwise data is formulated. As compared to the basic vector model, this model yields consistent estimates of the parameters since the number of parameters to be estimated does not increase with the number of subjects. The result of the analysis reveals that the model was stable and could classify each subject to the latent classes representing the typical scales used by these subjects.

Deflection Control in Composite Building by Using Belt Truss and Outriggers Systems

The design of high-rise building is more often dictated by its serviceability rather than strength. Structural Engineers are always striving to overcome challenge of controlling lateral deflection and storey drifts as well as self weight of structure imposed on foundation. One of the most effective techniques is the use of outrigger and belt truss system in Composite structures that can astutely solve the above two issues in High-rise constructions. This paper investigates deflection control by effective utilisation of belt truss and outrigger system on a 60-storey composite building subjected to wind loads. A three dimensional Finite Element Analysis is performed with one, two and three outrigger levels. The reductions in lateral deflection are 34%, 42% and 51% respectively as compared to a model without any outrigger system. There is an appreciable decline in the storey drifts with the introduction of these stiffer arrangements.

Study of Aero-thermal Effects with Heat Radiation in Optical Side Window

In hypersonic environments, the aerothermal effect makes it difficult for the optical side windows of optical guided missiles to withstand high heat. This produces cracking or breaking, resulting in an inability to function. This study used computational fluid mechanics to investigate the external cooling jet conditions of optical side windows. The turbulent models k-ε and k-ω were simulated. To be in better accord with actual aerothermal environments, a thermal radiation model was added to examine suitable amounts of external coolants and the optical window problems of aero-thermodynamics. The simulation results indicate that when there are no external cooling jets, because airflow on the optical window and the tail groove produce vortices, the temperatures in these two locations reach a peak of approximately 1600 K. When the external cooling jets worked at 0.15 kg/s, the surface temperature of the optical windows dropped to approximately 280 K. When adding thermal radiation conditions, because heat flux dissipation was faster, the surface temperature of the optical windows fell from 280 K to approximately 260 K. The difference in influence of the different turbulence models k-ε and k-ω on optical window surface temperature was not significant.

Using Structural Equation Modeling in Causal Relationship Design for Balanced-Scorecards' Strategic Map

Through 1980s, management accounting researchers described the increasing irrelevance of traditional control and performance measurement systems. The Balanced Scorecard (BSC) is a critical business tool for a lot of organizations. It is a performance measurement system which translates mission and strategy into objectives. Strategy map approach is a development variant of BSC in which some necessary causal relations must be established. To recognize these relations, experts usually use experience. It is also possible to utilize regression for the same purpose. Structural Equation Modeling (SEM), which is one of the most powerful methods of multivariate data analysis, obtains more appropriate results than traditional methods such as regression. In the present paper, we propose SEM for the first time to identify the relations between objectives in the strategy map, and a test to measure the importance of relations. In SEM, factor analysis and test of hypotheses are done in the same analysis. SEM is known to be better than other techniques at supporting analysis and reporting. Our approach provides a framework which permits the experts to design the strategy map by applying a comprehensive and scientific method together with their experience. Therefore this scheme is a more reliable method in comparison with the previously established methods.

An Agent Based Simulation for Network Formation with Heterogeneous Agents

We investigate an asymmetric connections model with a dynamic network formation process, using an agent based simulation. We permit heterogeneity of agents- value. Valuable persons seem to have many links on real social networks. We focus on this point of view, and examine whether valuable agents change the structures of the terminal networks. Simulation reveals that valuable agents diversify the terminal networks. We can not find evidence that valuable agents increase the possibility that star networks survive the dynamic process. We find that valuable agents disperse the degrees of agents in each terminal network on an average.

Design and Simulation of Air-Fuel Ratio Control System for Distributorless CNG Engine

This paper puts forward one kind of air-fuel ratio control method with PI controller. With the help of MATLAB/SIMULINK software, the mathematical model of air-fuel ratio control system for distributorless CNG engine is constructed. The objective is to maintain cylinder-to-cylinder air-fuel ratio at a prescribed set point, determined primarily by the state of the Three- Way-Catalyst (TWC), so that the pollutants in the exhaust are removed with the highest efficiency. The concurrent control of airfuel under transient conditions could be implemented by Proportional and Integral (PI) controller. The simulation result indicates that the control methods can easily eliminate the air/fuel maldistribution and maintain the air/fuel ratio at the stochiometry within minimum engine events.

An Investigation into Turbine Blade Tip Leakage Flows at High Speeds

The effect of the blade tip geometry of a high pressure gas turbine is studied experimentally and computationally for high speed leakage flows. For this purpose two simplified models are constructed, one models a flat tip of the blade and the second models a cavity tip of the blade. Experimental results are obtained from a transonic wind tunnel to show the static pressure distribution along the tip wall and provide flow visualization. RANS computations were carried to provide further insight into the mean flow behavior and to calculate the discharge coefficient which is a measure of the flow leaking over the tip. It is shown that in both geometries of tip the flow separates over the tip to form a separation bubble. The bubble is higher for the cavity tip while a complete shock wave system of oblique waves ending with a normal wave can be seen for the flat tip. The discharge coefficient for the flat tip shows less dependence on the pressure ratio over the blade tip than the cavity tip. However, the discharge coefficient for the cavity tip is lower than that of the flat tip, showing a better ability to reduce the leakage flow and thus increase the turbine efficiency.

Pattern Classification of Back-Propagation Algorithm Using Exclusive Connecting Network

The objective of this paper is to a design of pattern classification model based on the back-propagation (BP) algorithm for decision support system. Standard BP model has done full connection of each node in the layers from input to output layers. Therefore, it takes a lot of computing time and iteration computing for good performance and less accepted error rate when we are doing some pattern generation or training the network. However, this model is using exclusive connection in between hidden layer nodes and output nodes. The advantage of this model is less number of iteration and better performance compare with standard back-propagation model. We simulated some cases of classification data and different setting of network factors (e.g. hidden layer number and nodes, number of classification and iteration). During our simulation, we found that most of simulations cases were satisfied by BP based using exclusive connection network model compared to standard BP. We expect that this algorithm can be available to identification of user face, analysis of data, mapping data in between environment data and information.

Helicopter Adaptive Control with Parameter Estimation Based on Feedback Linearization

This paper presents an adaptive feedback linearization approach to derive helicopter. Ideal feedback linearization is defined for the cases when the system model is known. Adaptive feedback linearization is employed to get asymptotically exact cancellation for the inherent uncertainty in the knowledge of the given parameters of system. The control algorithm is implemented using the feedback linearization technique and adaptive method. The controller parameters are unknown where an adaptive control law aims to drive them towards their ideal values for providing perfect model matching between the reference model and the closed-loop plant model. The converged parameters of controller would then provide good estimates for the unknown plant parameters.

A Model to Determine Atmospheric Stability and its Correlation with CO Concentration

Atmospheric stability plays the most important role in the transport and dispersion of air pollutants. Different methods are used for stability determination with varying degrees of complexity. Most of these methods are based on the relative magnitude of convective and mechanical turbulence in atmospheric motions. Richardson number, Monin-Obukhov length, Pasquill-Gifford stability classification and Pasquill–Turner stability classification, are the most common parameters and methods. The Pasquill–Turner Method (PTM), which is employed in this study, makes use of observations of wind speed, insolation and the time of day to classify atmospheric stability with distinguishable indices. In this study, a model is presented to determination of atmospheric stability conditions using PTM. As a case study, meteorological data of Mehrabad station in Tehran from 2000 to 2005 is applied to model. Here, three different categories are considered to deduce the pattern of stability conditions. First, the total pattern of stability classification is obtained and results show that atmosphere is 38.77%, 27.26%, 33.97%, at stable, neutral and unstable condition, respectively. It is also observed that days are mostly unstable (66.50%) while nights are mostly stable (72.55%). Second, monthly and seasonal patterns are derived and results indicate that relative frequency of stable conditions decrease during January to June and increase during June to December, while results for unstable conditions are exactly in opposite manner. Autumn is the most stable season with relative frequency of 50.69% for stable condition, whilst, it is 42.79%, 34.38% and 27.08% for winter, summer and spring, respectively. Hourly stability pattern is the third category that points out that unstable condition is dominant from approximately 03-15 GTM and 04-12 GTM for warm and cold seasons, respectively. Finally, correlation between atmospheric stability and CO concentration is achieved.

Photo Mosaic Smartphone Application in Client-Server Based Large-Scale Image Databases

In this paper we present a photo mosaic smartphone application in client-server based large-scale image databases. Photo mosaic is not a new concept, but there are very few smartphone applications especially for a huge number of images in the client-server environment. To support large-scale image databases, we first propose an overall framework working as a client-server model. We then present a concept of image-PAA features to efficiently handle a huge number of images and discuss its lower bounding property. We also present a best-match algorithm that exploits the lower bounding property of image-PAA. We finally implement an efficient Android-based application and demonstrate its feasibility.

Hybridized Technique to Analyze Workstress Related Data via the StressCafé

This paper presents anapproach of hybridizing two or more artificial intelligence (AI) techniques which arebeing used to fuzzify the workstress level ranking and categorize the rating accordingly. The use of two or more techniques (hybrid approach) has been considered in this case, as combining different techniques may lead to neutralizing each other-s weaknesses generating a superior hybrid solution. Recent researches have shown that there is a need for a more valid and reliable tools, for assessing work stress. Thus artificial intelligence techniques have been applied in this instance to provide a solution to a psychological application. An overview about the novel and autonomous interactive model for analysing work-stress that has been developedusing multi-agent systems is also presented in this paper. The establishment of the intelligent multi-agent decision analyser (IMADA) using hybridized technique of neural networks and fuzzy logic within the multi-agent based framework is also described.

A Study of Classification Models to Predict Drill-Bit Breakage Using Degradation Signals

Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.

Dimensional Modeling of HIV Data Using Open Source

Selecting the data modeling technique for an information system is determined by the objective of the resultant data model. Dimensional modeling is the preferred modeling technique for data destined for data warehouses and data mining, presenting data models that ease analysis and queries which are in contrast with entity relationship modeling. The establishment of data warehouses as components of information system landscapes in many organizations has subsequently led to the development of dimensional modeling. This has been significantly more developed and reported for the commercial database management systems as compared to the open sources thereby making it less affordable for those in resource constrained settings. This paper presents dimensional modeling of HIV patient information using open source modeling tools. It aims to take advantage of the fact that the most affected regions by the HIV virus are also heavily resource constrained (sub-Saharan Africa) whereas having large quantities of HIV data. Two HIV data source systems were studied to identify appropriate dimensions and facts these were then modeled using two open source dimensional modeling tools. Use of open source would reduce the software costs for dimensional modeling and in turn make data warehousing and data mining more feasible even for those in resource constrained settings but with data available.

Environmental Efficiency of Electric Power Industry of the United States: A Data Envelopment Analysis Approach

Importance of environmental efficiency of electric power industry stems from high demand for energy combined with global warming concerns. It is especially essential for the world largest economies like that of the United States. The paper introduces a Data Envelopment Analysis (DEA) model of environmental efficiency using indicators of fossil fuels utilization, emissions rate, and electric power losses. Using DEA is advantageous in this situation over other approaches due to its nonparametric nature. The paper analyzes data for the period of 1990 - 2006 by comparing actual yearly levels in each dimension with the best values of partial indicators for the period. As positive factors of efficiency, tendency to the decline in emissions rates starting 2000, and in electric power losses starting 2004 may be mentioned together with increasing trend of fuel utilization starting 1999. As a result, dynamics of environmental efficiency is positive starting 2002. The main concern is the decline in fossil fuels utilization in 2006. This negative change should be reversed to comply with ecological and economic requirements.

Optimal Water Allocation: Sustainable Management of Dam Reservoir

Scarcity of water resources and huge costs of establishing new hydraulic installations necessitate optimal exploitation from existing reservoirs. Sustainable management and efficient exploitation from existing finite water resources are important factors in water resource management, particularly in the periods of water insufficiency and in dry regions, and on account of competitive allocations in the view of exploitation management. This study aims to minimize reservoir water release from a determined rate of demand. A numerical model for water optimal exploitation has been developed using GAMS introduced by the World Bank and applied to the case of Meijaran dam, northern Iran. The results indicate that this model can optimize the function of reservoir exploitation while required water for lower parts of the region will be supplied. Further, allocating optimal water from reservoir, the optimal rate of water allocated to any group of the users were specified to increase benefits in curve dam exploitation.