An Examination of the Factors Influencing Software Development Effort

Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.

Verification of Protocol Design using UML - SMV

In recent past, the Unified Modeling Language (UML) has become the de facto industry standard for object-oriented modeling of the software systems. The syntax and semantics rich UML has encouraged industry to develop several supporting tools including those capable of generating deployable product (code) from the UML models. As a consequence, ensuring the correctness of the model/design has become challenging and extremely important task. In this paper, we present an approach for automatic verification of protocol model/design. As a case study, Session Initiation Protocol (SIP) design is verified for the property, “the CALLER will not converse with the CALLEE before the connection is established between them ". The SIP is modeled using UML statechart diagrams and the desired properties are expressed in temporal logic. Our prototype verifier “UML-SMV" is used to carry out the verification. We subjected an erroneous SIP model to the UML-SMV, the verifier could successfully detect the error (in 76.26ms) and generate the error trace.

The Impact of Financial Risks on Profitability of Malaysian Commercial Banks: 1996-2005

This paper examines the relationship between financial risks and profitability of the conventional and Islamic banks in Malaysia for the period between 1996 and 2005. The measures of profitability that have been used in the study are the return on equity (ROE) and return on assets (ROA) while the financial risks are credit risk, interest rate risk and liquidity risks. This study employs panel data regression analysis of Generalised Least Squares of fixed effects and random effects models. It was found that credit risk has a significant impact on ROA and ROE for the conventional as well as the Islamic banks. The relationship between interest rate risk and ROE were found to be weakly significant for the conventional banks and insignificant for the Islamic banks. The effect of interest rate risk on ROA is significant for the conventional banks. Liquidity risk was found to have an insignificant impact on both profitability measures.

A Computational Stochastic Modeling Formalism for Biological Networks

Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.

Experimental Investigation of Drying Behavior of Rosehip in a Cyclone-Type Dryer

This paper describes an experimental investigation of the drying behavior and conditions of rosehip in a convective cyclone-type dryer. Drying experiments were conducted at air inlet temperatures of 50, 60 and 70 o C and air velocities of 0.5, 1 and 1.5 ms–1. The parametric values obtained from the experiments were fitted to the Newton mathematical models. Consequently, the drying model developed by Newton model showed good agreement with the data obtained from the experiments. Concluding, it was obtained that; (i) the temperature is the major effect on the drying process, (ii) air velocity has low effect on the drying of rosehip, (iii) the C-vitamin is observed to change according to the temperature, moisture, drying time and flow types. The changing ratio is found to be in the range of 0.70-0.74.

A Decision Boundary based Discretization Technique using Resampling

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

AGENTMAP: A Conceptual Meta-Model of Interacting Simulations

A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.

Ratio-Dependent Food Chain Models with Three Trophic Levels

In this paper we study a food chain model with three trophic levels and Michaelis-Menten type ratio-dependent functional response. Distinctive feature of this model is the sensitive dependence of the dynamical behavior on the initial populations and parameters of the real world. The stability of the equilibrium points are also investigated.

Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

The Use of Artificial Neural Network in Option Pricing: The Case of S and P 100 Index Options

Due to the increasing and varying risks that economic units face with, derivative instruments gain substantial importance, and trading volumes of derivatives have reached very significant level. Parallel with these high trading volumes, researchers have developed many different models. Some are parametric, some are nonparametric. In this study, the aim is to analyse the success of artificial neural network in pricing of options with S&P 100 index options data. Generally, the previous studies cover the data of European type call options. This study includes not only European call option but also American call and put options and European put options. Three data sets are used to perform three different ANN models. One only includes data that are directly observed from the economic environment, i.e. strike price, spot price, interest rate, maturity, type of the contract. The others include an extra input that is not an observable data but a parameter, i.e. volatility. With these detail data, the performance of ANN in put/call dimension, American/European dimension, moneyness dimension is analyzed and whether the contribution of the volatility in neural network analysis make improvement in prediction performance or not is examined. The most striking results revealed by the study is that ANN shows better performance when pricing call options compared to put options; and the use of volatility parameter as an input does not improve the performance.

Investigation of Inert Gas Injection in Steam Reforming of Methane: Energy

Synthesis gas manufacturing by steam reforming of hydrocarbons is an important industrial process. High endothermic nature of the process makes it one of the most cost and heat intensive processes. In the present work, composite effect of different inert gases on synthesis gas yield, feed gas conversion and temperature distribution along the reactor length has been studied using a heterogeneous model. Mathematical model was developed as a first stage and validated against the existing process models. With the addition of inert gases, a higher yield of synthesis gas is observed. Simultaneously the rector outlet temperature drops to as low as 810 K. It was found that Xenon gives the highest yield and conversion while Helium gives the lowest temperature. Using Xenon inert gas 20 percent reduction in outlet temperature was observed compared to traditional case.

Neuro-fuzzy Model and Regression Model a Comparison Study of MRR in Electrical Discharge Machining of D2 Tool Steel

In the current research, neuro-fuzzy model and regression model was developed to predict Material Removal Rate in Electrical Discharge Machining process for AISI D2 tool steel with copper electrode. Extensive experiments were conducted with various levels of discharge current, pulse duration and duty cycle. The experimental data are split into two sets, one for training and the other for validation of the model. The training data were used to develop the above models and the test data, which was not used earlier to develop these models were used for validation the models. Subsequently, the models are compared. It was found that the predicted and experimental results were in good agreement and the coefficients of correlation were found to be 0.999 and 0.974 for neuro fuzzy and regression model respectively

Application of a Systemic Soft Domain-Driven Design Framework

This paper proposes a “soft systems" approach to domain-driven design of computer-based information systems. We propose a systemic framework combining techniques from Soft Systems Methodology (SSM), the Unified Modelling Language (UML), and an implementation pattern known as “Naked Objects". We have used this framework in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within the proposed framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to generate a ubiquitous language (soft language) which can be used as the basis for developing an object-oriented domain model. The domain model is further developed using techniques based on the UML and is implemented in software following the “Naked Objects" implementation pattern. We argue that there are advantages from combining and using techniques from different methodologies in this way. The proposed systemic framework is overviewed and justified as multimethodologyusing Mingers multimethodology ideas. This multimethodology approach is being evaluated through a series of action research projects based on real-world case studies. A Peer-Tutoring case study is presented here as a sample of the framework evaluation process

Assessment of Channel Unavailability Effect on the Wireless Networks Teletraffic Modeling and Analysis

Whereas cellular wireless communication systems are subject to short-and long-term fading. The effect of wireless channel has largely been ignored in most of the teletraffic assessment researches. In this paper, a mathematical teletraffic model is proposed to estimate blocking and forced termination probabilities of cellular wireless networks as a result of teletraffic behavior as well as the outage of the propagation channel. To evaluate the proposed teletraffic model, gamma inter-arrival and general service time distributions have been considered based on wireless channel fading effect. The performance is evaluated and compared with the classical model. The proposed model is dedicated and investigated in different operational conditions. These conditions will consider not only the arrival rate process, but also, the different faded channels models.

Evaluation of Newly Developed Dot-ELISA Test for Identification of Naja-naja sumantrana and Calloselasma rhodostoma Venom Antigens

Snake bite cases in Malaysia most often involve the species Naja-naja and Calloselasma rhodostoma. In keeping with the need for a rapid snake venom detection kit in a clinical setting, plate and dot-ELISA test for the venoms of Naja-naja sumatrana, Calloselasma rhodostoma and the cobra venom fraction V antigen was developed. Polyclonal antibodies were raised and further used to prepare the reagents for the dot-ELISA test kit which was tested in mice, rabbit and virtual human models. The newly developed dot- ELISA kit was able to detect a minimum venom concentration of 244ng/ml with cross reactivity of one antibody type. The dot-ELISA system was sensitive and specific for all three snake venom types in all tested animal models. The lowest minimum venom concentration detectable was in the rabbit model, 244ng/ml of the cobra venom fraction V antigen. The highest minimum venom concentration was in mice, 1953ng/ml against a multitude of venoms. The developed dot-ELISA system for the detection of three snake venom types was successful with a sensitivity of 95.8% and specificity of 97.9%.

A Comparison of the Nonparametric Regression Models using Smoothing Spline and Kernel Regression

This paper study about using of nonparametric models for Gross National Product data in Turkey and Stanford heart transplant data. It is discussed two nonparametric techniques called smoothing spline and kernel regression. The main goal is to compare the techniques used for prediction of the nonparametric regression models. According to the results of numerical studies, it is concluded that smoothing spline regression estimators are better than those of the kernel regression.

An Investigation into the Role of Market Beta in Asset Pricing: Evidence from the Romanian Stock Market

In this paper, we apply the FM methodology to the cross-section of Romanian-listed common stocks and investigate the explanatory power of market beta on the cross-section of commons stock returns from Bucharest Stock Exchange. Various assumptions are empirically tested, such us linearity, market efficiency, the “no systematic effect of non-beta risk" hypothesis or the positive expected risk-return trade-off hypothesis. We find that the Romanian stock market shows the same properties as the other emerging markets in terms of efficiency and significance of the linear riskreturn models. Our analysis included weekly returns from January 2002 until May 2010 and the portfolio formation, estimation and testing was performed in a rolling manner using 51 observations (one year) for each stage of the analysis.

Unsupervised Texture Classification and Segmentation

An unsupervised classification algorithm is derived by modeling observed data as a mixture of several mutually exclusive classes that are each described by linear combinations of independent non-Gaussian densities. The algorithm estimates the data density in each class by using parametric nonlinear functions that fit to the non-Gaussian structure of the data. This improves classification accuracy compared with standard Gaussian mixture models. When applied to textures, the algorithm can learn basis functions for images that capture the statistically significant structure intrinsic in the images. We apply this technique to the problem of unsupervised texture classification and segmentation.

Agent-Based Simulation and Analysis of Network-Centric Air Defense Missile Systems

Network-Centric Air Defense Missile Systems (NCADMS) represents the superior development of the air defense missile systems and has been regarded as one of the major research issues in military domain at present. Due to lack of knowledge and experience on NCADMS, modeling and simulation becomes an effective approach to perform operational analysis, compared with those equation based ones. However, the complex dynamic interactions among entities and flexible architectures of NCADMS put forward new requirements and challenges to the simulation framework and models. ABS (Agent-Based Simulations) explicitly addresses modeling behaviors of heterogeneous individuals. Agents have capability to sense and understand things, make decisions, and act on the environment. They can also cooperate with others dynamically to perform the tasks assigned to them. ABS proves an effective approach to explore the new operational characteristics emerging in NCADMS. In this paper, based on the analysis of network-centric architecture and new cooperative engagement strategies for NCADMS, an agent-based simulation framework by expanding the simulation framework in the so-called System Effectiveness Analysis Simulation (SEAS) was designed. The simulation framework specifies components, relationships and interactions between them, the structure and behavior rules of an agent in NCADMS. Based on scenario simulations, information and decision superiority and operational advantages in NCADMS were analyzed; meanwhile some suggestions were provided for its future development.

Defining a Semantic Web-based Framework for Enabling Automatic Reasoning on CIM-based Management Platforms

CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping provides CIM diagrams with precise semantics and can be used for automatic reasoning about the management information models, as a design aid, by means of newgeneration CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.