A Linear Use Case Based Software Cost Estimation Model

Software development is moving towards agility with use cases and scenarios being used for requirements stories. Estimates of software costs are becoming even more important than before as effects of delays is much larger in successive short releases context of agile development. Thus, this paper reports on the development of new linear use case based software cost estimation model applicable in the very early stages of software development being based on simple metric. Evaluation showed that accuracy of estimates varies between 43% and 55% of actual effort of historical test projects. These results outperformed those of wellknown models when applied in the same context. Further work is being carried out to improve the performance of the proposed model when considering the effect of non-functional requirements.

Micro-aerobic, Anaerobic and Two-stage Condition for Ethanol Production by enterobacter aerogenes from Biodiesel-derived Crude Glycerol

The microbial production of ethanol from biodiesel¬derived crude glycerol by Enterobacter aerogenes TISTR1468, under micro-aerobic and anaerobic conditions, was investigated. The experimental results showed that micro-aerobic conditions were more favorable for cellular growth (4.0 g/L DCW), ethanol production (20.7 g/L) as well as the ethanol yield (0.47 g/g glycerol) than anaerobic conditions (1.2 g/L DCW, 6.3 g/L ethanol and 0.72 g/g glycerol, respectively). Crude glycerol (100 g/L) was consumed completely with the rate of 1.80 g/L/h. Two-stage fermentation (combination of micro-aerobic and anaerobic condition) exhibited higher ethanol production (24.5 g/L) than using one-stage fermentation (either micro-aerobic or anaerobic condition. The two- stage configuration, exhibited slightly higher crude glycerol consumption rate (1.81 g/L/h), as well as ethanol yield (0.56 g/g) than the one-stage configuration. Therefore, two-stage process was selected for ethanol production from E. aerogenes TISTR1468 in scale-up studies.

A Compact Pi Network for Reducing Bit Error Rate in Dispersive FIR Channel Noise Model

During signal transmission, the combined effect of the transmitter filter, the transmission medium, and additive white Gaussian noise (AWGN) are included in the channel which distort and add noise to the signal. This causes the well defined signal constellation to spread causing errors in bit detection. A compact pi neural network with minimum number of nodes is proposed. The replacement of summation at each node by multiplication results in more powerful mapping. The resultant pi network is tested on six different channels.

Quality of Service Evaluation using a Combination of Fuzzy C-Means and Regression Model

In this study, a network quality of service (QoS) evaluation system was proposed. The system used a combination of fuzzy C-means (FCM) and regression model to analyse and assess the QoS in a simulated network. Network QoS parameters of multimedia applications were intelligently analysed by FCM clustering algorithm. The QoS parameters for each FCM cluster centre were then inputted to a regression model in order to quantify the overall QoS. The proposed QoS evaluation system provided valuable information about the network-s QoS patterns and based on this information, the overall network-s QoS was effectively quantified.

A New Predictor of Coding Regions in Genomic Sequences using a Combination of Different Approaches

Identifying protein coding regions in DNA sequences is a basic step in the location of genes. Several approaches based on signal processing tools have been applied to solve this problem, trying to achieve more accurate predictions. This paper presents a new predictor that improves the efficacy of three techniques that use the Fourier Transform to predict coding regions, and that could be computed using an algorithm that reduces the computation load. Some ideas about the combination of the predictor with other methods are discussed. ROC curves are used to demonstrate the efficacy of the proposed predictor, based on the computation of 25 DNA sequences from three different organisms.

Expelling Policy Based Buffer Control during Congestion in Differentiated Service Routers

In this paper a special kind of buffer management policy is studied where the packet are preempted even when sufficient space is available in the buffer for incoming packets. This is done to congestion for future incoming packets to improve QoS for certain type of packets. This type of study has been done in past for ATM type of scenario. We extend the same for heterogeneous traffic where data rate and size of the packets are very versatile in nature. Typical example of this scenario is the buffer management in Differentiated Service Router. There are two aspects that are of interest. First is the packet size: whether all packets have same or different sizes. Second aspect is the value or space priority of the packets, do all packets have the same space priority or different packets have different space priorities. We present two types of policies to achieve QoS goals for packets with different priorities: the push out scheme and the expelling scheme. For this work the scenario of packets of variable length is considered with two space priorities and main goal is to minimize the total weighted packet loss. Simulation and analytical studies show that, expelling policies can outperform the push out policies when it comes to offering variable QoS for packets of two different priorities and expelling policies also help improve the amount of admissible load. Some other comparisons of push out and expelling policies are also presented using simulations.

Analysis and Prototyping of Biological Systems: the Abstract Biological Process Model

The aim of a biological model is to understand the integrated structure and behavior of complex biological systems as a function of the underlying molecular networks to achieve simulation and forecast of their operation. Although several approaches have been introduced to take into account structural and environment related features, relatively little attention has been given to represent the behavior of biological systems. The Abstract Biological Process (ABP) model illustrated in this paper is an object-oriented model based on UML (the standard object-oriented language). Its main objective is to bring into focus the functional aspects of the biological system under analysis.

Depth Controls of an Autonomous Underwater Vehicle by Neurocontrollers for Enhanced Situational Awareness

This paper focuses on a critical component of the situational awareness (SA), the neural control of autonomous constant depth flight of an autonomous underwater vehicle (AUV). Autonomous constant depth flight is a challenging but important task for AUVs to achieve high level of autonomy under adverse conditions. The fundamental requirement for constant depth flight is the knowledge of the depth, and a properly designed controller to govern the process. The AUV, named VORAM, is used as a model for the verification of the proposed hybrid control algorithm. Three neural network controllers, named NARMA-L2 controllers, are designed for fast and stable diving maneuvers of chosen AUV model. This hybrid control strategy for chosen AUV model has been verified by simulation of diving maneuvers using software package Simulink and demonstrated good performance for fast SA in real-time searchand- rescue operations.

Balancing of Quad Tree using Point Pattern Analysis

Point quad tree is considered as one of the most common data organizations to deal with spatial data & can be used to increase the efficiency for searching the point features. As the efficiency of the searching technique depends on the height of the tree, arbitrary insertion of the point features may make the tree unbalanced and lead to higher time of searching. This paper attempts to design an algorithm to make a nearly balanced quad tree. Point pattern analysis technique has been applied for this purpose which shows a significant enhancement of the performance and the results are also included in the paper for the sake of completeness.

Project Portfolio Management Phases: A Technique for Strategy Alignment

This paper seeks to give a general idea of the universe of project portfolio management, from its multidisciplinary nature, to the many challenges it raises, passing through the different techniques, models and tools used to solve the multiple problems known. It is intended to contribute to the clarification, with great depth, of the impacts and relationships involved in managing the projects- portfolio. It aims at proposing a technique for the project alignment with the organisational strategy, in order to select projects that later on will be considered in the analysis and selection of the portfolio. We consider the development of a methodology for assessing the project alignment index very relevant in the global market scenario. It can help organisations to gain a greater awareness of market dynamics, speed up the decision process and increase its consistency, thus enabling the strategic alignment and the improvement of the organisational performance.

Managing Handheld Devices in Ad-Hoc Collaborative Computing Environments

The noticeable advance in the area of computer technology has paved the way for the invention of powerful mobile devices. However, limited storage, short battery life, and relatively low computational power define the major problems of such devices. Due to the ever increasing computational requirements, such devices may fail to process needed tasks under certain constraints. One of the proposed solutions to this drawback is the introduction of Collaborative Computing, a new concept dealing with the distribution of computational tasks amongst several handhelds. This paper introduces the basics of Collaborative Computing, and proposes a new protocol that aims at managing and optimizing computing tasks in Ad-Hoc Collaborative Computing Environments.

A Weighted Least Square Algorithm for Low-Delay FIR Filters with Piecewise Variable Stopbands

Variable digital filters are useful for various signal processing and communication applications where the frequency characteristics, such as fractional delays and cutoff frequencies, can be varied. In this paper, we propose a design method of variable FIR digital filters with an approximate linear phase characteristic in the passband. The proposed variable FIR filters have some large attenuation in stopband and their large attenuation can be varied by spectrum parameters. In the proposed design method, a quasi-equiripple characteristic can be obtained by using an iterative weighted least square method. The usefulness of the proposed design method is verified through some examples.

A New Traffic Pattern Matching for DDoS Traceback Using Independent Component Analysis

Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.

Processing and Assessment of Quality Characteristics of Composite Baby Foods

The usefulness of weaning foods to meet the nutrient needs of children is well recognized, and most of them are precooked roller dried mixtures of cereal and/or legume flours which posses a high viscosity and bulk when reconstituted. The objective of this study was to formulate composite weaning foods using cereals, malted legumes and vegetable powders and analyze them for nutrients, functional properties and sensory attributes. Selected legumes (green gram and lentil) were germinated, dried and dehulled. Roasted wheat, rice, carrot powder and skim milk powder also were used. All the ingredients were mixed in different proportions to get four formulations, made into 30% slurry and dried in roller drier. The products were analyzed for proximate principles, mineral content, functional and sensory qualities. The results of analysis showed following range of constituents per 100g of formulations on dry weight basis, protein, 18.1-18.9 g ; fat, 0.78-1.36 g ; iron, 5.09-6.53 mg; calcium, 265-310 mg. The lowest water absorption capacity was in case of wheat green gram based and the highest was in rice lentil based sample. Overall sensory qualities of all foods were graded as “good" and “very good" with no significant differences. The results confirm that formulated weaning foods were nutritionally superior, functionally appropriate and organoleptically acceptable.

Internet Bandwidth Network Quality Management: The Case Study of Telecom Organization of Thailand

This paper addresses a current problem that occurs among Thai internet service providers with regard to bandwidth network quality management. The IPSTAR department of Telecom Organization of Thailand public company (TOT); the largest internet service provider in Thailand, is the case study to analyze the problem that exists. The Internet bandwidth network quality management (iBWQM) framework is mainly applied to the problem that has been found. Bandwidth management policy (BMP) and quality of service (QoS) are two antecedents of iBWQM. This paper investigates internet user behavior, marketing demand and network operation views in order to determine bandwidth management policy (e.g. quota management, scheduling and malicious management). The congestion of bandwidth is also analyzed to enhance quality of service (QoS). Moreover, the iBWQM framework is able to improve the quality of service and increase bandwidth utilization, minimize complaint rate concerns to slow speed, and provide network planning guidelines through Thai Internet services providers.

Object-Oriented Simulation of Simulating Anticipatory Systems

The present paper is oriented to problems of simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. A certain analogy between use of simulation and imagining will be applied to make the explication more comprehensible. The paper will be completed by notes of problems and by some existing applications. The problems consist in the fact that simulation of the mentioned anticipatory systems end is simulation of simulating systems, i.e. in computer models handling two or more modeled time axes that should be mapped to real time flow in a nondescent manner. Languages oriented to objects, processes and blocks can be used to surmount the problems.

Comprehensive Characteristics of the Municipal Solid Waste Generated in the Faculty of Engineering, UKM

The main aims in this research are to study the solid waste generation in the Faculty of Engineering and Built Environment in the UKM and at the same time to determine composition and some of the waste characteristics likewise: moisture content, density, pH and C/N ratio. For this purpose multiple campaigns were conducted to collect the wastes produced in all hostels, faculties, offices and so on, during 24th of February till 2nd of March 2009, measure and investigate them with regard to both physical and chemical characteristics leading to highlight the necessary management policies. Research locations are Faculty of Engineering and the Canteen nearby that. From the result gained, the most suitable solid waste management solution will be proposed to UKM. The average solid waste generation rate in UKM is 203.38 kg/day. The composition of solid waste generated are glass, plastic, metal, aluminum, organic and inorganic waste and others waste. From the laboratory result, the average moisture content, density, pH and C/N ratio values from the solid waste generated are 49.74%, 165.1 kg/m3, 5.3, and 7:1 respectively. Since, the food waste (organic waste) were the most dominant component, around 62% from the total waste generated hence, the most suitable solid waste management solution is composting.

Domain-based Key Management Scheme for Active Network

Active network was developed to solve the problem of the current sharing-based network–difficulty in applying new technology, service or standard, and duplicated operation at several protocol layers. Active network can transport the packet loaded with the executable codes, which enables to change the state of the network node. However, if the network node is placed in the sharing-based network, security and safety issues should be resolved. To satisfy this requirement, various security aspects are required such as authentication, authorization, confidentiality and integrity. Among these security components, the core factor is the encryption key. As a result, this study is designed to propose the scheme that manages the encryption key, which is used to provide security of the comprehensive active directory, based on the domain.

Babbitt Casting and Babbitt Spraying Processes Case Study

In this paper, the babbitting of a bearing in boiler feed pump of an electromotor has been studied. These bearings have an important role in reducing the shut down times in the pumps, compressors and turbines. The most conventional method in babbitting is casting as a melting method. The comparison between thermal spray and casting methods in babbitting shows that the thermal spraying babbitt layer has better performance and tribological behavior. The metallurgical and tribological analysis such as SEM, EDS and wet chemical analysis has been made in the Babbitt alloys and worn surfaces. Two type of babbitt materials: tinbase and lead-base babbitt was used. The benefits of thermally sprayed babbitt layers are completely clear especially in large bearings.