Comparison of the Existing Methods in Determination of the Characteristic Polynomial

This paper presents comparison among methods of determination of the characteristic polynomial coefficients. First, the resultant systems from the methods are compared based on frequency criteria such as the closed loop bandwidth, gain and phase margins. Then the step responses of the resultant systems are compared on the basis of the transient behavior criteria including overshoot, rise time, settling time and error (via IAE, ITAE, ISE and ITSE integral indices). Also relative stability of the systems is compared together. Finally the best choices in regards to the above diverse criteria are presented.

Flagging Critical Components to Prevent Transient Faults in Real-Time Systems

This paper proposes the use of metrics in design space exploration that highlight where in the structure of the model and at what point in the behaviour, prevention is needed against transient faults. Previous approaches to tackle transient faults focused on recovery after detection. Almost no research has been directed towards preventive measures. But in real-time systems, hard deadlines are performance requirements that absolutely must be met and a missed deadline constitutes an erroneous action and a possible system failure. This paper proposes the use of metrics to assess the system design to flag where transient faults may have significant impact. These tools then allow the design to be changed to minimize that impact, and they also flag where particular design techniques – such as coding of communications or memories – need to be applied in later stages of design.

Relational Representation in XCSF

Generalization is one of the most challenging issues of Learning Classifier Systems. This feature depends on the representation method which the system used. Considering the proposed representation schemes for Learning Classifier System, it can be concluded that many of them are designed to describe the shape of the region which the environmental states belong and the other relations of the environmental state with that region was ignored. In this paper, we propose a new representation scheme which is designed to show various relationships between the environmental state and the region that is specified with a particular classifier.

A Comparison of Exact and Heuristic Approaches to Capital Budgeting

This paper summarizes and compares approaches to solving the knapsack problem and its known application in capital budgeting. The first approach uses deterministic methods and can be applied to small-size tasks with a single constraint. We can also apply commercial software systems such as the GAMS modelling system. However, because of NP-completeness of the problem, more complex problem instances must be solved by means of heuristic techniques to achieve an approximation of the exact solution in a reasonable amount of time. We show the problem representation and parameter settings for a genetic algorithm framework.

Wavelet-Based Despeckling of Synthetic Aperture Radar Images Using Adaptive and Mean Filters

In this paper we introduced new wavelet based algorithm for speckle reduction of synthetic aperture radar images, which uses combination of undecimated wavelet transformation, wiener filter (which is an adaptive filter) and mean filter. Further more instead of using existing thresholding techniques such as sure shrinkage, Bayesian shrinkage, universal thresholding, normal thresholding, visu thresholding, soft and hard thresholding, we use brute force thresholding, which iteratively run the whole algorithm for each possible candidate value of threshold and saves each result in array and finally selects the value for threshold that gives best possible results. That is why it is slow as compared to existing thresholding techniques but gives best results under the given algorithm for speckle reduction.

Multivalued Knowledge-Base based on Multivalued Datalog

The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. The concept of multivalued knowledgebase will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these extensions the concept of multivalued knowledge-base will be defined. This knowledge-base can be a possible background of a future agent-model.

New Recursive Representations for the Favard Constants with Application to the Summation of Series

In this study integral form and new recursive formulas for Favard constants and some connected with them numeric and Fourier series are obtained. The method is based on preliminary integration of Fourier series which allows for establishing finite recursive representations for the summation. It is shown that the derived recursive representations are numerically more effective than known representations of the considered objects.

Application of SDS/LABS in Recovery Improvement from Fractured Models

This work concerns on experimentally investigation of surfactant flooding in fractured porous media. In this study a series of water and surfactant injection processes were performed on micromodels initially saturated with a heavy crude oil. Eight fractured glass micromodels were used to illustrate effects of surfactant types and concentrations on oil recovery efficiency in presence of fractures with different properties i.e. fracture orientation, length and number of fractures. Two different surfactants with different concentrations were tested. The results showed that surfactant flooding would be more efficient by using SDS surfactant aqueous solution and also by locating injection well in a proper position respect to fracture properties. This study demonstrates different physical and chemical conditions that affect the efficiency of this method of enhanced oil recovery.

Monte Carlo Simulation of Copolymer Heterogeneity in Atom Transfer Radical Copolymerization of Styrene and N-Butyl Acrylate

A high-performance Monte Carlo simulation, which simultaneously takes diffusion-controlled and chain-length-dependent bimolecular termination reactions into account, is developed to simulate atom transfer radical copolymerization of styrene and nbutyl acrylate. As expected, increasing initial feed fraction of styrene raises the fraction of styrene-styrene dyads (fAA) and reduces that of n-butyl acrylate dyads (fBB). The trend of variation in randomness parameter (fAB) during the copolymerization also varies significantly. Also, there is a drift in copolymer heterogeneity and the highest drift occurs in the initial feeds containing lower percentages of styrene, i.e. 20% and 5%.

A Survey on Performance Tools for OpenMP

Advances in processors architecture, such as multicore, increase the size of complexity of parallel computer systems. With multi-core architecture there are different parallel languages that can be used to run parallel programs. One of these languages is OpenMP which embedded in C/Cµ or FORTRAN. Because of this new architecture and the complexity, it is very important to evaluate the performance of OpenMP constructs, kernels, and application program on multi-core systems. Performance is the activity of collecting the information about the execution characteristics of a program. Performance tools consists of at least three interfacing software layers, including instrumentation, measurement, and analysis. The instrumentation layer defines the measured performance events. The measurement layer determines what performance event is actually captured and how it is measured by the tool. The analysis layer processes the performance data and summarizes it into a form that can be displayed in performance tools. In this paper, a number of OpenMP performance tools are surveyed, explaining how each is used to collect, analyse, and display data collection.

A Detailed Timber Harvest Simulator Coupled with 3-D Visualization

In today-s world, the efficient utilization of wood resources comes more and more to the mind of forest owners. It is a very complex challenge to ensure an efficient harvest of the wood resources. This is one of the scopes the project “Virtual Forest II" addresses. Its core is a database with data about forests containing approximately 260 million trees located in North Rhine-Westphalia (NRW). Based on this data, tree growth simulations and wood mobilization simulations can be conducted. This paper focuses on the latter. It describes a discrete-event-simulation with an attached 3-D real time visualization which simulates timber harvest using trees from the database with different crop resources. This simulation can be displayed in 3-D to show the progress of the wood crop. All the data gathered during the simulation is presented as a detailed summary afterwards. This summary includes cost-benefit calculations and can be compared to those of previous runs to optimize the financial outcome of the timber harvest by exchanging crop resources or modifying their parameters.

Degradability Studies of Photodegradable Plastic Film

Polypropylene blended with natural oil and pigment additives has been studied. Different formulations for each compound were made into polybag used for cultivation of oil palm seedlings for strength and mechanical properties studies. One group of sample was exposed under normal sunlight to initiate degradation and another group of sample was placed under shaded area for five months. All samples were tested for tensile strength to determine the degradation effects. The tensile strength of directly exposed sunlight samples and shaded area showed up to 50% and 25% degradation respectively. However, similar reduction of Young’s modulus for all samples was found for both exposures. Structural investigations were done using FTIR to detect deformation. The natural additives that were used in the studies were all natural and environmental friendly

The Performance of Disbursement Procedure of Public Works in Thailand

This paper analysis performance of disbursement procedure of public works project in Thailand. The results of research were summarised based on contracts, submitted invoice, inspection dated, copies of disbursement dated between client and their main contractor and interviewed with persons involved in central and local government projects during 1994-2008 in Thailand. The data collection was to investigate the disbursement procedure related to performance in disbursement during construction period (Planned duration of contract against Actual execution date in each month). A graphical presentation of a duration analysis of the projects illustrated significant disbursement formation in each project. It was established that the shortage of staff, the financial stability of clients, bureaucratic, method of disbursement and economics situation has play major role on performance of disbursement to their main contractors.

Analysis and Design Business Directory for Micro, Small and Medium Enterprises using Google Maps API and Multimedia

This paper explain about analysis and design a business directory for micro-scale businesses, small and medium enterprises (SMEs). Business Directory, if implemented will facilitate and optimize the access of SMEs to ease suppliers access to marketing. Business Directory will be equipped with the power of geocoding, so each location can be easily viewed SMEs on the map. The map will be constructed by using the functionality of a webbased Google Maps API. The information presented in the form of multimedia that can be more interesting and interactive. The method used to achieve the goal are: observation; interviews; modeling and classifying business directory for SMEs.

Enhancing the e-Government Functionality using Knowledge Management

The primary aim of the e-government applications is the fast citizen service and the accomplishment of governmental functions. This paper discusses the knowledge management for egovernment development in the needs and role. The paper focused on analyzing the advantages of using knowledge management by using the existing IT technologies to maximize the government functions efficiency. The proposed new approach of providing government services is based on using Knowledge management as a part of e-government system.

Design Analysis of a Slotted Microstrip Antenna for Wireless Communication

In this paper, a new design technique for enhancing bandwidth that improves the performance of a conventional microstrip patch antenna is proposed. This paper presents a novel wideband probe fed inverted slotted microstrip patch antenna. The design adopts contemporary techniques; coaxial probe feeding, inverted patch structure and slotted patch. The composite effect of integrating these techniques and by introducing the proposed patch, offer a low profile, broadband, high gain, and low cross-polarization level. The results for the VSWR, gain and co-and cross-polarization patterns are presented. The antenna operating the band of 1.80-2.36 GHz shows an impedance bandwidth (2:1 VSWR) of 27% and a gain of 10.18 dBi with a gain variation of 1.12 dBi. Good radiation characteristics, including a cross-polarization level in xz-plane less than -42 dB, have been obtained.

Processor Scheduling on Parallel Computers

Many problems in computer vision and image processing present potential for parallel implementations through one of the three major paradigms of geometric parallelism, algorithmic parallelism and processor farming. Static process scheduling techniques are used successfully to exploit geometric and algorithmic parallelism, while dynamic process scheduling is better suited to dealing with the independent processes inherent in the process farming paradigm. This paper considers the application of parallel or multi-computers to a class of problems exhibiting spatial data characteristic of the geometric paradigm. However, by using processor farming paradigm, a dynamic scheduling technique is developed to suit the MIMD structure of the multi-computers. A hybrid scheme of scheduling is also developed and compared with the other schemes. The specific problem chosen for the investigation is the Hough transform for line detection.

Aspect Oriented Software Architecture

Natural language processing systems pose a unique challenge for software architectural design as system complexity has increased continually and systems cannot be easily constructed from loosely coupled modules. Lexical, syntactic, semantic, and pragmatic aspects of linguistic information are tightly coupled in a manner that requires separation of concerns in a special way in design, implementation and maintenance. An aspect oriented software architecture is proposed in this paper after critically reviewing relevant architectural issues. For the purpose of this paper, the syntactic aspect is characterized by an augmented context-free grammar. The semantic aspect is composed of multiple perspectives including denotational, operational, axiomatic and case frame approaches. Case frame semantics matured in India from deep thematic analysis. It is argued that lexical, syntactic, semantic and pragmatic aspects work together in a mutually dependent way and their synergy is best represented in the aspect oriented approach. The software architecture is presented with an augmented Unified Modeling Language.

The Impact of Self-Phase Modulation on Dispersion Compensated Mapping Multiplexing Technique (MMT)

An exploration in the competency of the optical multilevel Mapping Multiplexing Technique (MMT) system in tolerating to the impact of nonlinearities as Self Phase Modulation (SPM) during the presence of dispersion compensation methods. The existence of high energy pulses stimulates deterioration in the chirp compression process attained by SPM which introduces an upper power boundary limit. An evaluation of the post and asymmetric prepost fiber compensation methods have been deployed on the MMT system compared with others of the same bit rate modulation formats. The MMT 40 Gb/s post compensation system has 1.4 dB enhancements to the 40 Gb/s 4-Arysystem and less than 3.9 dB penalty compared to the 40 Gb/s OOK-RZsystem. However, the optimized Pre-Post asymmetric compensation has an enhancement of 4.6 dB compared to the Post compensation MMT configuration for a 30% pre compensation dispersion.

Using Exponential Lévy Models to Study Implied Volatility patterns for Electricity Options

German electricity European options on futures using Lévy processes for the underlying asset are examined. Implied volatility evolution, under each of the considered models, is discussed after calibrating for the Merton jump diffusion (MJD), variance gamma (VG), normal inverse Gaussian (NIG), Carr, Geman, Madan and Yor (CGMY) and the Black and Scholes (B&S) model. Implied volatility is examined for the entire sample period, revealing some curious features about market evolution, where data fitting performances of the five models are compared. It is shown that variance gamma processes provide relatively better results and that implied volatility shows significant differences through time, having increasingly evolved. Volatility changes for changed uncertainty, or else, increasing futures prices and there is evidence for the need to account for seasonality when modelling both electricity spot/futures prices and volatility.