Sovereign Credit Risk Measures

This paper focuses on sovereign credit risk meaning a hot topic related to the current Eurozone crisis. In the light of the recent financial crisis, market perception of the creditworthiness of individual sovereigns has changed significantly. Before the outbreak of the financial crisis, market participants did not differentiate between credit risk born by individual states despite different levels of public indebtedness. In the proceeding of the financial crisis, the market participants became aware of the worsening fiscal situation in the European countries and started to discriminate among government issuers. Concerns about the increasing sovereign risk were reflected in surging sovereign risk premium. The main of this paper is to shed light on the characteristics of the sovereign risk with the special attention paid to the mutual relation between credit spread and the CDS premium as the main measures of the sovereign risk premium.

Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction

Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).

Stability Analysis for a Multicriteria Problem with Linear Criteria and Parameterized Principle of Optimality “from Lexicographic to Slater“

A multicriteria linear programming problem with integer variables and parameterized optimality principle "from lexicographic to Slater" is considered. A situation in which initial coefficients of penalty cost functions are not fixed but may be potentially a subject to variations is studied. For any efficient solution, appropriate measures of the quality are introduced which incorporate information about variations of penalty cost function coefficients. These measures correspond to the so-called stability and accuracy functions defined earlier for efficient solutions of a generic multicriteria combinatorial optimization problem with Pareto and lexicographic optimality principles. Various properties of such functions are studied and maximum norms of perturbations for which an efficient solution preserves the property of being efficient are calculated.

Change Detector Combination in Remotely Sensed Images Using Fuzzy Integral

Decision fusion is one of hot research topics in classification area, which aims to achieve the best possible performance for the task at hand. In this paper, we investigate the usefulness of this concept to improve change detection accuracy in remote sensing. Thereby, outputs of two fuzzy change detectors based respectively on simultaneous and comparative analysis of multitemporal data are fused by using fuzzy integral operators. This method fuses the objective evidences produced by the change detectors with respect to fuzzy measures that express the difference of performance between them. The proposed fusion framework is evaluated in comparison with some ordinary fuzzy aggregation operators. Experiments carried out on two SPOT images showed that the fuzzy integral was the best performing. It improves the change detection accuracy while attempting to equalize the accuracy rate in both change and no change classes.

Study on the Effect of Road Infrastructure, Socio-Economic and Demographic Features on Road Crashes in Bangladesh

Road crashes not only claim lives and inflict injuries but also create economic burden to the society due to loss of productivity. The problem of deaths and injuries as a result of road traffic crashes is now acknowledged to be a global phenomenon with authorities in virtually all countries of the world concerned about the growth in the number of people killed and seriously injured on their roads. However, the road crash scenario of a developing country like Bangladesh is much worse comparing with this of developed countries. For developing proper countermeasures it is necessary to identify the factors affecting crash occurrences. The objectives of the study is to examine the effect of district wise road infrastructure, socioeconomic and demographic features on crash occurrence .The unit of analysis will be taken as individual district which has not been explored much in the past. Reported crash data obtained from Bangladesh Road Transport Authority (BRTA) from the year 2004 to 2010 are utilized to develop negative binomial model. The model result will reveal the effect of road length (both paved and unpaved), road infrastructure and several socio economic characteristics on district level crash frequency in Bangladesh.

On Reversal and Transposition Medians

During the last years, the genomes of more and more species have been sequenced, providing data for phylogenetic recon- struction based on genome rearrangement measures. A main task in all phylogenetic reconstruction algorithms is to solve the median of three problem. Although this problem is NP-hard even for the sim- plest distance measures, there are exact algorithms for the breakpoint median and the reversal median that are fast enough for practical use. In this paper, this approach is extended to the transposition median as well as to the weighted reversal and transposition median. Although there is no exact polynomial algorithm known even for the pairwise distances, we will show that it is in most cases possible to solve these problems exactly within reasonable time by using a branch and bound algorithm.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Dependence of Virtual Subjects Reflection from the Features of Coping Behavior of Students

In the globalization process, when the struggle for minds and values of the people is taking place, the impact of the virtual space can cause unexpected effects and consequences in the process of adjustment of young people in this world. Their special significance is defined by unconscious influence on the underlying process of meaning and therefore the values preached by them are much more effective and affect both the personal characteristics and the peculiarities of adjustment process. Related to this the challenge is to identify factors influencing the reflection characteristics of virtual subjects and measures their impact on the personal characteristics of the students.

Closing the Achievement Gap Within Reading and Mathematics Classrooms by Fostering Hispanic Students- Educational Resilience

While many studies have conducted the achievement gap between groups of students in school districts, few studies have utilized resilience research to investigate achievement gaps within classrooms. This paper aims to summarize and discuss some recent studies Waxman, Padr├│n, and their colleagues conducted, in which they examined learning environment differences between resilient and nonresilient students in reading and mathematics classrooms. The classes consist of predominantly Hispanic elementary school students from low-income families. These studies all incorporated learning environment questionnaires and systematic observation methods. Significant differences were found between resilient and nonresilient students on their classroom learning environments and classroom behaviors. The observation results indicate that the amount and quality of teacher and student academic interaction are two of the most influential variables that promote student outcomes. This paper concludes by suggesting the following teacher practices to promote resiliency in schools: (a) using feedback from classroom observation and learning environment measures, (b) employing explicit teaching practices; and (c) understanding students on a social and personal level.

Publishing Curriculum Vitae using Weblog: An Investigation on its Usefulness, Ease of Use, and Behavioral Intention to Use

In this cyber age, the job market has been rapidly transforming and being digitalized. Submitting a paper-based curriculum vitae (CV) nowadays does not grant a job seeker a high employability rate. This paper calls for attention on the creation of mobile Curriculum Vitae or m-CV (http://mcurriculumvitae. blogspot.com), a sample of an individual CV developed using weblog, which can enhance the job hunter especially fresh graduate-s higher marketability rate. This study is designed to identify the perceptions held by Malaysian university students regarding m-CV grounded on a modified Technology Acceptance Model (TAM). It measures the strength and the direction of relationships among three major variables – Perceived Ease of Use (PEOU), Perceived Usefulness (PU) and Behavioral Intention (BI) to use. The finding shows that university students generally accepted adopting m-CV since they perceived m-CV to be more useful rather than easy to use. Additionally, this study has confirmed TAM to be a useful theoretical model in helping to understand and explain the behavioral intention to use Web 2.0 application-weblog publishing their CV. The result of the study has underlined another significant positive value of using weblog to create personal CV. Further research of m-CV has been highlighted in this paper.

Numerical Analysis and Experimental Validation of Detector Pressure Housing Subject to HPHT

Reservoirs with high pressures and temperatures (HPHT) that were considered to be atypical in the past are now frequent targets for exploration. For downhole oilfield drilling tools and components, the temperature and pressure affect the mechanical strength. To address this issue, a finite element analysis (FEA) for 206.84 MPa (30 ksi) pressure and 165°C has been performed on the pressure housing of the measurement-while-drilling/logging-whiledrilling (MWD/LWD) density tool. The density tool is a MWD/LWD sensor that measures the density of the formation. One of the components of the density tool is the pressure housing that is positioned in the tool. The FEA results are compared with the experimental test performed on the pressure housing of the density tool. Past results show a close match between the numerical results and the experimental test. This FEA model can be used for extreme HPHT and ultra HPHT analyses, and/or optimal design changes.

Landslide, Earthquake and Flood Hazard Risks of Izmir Metropolitan City, A Case: Altindag Landslide Areas

Urban disaster risks and vulnerabilities are great problems for Turkey. The annual loss of life and property through disaster in the world-s major metropolitan areas is increasing. Urban concentrations of the poor and less-informed in environmentally fragile locations suffer the impact of disaster disproportionately. Gecekondu (squatter) developments will compound the inherent risks associated with high-density environments, in appropriate technologies, and inadequate infrastructure. On the other hand, there are many geological disadvantages such as sitting on top of active tectonic plate boundaries, and why having avalanche, flood, and landslide and drought prone areas in Turkey. However, this natural formation is inevitable; the only way to survive in such a harsh geography is to be aware of importance of these natural events and to take political and physical measures. The main aim of this research is to bring up the magnitude of natural hazard risks in Izmir built-up zone, not being taken into consideration adequately. Because the dimensions of the peril are not taken seriously enough, the natural hazard risks, which are commonly well known, are not considered important or they are being forgotten after some time passes. Within this research, the magnitude of natural hazard risks for Izmir is being presented in the scope of concrete and local researches over Izmir risky areas.

Contourlet versus Wavelet Transform for a Robust Digital Image Watermarking Technique

In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.

A Hybrid Approach for Quantification of Novelty in Rule Discovery

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?

To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.

An Algorithm of Finite Capacity Material Requirement Planning System for Multi-stage Assembly Flow Shop

This paper aims to develop an algorithm of finite capacity material requirement planning (FCMRP) system for a multistage assembly flow shop. The developed FCMRP system has two main stages. The first stage is to allocate operations to the first and second priority work centers and also determine the sequence of the operations on each work center. The second stage is to determine the optimal start time of each operation by using a linear programming model. Real data from a factory is used to analyze and evaluate the effectiveness of the proposed FCMRP system and also to guarantee a practical solution to the user. There are five performance measures, namely, the total tardiness, the number of tardy orders, the total earliness, the number of early orders, and the average flow-time. The proposed FCMRP system offers an adjustable solution which is a compromised solution among the conflicting performance measures. The user can adjust the weight of each performance measure to obtain the desired performance. The result shows that the combination of FCMRP NP3 and EDD outperforms other combinations in term of overall performance index. The calculation time for the proposed FCMRP system is about 10 minutes which is practical for the planners of the factory.

Assessing Nutrient Concentration and Trophic Status of Brahma Sarover at Kurukshetra, India

Eutrophication of surface water is one of the most widespread environmental problems at present. Large number of pilgrims and tourists visit sacred artificial tank known as “Brahma Sarover” located at Kurukshetra, India to take holy dip and perform religious ceremonies. The sources of pollutants include impurities in feed water, mass bathing, religious offerings and windblown particulate matter. Studies so far have focused mainly on assessing water quality for bathing purpose by using physico-chemical and bacteriological parameters. No effort has been made to assess nutrient concentration and trophic status of the tank to take more appropriate measures for improving water quality on long term basis. In the present study, total nitrogen, total phosphorous and chlorophyll a measurements have been done to assess the nutrient level and trophic status of the tank. The results show presence of high concentration of nutrients and Chlorophyll a indicating mesotrophic and eutrophic state of the tank. Phosphorous has been observed as limiting nutrient in the tank water.

Effects of Capacitor Bank Defects on Harmonic Distortion and Park's Pattern Analysis in Induction Motors

Properly sized capacitor banks are connected across induction motors for several reasons including power factor correction, reducing distortions, increasing capacity, etc. Total harmonic distortion (THD) and power factor (PF) are used in such cases to quantify the improvements obtained through connection of the external capacitor banks. On the other hand, one of the methods for assessing the motor internal condition is by the use of Park-s pattern analysis. In spite of taking adequate precautionary measures, the capacitor banks may sometimes malfunction. Such a minor fault in the capacitor bank is often not apparently discernible. This may however, give rise to substantial degradation of power factor correction performance and may also damage the supply profile. The case is more severe with the fact that the Park-s pattern gets distorted due to such external capacitor faults, and can give anomalous results about motor internal fault analyses. The aim of this paper is to present simulation and hardware laboratory test results to have an understanding of the anomalies in harmonic distortion and Park-s pattern analyses in induction motors due to capacitor bank defects.

Variation of CONWIP Systems

The paper describes the workings for four models of CONWIP systems used till date; the basic CONWIP system, the hybrid CONWIP system, the multi-product CONWIP system, and the parallel CONWIP system. The final novel model is introduced in this paper in a general form. These models may be adopted for analysis for both simulation studies and implementation on the shop floor. For each model, input parameters of interest are highlighted and their impacts on several system performance measures are addressed.

The Establishment of Cause-System of Poor Construction Site Safety and Priority Analysis from Different Perspectives

Construction site safety in China has aroused comprehensive concern all over the world. It is imperative to investigate the main causes of poor construction site safety. This paper divides all the causes into four aspects, namely the factors of workers, object, environment and management and sets up the accident causes element system based on Delphi Method. This is followed by the application of structural equation modeling to examine the importance of each aspect of causes from the standpoints of different roles related to the construction respectively. The results indicate that all the four aspects of factors are in need of improvement, and different roles have different ideas considering the priority of those factors. The paper has instructive significance for the practitioners to take measures to improve construction site safety in China accordingly.