Statistical Computational of Volatility in Financial Time Series Data

It is well known that during the developments in the economic sector and through the financial crises occur everywhere in the whole world, volatility measurement is the most important concept in financial time series. Therefore in this paper we discuss the volatility for Amman stocks market (Jordan) for certain period of time. Since wavelet transform is one of the most famous filtering methods and grows up very quickly in the last decade, we compare this method with the traditional technique, Fast Fourier transform to decide the best method for analyzing the volatility. The comparison will be done on some of the statistical properties by using Matlab program.

A Follow up Study on the Elderly Survivors - Mental Health Two Years after the Wenchuan Earthquake

Background: This investigated the mental health of the elderly survivors six months, ten months and two years after the “5.12 Wenchuan" earthquake. Methods: Two hundred and thirty-two physically healthy older survivors from earthquake-affected Mianyang County were interviewed. The measures included the Revised Impact of Event Scale (IES-R, Chinese version, for PTSD) and a Chinese Mental Health Inventory for the Elderly (MHIE). A repeated measures ANOVA test was used for statistical analysis. Results: The follow-up group had a statistically significant lower IES-R score and lower MHIE score than the initial group ten months after the earthquake. Two years later, the score of IES-R in follow-up group were still lower than that of non-follow-up group, but no differences were significant on the score of MHIE between groups. Furthermore, a negative relationship was found between scores of IES-R and MHIE. Conclusion: The earthquake has had a persistent negative impact on older survivors- mental health within the two-year period and that although the PTSD level declined significantly with time, it did not disappear completely.

Compiler-Based Architecture for Context Aware Frameworks

Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.

An Evaluation of Carbon Dioxide Emissions Trading among Enterprises -The Tokyo Cap and Trade Program-

This study aims to propose three evaluation methods to evaluate the Tokyo Cap and Trade Program when emissions trading is performed virtually among enterprises, focusing on carbon dioxide (CO2), which is the only emitted greenhouse gas that tends to increase. The first method clarifies the optimum reduction rate for the highest cost benefit, the second discusses emissions trading among enterprises through market trading, and the third verifies long-term emissions trading during the term of the plan (2010-2019), checking the validity of emissions trading partly using Geographic Information Systems (GIS). The findings of this study can be summarized in the following three points. 1. Since the total cost benefit is the greatest at a 44% reduction rate, it is possible to set it more highly than that of the Tokyo Cap and Trade Program to get more total cost benefit. 2. At a 44% reduction rate, among 320 enterprises, 8 purchasing enterprises and 245 sales enterprises gain profits from emissions trading, and 67 enterprises perform voluntary reduction without conducting emissions trading. Therefore, to further promote emissions trading, it is necessary to increase the sales volumes of emissions trading in addition to sales enterprises by increasing the number of purchasing enterprises. 3. Compared to short-term emissions trading, there are few enterprises which benefit in each year through the long-term emissions trading of the Tokyo Cap and Trade Program. Only 81 enterprises at the most can gain profits from emissions trading in FY 2019. Therefore, by setting the reduction rate more highly, it is necessary to increase the number of enterprises that participate in emissions trading and benefit from the restraint of CO2 emissions.

System-Level Energy Estimation for SoC based on the Dynamic Behavior of Embedded Software

This paper describes a system-level SoC energy consumption estimation method based on a dynamic behavior of embedded software in the early stages of the SoC development. A major problem of SOC development is development rework caused by unreliable energy consumption estimation at the early stages. The energy consumption of an SoC used in embedded systems is strongly affected by the dynamic behavior of the software. At the early stages of SoC development, modeling with a high level of abstraction is required for both the dynamic behavior of the software, and the behavior of the SoC. We estimate the energy consumption by a UML model-based simulation. The proposed method is applied for an actual embedded system in an MFP. The energy consumption estimation of the SoC is more accurate than conventional methods and this proposed method is promising to reduce the chance of development rework in the SoC development. ∈

Project Selection by Using Fuzzy AHP and TOPSIS Technique

In this article, by using fuzzy AHP and TOPSIS technique we propose a new method for project selection problem. After reviewing four common methods of comparing alternatives investment (net present value, rate of return, benefit cost analysis and payback period) we use them as criteria in AHP tree. In this methodology by utilizing improved Analytical Hierarchy Process by Fuzzy set theory, first we try to calculate weight of each criterion. Then by implementing TOPSIS algorithm, assessment of projects has been done. Obtained results have been tested in a numerical example.

Comparative Study of Complexity in Streetscape Composition

This research is a comparative study of complexity, as a multidimensional concept, in the context of streetscape composition in Algeria and Japan. 80 streetscapes visual arrays have been collected and then presented to 20 participants, with different cultural backgrounds, in order to be categorized and classified according to their degrees of complexity. Three analysis methods have been used in this research: cluster analysis, ranking method and Hayashi Quantification method (Method III). The results showed that complexity, disorder, irregularity and disorganization are often conflicting concepts in the urban context. Algerian daytime streetscapes seem to be balanced, ordered and regular, and Japanese daytime streetscapes seem to be unbalanced, regular and vivid. Variety, richness and irregularity with some aspects of order and organization seem to characterize Algerian night streetscapes. Japanese night streetscapes seem to be more related to balance, regularity, order and organization with some aspects of confusion and ambiguity. Complexity characterized mainly Algerian avenues with green infrastructure. Therefore, for Japanese participants, Japanese traditional night streetscapes were complex. And for foreigners, Algerian and Japanese avenues nightscapes were the most complex visual arrays.

On-line Speech Enhancement by Time-Frequency Masking under Prior Knowledge of Source Location

This paper presents the source extraction system which can extract only target signals with constraints on source localization in on-line systems. The proposed system is a kind of methods for enhancing a target signal and suppressing other interference signals. But, the performance of proposed system is superior to any other methods and the extraction of target source is comparatively complete. The method has a beamforming concept and uses an improved time-frequency (TF) mask-based BSS algorithm to separate a target signal from multiple noise sources. The target sources are assumed to be in front and test data was recorded in a reverberant room. The experimental results of the proposed method was evaluated by the PESQ score of real-recording sentences and showed a noticeable speech enhancement.

Curing Methods Yield Multiple Refractive Index of Benzocyclobutene Polymer Film

Refractive index control of benzocyclobutene (BCB 4024-40) is achieved by facilitating different conditions during the thermal curing of BCB film. Refractive index (RI) change of 1.49% is obtained with curing of BCB film using an oven, while the RI change is 0.1% when the BCB is cured using a hotplate. The two different curing methods exhibit a temperature dependent refractive index change of the BCB photosensitive polymer. By carefully controlling the curing conditions, multiple layers of BCB with different RI can be fabricated, which can then be applied in the fabrication of optical waveguides.

Active Contours with Prior Corner Detection

Deformable active contours are widely used in computer vision and image processing applications for image segmentation, especially in biomedical image analysis. The active contour or “snake" deforms towards a target object by controlling the internal, image and constraint forces. However, if the contour initialized with a lesser number of control points, there is a high probability of surpassing the sharp corners of the object during deformation of the contour. In this paper, a new technique is proposed to construct the initial contour by incorporating prior knowledge of significant corners of the object detected using the Harris operator. This new reconstructed contour begins to deform, by attracting the snake towards the targeted object, without missing the corners. Experimental results with several synthetic images show the ability of the new technique to deal with sharp corners with a high accuracy than traditional methods.

Real-time Haptic Modeling and Simulation for Prosthetic Insertion

In this work a surgical simulator is produced which enables a training otologist to conduct a virtual, real-time prosthetic insertion. The simulator provides the Ear, Nose and Throat surgeon with real-time visual and haptic responses during virtual cochlear implantation into a 3D model of the human Scala Tympani (ST). The parametric model is derived from measured data as published in the literature and accounts for human morphological variance, such as differences in cochlear shape, enabling patient-specific pre- operative assessment. Haptic modeling techniques use real physical data and insertion force measurements, to develop a force model which mimics the physical behavior of an implant as it collides with the ST walls during an insertion. Output force profiles are acquired from the insertion studies conducted in the work, to validate the haptic model. The simulator provides the user with real-time, quantitative insertion force information and associated electrode position as user inserts the virtual implant into the ST model. The information provided by this study may also be of use to implant manufacturers for design enhancements as well as for training specialists in optimal force administration, using the simulator. The paper reports on the methods for anatomical modeling and haptic algorithm development, with focus on simulator design, development, optimization and validation. The techniques may be transferrable to other medical applications that involve prosthetic device insertions where user vision is obstructed.

Border Limited Adaptive Subdivision Based On Triangle Meshes

Subdivision is a method to create a smooth surface from a coarse mesh by subdividing the entire mesh. The conventional ways to compute and render surfaces are inconvenient both in terms of memory and computational time as the number of meshes will increase exponentially. An adaptive subdivision is the way to reduce the computational time and memory by subdividing only certain selected areas. In this paper, a new adaptive subdivision method for triangle meshes is introduced. This method defines a new adaptive subdivision rules by considering the properties of each triangle's neighbors and is embedded in a traditional Loop's subdivision. It prevents some undesirable side effects that appear in the conventional adaptive ways. Models that were subdivided by our method are compared with other adaptive subdivision methods

Artificial Neural Networks for Classifying Magnetic Measurements in Tokamak Reactors

This paper is mainly concerned with the application of a novel technique of data interpretation to the characterization and classification of measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artifical Neural Networks have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compares with earlier methods.

Efficient Tools for Managing Uncertainties in Design and Operation of Engineering Structures

Actual load, material characteristics and other quantities often differ from the design values. This can cause worse function, shorter life or failure of a civil engineering structure, a machine, vehicle or another appliance. The paper shows main causes of the uncertainties and deviations and presents a systematic approach and efficient tools for their elimination or mitigation of consequences. Emphasis is put on the design stage, which is most important for reliability ensuring. Principles of robust design and important tools are explained, including FMEA, sensitivity analysis and probabilistic simulation methods. The lifetime prediction of long-life objects can be improved by long-term monitoring of the load response and damage accumulation in operation. The condition evaluation of engineering structures, such as bridges, is often based on visual inspection and verbal description. Here, methods based on fuzzy logic can reduce the subjective influences.

Issues in Spectral Source Separation Techniques for Plant-wide Oscillation Detection and Diagnosis

In the last few years, three multivariate spectral analysis techniques namely, Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Non-negative Matrix Factorization (NMF) have emerged as effective tools for oscillation detection and isolation. While the first method is used in determining the number of oscillatory sources, the latter two methods are used to identify source signatures by formulating the detection problem as a source identification problem in the spectral domain. In this paper, we present a critical drawback of the underlying linear (mixing) model which strongly limits the ability of the associated source separation methods to determine the number of sources and/or identify the physical source signatures. It is shown that the assumed mixing model is only valid if each unit of the process gives equal weighting (all-pass filter) to all oscillatory components in its inputs. This is in contrast to the fact that each unit, in general, acts as a filter with non-uniform frequency response. Thus, the model can only facilitate correct identification of a source with a single frequency component, which is again unrealistic. To overcome this deficiency, an iterative post-processing algorithm that correctly identifies the physical source(s) is developed. An additional issue with the existing methods is that they lack a procedure to pre-screen non-oscillatory/noisy measurements which obscure the identification of oscillatory sources. In this regard, a pre-screening procedure is prescribed based on the notion of sparseness index to eliminate the noisy and non-oscillatory measurements from the data set used for analysis.

Study on Applying Fuzzy AHP and GRA in Selection of Agent Construction Enterprise

To help the client to select a competent agent construction enterprise (ACE), this study aims to investigate the selection standards by using the Fuzzy Analytic Hierarchy Process (FAHP) and build an evaluation mathematical model with Grey Relational Analysis (GRA). According to the outputs of literature review, four orderly levels are established within the model, taking the consideration of various agent construction models in practice. Then, the process of applying FAHP and GRA is discussed in detailed. Finally, through a case study, this paper illustrates how to apply these methods in getting the weights of each standard and the final assessment result.

Consideration of Criteria of Vibration Comfort of People in Diagnosis and Design of Buildings

The increasing influence of traffic on building objects and people residing in them should be taken into account in diagnosis and design. Users of buildings expect that vibrations occurring in their environment, will not only lead to damage to the building or its accelerated wear, but neither would affect the required comfort in rooms designed to accommodate people. This article describes the methods and principles useful in designing and building diagnostics located near transportation routes, with particular emphasis on the impact of traffic vibration on people in buildings. It also describes the procedures used in obtaining information about the parameters of vibrations in different cases of diagnostics and design. A universal algorithm of procedure in diagnostics and design of buildings taking into account assurance of human vibration comfort of people residing in the these buildings was presented.

Project Selection by Using a Fuzzy TOPSIS Technique

Selection of a project among a set of possible alternatives is a difficult task that the decision maker (DM) has to face. In this paper, by using a fuzzy TOPSIS technique we propose a new method for a project selection problem. After reviewing four common methods of comparing investment alternatives (net present value, rate of return, benefit cost analysis and payback period) we use them as criteria in a TOPSIS technique. First we calculate the weight of each criterion by a pairwise comparison and then we utilize the improved TOPSIS assessment for the project selection.

Closed form Delay Model for on-Chip VLSIRLCG Interconnects for Ramp Input for Different Damping Conditions

Fast delay estimation methods, as opposed to simulation techniques, are needed for incremental performance driven layout synthesis. On-chip inductive effects are becoming predominant in deep submicron interconnects due to increasing clock speed and circuit complexity. Inductance causes noise in signal waveforms, which can adversely affect the performance of the circuit and signal integrity. Several approaches have been put forward which consider the inductance for on-chip interconnect modelling. But for even much higher frequency, of the order of few GHz, the shunt dielectric lossy component has become comparable to that of other electrical parameters for high speed VLSI design. In order to cope up with this effect, on-chip interconnect has to be modelled as distributed RLCG line. Elmore delay based methods, although efficient, cannot accurately estimate the delay for RLCG interconnect line. In this paper, an accurate analytical delay model has been derived, based on first and second moments of RLCG interconnection lines. The proposed model considers both the effect of inductance and conductance matrices. We have performed the simulation in 0.18μm technology node and an error of as low as less as 5% has been achieved with the proposed model when compared to SPICE. The importance of the conductance matrices in interconnect modelling has also been discussed and it is shown that if G is neglected for interconnect line modelling, then it will result an delay error of as high as 6% when compared to SPICE.

A New Algorithm for Determining the Leading Coefficient of in the Parabolic Equation

This paper investigates the inverse problem of determining the unknown time-dependent leading coefficient in the parabolic equation using the usual conditions of the direct problem and an additional condition. An algorithm is developed for solving numerically the inverse problem using the technique of space decomposition in a reproducing kernel space. The leading coefficients can be solved by a lower triangular linear system. Numerical experiments are presented to show the efficiency of the proposed methods.