Dynamic Behaviour of Earth Dams for Variation of Earth Material Stiffness

This paper presents a numerical analysis of the seismic behaviour of earth dams. Analysis is conducted for the solid phase. It may correspond to the response of the dam before water filling. Analysis is conducted for a simple case which concerns the elastic response of the dam. Numerical analyses are conducted using the FLAC3D program. The behaviour of the Shell and core of the dam and the foundation behaviour is assumed to be elastic. Result shows the influence of the variation of the shear modulus of the core and shell on the seismic amplification of the dam. It can be observed that the variation of the shearing modulus of the core leads to a moderate increase in the dynamic amplification and the increase in the shell shearing modulus leads to a significant increase in the dynamic amplification.

Corporate Fraud: An Analysis of Malaysian Securities Commission Enforcement Releases

Economic crime (i.e. corporate fraud) has a significant impact on business. This study analyzes the fraud cases reported by the Malaysian Securities Commission. Frauds involving market manipulation and/or illegal share trading are the most common types of fraud reported over the 6 years analyzed. The highest number of frauds reported involved investment and fund holding companies. Alarmingly the results indicate quite a high number of frauds cases are committed by management. The higher number of Chinese perpetrators may be due to fact that they are the dominant group in Malaysian business. The result also shows that more than half of companies involved with fraud are privately held companies in the investment/fund/finance sector. The results of this study highlight general characteristic of perpetrators (person and company) that commit fraud which could help the regulators in their monitoring and enforcement activities. To investors, this would help in analyzing their business investment or portfolio risk.

SMCC: Self-Managing Congestion Control Algorithm

Transmission control protocol (TCP) Vegas detects network congestion in the early stage and successfully prevents periodic packet loss that usually occurs in TCP Reno. It has been demonstrated that TCP Vegas outperforms TCP Reno in many aspects. However, TCP Vegas suffers several problems that affect its congestion avoidance mechanism. One of the most important weaknesses in TCP Vegas is that alpha and beta depend on a good expected throughput estimate, which as we have seen, depends on a good minimum RTT estimate. In order to make the system more robust alpha and beta must be made responsive to network conditions (they are currently chosen statically). This paper proposes a modified Vegas algorithm, which can be adjusted to present good performance compared to other transmission control protocols (TCPs). In order to do this, we use PSO algorithm to tune alpha and beta. The simulation results validate the advantages of the proposed algorithm in term of performance.

Into the Bank Lending Channel of SEE: Greek Banks- Buffering Effects

This paper tries to shed light on the existence of a bank lending channel (BLC) in South Eastern European countries (SEE). Based on a VAR framework we test the responsiveness of credit supply to monetary policy shocks. By compiling a new data set and using the reserve requirement ratio, among others, as the policy instrument we measure the effectiveness of the BLC and the buffering effect of the banks in the SEE countries. The results indicate that loan supply is significantly affected by shifts in monetary policy, when demand factors are controlled. Furthermore, by analyzing the effect of the Greek banks in the region we conclude that Greek banks do buffer the negative effects of monetary policy transmission. By having a significant market share of the SEE-s banking markets we argue that Greek banks influence positively the economic growth of SEE countries.

Analysis of Heart Beat Dynamics through Singularity Spectrum

The analysis to detect arrhythmias and life-threatening conditions are highly essential in today world and this analysis can be accomplished by advanced non-linear processing methods for accurate analysis of the complex signals of heartbeat dynamics. In this perspective, recent developments in the field of multiscale information content have lead to the Microcanonical Multiscale Formalism (MMF). We show that such framework provides several signal analysis techniques that are especially adapted to the study of heartbeat dynamics. In this paper, we just show first hand results of whether the considered heartbeat dynamics signals have the multiscale properties by computing local preticability exponents (LPEs) and the Unpredictable Points Manifold (UPM), and thereby computing the singularity spectrum.

An Efficient Obstacle Detection Algorithm Using Colour and Texture

This paper presents a new classification algorithm using colour and texture for obstacle detection. Colour information is computationally cheap to learn and process. However in many cases, colour alone does not provide enough information for classification. Texture information can improve classification performance but usually comes at an expensive cost. Our algorithm uses both colour and texture features but texture is only needed when colour is unreliable. During the training stage, texture features are learned specifically to improve the performance of a colour classifier. The algorithm learns a set of simple texture features and only the most effective features are used in the classification stage. Therefore our algorithm has a very good classification rate while is still fast enough to run on a limited computer platform. The proposed algorithm was tested with a challenging outdoor image set. Test result shows the algorithm achieves a much better trade-off between classification performance and efficiency than a typical colour classifier.

Unsupervised Segmentation by Hidden Markov Chain with Bi-dimensional Observed Process

In unsupervised segmentation context, we propose a bi-dimensional hidden Markov chain model (X,Y) that we adapt to the image segmentation problem. The bi-dimensional observed process Y = (Y 1, Y 2) is such that Y 1 represents the noisy image and Y 2 represents a noisy supplementary information on the image, for example a noisy proportion of pixels of the same type in a neighborhood of the current pixel. The proposed model can be seen as a competitive alternative to the Hilbert-Peano scan. We propose a bayesian algorithm to estimate parameters of the considered model. The performance of this algorithm is globally favorable, compared to the bi-dimensional EM algorithm through numerical and visual data.

On the Continuous Service of Distributed e-Learning System

In this paper, backup and recovery technique for Peer to Peer applications, such as a distributed asynchronous Web-Based Training system that we have previously proposed. In order to improve the scalability and robustness of this system, all contents and function are realized on mobile agents. These agents are distributed to computers, and they can obtain using a Peer to Peer network that modified Content-Addressable Network. In the proposed system, although entire services do not become impossible even if some computers break down, the problem that contents disappear occurs with an agent-s disappearance. As a solution for this issue, backups of agents are distributed to computers. If a failure of a computer is detected, other computers will continue service using backups of the agents belonged to the computer.

Optimizing Materials Cost and Mechanical Properties of PVC Electrical Cable-s Insulation by Using Mixture Experimental Design Approach

With the development of the Polyvinyl chloride (PVC) products in many applications, the challenge of investigating the raw material composition and reducing the cost have both become more and more important. Considerable research has been done investigating the effect of additives on the PVC products. Most of the PVC composites research investigates only the effect of single/few factors, at a time. This isolated consideration of the input factors does not take in consideration the interaction effect of the different factors. This paper implements a mixture experimental design approach to find out a cost-effective PVC composition for the production of electrical-insulation cables considering the ASTM Designation (D) 6096. The results analysis showed that a minimum cost can be achieved through using 20% virgin PVC, 18.75% recycled PVC, 43.75% CaCO3 with participle size 10 microns, 14% DOP plasticizer, and 3.5% CPW plasticizer. For maximum UTS the compound should consist of: 17.5% DOP, 62.5% virgin PVC, and 20.0% CaCO3 of particle size 5 microns. Finally, for the highest ductility the compound should be made of 35% virgin PVC, 20% CaCO3 of particle size 5 microns, and 45.0% DOP plasticizer.

From Experiments to Numerical Modeling: A Tool for Teaching Heat Transfer in Mechanical Engineering

In this work the numerical simulation of transient heat transfer in a cylindrical probe is done. An experiment was conducted introducing a steel cylinder in a heating chamber and registering its surface temperature along the time during one hour. In parallel, a mathematical model was solved for one dimension transient heat transfer in cylindrical coordinates, considering the boundary conditions of the test. The model was solved using finite difference method, because the thermal conductivity in the cylindrical steel bar and the convection heat transfer coefficient used in the model are considered temperature dependant functions, and both conditions prevent the use of the analytical solution. The comparison between theoretical and experimental results showed the average deviation is below 2%. It was concluded that numerical methods are useful in order to solve engineering complex problems. For constant k and h, the experimental methodology used here can be used as a tool for teaching heat transfer in mechanical engineering, using mathematical simplified models with analytical solutions.

Towards a Compliance Reporting using a Balanced Scorecard

Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.

Niksic in the Context of Visual Urban Culture

Out of all visual arts including: painting, sculpture, graphics, photography, architecture, and others, architecture is by far the most complex one, because the art category is only one of its determinants. Architecture, to some extent includes other arts which can significantly influence the shaping of an urban space (artistic interventions). These arts largely shape the visual culture in combination with other categories: film, TV, Internet, information technologies that are "changing the world" etc. In the area of architecture and urbanism, visual culture is achieved through the aspects of visual spatial effects. In this context, a complex visual deliberation about designing urban areas in order to contribute to the urban visual culture, and with it restore the cultural identity of the city, is becoming almost the primary concept of contemporary urban and architectural practice. Research in this paper relate to the city of Niksic and its place in the visual urban culture. We are looking at the city’s existing visual effects and determining the directions of transformability of its physical structure in order to achieve the visual realization of an urban area and the renewal of cultural identity of a modern city.

PZ: A Z-based Formalism for Modeling Probabilistic Behavior

Probabilistic techniques in computer programs are becoming more and more widely used. Therefore, there is a big interest in the formal specification, verification, and development of probabilistic programs. In our work-in-progress project, we are attempting to make a constructive framework for developing probabilistic programs formally. The main contribution of this paper is to introduce an intermediate artifact of our work, a Z-based formalism called PZ, by which one can build set theoretical models of probabilistic programs. We propose to use a constructive set theory, called CZ set theory, to interpret the specifications written in PZ. Since CZ has an interpretation in Martin-L¨of-s theory of types, this idea enables us to derive probabilistic programs from correctness proofs of their PZ specifications.

The Efficacy of Self-Administered Danger Ideation Reduction Therapy for a 50-year Old Woman with a 20 Year History of Obsessive- Compulsive Disorder: A Case Study

Obsessive-Compulsive Disorder (OCD) is a common and disabling condition. Therapist-delivered treatments that use exposure and response prevention have been found to be very effective in treating OCD, although they are costly and associated with high rates of attrition. Effective treatments that can be made widely available without the need for therapist contact are urgently needed. This case study represents the first published investigation of a self-administered cognitive treatment for OCD in a 50-year old female with a 20 year history of OCD. The treatment evaluation occurred over 27 weeks, including 12 weeks of self-administration of the Danger Ideation Reduction Therapy (DIRT) program. Decreases of between 23% to 33% on measures from pre-treatment to follow-up were observed. Bearing in mind the methodological limitations associated with a case study, we conclude that the results reported here are encouraging and indicate that further research effort evaluating the effectiveness of self-administered DIRT is warranted.

The Self-Energy of an Ellectron Bound in a Coulomb Field

Recent progress in calculation of the one-loop selfenergy of the electron bound in the Coulomb field is summarized. The relativistic multipole expansion is introduced. This expansion is based on a single assumption: except for the part of the time component of the electron four-momentum corresponding to the electron rest mass, the exchange of four-momentum between the virtual electron and photon can be treated perturbatively. For non Sstates and normalized difference n3En −E1 of the S-states this itself yields very accurate results after taking the method to the third order. For the ground state the perturbation treatment of the electron virtual states with very high three-momentum is to be avoided. For these states one can always rearrange the pertinent expression in such a way that free-particle approximation is allowed. Combination of the relativistic multipole expansion and free-particle approximation yields very accurate result after taking the method to the ninth order. These results are in very good agreement with the previous results obtained by the partial wave expansion and definitely exclude the possibility that the uncertainity in determination of the proton radius comes from the uncertainity in the calculation of the one-loop selfenergy.

Interaxial Distance and Convergence Control for Efficient Stereoscopic Shooting using Horizontal Moving 3D Camera Rig

The proper assessment of interaxial distance and convergence control are important factors in stereoscopic imaging technology to make an efficient 3D image. To control interaxial distance and convergence for efficient 3D shooting, horizontal 3D camera rig is designed using some hardware components like 'LM Guide', 'Goniometer' and 'Rotation Stage'. The horizontal 3D camera rig system can be properly aligned by moving the two cameras horizontally in same or opposite directions, by adjusting the camera angle and finally considering horizontal swing as well as vertical swing. In this paper, the relationship between interaxial distance and convergence angle control are discussed and intensive experiments are performed in order to demonstrate an easy and effective 3D shooting.

Stealthy Network Transfer of Data

Users of computer systems may often require the private transfer of messages/communications between parties across a network. Information warfare and the protection and dominance of information in the military context is a prime example of an application area in which the confidentiality of data needs to be maintained. The safe transportation of critical data is therefore often a vital requirement for many private communications. However, unwanted interception/sniffing of communications is also a possibility. An elementary stealthy transfer scheme is therefore proposed by the authors. This scheme makes use of encoding, splitting of a message and the use of a hashing algorithm to verify the correctness of the reconstructed message. For this proof-of-concept purpose, the authors have experimented with the random sending of encoded parts of a message and the construction thereof to demonstrate how data can stealthily be transferred across a network so as to prevent the obvious retrieval of data.

Determination of the Proper Quality Costs Parameters via Variable Step Size Steepest Descent Algorithm

This paper presents the determination of the proper quality costs parameters which provide the optimum return. The system dynamics simulation was applied. The simulation model was constructed by the real data from a case of the electronic devices manufacturer in Thailand. The Steepest Descent algorithm was employed to optimise. The experimental results show that the company should spend on prevention and appraisal activities for 850 and 10 Baht/day respectively. It provides minimum cumulative total quality cost, which is 258,000 Baht in twelve months. The effect of the step size in the stage of improving the variables to the optimum was also investigated. It can be stated that the smaller step size provided a better result with more experimental runs. However, the different yield in this case is not significant in practice. Therefore, the greater step size is recommended because the region of optima could be reached more easily and rapidly.

Scope and Application of Collaborative Tools and Digital Manufacturing in Dentistry

It is necessary to incorporate technological advances achieved in the field of engineering into dentistry in order to enhance the process of diagnosis, treatment planning and enable the doctors to render better treatment to their patients. To achieve this ultimate goal long distance collaborations are often necessary. This paper discusses the various collaborative tools and their applications to solve a few burning problems confronted by the dentists. Customization is often the solution to most of the problems. But rapid designing, development and cost effective manufacturing is a difficult task to achieve. This problem can be solved using the technique of digital manufacturing. Cases from 6 major branches of dentistry have been discussed and possible solutions with the help of state of art technology using rapid digital manufacturing have been proposed in the present paper. The paper also entails the usage of existing tools in collaborative and digital manufacturing area.

A Retrospective Analysis of a Professional Learning Community: How Teachers- Capacities Shaped It

The purpose of this paper is to describe the process of setting up a learning community within an elementary school in Ontario, Canada. The description is provided through reflection and examination of field notes taken during the yearlong training and implementation process. Specifically the impact of teachers- capacity on the creation of a learning community was of interest. This paper is intended to inform and add to the debate around the tensions that exist in implementing a bottom-up professional development model like the learning community in a top-down organizational structure. My reflections of the process illustrate that implementation of the learning community professional development model may be difficult and yet transformative in the professional lives of the teachers, students, and administration involved in the change process. I conclude by suggesting the need for a new model of professional development that requires a transformative shift in power dynamics and a shift in the view of what constitutes effective professional learning.