Web Page Watermarking: XML files using Synonyms and Acronyms

Advent enhancements in the field of computing have increased massive use of web based electronic documents. Current Copyright protection laws are inadequate to prove the ownership for electronic documents and do not provide strong features against copying and manipulating information from the web. This has opened many channels for securing information and significant evolutions have been made in the area of information security. Digital Watermarking has developed into a very dynamic area of research and has addressed challenging issues for digital content. Watermarking can be visible (logos or signatures) and invisible (encoding and decoding). Many visible watermarking techniques have been studied for text documents but there are very few for web based text. XML files are used to trade information on the internet and contain important information. In this paper, two invisible watermarking techniques using Synonyms and Acronyms are proposed for XML files to prove the intellectual ownership and to achieve the security. Analysis is made for different attacks and amount of capacity to be embedded in the XML file is also noticed. A comparative analysis for capacity is also made for both methods. The system has been implemented using C# language and all tests are made practically to get the results.

Italians- Social and Emotional Loneliness: The Results of Five Studies

Subjective loneliness describes people who feel a disagreeable or unacceptable lack of meaningful social relationships, both at the quantitative and qualitative level. The studies to be presented tested an Italian 18-items self-report loneliness measure, that included items adapted from scales previously developed, namely a short version of the UCLA (Russell, Peplau and Cutrona, 1980), and the 11-items Loneliness scale by De Jong-Gierveld & Kamphuis (JGLS; 1985). The studies aimed at testing the developed scale and at verifying whether loneliness is better conceptualized as a unidimensional (so-called 'general loneliness') or a bidimensional construct, namely comprising the distinct facets of social and emotional loneliness. The loneliness questionnaire included 2 singleitem criterion measures of sad mood, and social contact, and asked participants to supply information on a number of socio-demographic variables. Factorial analyses of responses obtained in two preliminary studies, with 59 and 143 Italian participants respectively, showed good factor loadings and subscale reliability and confirmed that perceived loneliness has clearly two components, a social and an emotional one, the latter measured by two subscales, a 7-item 'general' loneliness subscale derived from UCLA, and a 6–item 'emotional' scale included in the JGLS. Results further showed that type and amount of loneliness are related, negatively, to frequency of social contacts, and, positively, to sad mood. In a third study data were obtained from a nation-wide sample of 9.097 Italian subjects, 12 to about 70 year-olds, who filled the test on-line, on the Italian web site of a large-audience magazine, Focus. The results again confirmed the reliability of the component subscales, namely social, emotional, and 'general' loneliness, and showed that they were highly correlated with each other, especially the latter two. Loneliness scores were significantly predicted by sex, age, education level, sad mood and social contact, and, less so, by other variables – e.g., geographical area and profession. The scale validity was confirmed by the results of a fourth study, with elderly men and women (N 105) living at home or in residential care units. The three subscales were significantly related, among others, to depression, and to various measures of the extension of, and satisfaction with, social contacts with relatives and friends. Finally, a fifth study with 315 career-starters showed that social and emotional loneliness correlate with life satisfaction, and with measures of emotional intelligence. Altogether the results showed a good validity and reliability in the tested samples of the entire scale, and of its components.

Factorial Structure and Psychometric Validation of Ecotourism Experiential Value Construct: Insights from Taman Negara National Park, Malaysia

The purpose of this research is to disentangle and validate the underlying factorial-structure of Ecotourism Experiential Value (EEV) measurement scale and subsequently investigate its psychometric properties. The analysis was based on a sample of 225 eco-tourists, collected at the vicinity of Taman Negara National Park (TNNP) via interviewer-administered questionnaire. Exploratory factor analysis (EFA) was performed to determine the factorial structure of EEV. Subsequently, to confirm and validate the factorial structure and assess the psychometric properties of EEV, confirmatory factor analysis (CFA) was executed. In addition, to establish the nomological validity of EEV a structural model was developed to examine the effect of EEV on Total Eco-tourist Experience Quality (TEEQ). It is unveiled that EEV is a secondorder six-factorial structure construct and it scale has adequately met the psychometric criteria, thus could permit interpretation of results confidently. The findings have important implications for future research directions and management of ecotourism destination.

Corporate Fraud: An Analysis of Malaysian Securities Commission Enforcement Releases

Economic crime (i.e. corporate fraud) has a significant impact on business. This study analyzes the fraud cases reported by the Malaysian Securities Commission. Frauds involving market manipulation and/or illegal share trading are the most common types of fraud reported over the 6 years analyzed. The highest number of frauds reported involved investment and fund holding companies. Alarmingly the results indicate quite a high number of frauds cases are committed by management. The higher number of Chinese perpetrators may be due to fact that they are the dominant group in Malaysian business. The result also shows that more than half of companies involved with fraud are privately held companies in the investment/fund/finance sector. The results of this study highlight general characteristic of perpetrators (person and company) that commit fraud which could help the regulators in their monitoring and enforcement activities. To investors, this would help in analyzing their business investment or portfolio risk.

Into the Bank Lending Channel of SEE: Greek Banks- Buffering Effects

This paper tries to shed light on the existence of a bank lending channel (BLC) in South Eastern European countries (SEE). Based on a VAR framework we test the responsiveness of credit supply to monetary policy shocks. By compiling a new data set and using the reserve requirement ratio, among others, as the policy instrument we measure the effectiveness of the BLC and the buffering effect of the banks in the SEE countries. The results indicate that loan supply is significantly affected by shifts in monetary policy, when demand factors are controlled. Furthermore, by analyzing the effect of the Greek banks in the region we conclude that Greek banks do buffer the negative effects of monetary policy transmission. By having a significant market share of the SEE-s banking markets we argue that Greek banks influence positively the economic growth of SEE countries.

An Efficient Obstacle Detection Algorithm Using Colour and Texture

This paper presents a new classification algorithm using colour and texture for obstacle detection. Colour information is computationally cheap to learn and process. However in many cases, colour alone does not provide enough information for classification. Texture information can improve classification performance but usually comes at an expensive cost. Our algorithm uses both colour and texture features but texture is only needed when colour is unreliable. During the training stage, texture features are learned specifically to improve the performance of a colour classifier. The algorithm learns a set of simple texture features and only the most effective features are used in the classification stage. Therefore our algorithm has a very good classification rate while is still fast enough to run on a limited computer platform. The proposed algorithm was tested with a challenging outdoor image set. Test result shows the algorithm achieves a much better trade-off between classification performance and efficiency than a typical colour classifier.

Enhanced Ant Colony Based Algorithm for Routing in Mobile Ad Hoc Network

Mobile Ad hoc network consists of a set of mobile nodes. It is a dynamic network which does not have fixed topology. This network does not have any infrastructure or central administration, hence it is called infrastructure-less network. The change in topology makes the route from source to destination as dynamic fixed and changes with respect to time. The nature of network requires the algorithm to perform route discovery, maintain route and detect failure along the path between two nodes [1]. This paper presents the enhancements of ARA [2] to improve the performance of routing algorithm. ARA [2] finds route between nodes in mobile ad-hoc network. The algorithm is on-demand source initiated routing algorithm. This is based on the principles of swarm intelligence. The algorithm is adaptive, scalable and favors load balancing. The improvements suggested in this paper are handling of loss ants and resource reservation.

Location Management in Cellular Networks

Cellular networks provide voice and data services to the users with mobility. To deliver services to the mobile users, the cellular network is capable of tracking the locations of the users, and allowing user movement during the conversations. These capabilities are achieved by the location management. Location management in mobile communication systems is concerned with those network functions necessary to allow the users to be reached wherever they are in the network coverage area. In a cellular network, a service coverage area is divided into smaller areas of hexagonal shape, referred to as cells. The cellular concept was introduced to reuse the radio frequency. Continued expansion of cellular networks, coupled with an increasingly restricted mobile spectrum, has established the reduction of communication overhead as a highly important issue. Much of this traffic is used in determining the precise location of individual users when relaying calls, with the field of location management aiming to reduce this overhead through prediction of user location. This paper describes and compares various location management schemes in the cellular networks.

Analysis of Production Loss on a Linear Walking Worker Line

This paper mathematically analyses the varying magnitude of production loss, which may occur due to idle time (inprocess waiting time and traveling time) on a linear walking worker assembly line. Within this flexible and reconfigurable assembly system, each worker travels down the line carrying out each assembly task at each station; and each worker accomplishes the assembly of a unit from start to finish and then travels back to the first station to start the assembly of a new product. This strategy of system design attempts to combine the flexibility of the U-shaped moving worker assembly cell with the efficiency of the conventional fixed worker assembly line. The paper aims to evaluate the effect of idle time that may offset the labor efficiency of each walking worker providing an insight into the mechanism of such a flexible and reconfigurable assembly system.

Optimizing Materials Cost and Mechanical Properties of PVC Electrical Cable-s Insulation by Using Mixture Experimental Design Approach

With the development of the Polyvinyl chloride (PVC) products in many applications, the challenge of investigating the raw material composition and reducing the cost have both become more and more important. Considerable research has been done investigating the effect of additives on the PVC products. Most of the PVC composites research investigates only the effect of single/few factors, at a time. This isolated consideration of the input factors does not take in consideration the interaction effect of the different factors. This paper implements a mixture experimental design approach to find out a cost-effective PVC composition for the production of electrical-insulation cables considering the ASTM Designation (D) 6096. The results analysis showed that a minimum cost can be achieved through using 20% virgin PVC, 18.75% recycled PVC, 43.75% CaCO3 with participle size 10 microns, 14% DOP plasticizer, and 3.5% CPW plasticizer. For maximum UTS the compound should consist of: 17.5% DOP, 62.5% virgin PVC, and 20.0% CaCO3 of particle size 5 microns. Finally, for the highest ductility the compound should be made of 35% virgin PVC, 20% CaCO3 of particle size 5 microns, and 45.0% DOP plasticizer.

From Experiments to Numerical Modeling: A Tool for Teaching Heat Transfer in Mechanical Engineering

In this work the numerical simulation of transient heat transfer in a cylindrical probe is done. An experiment was conducted introducing a steel cylinder in a heating chamber and registering its surface temperature along the time during one hour. In parallel, a mathematical model was solved for one dimension transient heat transfer in cylindrical coordinates, considering the boundary conditions of the test. The model was solved using finite difference method, because the thermal conductivity in the cylindrical steel bar and the convection heat transfer coefficient used in the model are considered temperature dependant functions, and both conditions prevent the use of the analytical solution. The comparison between theoretical and experimental results showed the average deviation is below 2%. It was concluded that numerical methods are useful in order to solve engineering complex problems. For constant k and h, the experimental methodology used here can be used as a tool for teaching heat transfer in mechanical engineering, using mathematical simplified models with analytical solutions.

The Effects of Misspecification of Stochastic Processes on Investment Appraisal

For decades financial economists have been attempted to determine the optimal investment policy by recognizing the option value embedded in irreversible investment whose project value evolves as a geometric Brownian motion (GBM). This paper aims to examine the effects of the optimal investment trigger and of the misspecification of stochastic processes on investment in real options applications. Specifically, the former explores the consequence of adopting optimal investment rules on the distributions of corporate value under the correct assumption of stochastic process while the latter analyzes the influence on the distributions of corporate value as a result of the misspecification of stochastic processes, i.e., mistaking an alternative process as a GBM. It is found that adopting the correct optimal investment policy may increase corporate value by shifting the value distribution rightward, and the misspecification effect may decrease corporate value by shifting the value distribution leftward. The adoption of the optimal investment trigger has a major impact on investment to such an extent that the downside risk of investment is truncated at the project value of zero, thereby moving the value distributions rightward. The analytical framework is also extended to situations where collection lags are in place, and the result indicates that collection lags reduce the effects of investment trigger and misspecification on investment in an opposite way.

PZ: A Z-based Formalism for Modeling Probabilistic Behavior

Probabilistic techniques in computer programs are becoming more and more widely used. Therefore, there is a big interest in the formal specification, verification, and development of probabilistic programs. In our work-in-progress project, we are attempting to make a constructive framework for developing probabilistic programs formally. The main contribution of this paper is to introduce an intermediate artifact of our work, a Z-based formalism called PZ, by which one can build set theoretical models of probabilistic programs. We propose to use a constructive set theory, called CZ set theory, to interpret the specifications written in PZ. Since CZ has an interpretation in Martin-L¨of-s theory of types, this idea enables us to derive probabilistic programs from correctness proofs of their PZ specifications.

The Self-Energy of an Ellectron Bound in a Coulomb Field

Recent progress in calculation of the one-loop selfenergy of the electron bound in the Coulomb field is summarized. The relativistic multipole expansion is introduced. This expansion is based on a single assumption: except for the part of the time component of the electron four-momentum corresponding to the electron rest mass, the exchange of four-momentum between the virtual electron and photon can be treated perturbatively. For non Sstates and normalized difference n3En −E1 of the S-states this itself yields very accurate results after taking the method to the third order. For the ground state the perturbation treatment of the electron virtual states with very high three-momentum is to be avoided. For these states one can always rearrange the pertinent expression in such a way that free-particle approximation is allowed. Combination of the relativistic multipole expansion and free-particle approximation yields very accurate result after taking the method to the ninth order. These results are in very good agreement with the previous results obtained by the partial wave expansion and definitely exclude the possibility that the uncertainity in determination of the proton radius comes from the uncertainity in the calculation of the one-loop selfenergy.

Radio Technology Frequency Identification Applied in High-Voltage Power Transmission- Line for Sag Measurement

High-voltage power transmission lines are the back bone of electrical power utilities. The stability and continuous monitoring of this critical infrastructure is pivotal. Nine-Sigma representing Eskom Holding SOC limited, South Africa has a major problem on proactive detection of fallen power lines and real time sagging measurement together with slipping of such conductors. The main objective of this research is to innovate RFID technology to solve this challenge. Various options and technologies such as GPS, PLC, image processing, MR sensors and etc., have been reviewed and draw backs were made. The potential of RFID to give precision measurement will be observed and presented. The future research will look at magnetic and electrical interference as well as corona effect on the technology.

Coordinated Q–V Controller for Multi-machine Steam Power Plant: Design and Validation

This paper discusses coordinated reactive power - voltage (Q-V) control in a multi machine steam power plant. The drawbacks of manual Q-V control are briefly listed, and the design requirements for coordinated Q-V controller are specified. Theoretical background and mathematical model of the new controller are presented next followed by validation of developed Matlab/Simulink model through comparison with recorded responses in real steam power plant and description of practical realisation of the controller. Finally, the performance of commissioned controller is illustrated on several examples of coordinated Q-V control in real steam power plant and compared with manual control.

Determination of the Proper Quality Costs Parameters via Variable Step Size Steepest Descent Algorithm

This paper presents the determination of the proper quality costs parameters which provide the optimum return. The system dynamics simulation was applied. The simulation model was constructed by the real data from a case of the electronic devices manufacturer in Thailand. The Steepest Descent algorithm was employed to optimise. The experimental results show that the company should spend on prevention and appraisal activities for 850 and 10 Baht/day respectively. It provides minimum cumulative total quality cost, which is 258,000 Baht in twelve months. The effect of the step size in the stage of improving the variables to the optimum was also investigated. It can be stated that the smaller step size provided a better result with more experimental runs. However, the different yield in this case is not significant in practice. Therefore, the greater step size is recommended because the region of optima could be reached more easily and rapidly.

Words Reordering based on Statistical Language Model

There are multiple reasons to expect that detecting the word order errors in a text will be a difficult problem, and detection rates reported in the literature are in fact low. Although grammatical rules constructed by computer linguists improve the performance of grammar checker in word order diagnosis, the repairing task is still very difficult. This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The comparative advantage of this method is that works with a large set of words, and avoids the laborious and costly process of collecting word order errors for creating error patterns.

Power System with PSS and FACTS Controller: Modelling, Simulation and Simultaneous Tuning Employing Genetic Algorithm

This paper presents a systematic procedure for modelling and simulation of a power system installed with a power system stabilizer (PSS) and a flexible ac transmission system (FACTS)-based controller. For the design purpose, the model of example power system which is a single-machine infinite-bus power system installed with the proposed controllers is developed in MATLAB/SIMULINK. In the developed model synchronous generator is represented by model 1.1. which includes both the generator main field winding and the damper winding in q-axis so as to evaluate the impact of PSS and FACTS-based controller on power system stability. The model can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, to avoid adverse interactions, PSS and FACTS-based controller are simultaneously designed employing genetic algorithm (GA). The non-linear simulation results are presented for the example power system under various disturbance conditions to validate the effectiveness of the proposed modelling and simultaneous design approach.

The Wijma Delivery Expectancy/Experience Questionnaire (W-DEQ) with Turkish Sample: Confirmatory and Exploratory Factor Analysis

The propose of this study is to investigate the factor structures of the W-DEQ, originally developed on UK and Swedish women, were confirmed in Turkish samples, and to obtain a new modified factor structure appropriate to Turkish culture. Statistical analyses of the data obtained were performed using SPSS© for Windows version 13.0 and the SAS statistical software Version 9.1. Both confirmatory and exploratory factor analysis of W-DEQ were performed in the study. Factor analysis yielded four factors related to hope, fear, lack of positive anticipation and riskiness. The alpha estimates of the total W-DEQ score were somewhat higher, being 0.92 for the parous and 0.90 for the nulliparous sample. These are well above the accepted limit of 0.70 and indicate excellent levels of internal reliability, thus showing that the questions were appropriate to the Turkish culture and useful scale for the evaluation of fear of childbirth in Turkish pregnants.