Ontology Population via NLP Techniques in Risk Management

In this paper we propose an NLP-based method for Ontology Population from texts and apply it to semi automatic instantiate a Generic Knowledge Base (Generic Domain Ontology) in the risk management domain. The approach is semi-automatic and uses a domain expert intervention for validation. The proposed approach relies on a set of Instances Recognition Rules based on syntactic structures, and on the predicative power of verbs in the instantiation process. It is not domain dependent since it heavily relies on linguistic knowledge. A description of an experiment performed on a part of the ontology of the PRIMA1 project (supported by the European community) is given. A first validation of the method is done by populating this ontology with Chemical Fact Sheets from Environmental Protection Agency2. The results of this experiment complete the paper and support the hypothesis that relying on the predicative power of verbs in the instantiation process improves the performance.

Classification Algorithms in Human Activity Recognition using Smartphones

Rapid advancement in computing technology brings computers and humans to be seamlessly integrated in future. The emergence of smartphone has driven computing era towards ubiquitous and pervasive computing. Recognizing human activity has garnered a lot of interest and has raised significant researches- concerns in identifying contextual information useful to human activity recognition. Not only unobtrusive to users in daily life, smartphone has embedded built-in sensors that capable to sense contextual information of its users supported with wide range capability of network connections. In this paper, we will discuss the classification algorithms used in smartphone-based human activity. Existing technologies pertaining to smartphone-based researches in human activity recognition will be highlighted and discussed. Our paper will also present our findings and opinions to formulate improvement ideas in current researches- trends. Understanding research trends will enable researchers to have clearer research direction and common vision on latest smartphone-based human activity recognition area.

Evaluating Content Based Image Retrieval Techniques with the One Million Images CLIC Test Bed

Pattern recognition and image recognition methods are commonly developed and tested using testbeds, which contain known responses to a query set. Until now, testbeds available for image analysis and content-based image retrieval (CBIR) have been scarce and small-scale. Here we present the one million images CEA-List Image Collection (CLIC) testbed that we have produced, and report on our use of this testbed to evaluate image analysis merging techniques. This testbed will soon be made publicly available through the EU MUSCLE Network of Excellence.

Revea Ling Casein Micelle Dispersion under Various Ranges of Nacl: Evolution of Particles Size and Structure

Dispersions of casein micelles (CM) were studied at a constant protein concentration of 5 wt % in high NaCl environment ranging from 0% to 12% by Dynamic light scattering (DLS) and Fourier Transform Infrared (FTIR). The rehydration profiles obtained were interpreted in term of wetting, swelling and dispersion stages by using a turbidity method. Two behaviours were observed depending on the salt concentration. The first behaviour (low salt concentration) presents a typical rehydration profile with a significant change between 3 and 6% NaCl indicating quick wetting, swelling and long dispersion stage. On the opposite, the dispersion stage of the second behaviour (high salt concentration) was significantly shortened indicating a strong modification of the protein backbone. A salt increase result to a destabilization of the micelle and the formation of mini-micelles more or less aggregated indicating an average micelles size ranging from 100 to 200 nm. For the first time, the estimations of secondary structural elements (irregular, ß-sheet, α-helix and turn) by the Amide III assignments were correlated with results from Amide I.

WPRiMA Tool: Managing Risks in Web Projects

Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.

A Study of Analyzing the Selection of Promotion Activities and Destination Attributes in Tourism Industry in Vietnam - From the Perspective of Tourism Industrial Service Network (TISN)

In order to explore the relationship of promotion activities, destination attribute and destination image of Vietnam and find possible solutions, this study uses decision system analysis (DSA) method to develop flowcharts based on three rounds of expert interviews. The interviews were conducted with the experts who were confirmed to directly participate or influence on the decision making that drives the promotion of Vietnam tourism process. This study identifies three models and describes specific decisions on promotion activities, destination attributes and destination images. This study finally derives a general model for promoting the Tourism Industrial Service Network (TISN) in Vietnam. This study finds that the coordination with all sectors and industries of tourism to facilitate favorable condition and improving destination attributes in linking with the efficient promotion activities is highly recommended in order to make visitors satisfied and improve the destination image.

Determinants of Capital Structure in Malaysia Electrical and Electronic Sector

Capital structure is one of the most important financial decisions in corporate financing strategy. It involves the choice of debt and equity level in financing a company-s operations. This study aims to investigate whether the capital structure choice of Malaysian electrical and electronic manufacturing companies that are listed in the Bursa Malaysia can be explained by factors that have been found by most studies as dominant determinants of capital structure (company size, profitability, asset tangibility, liquidity and growth). Using debt ratio as the proxy for capital structure and applying pooled ordinary least square multiple regression estimation, the results showed that on average, Malaysian electrical and electronic manufacturing companies used less debt in funding their business operations. The findings also showed that size and asset tangibility has a significant positive relationship with debt level, while liquidity has a negative significant relationship with leverage.

Terminal Velocity of a Bubble Rise in a Liquid Column

As it is known, buoyancy and drag forces rule bubble's rise velocity in a liquid column. These forces are strongly dependent on fluid properties, gravity as well as equivalent's diameter. This study reports a set of bubble rising velocity experiments in a liquid column using water or glycerol. Several records of terminal velocity were obtained. The results show that bubble's rise terminal velocity is strongly dependent on dynamic viscosity effect. The data set allowed to have some terminal velocities data interval of 8.0 ? 32.9 cm/s with Reynolds number interval 1.3 -7490. The bubble's movement was recorded with a video camera. The main goal is to present an original set data and results that will be discussed based on two-phase flow's theory. It will also discussed, the prediction of terminal velocity of a single bubble in liquid, as well as the range of its applicability. In conclusion, this study presents general expressions for the determination of the terminal velocity of isolated gas bubbles of a Reynolds number range, when the fluid proprieties are known.

An Empirical Study of the Expectation- Perception Gap of I.S. Development

This paper adopts a notion of expectation-perception gap of systems users as information systems (IS) failure. Problems leading to the expectation-perception gap are identified and modelled as five interrelated discrepancies or gaps throughout the process of information systems development (ISD). It describes an empirical study on how systems developers and users perceive the size of each gap and the extent to which each problematic issue contributes to the gap. The key to achieving success in ISD is to keep the expectationperception gap closed by closing all 5 pertaining gaps. The gap model suggests that most factors in IS failure are related to organizational, cognitive and social aspects of information systems design. Organization requirement analysis, being the weakest link of IS development, is particularly worthy of investigation.

A Cohesive Lagrangian Swarm and Its Application to Multiple Unicycle-like Vehicles

Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.

MEGSOR Iterative Scheme for the Solution of 2D Elliptic PDE's

Recently, the findings on the MEG iterative scheme has demonstrated to accelerate the convergence rate in solving any system of linear equations generated by using approximation equations of boundary value problems. Based on the same scheme, the aim of this paper is to investigate the capability of a family of four-point block iterative methods with a weighted parameter, ω such as the 4 Point-EGSOR, 4 Point-EDGSOR, and 4 Point-MEGSOR in solving two-dimensional elliptic partial differential equations by using the second-order finite difference approximation. In fact, the formulation and implementation of three four-point block iterative methods are also presented. Finally, the experimental results show that the Four Point MEGSOR iterative scheme is superior as compared with the existing four point block schemes.

Reversible, Embedded and Highly Scalable Image Compression System

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

Power System Contingency Analysis Using Multiagent Systems

The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.

Gas Flow Rate Identification in Biomass Power Plants by Response Surface Method

The utilize of renewable energy sources becomes more crucial and fascinatingly, wider application of renewable energy devices at domestic, commercial and industrial levels is not only affect to stronger awareness but also significantly installed capacities. Moreover, biomass principally is in form of woods and converts to be energy for using by humans for a long time. Gasification is a process of conversion of solid carbonaceous fuel into combustible gas by partial combustion. Many gasified models have various operating conditions because the parameters kept in each model are differentiated. This study applied the experimental data including three inputs variables including biomass consumption; temperature at combustion zone and ash discharge rate and gas flow rate as only one output variable. In this paper, response surface methods were applied for identification of the gasified system equation suitable for experimental data. The result showed that linear model gave superlative results.

The Statistical Properties of Filtered Signals

In this paper, the statistical properties of filtered or convolved signals are considered by deriving the resulting density functions as well as the exact mean and variance expressions given a prior knowledge about the statistics of the individual signals in the filtering or convolution process. It is shown that the density function after linear convolution is a mixture density, where the number of density components is equal to the number of observations of the shortest signal. For circular convolution, the observed samples are characterized by a single density function, which is a sum of products.

Examining Corporate Tax Evaders: Evidence from the Finalized Audit Cases

This paper aims to (1) analyze the profiles of transgressors (detected evaders); (2) examine reason(s) that triggered a tax audit, causes of tax evasion, audit timeframe and tax penalty charged; and (3) to assess if tax auditors followed the guidelines as stated in the 'Tax Audit Framework' when conducting tax audits. In 2011, the Inland Revenue Board Malaysia (IRBM) had audited and finalized 557 company cases. With official permission, data of all the 557 cases were obtained from the IRBM. Of these, a total of 421 cases with complete information were analyzed. About 58.1% was small and medium corporations and from the construction industry (32.8%). The selection for tax audit was based on risk analysis (66.8%), information from third party (11.1%), and firm with low profitability or fluctuating profit pattern (7.8%). The three persistent causes of tax evasion by firms were over claimed expenses (46.8%), fraudulent reporting of income (38.5%) and overstating purchases (10.5%). These findings are consistent with past literature. Results showed that tax auditors took six to 18 months to close audit cases. More than half of tax evaders were fined 45% on additional tax raised during audit for the first offence. The study found tax auditors did follow the guidelines in the 'Tax Audit Framework' in audit selection, settlement and penalty imposition.

Organizational Culture and Innovation Adoption/Generation: A Proposed Model for Architectural Firms

Organizational culture fosters innovation, and innovation is the main engine to be sustained within the uncertainty market. Like other countries, the construction industry significantly contributes to the economy, society and technology of Malaysia, yet, innovation is still considered slow compared to other industries such as manufacturing. Given the important role of an architect as the key player and the contributor of new ideas in the construction industry, there is a call to identify the issue and improve the current situation by focusing on the architectural firms. In addition, the existing studies tend to focus only on a few dimensions of organizational culture and very few studies consider whether innovation is being generated or adopted. Hence, the present research tends to fill in the gap by identifying the organizational cultures that foster or hinder innovation generation and/or innovation adoption, and propose a model of organizational culture and innovation generation and/or adoption.

Estimating the Absorption of Volatile Organic Compounds in Four Biodiesels Using the UNIFAC Procedure

This work considered the thermodynamic feasibility of scrubbing volatile organic compounds into biodiesel in view of designing a gas treatment process with this absorbent. A detailed vapour – liquid equilibrium investigation was performed using the original UNIFAC group contribution method. The four biodiesels studied in this work are methyl oleate, methyl palmitate, methyl linolenate and ethyl stearate. The original UNIFAC procedure was used to estimate the infinite dilution activity coefficients of 13 selected volatile organic compounds in the biodiesels. The calculations were done at the VOC mole fraction of 9.213x10-8. Ethyl stearate gave the most favourable phase equilibrium. A close agreement was found between the infinite dilution activity coefficient of toluene found in this work and those reported in literature. Thermodynamic models can efficiently be used to calculate vast amount of phase equilibrium behaviour using limited number of experimental data.

Harmonic Elimination of Hybrid Multilevel Inverters Using Particle Swarm Optimization

This paper present the harmonic elimination of hybrid multilevel inverters (HMI) which could be increase the number of output voltage level. Total Harmonic Distortion (THD) is one of the most important requirements concerning performance indices. Because of many numbers output levels of HMI, it had numerous unknown variables of eliminate undesired individual harmonic and THD nonlinear equations set. Optimized harmonic stepped waveform (OHSW) is solving switching angles conventional method, but most complicated for solving as added level. The artificial intelligent techniques are deliberation to solve this problem. This paper presents the Particle Swarm Optimization (PSO) technique for solving switching angles to get minimum THD and eliminate undesired individual harmonics of 15-levels hybrid multilevel inverters. Consequently it had many variables and could eliminate numerous harmonics. Both advantages including high level of inverter and Particle Swarm Optimization (PSO) are used as powerful tools for harmonics elimination.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.