Control-flow Complexity Measurement of Processes and Weyuker's Properties

Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.

Binary Phase-Only Filter Watermarking with Quantized Embedding

The binary phase-only filter digital watermarking embeds the phase information of the discrete Fourier transform of the image into the corresponding magnitudes for better image authentication. The paper proposed an approach of how to implement watermark embedding by quantizing the magnitude, with discussing how to regulate the quantization steps based on the frequencies of the magnitude coefficients of the embedded watermark, and how to embed the watermark at low frequency quantization. The theoretical analysis and simulation results show that algorithm flexibility, security, watermark imperceptibility and detection performance of the binary phase-only filter digital watermarking can be effectively improved with quantization based watermark embedding, and the robustness against JPEG compression will also be increased to some extent.

Effect of Flaying Capacitors on Improving the 4 Level Three-Cell Inverter

With the rapid advanced of technology, the industrial processes become increasingly demanding, from the point of view, power quality and controllability. The advent of multi levels inverters responds partially to these requirements. But actually, the new generation of multi-cells inverters permits to reach more performances, since, it offers more voltage levels. The disadvantage in the increase of voltage levels by the number of cells in cascades is on account of series igbts synchronisation loss, from where, a limitation of cells in cascade to 4. Regarding to these constraints, a new topology is proposed in this paper, which increases the voltage levels of the three-cell inverter from 4 to 8; with the same number of igbts, and using less stored energy in the flaying capacitors. The details of operation and modelling of this new inverter structure are also presented, then tested thanks to a three phase induction motor. KeywordsFlaying capacitors, Multi-cells inverter, pwm, switchers, modelling.

Optimal Algorithm for Constructing the Delaunay Triangulation in Ed

In this paper we propose a new approach to constructing the Delaunay Triangulation and the optimum algorithm for the case of multidimensional spaces (d ≥ 2). Analysing the modern state, it is possible to draw a conclusion, that the ideas for the existing effective algorithms developed for the case of d ≥ 2 are not simple to generalize on a multidimensional case, without the loss of efficiency. We offer for the solving this problem an effective algorithm that satisfies all the given requirements. But theoretical complexity of the problem it is impossible to improve as the Worst - Case Optimality for algorithms of solving such a problem is proved.

Sufficiency Economy: A Contribution to Economic Development

The Philosophy of Sufficiency Economy, bestowed by His Majesty the King Bhumibol Adulyadej to the people of Thailand, highlights a balanced way of living. Three principles of moderation reasonableness, and immunity, along with the conditions for morality and knowledge, can be applied to any level of the society–from an individual to the nation. The Philosophy of Sufficiency Economy helps address the current development challenges, which are issues on institutions, environmental sustainability, human well-being, and the role of the government.

Weak Measurement Theory for Discrete Scales

With the increasing spread of computers and the internet among culturally, linguistically and geographically diverse communities, issues of internationalization and localization and becoming increasingly important. For some of the issues such as different scales for length and temperature, there is a well-developed measurement theory. For others such as date formats no such theory will be possible. This paper fills a gap by developing a measurement theory for a class of scales previously overlooked, based on discrete and interval-valued scales such as spanner and shoe sizes. The paper gives a theoretical foundation for a class of data representation problems.

Solid-State Bioconversion of Pineapple Residues into Kojic Acid by Aspergillus flavus: A Prospective Study

Kojic acid is an organic acid that is widely used as an ingredient for dermatological products, precursor for flavor enhancer and also as anti-inflammatory drug. The present study was undertaken to test the feasibility of pineapple residues as substrate for kojic acid production by Aspergillus flavus Link 44-1 via solid-state fermentation. The effect of initial moisture content, pH and incubation time on kojic acid fermentation was investigated. The best initial moisture content for kojic acid production from pineapple residues was observed at 70% (v/w) whereas initial culture pH 2.5 was identified to give high production of kojic acid. The optimal range of incubation time was identified between 8 and 14 days of incubation which corresponded to highest range of kojic acid produced. The results from this study pronounce the promising usability of pineapple residues as alternative substrate for kojic acid production by A. flavus Link 44-1.

Anomaly Detection using Neuro Fuzzy system

As the network based technologies become omnipresent, demands to secure networks/systems against threat increase. One of the effective ways to achieve higher security is through the use of intrusion detection systems (IDS), which are a software tool to detect anomalous in the computer or network. In this paper, an IDS has been developed using an improved machine learning based algorithm, Locally Linear Neuro Fuzzy Model (LLNF) for classification whereas this model is originally used for system identification. A key technical challenge in IDS and LLNF learning is the curse of high dimensionality. Therefore a feature selection phase is proposed which is applicable to any IDS. While investigating the use of three feature selection algorithms, in this model, it is shown that adding feature selection phase reduces computational complexity of our model. Feature selection algorithms require the use of a feature goodness measure. The use of both a linear and a non-linear measure - linear correlation coefficient and mutual information- is investigated respectively

Artificial Intelligence for Software Quality Improvement

This paper presents a software quality support tool, a Java source code evaluator and a code profiler based on computational intelligence techniques. It is Java prototype software developed by AI Group [1] from the Research Laboratories at Universidad de Palermo: an Intelligent Java Analyzer (in Spanish: Analizador Java Inteligente, AJI). It represents a new approach to evaluate and identify inaccurate source code usage and transitively, the software product itself. The aim of this project is to provide the software development industry with a new tool to increase software quality by extending the value of source code metrics through computational intelligence.

Estimation of the Park-Ang Damage Index for Floating Column Building with Infill Wall

Buildings with floating column are highly undesirable built in seismically active areas. Many urban multi-storey buildings today have floating column buildings which are adopted to accommodate parking at ground floor or reception lobbies in the first storey. The earthquake forces developed at different floor levels in a building need to be brought down along the height to the ground by the shortest path; any deviation or discontinuity in this load transfer path results in poor performance of the building. Floating column buildings are severely damaged during earthquake. Damage on this structure can be reduce by taking the effect of infill wall. This paper presents the effect of stiffness of infill wall to the damage occurred in floating column building when ground shakes. Modelling and analysis are carried out by non linear analysis programme IDARC-2D. Damage occurred in beams, columns, storey are studied by formulating modified Park & Ang model to evaluate damage indices. Overall structural damage indices in buildings due to shaking of ground are also obtained. Dynamic response parameters i.e. lateral floor displacement, storey drift, time period, base shear of buildings are obtained and results are compared with the ordinary moment resisting frame buildings. Formation of cracks, yield, plastic hinge, are also observed during analysis.

Development and Evaluation of a Dynamic Cardiac Phantom for use in Nuclear Medicine

The aim of this study was to develop a dynamic cardiac phantom for quality control in myocardial scintigraphy. The dynamic heart phantom constructed only contained the left ventricle, made of elastic material (latex), comprising two cavities: one internal and one external. The data showed a non-significant variation in the values of left ventricular ejection fraction (LVEF) obtained by varying the heart rate. It was also possible to evaluate the ejection fraction (LVEF) through different arrays of image acquisition and to perform an intercomparison of LVEF by two different scintillation cameras. The results of the quality control tests were satisfactory, showing that they can be used as parameters in future assessments. The new dynamic heart phantom was demonstrated to be effective for use in LVEF measurements. Therefore, the new heart simulator is useful for the quality control of scintigraphic cameras.

A Model for Application of Knowledge Management in Public Organizations in Iran

This study examines knowledge management in the public organizations in Iran. The purpose of this article is to provide a conceptual framework for application of knowledge management in public organizations. The study indicates that an increasing tendency for implementation of knowledge management in organizations is emerging. Nonetheless knowledge management in public organizations is toddler and little has been done to bring the subject to use in the public sector. The globalization of change and popularization of some values like participation, citizen-orientation and knowledge-orientation in the new theories of public administration requires that the knowledge management is considered and attend to in the public sector. This study holds that a knowledge management framework for public organizations is different from this in the public sector, because public sector is stakeholder-dependent while the private is shareholder-dependent. Based on the research, we provide a conceptual model. The model proposed involves three factors: Organizational, knowledge citizens and contextual factors. The study results indicate these factors affect on knowledge management in public organizations in Iran.

Production of Hydrogen and Carbon Nanofiber via Methane Decomposition

High purity hydrogen and the valuable by-product of carbon nanotubes (CNTs) can be produced by the methane catalytic decomposition. The methane conversion and the performance of CNTs were determined by the choices of catalysts and the condition of decomposition reaction. In this paper, Ni/MgO and Ni/O-D (oxidized diamond) catalysts were prepared by wetness impregnation method. The effects of reaction temperature and space velocity of methane on the methane conversion were investigated in a fixed-bed. The surface area, structure and micrography were characterized with BET, XPS, SEM, EDS technology. The results showed that the conversion of methane was above 8% within 150 min (T=500) for 33Ni/O-D catalyst and higher than 25% within 120 min (T=650) for 41Ni/MgO catalyst. The initial conversion increased with the increasing temperature of the decomposition reaction, but their catalytic activities decreased rapidly while at too higher temperature. To decrease the space velocity of methane was propitious to promote the methane conversion, but not favor of the hydrogen yields. The appearance of carbon resulted from the methane decomposition lied on the support type and the condition of catalytic reaction. It presented as fiber shape on the surface of Ni/O-D at the relatively lower temperature such as 500 and 550, but as grain shape stacked on and overlayed on the surface of the metal nickel while at 650. The carbon fiber can form on the Ni/MgO surface at 650 and the diameter of the carbon fiber increased with the decreasing space velocity.

A Multi-Level GA Search with Application to the Resource-Constrained Re-Entrant Flow Shop Scheduling Problem

Re-entrant scheduling is an important search problem with many constraints in the flow shop. In the literature, a number of approaches have been investigated from exact methods to meta-heuristics. This paper presents a genetic algorithm that encodes the problem as multi-level chromosomes to reflect the dependent relationship of the re-entrant possibility and resource consumption. The novel encoding way conserves the intact information of the data and fastens the convergence to the near optimal solutions. To test the effectiveness of the method, it has been applied to the resource-constrained re-entrant flow shop scheduling problem. Computational results show that the proposed GA performs better than the simulated annealing algorithm in the measure of the makespan

Sidecooler Flow Field Investigation

One of the aims of the paper is to make a comparison of experimental results with numerical simulation for a side cooler. Specifically, it was the amount of air to be delivered by the side cooler with fans running at 100%. This integral value was measured and evaluated within the plane parallel to the front side of the side cooler at a distance of 20mm from the front side. The flow field extending from the side cooler to the space was also evaluated. Another objective was to address the contribution of evaluated values to the increase of data center energy consumption.

How Team Efficacy Beliefs Impact Project Performance: An Empirical Investigation of Team Potency in Capital Projects in the Process Industries

Team efficacy beliefs show promise in enhancing team performance. Using a model-based quantitative research design, we investigated the antecedents and performance consequences of generalized team efficacy (potency) in a sample of 56 capital projects executed by 15 Fortune 500 companies in the process industries. Empirical analysis of our field survey identified that generalized team efficacy beliefs were positively associated with an objective measure of project cost performance. Regression analysis revealed that team competence, empowering leadership, and performance feedback all predicted generalized team efficacy beliefs. Tests of mediation revealed that generalized team efficacy fully mediated between these three inputs and project cost performance.

Computer-aided Sequence Planning of Shearing Operations in Progressive Dies

This paper aims to study the methodology of building the knowledge of planning adequate punches in order to complete the task of strip layout for shearing processes, using progressive dies. The proposed methodology uses die design rules and characteristics of different types of punches to classify them into five groups: prior use (the punches must be used first), posterior use (must be used last), compatible use (may be used together), sequential use (certain punches must precede some others) and simultaneous use (must be used together). With these five groups of punches, the searching space of feasible designs will be greatly reduced, and superimposition becomes a more effective method of punch layout. The superimposition scheme will generate many feasible solutions, an evaluation function based on number of stages, moment balancing and strip stability is developed for helping designers to find better solutions.

Megalopolisation: An Effect of Large Scale Urbanisation in Post-Reform China

Megalopolis is a group of densely populated metropolitan areas that combine to form an urban complex. Since China introduced the economic reforms in late 1970s, the Chinese urban system has experienced unprecedented growth. The process of urbanisation prevailed in the 1980s, and the process of predominantly large city growth appeared to continue through 1990s and 2000s. In this study, the magnitude and pattern of urbanisation in China during 1990s were examined using remotely sensed imagery acquired by TM/ETM+ sensor onboard the Landsat satellites. The development of megalopolis areas in China was also studied based on the GIS analysis of the increases of urban and built-up area from 1990 to 2000. The analysis suggests that in the traditional agricultural zones in China, e.g., Huang-Huai-Hai Plains, Changjiang River Delta, Pearl River Delta and Sichuan Basin, the urban and built-up areas increased by 1.76 million hectares, of which 0.82 million hectares are expansion of urban areas, an increase of 24.78% compared with 1990 at the national scale. The Yellow River Delta, Changjiang River Delta and Pearl River Delta also saw an increase of urban and built-up area by 63.9%, 66.2% and 83.0% respectively. As a result, three major megalopolises were developed in China: the Guangzhou-Shenzhen-Hong Kong- Macau (Pearl River Delta: PRD) megalopolis area, the Shanghai- Nanjing-Hangzhou (Changjiang River Delta: CRD) megalopolis area and the Beijing-Tianjing-Tangshan-Qinhuangdao (Yellow River Delta-Bohai Sea Ring: YRD) megalopolis area. The relationship between the processed of megalopolisation and the inter-provincial population flow was also explored in the context of social-economic and transport infrastructure development in Post-reform China.

A Simulation for Estimation of the Blood Pressure using Arterial Pressure-volume Model

A analysis on the conventional the blood pressure estimation method using an oscillometric sphygmomanometer was performed through a computer simulation using an arterial pressure-volume (APV) model. Traditionally, the maximum amplitude algorithm (MAP) was applied on the oscillation waveforms of the APV model to obtain the mean arterial pressure and the characteristic ratio. The estimation of mean arterial pressure and characteristic ratio was significantly affected with the shape of the blood pressure waveforms and the cutoff frequency of high-pass filter (HPL) circuitry. Experimental errors are due to these effects when estimating blood pressure. To find out an algorithm independent from the influence of waveform shapes and parameters of HPL, the volume oscillation of the APV model and the phase shift of the oscillation with fast fourier transform (FFT) were testified while increasing the cuff pressure from 1 mmHg to 200 mmHg (1 mmHg per second). The phase shift between the ranges of volume oscillation was then only observed between the systolic and the diastolic blood pressures. The same results were also obtained from the simulations performed on two different the arterial blood pressure waveforms and one hyperthermia waveform.

Redefining Field Experiences: Virtual Environments in Teacher Education

The explosion of interest in online gaming and virtual worlds is leading many universities to investigate possible educational applications of the new environments. In this paper we explore the possibilities of 3D online worlds for teacher education, particularly the field experience component. Drawing upon two pedagogical examples, we suggest that virtual simulations may, with certain limitations, create safe spaces that allow preservice teachers to adopt alternate identities and interact safely with the “other." In so doing they may become aware of the constructed nature of social categories and gain the essential pedagogical skill of perspective-taking. We suggest that, ultimately, the ability to be the principal creators of themselves in virtual environments can increase their ability to do the same in the real world.