Re-Thinking Knowledge-Based Management

This paper challenges the relevance of knowledgebased management research by arguing that the majority of the literature emphasizes information and knowledge provision instead of their business usage. For this reason the related processes are considered valuable and eligible as such, which has led to overlapping nature of knowledge-based management disciplines. As a solution, this paper turns the focus on the information usage. Value of knowledge and respective management tasks are then defined by the business need and the knowledge-user becomes the main actor. The paper analyses the prevailing literature streams and recognizes the need for a more focused and robust understanding of knowledgebased value creation. The paper contributes by synthetizing the existing literature and pinpointing the essence of knowledge-based management disciplines.

Network Reconfiguration for Load Balancing in Distribution System with Distributed Generation and Capacitor Placement

This paper presents an efficient algorithm for optimization of radial distribution systems by a network reconfiguration to balance feeder loads and eliminate overload conditions. The system load-balancing index is used to determine the loading conditions of the system and maximum system loading capacity. The index value has to be minimum in the optimal network reconfiguration of load balancing. A method based on Tabu search algorithm, The Tabu search algorithm is employed to search for the optimal network reconfiguration. The basic idea behind the search is a move from a current solution to its neighborhood by effectively utilizing a memory to provide an efficient search for optimality. It presents low computational effort and is able to find good quality configurations. Simulation results for a radial 69-bus system with distributed generations and capacitors placement. The study results show that the optimal on/off patterns of the switches can be identified to give the best network reconfiguration involving balancing of feeder loads while respecting all the constraints.

Two-Phase Optimization for Selecting Materialized Views in a Data Warehouse

A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.

Experimental Investigation of a Novel Reaction in Reduction of Sulfates by Natural Gas as a Reducing Agent

In a pilot plant scale of a fluidized bed reactor, a reduction reaction of sodium sulfate by natural gas has been investigated. Natural gas is applied in this study as a reductant. Feed density, feed mass flow rate, natural gas and air flow rate (independent parameters)and temperature of bed and CO concentration in inlet and outlet of reactor (dependent parameters) were monitored and recorded at steady state. The residence time was adjusted close to value of traditional reaction [1]. An artificial neural network (ANN) was established to study dependency of yield and carbon gradient on operating parameters. Resultant 97% accuracy of applied ANN is a good prove that natural gas can be used as a reducing agent. Predicted ANN model for relation between other sources carbon gradient (accuracy 74%) indicates there is not a meaningful relation between other sources carbon variation and reduction process which means carbon in granule does not have significant effect on the reaction yield.

The Optimal Equilibrium Capacity of Information Hiding Based on Game Theory

Game theory could be used to analyze the conflicted issues in the field of information hiding. In this paper, 2-phase game can be used to build the embedder-attacker system to analyze the limits of hiding capacity of embedding algorithms: the embedder minimizes the expected damage and the attacker maximizes it. In the system, the embedder first consumes its resource to build embedded units (EU) and insert the secret information into EU. Then the attacker distributes its resource evenly to the attacked EU. The expected equilibrium damage, which is maximum damage in value from the point of view of the attacker and minimum from the embedder against the attacker, is evaluated by the case when the attacker attacks a subset from all the EU. Furthermore, the optimal equilibrium capacity of hiding information is calculated through the optimal number of EU with the embedded secret information. Finally, illustrative examples of the optimal equilibrium capacity are presented.

Tourist Satisfaction and Repeat Visitation; Toward a New Comprehensive Model

Tourism researchers have recently focused on repeat visitation as a part of destination loyalty. Different models have also considered satisfaction as the main determinant of revisit intention, while findings in many studies show it as a continuous issue. This conceptual paper attempts at evaluating recent empirical studies on satisfaction and revisit intention. Based on limitations and gaps in recent studies, the current paper suggests a new model that would be more comprehensive than those in previous studies. The new model offers new relationships between antecedents (destination image, perceived value, specific novelty seeking, and distance to destination) and both of satisfaction and revisit intention. Revisit intention in turn is suggested to be measured in a temporal approach.

Optimization of Parametric Studies Using Strategies of Sampling Techniques

To improve the efficiency of parametric studies or tests planning the method is proposed, that takes into account all input parameters, but only a few simulation runs are performed to assess the relative importance of each input parameter. For K input parameters with N input values the total number of possible combinations of input values equals NK. To limit the number of runs, only some (totally N) of possible combinations are taken into account. The sampling procedure Updated Latin Hypercube Sampling is used to choose the optimal combinations. To measure the relative importance of each input parameter, the Spearman rank correlation coefficient is proposed. The sensitivity and the influence of all parameters are analyzed within one procedure and the key parameters with the largest influence are immediately identified.

A Predictive Rehabilitation Software for Cerebral Palsy Patients

Young patients suffering from Cerebral Palsy are facing difficult choices concerning heavy surgeries. Diagnosis settled by surgeons can be complex and on the other hand decision for patient about getting or not such a surgery involves important reflection effort. Proposed software combining prediction for surgeries and post surgery kinematic values, and from 3D model representing the patient is an innovative tool helpful for both patients and medicine professionals. Beginning with analysis and classification of kinematics values from Data Base extracted from gait analysis in 3 separated clusters, it is possible to determine close similarity between patients. Prediction surgery best adapted to improve a patient gait is then determined by operating a suitable preconditioned neural network. Finally, patient 3D modeling based on kinematic values analysis, is animated thanks to post surgery kinematic vectors characterizing the closest patient selected from patients clustering.

Renewal of The Swedish Million Dwelling Program, the Public Housing Company and the Local Community, Hindrances and Mutual Aid

Public housing is a vital factor in community development. Successful city, housing and eco system regeneration design is essential in providing positive community development. This concerns work places, nice dwellings, providing premises for child care, care of the elderly, providing qualitative premises for different kinds of commercial service, providing a nice built environment and housing areas and not the least activating tenants. The public housing companies give value to society by stimulating people, renovating socially and economically sustainable as well as being partners to local business and authorities. By their activities the housing companies contribute to sustainable local and regional growth and the identity and reputation of cities. A Social, Economic and Ecological Reputation Effect (SEERE) model for actions to promote housing and community reputation is presented. The model emphasizes regenerative actions to restore natural eco systems as part of housing renewal strategies and to strengthen municipality reputation.

Towards a Unified Approach of Social Justice: Merging Tradition and Modernity in Public Policy Making in India

This paper explores the social and political imperatives in the sphere of public policy relating to social justice. In India, the colonial legacy and post-colonial social and political pressures sustained the appropriation of 'caste' category in allocating public resources to the backward class of citizens. For several reasons, 'economic' category could not be placed in allocating resources. This paper examines the reasons behind the deliberative exercises and formulating policies and seeks an alternative framework in realizing social justice in terms of a unified category. This attempt can be viewed as a reconciliation of traditional and modern values for a viable alternative in public policy making.

Target and Kaizen Costing

increased competition and increased costs of designing made it important for the firms to identify the right products and the right methods for manufacturing the products. Firms should focus on customers and identify customer demands directly to design the right products. Several management methods and techniques that are currently available improve one or more functions or processes in an industry and do not take the complete product life cycle into consideration. On the other hand target costing is a method / philosophy that takes financial, manufacturing and customer aspects into consideration during designing phase and helps firms in making product design decisions to increase the profit / value of the company. It uses various techniques to identify customer demands, to decrease costs of manufacturing and finally to achieve strategic goals. Target Costing forms an integral part of total product design / redesign based on strategic plans.

Weak Measurement Theory for Discrete Scales

With the increasing spread of computers and the internet among culturally, linguistically and geographically diverse communities, issues of internationalization and localization and becoming increasingly important. For some of the issues such as different scales for length and temperature, there is a well-developed measurement theory. For others such as date formats no such theory will be possible. This paper fills a gap by developing a measurement theory for a class of scales previously overlooked, based on discrete and interval-valued scales such as spanner and shoe sizes. The paper gives a theoretical foundation for a class of data representation problems.

Artificial Intelligence for Software Quality Improvement

This paper presents a software quality support tool, a Java source code evaluator and a code profiler based on computational intelligence techniques. It is Java prototype software developed by AI Group [1] from the Research Laboratories at Universidad de Palermo: an Intelligent Java Analyzer (in Spanish: Analizador Java Inteligente, AJI). It represents a new approach to evaluate and identify inaccurate source code usage and transitively, the software product itself. The aim of this project is to provide the software development industry with a new tool to increase software quality by extending the value of source code metrics through computational intelligence.

Development and Evaluation of a Dynamic Cardiac Phantom for use in Nuclear Medicine

The aim of this study was to develop a dynamic cardiac phantom for quality control in myocardial scintigraphy. The dynamic heart phantom constructed only contained the left ventricle, made of elastic material (latex), comprising two cavities: one internal and one external. The data showed a non-significant variation in the values of left ventricular ejection fraction (LVEF) obtained by varying the heart rate. It was also possible to evaluate the ejection fraction (LVEF) through different arrays of image acquisition and to perform an intercomparison of LVEF by two different scintillation cameras. The results of the quality control tests were satisfactory, showing that they can be used as parameters in future assessments. The new dynamic heart phantom was demonstrated to be effective for use in LVEF measurements. Therefore, the new heart simulator is useful for the quality control of scintigraphic cameras.

A Model for Application of Knowledge Management in Public Organizations in Iran

This study examines knowledge management in the public organizations in Iran. The purpose of this article is to provide a conceptual framework for application of knowledge management in public organizations. The study indicates that an increasing tendency for implementation of knowledge management in organizations is emerging. Nonetheless knowledge management in public organizations is toddler and little has been done to bring the subject to use in the public sector. The globalization of change and popularization of some values like participation, citizen-orientation and knowledge-orientation in the new theories of public administration requires that the knowledge management is considered and attend to in the public sector. This study holds that a knowledge management framework for public organizations is different from this in the public sector, because public sector is stakeholder-dependent while the private is shareholder-dependent. Based on the research, we provide a conceptual model. The model proposed involves three factors: Organizational, knowledge citizens and contextual factors. The study results indicate these factors affect on knowledge management in public organizations in Iran.

Sidecooler Flow Field Investigation

One of the aims of the paper is to make a comparison of experimental results with numerical simulation for a side cooler. Specifically, it was the amount of air to be delivered by the side cooler with fans running at 100%. This integral value was measured and evaluated within the plane parallel to the front side of the side cooler at a distance of 20mm from the front side. The flow field extending from the side cooler to the space was also evaluated. Another objective was to address the contribution of evaluated values to the increase of data center energy consumption.

A New Weighted LDA Method in Comparison to Some Versions of LDA

Linear Discrimination Analysis (LDA) is a linear solution for classification of two classes. In this paper, we propose a variant LDA method for multi-class problem which redefines the between class and within class scatter matrices by incorporating a weight function into each of them. The aim is to separate classes as much as possible in a situation that one class is well separated from other classes, incidentally, that class must have a little influence on classification. It has been suggested to alleviate influence of classes that are well separated by adding a weight into between class scatter matrix and within class scatter matrix. To obtain a simple and effective weight function, ordinary LDA between every two classes has been used in order to find Fisher discrimination value and passed it as an input into two weight functions and redefined between class and within class scatter matrices. Experimental results showed that our new LDA method improved classification rate, on glass, iris and wine datasets, in comparison to different versions of LDA.

Experimental Determination of Large Strain Localization in Cut Steel Chips

Metal cutting is a severe plastic deformation process involving large strains, high strain rates, and high temperatures. Conventional analysis of the chip formation process is based on bulk material deformation disregarding the inhomogeneous nature of the material microstructure. A series of orthogonal cutting tests of AISI 1045 and 1144 steel were conducted which yielded similar process characteristics and chip formations. With similar shear angles and cut chip thicknesses, shear strains for both chips were found to range from 2.0 up to 2.8. The manganese-sulfide (MnS) precipitate in the 1144 steel has a very distinct and uniform shape which allows for comparison before and after chip formation. From close observations of MnS precipitates in the cut chips it is shown that the conventional approach underestimates plastic strains in metal cutting. Experimental findings revealed local shear strains around a value of 6. These findings and their implications are presented and discussed.

Geochemical Assessment of Heavy Metals Concentration in Surface Sediment of West Port, Malaysia

One year (November 2009-October 2010) sediment monitoring was used to evaluate pollution status, concentration and distribution of heavy metals (As, Cu, Cd, Cr, Hg, Ni, Pb and Zn) in West Port of Malaysia. Sediment sample were collected from nine stations every four months. Geo-accumulation factor and Pollution Load Index (PLI) were estimated to better understand the pollution level in study area. The heavy metal concentration (Mg/g dry weight) were ranged from 20.2 to 162 for As, 7.4 to 27.6 for Cu, 0.244 to 3.53 for Cd, 11.5 to 61.5 for Cr, 0.11 to 0.409 for Hg, 7.2 to 22.2 for Ni, 22.3 to 80 for Pb and 23 to 98.3 for Zn. In general, concentration some metals (As,Cd, Hg and Pb) was higher than background values that are considered as serious concern for aquatic life and the human health.

Metoprolol Tartrate-Ethylcellulose Tabletted Microparticles: Development of a Validated Invitro In-vivo Correlation

This study describes the methodology for the development of a validated in-vitro in-vivo correlation (IVIVC) for metoprolol tartrate modified release dosage forms with distinctive release rate characteristics. Modified release dosage forms were formulated by microencapsulation of metoprolol tartrate into different amounts of ethylcellulose by non-solvent addition technique. Then in-vitro and in-vivo studies were conducted to develop and validate level A IVIVC for metoprolol tartrate. The values of regression co-efficient (R2-values) for IVIVC of T2 and T3 formulations were not significantly (p