Dignity and Suffering: Reading of Human Rights in Untouchable by Anand

Cultural stories are political. They register cultural phenomena and their relations with the world and society in term of their existence, function, characteristics by using different context. This paper will provide a new way of rethinking which will help us to rethink the relationship between fiction and politics. It discusses the theme of human rights and it shows the relevance between art and politics by studying the civil society through a literary framework. Reasons to establish a relationship between fiction and politics are the relevant themes and universal issues among the two disciplines. Both disciplines are sets of views and ideas formulated by the human mind to explain political or cultural phenomenon. Other reasons are the complexity and depth of the author-s vision, and the need to explain the violations of human rights in a more active structure which can relate to emotional and social existence.

Sensitivity Analysis of Real-Time Systems

Verification of real-time software systems can be expensive in terms of time and resources. Testing is the main method of proving correctness but has been shown to be a long and time consuming process. Everyday engineers are usually unwilling to adopt formal approaches to correctness because of the overhead associated with developing their knowledge of such techniques. Performance modelling techniques allow systems to be evaluated with respect to timing constraints. This paper describes PARTES, a framework which guides the extraction of performance models from programs written in an annotated subset of C.

A Reusability Evaluation Model for OO-Based Software Components

The requirement to improve software productivity has promoted the research on software metric technology. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. CK metric suit is most widely used metrics for the objectoriented (OO) software; we critically analyzed the CK metrics, tried to remove the inconsistencies and devised the framework of metrics to obtain the structural analysis of OO-based software components. Neural network can learn new relationships with new input data and can be used to refine fuzzy rules to create fuzzy adaptive system. Hence, Neuro-fuzzy inference engine can be used to evaluate the reusability of OO-based component using its structural attributes as inputs. In this paper, an algorithm has been proposed in which the inputs can be given to Neuro-fuzzy system in form of tuned WMC, DIT, NOC, CBO , LCOM values of the OO software component and output can be obtained in terms of reusability. The developed reusability model has produced high precision results as expected by the human experts.

Modeling the Symptom-Disease Relationship by Using Rough Set Theory and Formal Concept Analysis

Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.

Quantification of Technology Innovation Usinga Risk-Based Framework

There is significant interest in achieving technology innovation through new product development activities. It is recognized, however, that traditional project management practices focused only on performance, cost, and schedule attributes, can often lead to risk mitigation strategies that limit new technology innovation. In this paper, a new approach is proposed for formally managing and quantifying technology innovation. This approach uses a risk-based framework that simultaneously optimizes innovation attributes along with traditional project management and system engineering attributes. To demonstrate the efficacy of the new riskbased approach, a comprehensive product development experiment was conducted. This experiment simultaneously managed the innovation risks and the product delivery risks through the proposed risk-based framework. Quantitative metrics for technology innovation were tracked and the experimental results indicate that the risk-based approach can simultaneously achieve both project deliverable and innovation objectives.

Component Based Framework for Authoring and Multimedia Training in Mathematics

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

Organizational Dimensions as Determinant Factors of KM Approaches in SMEs

In the current economy of increasing global competition, many organizations are attempting to use knowledge as one of the means to gain sustainable competitive advantage. Besides large organizations, the success of SMEs can be linked to how well they manage their knowledge. Despite the profusion of research about knowledge management within large organizations, fewer studies tried to analyze KM in SMEs. This research proposes a new framework showing the determinant role of organizational dimensions onto KM approaches. The paper and its propositions are based on a literature review and analysis. In this research, personalization versus codification, individualization versus institutionalization and IT-based versus non IT-based are highlighted as three distinct dimensions of knowledge management approaches. The study contributes to research by providing a more nuanced classification of KM approaches and provides guidance to managers about the types of KM approaches that should be adopted based on the size, geographical dispersion and task nature of SMEs. To the author-s knowledge, the paper is the first of its kind to examine if there are suitable configurations of KM approaches for SMEs with different dimensions. It gives valuable information, which hopefully will help SME sector to accomplish KM.

Identifying Corruption in Legislation using Risk Analysis Methods

The objective of this article is to discuss the potential of economic analysis as a tool for identification and evaluation of corruption in legislative acts. We propose that corruption be perceived as a risk variable within the legislative process. Therefore we find it appropriate to employ risk analysis methods, used in various fields of economics, for the evaluation of corruption in legislation. Furthermore we propose the incorporation of these methods into the so called corruption impact assessment (CIA), the general framework for detection of corruption in legislative acts. The applications of the risk analysis methods are demonstrated on examples of implementation of proposed CIA in the Czech Republic.

e Collaborative Decisions – a DSS for Academic Environment

This paper presents an innovative approach within the area of Group Decision Support System (GDSS) by using tools based on intelligent agents. It introduces iGDSS, a software platform for decision support and collaboration and an application of this platform - eCollaborative Decisions - for academic environment, all these developed within a framework of a research project.

South African MNEs Entry Strategies in Africa

This is a cross-cultural study that determines South African multinational enterprises (MNEs) entry strategies as they invest in Africa. An integrated theoretical framework comprising the transaction cost theory, Uppsala model, eclectic paradigm and the distance framework was adopted. A sample of 40 South African MNEs with 415 existing FDI entries in Africa was drawn. Using an ordered logistic regression model, the impact of culture on the choice of degree of control by South African MNEs in Africa was determined. Cultural distance was one of significant factors that influenced South African MNEs- choice of degree of control. Furthermore, South African MNEs are risk averse in all countries in Africa but minimize the risks differently across sectors. Service sectors chooses to own their subsidiaries 100% and avoid dealing with the locals while manufacturing, resources and construction choose to have a local partner to share the risk.

Issues in Travel Demand Forecasting

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper

Joint Optimization of Pricing and Advertisement for Seasonal Branded Products

The goal of this paper is to develop a model to integrate “pricing" and “advertisement" for short life cycle products, such as branded fashion clothing products. To achieve this goal, we apply the concept of “Dynamic Pricing". There are two classes of advertisements, for the brand (regardless of product) and for a particular product. Advertising the brand affects the demand and price of all the products. Thus, the model considers all these products in relation with each other. We develop two different methods to integrate both types of advertisement and pricing. The first model is developed within the framework of dynamic programming. However, due to the complexity of the model, this method cannot be applicable for large size problems. Therefore, we develop another method, called hieratical approach, which is capable of handling the real world problems. Finally, we show the accuracy of this method, both theoretically and also by simulation.

A Conceptual Framework for Supply Chain Competitiveness

The purpose of this paper is to highlight the importance of the concept of competitiveness in the supply chain and to present a conceptual framework for Supply Chain Competitiveness (SCC). The framework is based on supply chain activities, which are inputs, necessary for SCC and the benefits which are the outputs of SCC. A literature review is conducted on key supply chain competitiveness issues, its determinants, its various dimensions followed by exploration for SCC. Based on the insights gained, a conceptual framework for SCC is presented based on activities for SCC, SCC environment and outcomes of SCC. The information flow in the conceptual framework is bi-directional at all levels and the activities are interrelated in a global competitive environment. The activities include the activities of suppliers, manufacturers and distributors, giving more emphasis on manufacturers- activities. Further, implications of various factors such as economic, politicolegal, technical, socio-cultural, competition, demographic etc. are also highlighted. The SCC framework is an attempt to cover the relatively less explored area of supply chain competitiveness. It is expected that this work will further motivate researchers, academicians and practitioners to work in this area and offers conceptual help in providing a directions for supply chain competitiveness which leads to improvement in the supply chain and supply chain performance.

Color Image Segmentation using Adaptive Spatial Gaussian Mixture Model

An adaptive spatial Gaussian mixture model is proposed for clustering based color image segmentation. A new clustering objective function which incorporates the spatial information is introduced in the Bayesian framework. The weighting parameter for controlling the importance of spatial information is made adaptive to the image content to augment the smoothness towards piecewisehomogeneous region and diminish the edge-blurring effect and hence the name adaptive spatial finite mixture model. The proposed approach is compared with the spatially variant finite mixture model for pixel labeling. The experimental results with synthetic and Berkeley dataset demonstrate that the proposed method is effective in improving the segmentation and it can be employed in different practical image content understanding applications.

E-Government in Transition Economies

This paper deals with e-government issues at several levels. Initially we look at the concept of e-government itself in order to give it a sound framework. Than we look at the e-government issues at three levels, first we analyse it at the global level, second we analyse it at the level of transition economies, and finally we take a closer look on developments in Croatia. The analysis includes actual progress being made in selected transition economies given the Euro area averages, along with e-government potential in future demanding period.

Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

A Microcontroller Implementation of Model Predictive Control

Model Predictive Control (MPC) is increasingly being proposed for real time applications and embedded systems. However comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprises as well as a transformer of organizations and markets. Recently, advances in microelectronics and software allow such technique to be implemented in embedded systems. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In fact in this paper, we propose an efficient framework for implementation of Generalized Predictive Control (GPC) in the performed STM32 microcontroller. The STM32 keil starter kit based on a JTAG interface and the STM32 board was used to implement the proposed GPC firmware. Besides the GPC, the PID anti windup algorithm was also implemented using Keil development tools designed for ARM processor-based microcontroller devices and working with C/Cµ langage. A performances comparison study was done between both firmwares. This performances study show good execution speed and low computational burden. These results encourage to develop simple predictive algorithms to be programmed in industrial standard hardware. The main features of the proposed framework are illustrated through two examples and compared with the anti windup PID controller.

COTT – A Testability Framework for Object-Oriented Software Testing

Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.

Analyzing Convergence of IT and Energy Industry Based on Social System Framework

The purpose of this study is to analyze Green IT industry in major developed countries and to suggest overall directions for IT-Energy convergence industry. Recently, IT industry is pointed out as a problem such as environmental pollution, energy exhaustion, and high energy consumption. Therefore, Green IT gets focused which concerns as solution of these problems. However, since it is a beginning stage of this convergence area, there are only a few studies of IT-Energy convergence industry. According to this, this study examined the major developed countries in terms of institution arrangements, resources, markets and companies based on Van de Ven(1999)'s social system framework that shows relationship among key components of industrial infrastructure. Subsequently, the direction of the future study of convergence on IT and Energy industry is proposed.

Efficient and Extensible Data Processing Framework in Ubiquitious Sensor Networks

This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.