On the Noise Distance in Robust Fuzzy C-Means

In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.

On Solution of Interval Valued Intuitionistic Fuzzy Assignment Problem Using Similarity Measure and Score Function

The primary objective of the paper is to propose a new method for solving assignment problem under uncertain situation. In the classical assignment problem (AP), zpqdenotes the cost for assigning the qth job to the pth person which is deterministic in nature. Here in some uncertain situation, we have assigned a cost in the form of composite relative degree Fpq instead of  and this replaced cost is in the maximization form. In this paper, it has been solved and validated by the two proposed algorithms, a new mathematical formulation of IVIF assignment problem has been presented where the cost has been considered to be an IVIFN and the membership of elements in the set can be explained by positive and negative evidences. To determine the composite relative degree of similarity of IVIFS the concept of similarity measure and the score function is used for validating the solution which is obtained by Composite relative similarity degree method. Further, hypothetical numeric illusion is conducted to clarify the method’s effectiveness and feasibility developed in the study. Finally, conclusion and suggestion for future work are also proposed.

The Relationship between the Ramadan Bazaar and the Attraction and Dissemination of Information: A Case of International Tourists

Many people regard food events as part of gastronomic tourism and important in enhancing visitors’ experiences. Realizing the importance and contribution of food events to a country’s economy, the Malaysia government is undertaking greater efforts to promote such tourism activities to international tourists. Among other food events, the Ramadan bazaar is a unique food culture event, which receives significant attention from the Malaysia Ministry of Tourism. This study reports the empirical investigation into the international tourists’ perceptions, attraction towards the Ramadan bazaar and willingness in disseminating the information. Using the Ramadan bazaar at Kampung Baru, Kuala Lumpur as the data collection setting, results revealed that the Ramadan bazaar attributes (food and beverages, events and culture) significantly influenced the international tourist attraction to such a bazaar. Their high level of experience and satisfaction positively influenced their willingness to disseminate information. The positive response among the international tourists indicates that the Ramadan bazaar as gastronomic tourism can be used in addition to other tourism products as a catalyst to generate and boost the local economy. The related authorities that are closely associated with the tourism industry therefore should not ignore this indicator but continue to take proactive action in promoting the gastronomic event as one of the major tourist attractions.

Equal Sharing Solutions for Bicooperative Games

In this paper, we discuss the egalitarianism solution (ES) and center-of-gravity of the imputation-set value (CIV) for bicooperative games, which can be seen as the extensions of the solutions for traditional games given by Dutta and Ray [1] and Driessen and Funaki [2]. Furthermore, axiomatic systems for the given values are proposed. Finally, a numerical example is offered to illustrate the player ES and CTV.

Application of the Improved QFD Method Case Study: Kitchen Utensils Rack Design

This paper presents an application of the improved QFD method for determining the specifications of kitchen utensils rack. By using the improved method, the subjective nature in original QFD was reduced; particularly in defining the relationship between customer requirement and engineering characteristics. The regression analysis that was used for obtaining the relationship functions between customer requirement and engineering characteristics also accommodated the inaccurateness of the competitive assessment results. The improved method which is represented in the form of a mathematical model had become a formal guidance to allocate the resource for improving the specifications of kitchen utensils rack. The specifications obtained had led to the achievement of the highest feasible customer satisfaction.

Development of Wind Turbine Simulator for Generator Torque Control

Wind turbine should be controlled to capture maximum wind energy and to prevent the turbine from being stalled. To achieve those two goals, wind turbine controller controls torque on generator and limits input torque from wind by pitching blade. Usually, torque on generator is controlled using inverter torque set point. However, verifying a control algorithm in actual wind turbine needs a lot of efforts to test and the actual wind turbine could be broken while testing a control algorithm. So, several software have developed and commercialized by Garrad Hassan, GH Bladed, and NREL, FAST. Even though, those programs can simulate control system modeling with subroutines or DLLs. However, those simulation programs are not able to emulate detailed generator or PMSG. In this paper, a small size wind turbine simulator is developed with induction motor and small size drive train. The developed system can simulate wind turbine control algorithm in the region before rated power.

Scheduling Maintenance Actions for Gas Turbines Aircraft Engines

This paper considers the problem of scheduling maintenance actions for identical aircraft gas turbine engines. Each one of the turbines consists of parts which frequently require replacement. A finite inventory of spare parts is available and all parts are ready for replacement at any time. The inventory consists of both new and refurbished parts. Hence, these parts have different field lives. The goal is to find a replacement part sequencing that maximizes the time that the aircraft will keep functioning before the inventory is replenished. The problem is formulated as an identical parallel machine scheduling problem where the minimum completion time has to be maximized. Two models have been developed. The first one is an optimization model which is based on a 0-1 linear programming formulation, while the second one is an approximate procedure which consists in decomposing the problem into several two-machine subproblems. Each subproblem is optimally solved using the first model. Both models have been implemented using Lingo and have been tested on two sets of randomly generated data with up to 150 parts and 10 turbines. Experimental results show that the optimization model is able to solve only instances with no more than 4 turbines, while the decomposition procedure often provides near-optimal solutions within a maximum CPU time of 3 seconds.

A Cognitive Architectural Approach to the Institutional Roles of Agent Societies

This paper concerns a formal model to help the simulation of agent societies where institutional roles and institutional links can be specified operationally. That is, this paper concerns institutional roles that can be specified in terms of a minimal behavioral capability that an agent should have in order to enact that role and, thus, to perform the set of institutional functions that role is responsible for. Correspondingly, the paper concerns institutional links that can be specified in terms of a minimal interactional capability that two agents should have in order to, while enacting the two institutional roles that are linked by that institutional link, perform for each other the institutional functions supported by that institutional link. The paper proposes a cognitive architecture approach to institutional roles and institutional links, that is, an approach in which a institutional role is seen as an abstract cognitive architecture that should be implemented by any concrete agent (or set of concrete agents) that enacts the institutional role, and in which institutional links are seen as interactions between the two abstract cognitive agents that model the two linked institutional roles. We introduce a cognitive architecture for such purpose, called the Institutional BCC (IBCC) model, which lifts Yoav Shoham-s BCC (Beliefs-Capabilities-Commitments) agent architecture to social contexts. We show how the resulting model can be taken as a means for a cognitive architecture account of institutional roles and institutional links of agent societies. Finally, we present an example of a generic scheme for certain fragments of the social organization of agent societies, where institutional roles and institutional links are given in terms of the model.

The Effect of Combining Real Experimentation With Virtual Experimentation on Students-Success

The purpose of this study was to investigate the effect of combining Real Experimentation (RE) With Virtual Experimentation (VE) on students- conceptual understanding of photo electric effect. To achieve this, a pre–post comparison study design was used that involved 46 undergraduate students. Two groups were set up for this study. Participants in the control group used RE to learn photo electric effect, whereas, participants in the experimental group used RE in the first part of the curriculum and VE in another part. Achievement test was given to the groups before and after the application as pre-test and post test. The independent samples t- test, one way Anova and Tukey HSD test were used for testing the data obtained from the study. According to the results of analyzes, the experimental group was found more successful than the control group.

Reliability Analysis of Press Unit using Vague Set

In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.

Sustainability Management for Wine Production: A Case of Thailand

At present, increased concerns about global environmental problems have magnified the importance of sustainability management. To move towards sustainability, companies need to look at everything from a holistic perspective in order to understand the interconnections between economic growth and environmental and social sustainability. This paper aims to gain an understanding of key determinants that drive sustainability management and barriers that hinder its development. It employs semi-structured interviews with key informants, site observation and documentation. The informants are production, marketing and environmental managers of the leading wine producer, which aims to become an Asia-s leader in wine & wine based products. It is found that corporate image and top management leadership are the primary factors influencing the adoption of sustainability management. Lack of environmental knowledge and inefficient communication are identified as barriers.

Fuzzy Modeling Tool for Creating a Component Model of Information System

This paper focuses on creating a component model of information system under uncertainty. The paper identifies problem in current approach of component modeling and proposes fuzzy tool, which will work with vague customer requirements and propose components of the resulting component model. The proposed tool is verified on specific information system and results are shown in paper. After finding suitable sub-components of the resulting component model, the component model is visualised by tool.

Privacy Issues in Pervasive Healthcare Monitoring System: A Review

Privacy issues commonly discussed among researchers, practitioners, and end-users in pervasive healthcare. Pervasive healthcare systems are applications that can support patient-s need anytime and anywhere. However, pervasive healthcare raises privacy concerns since it can lead to situations where patients may not be aware that their private information is being shared and becomes vulnerable to threat. We have systematically analyzed the privacy issues and present a summary in tabular form to show the relationship among the issues. The six issues identified are medical information misuse, prescription leakage, medical information eavesdropping, social implications for the patient, patient difficulties in managing privacy settings, and lack of support in designing privacy-sensitive applications. We narrow down the issues and chose to focus on the issue of 'lack of support in designing privacysensitive applications' by proposing a privacy-sensitive architecture specifically designed for pervasive healthcare monitoring systems.

Financial Regulations in the Process of Global Financial Crisis and Macroeconomics Impact of Basel III

Basel III (or the Third Basel Accord) is a global regulatory standard on bank capital adequacy, stress testing and market liquidity risk agreed upon by the members of the Basel Committee on Banking Supervision in 2010-2011, and scheduled to be introduced from 2013 until 2018. Basel III is a comprehensive set of reform measures. These measures aim to; (1) improve the banking sector-s ability to absorb shocks arising from financial and economic stress, whatever the source, (2) improve risk management and governance, (3) strengthen banks- transparency and disclosures. Similarly the reform target; (1) bank level or micro-prudential, regulation, which will help raise the resilience of individual banking institutions to periods of stress. (2) Macro-prudential regulations, system wide risk that can build up across the banking sector as well as the pro-cyclical implication of these risks over time. These two approaches to supervision are complementary as greater resilience at the individual bank level reduces the risk system wide shocks. Macroeconomic impact of Basel III; OECD estimates that the medium-term impact of Basel III implementation on GDP growth is in the range -0,05 percent to -0,15 percent per year. On the other hand economic output is mainly affected by an increase in bank lending spreads as banks pass a rise in banking funding costs, due to higher capital requirements, to their customers. Consequently the estimated effects on GDP growth assume no active response from monetary policy. Basel III impact on economic output could be offset by a reduction (or delayed increase) in monetary policy rates by about 30 to 80 basis points. The aim of this paper is to create a framework based on the recent regulations in order to prevent financial crises. Thus the need to overcome the global financial crisis will contribute to financial crises that may occur in the future periods. In the first part of the paper, the effects of the global crisis on the banking system examine the concept of financial regulations. In the second part; especially in the financial regulations and Basel III are analyzed. The last section in this paper explored the possible consequences of the macroeconomic impacts of Basel III.

Enhancing Multi-Frame Images Using Self-Delaying Dynamic Networks

This paper presents the use of a newly created network structure known as a Self-Delaying Dynamic Network (SDN) to create a high resolution image from a set of time stepped input frames. These SDNs are non-recurrent temporal neural networks which can process time sampled data. SDNs can store input data for a lifecycle and feature dynamic logic based connections between layers. Several low resolution images and one high resolution image of a scene were presented to the SDN during training by a Genetic Algorithm. The SDN was trained to process the input frames in order to recreate the high resolution image. The trained SDN was then used to enhance a number of unseen noisy image sets. The quality of high resolution images produced by the SDN is compared to that of high resolution images generated using Bi-Cubic interpolation. The SDN produced images are superior in several ways to the images produced using Bi-Cubic interpolation.

Simulation of Dynamics of a Permanent Magnet Linear Actuator

Comparison of two approaches for the simulation of the dynamic behaviour of a permanent magnet linear actuator is presented. These are full coupled model, where the electromagnetic field, electric circuit and mechanical motion problems are solved simultaneously, and decoupled model, where first a set of static magnetic filed analysis is carried out and then the electric circuit and mechanical motion equations are solved employing bi-cubic spline approximations of the field analysis results. The results show that the proposed decoupled model is of satisfactory accuracy and gives more flexibility when the actuator response is required to be estimated for different external conditions, e.g. external circuit parameters or mechanical loads.

Performance Evaluation of Neural Network Prediction for Data Prefetching in Embedded Applications

Embedded systems need to respect stringent real time constraints. Various hardware components included in such systems such as cache memories exhibit variability and therefore affect execution time. Indeed, a cache memory access from an embedded microprocessor might result in a cache hit where the data is available or a cache miss and the data need to be fetched with an additional delay from an external memory. It is therefore highly desirable to predict future memory accesses during execution in order to appropriately prefetch data without incurring delays. In this paper, we evaluate the potential of several artificial neural networks for the prediction of instruction memory addresses. Neural network have the potential to tackle the nonlinear behavior observed in memory accesses during program execution and their demonstrated numerous hardware implementation emphasize this choice over traditional forecasting techniques for their inclusion in embedded systems. However, embedded applications execute millions of instructions and therefore millions of addresses to be predicted. This very challenging problem of neural network based prediction of large time series is approached in this paper by evaluating various neural network architectures based on the recurrent neural network paradigm with pre-processing based on the Self Organizing Map (SOM) classification technique.

The Analysis of Printing Quality of Offset - Printing Ink with Coconut Oil Base

The objectives of this research are to produce prototype coconut oil based solvent offset printing inks and to analyze a basic quality of printing work derived from coconut oil based solvent offset printing inks, by mean of bringing coconut oil for producing varnish and bringing such varnish to produce black offset printing inks. Then, analysis of qualities i.e. CIELAB value, density value, and dot gain value of printing work from coconut oil based solvent offset printing inks which printed on gloss-coated woodfree paper weighs 130 grams were done. The research result of coconut oil based solvent offset printing inks indicated that the suitable varnish formulation is using 51% of coconut oil, 36% of phenolic resin, and 14% of solvent oil 14%, while the result of producing black offset ink displayed that the suitable formula of printing ink is using varnish mixed with 20% of coconut oil, and the analyzing printing work of coconut oil based solvent offset printing inks which printed on paper, the results were as follows: CIELAB value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* = 1.86, density value is at 1.27 and dot gain value was high at mid tone area of image area.

Medical Image Segmentation Using Deformable Model and Local Fitting Binary: Thoracic Aorta

This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA datasets. An important challenge in reliably detecting aortic is the need to overcome problems associated with intensity inhomogeneities. Level sets are part of an important class of methods that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the level set formulation aids the suppression of noise in the extracted regions of interest and then guides the motion of the evolving contour for the detection of weak boundaries. The speed of curve evolution has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level sets, and are shown to be more effective than other approaches in coping with intensity inhomogeneities. We have applied the Courant Friedrichs Levy (CFL) condition as stability criterion for our algorithm.

An Adaptive Memetic Algorithm With Dynamic Population Management for Designing HIV Multidrug Therapies

In this paper, a mathematical model of human immunodeficiency virus (HIV) is utilized and an optimization problem is proposed, with the final goal of implementing an optimal 900-day structured treatment interruption (STI) protocol. Two type of commonly used drugs in highly active antiretroviral therapy (HAART), reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are considered. In order to solving the proposed optimization problem an adaptive memetic algorithm with population management (AMAPM) is proposed. The AMAPM uses a distance measure to control the diversity of population in genotype space and thus preventing the stagnation and premature convergence. Moreover, the AMAPM uses diversity parameter in phenotype space to dynamically set the population size and the number of crossovers during the search process. Three crossover operators diversify the population, simultaneously. The progresses of crossover operators are utilized to set the number of each crossover per generation. In order to escaping the local optima and introducing the new search directions toward the global optima, two local searchers assist the evolutionary process. In contrast to traditional memetic algorithms, the activation of these local searchers is not random and depends on both the diversity parameters in genotype space and phenotype space. The capability of AMAPM in finding optimal solutions compared with three popular metaheurestics is introduced.