Enhanced Character Based Algorithm for Small Parsimony

Phylogenetic tree is a graphical representation of the evolutionary relationship among three or more genes or organisms. These trees show relatedness of data sets, species or genes divergence time and nature of their common ancestors. Quality of a phylogenetic tree requires parsimony criterion. Various approaches have been proposed for constructing most parsimonious trees. This paper is concerned about calculating and optimizing the changes of state that are needed called Small Parsimony Algorithms. This paper has proposed enhanced small parsimony algorithm to give better score based on number of evolutionary changes needed to produce the observed sequence changes tree and also give the ancestor of the given input.

Target Tracking in Sensor Networks: A Distributed Constraint Satisfaction Approach

In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.

Antecedent Factors of Ethical Ideologies in Moral Judgment: Evidence from the Mixed Method Study

This research investigates the factors that influence moral judgments when dealing with ethical dilemmas in the organizational context. It also investigates the antecedents of individual ethical ideology (idealism and relativism). A mixed method study, which combines qualitative (field study) and quantitative (survey) approaches, was used in this study. An initial model was developed first, which was then fine-tuned based on field studies. Data were collected from managers in Malaysian large organizations. The results of this study reveal that in-group collectivism culture, power distance culture, parental values, and religiosity were significant as antecedents of ethical ideology. However, direct effects of these variables on moral judgment were not significant. Furthermore, the results of this study confirm the significant effects of ethical ideology on moral judgment. This study provides valuable insight into evaluating the validity of existing theory as proposed in the literature and offers significant practical implications.

Object Speed Estimation by using Fuzzy Set

Speed estimation is one of the important and practical tasks in machine vision, Robotic and Mechatronic. the availability of high quality and inexpensive video cameras, and the increasing need for automated video analysis has generated a great deal of interest in machine vision algorithms. Numerous approaches for speed estimation have been proposed. So classification and survey of the proposed methods can be very useful. The goal of this paper is first to review and verify these methods. Then we will propose a novel algorithm to estimate the speed of moving object by using fuzzy concept. There is a direct relation between motion blur parameters and object speed. In our new approach we will use Radon transform to find direction of blurred image, and Fuzzy sets to estimate motion blur length. The most benefit of this algorithm is its robustness and precision in noisy images. Our method was tested on many images with different range of SNR and is satisfiable.

Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Multi-Agent Systems Applied in the Modeling and Simulation of Biological Problems: A Case Study in Protein Folding

Multi-agent system approach has proven to be an effective and appropriate abstraction level to construct whole models of a diversity of biological problems, integrating aspects which can be found both in "micro" and "macro" approaches when modeling this type of phenomena. Taking into account these considerations, this paper presents the important computational characteristics to be gathered into a novel bioinformatics framework built upon a multiagent architecture. The version of the tool presented herein allows studying and exploring complex problems belonging principally to structural biology, such as protein folding. The bioinformatics framework is used as a virtual laboratory to explore a minimalist model of protein folding as a test case. In order to show the laboratory concept of the platform as well as its flexibility and adaptability, we studied the folding of two particular sequences, one of 45-mer and another of 64-mer, both described by an HP model (only hydrophobic and polar residues) and coarse grained 2D-square lattice. According to the discussion section of this piece of work, these two sequences were chosen as breaking points towards the platform, in order to determine the tools to be created or improved in such a way to overcome the needs of a particular computation and analysis of a given tough sequence. The backwards philosophy herein is that the continuous studying of sequences provides itself important points to be added into the platform, to any time improve its efficiency, as is demonstrated herein.

The Relationship between Spatial Planning and Transportation Planning in Southern Africa and its Consequences for Human Settlement

The paper reviews the relationship between spatial and transportation planning in the Southern African Development Community (SADC) region of Sub-Saharan Africa. It argues that most urbanisation in the region has largely occurred subsequent to the 1950s and, accordingly, urban development has been profoundly and negatively affected by the (misguided) spatial and institutional tenets of modernism. It demonstrates how a considerable amount of the poor performance of these settlements can be directly attributed to this. Two factors in particular about the planning systems are emphasized: the way in which programmatic land-use planning lies at the heart of both spatial and transportation planning; and the way on which transportation and spatial planning have been separated into independent processes. In the final section, the paper identifies ways of improving the planning system. Firstly, it identifies the performance qualities which Southern African settlements should be seeking to achieve. Secondly, it focuses on two necessary arenas of change: the need to replace programmatic land-use planning practices with structuralspatial approaches; and it makes a case for making urban corridors a spatial focus of integrated planning, as a way of beginning the restructuring and intensification of settlements which are currently characterised by sprawl, fragmentation and separation

Paradigm of Relocation of Urban Poor Habitats (Slums): Case Study of Nagpur City

Developing countries are facing a problem of slums and there appears to be no fool proof solution to eradicate them. For improving the quality of life there are three approaches of slum development and In-situ up-gradation approach is found to be the best one, while the relocation approach has proved to be failure. Factors responsible for failure of relocation projects are needed to be assessed, which is the basic aim of the paper. Factors responsible for failure of relocation projects are loss of livelihood, security of tenure and inefficiency of the Government. These factors are traced out & mapped from the examples of Western & Indian cities. National habitat, Resettlement policy emphasized relationship between shelter and work place. SRA has identified 55 slums for relocation due reservation of land uses, security of tenure and non- notified status of slums. The policy guidelines have been suggested for successful relocation projects. KeywordsLivelihood, Relocation, Slums, Urban poor.

A Perceptually Optimized Foveation Based Wavelet Embedded Zero Tree Image Coding

In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

An Unified Approach to Thermodynamics of Power Yield in Thermal, Chemical and Electrochemical Systems

This paper unifies power optimization approaches in various energy converters, such as: thermal, solar, chemical, and electrochemical engines, in particular fuel cells. Thermodynamics leads to converter-s efficiency and limiting power. Efficiency equations serve to solve problems of upgrading and downgrading of resources. While optimization of steady systems applies the differential calculus and Lagrange multipliers, dynamic optimization involves variational calculus and dynamic programming. In reacting systems chemical affinity constitutes a prevailing component of an overall efficiency, thus the power is analyzed in terms of an active part of chemical affinity. The main novelty of the present paper in the energy yield context consists in showing that the generalized heat flux Q (involving the traditional heat flux q plus the product of temperature and the sum products of partial entropies and fluxes of species) plays in complex cases (solar, chemical and electrochemical) the same role as the traditional heat q in pure heat engines. The presented methodology is also applied to power limits in fuel cells as to systems which are electrochemical flow engines propelled by chemical reactions. The performance of fuel cells is determined by magnitudes and directions of participating streams and mechanism of electric current generation. Voltage lowering below the reversible voltage is a proper measure of cells imperfection. The voltage losses, called polarization, include the contributions of three main sources: activation, ohmic and concentration. Examples show power maxima in fuel cells and prove the relevance of the extension of the thermal machine theory to chemical and electrochemical systems. The main novelty of the present paper in the FC context consists in introducing an effective or reduced Gibbs free energy change between products p and reactants s which take into account the decrease of voltage and power caused by the incomplete conversion of the overall reaction.

Conventional and PSO Based Approaches for Model Reduction of SISO Discrete Systems

Reduction of Single Input Single Output (SISO) discrete systems into lower order model, using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Modified Cauer Form (MCF) and differentiation are used. In this method the original discrete system is, first, converted into equivalent continuous system by applying bilinear transformation. The denominator of the equivalent continuous system and its reciprocal are differentiated successively, the reduced denominator of the desired order is obtained by combining the differentiated polynomials. The numerator is obtained by matching the quotients of MCF. The reduced continuous system is converted back into discrete system using inverse bilinear transformation. In the evolutionary technique method, Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.

The Design and Analysis of Learning Effects for a Game-based Learning System

The major purpose of this study is to use network and multimedia technologies to build a game-based learning system for junior high school students to apply in learning “World Geography" through the “role-playing" game approaches. This study first investigated the motivation and habits of junior high school students to use the Internet and online games, and then designed a game-based learning system according to situated and game-based learning theories. A teaching experiment was conducted to analyze the learning effectiveness of students on the game-based learning system and the major factors affecting their learning. A questionnaire survey was used to understand the students- attitudes towards game-based learning. The results showed that the game-based learning system can enhance students- learning, but the gender of students and their habits in using the Internet have no significant impact on learning. Game experience has a significant impact on students- learning, and the higher the experience value the better the effectiveness of their learning. The results of questionnaire survey also revealed that the system can increase students- motivation and interest in learning "World Geography".

Value of Sharing: Viral Advertisement

Sharing motivations of viral advertisements by consumers and the impacts of these advertisements on the perceptions for brand will be questioned in this study. Three fundamental questions are answered in the study. These are advertisement watching and sharing motivations of individuals, criteria of liking viral advertisement and the impact of individual attitudes for viral advertisement on brand perception respectively. This study will be carried out via a viral advertisement which was practiced in Turkey. The data will be collected by survey method and the sample of the study consists of individuals who experienced the practice of sample advertisement. Data will be collected by online survey method and will be analyzed by using SPSS statistical package program. Recently traditional advertisement mind have been changing. New advertising approaches which have significant impacts on consumers have been argued. Viral advertising is a modernist advertisement mind which offers significant advantages to brands apart from traditional advertising channels such as television, radio and magazines. Viral advertising also known as Electronic Word-of- Mouth (eWOM) consists of free spread of convincing messages sent by brands among interpersonal communication. When compared to the traditional advertising, a more provocative thematic approach is argued. The foundation of this approach is to create advertisements that are worth sharing with others by consumers. When that fact is taken into consideration, in a manner of speaking it can also be stated that viral advertising is media engineering. The content worth sharing makes people being a volunteer spokesman of a brand and strengthens the emotional bonds among brand and consumer. Especially for some sectors in countries which are having traditional advertising channel limitations, viral advertising creates vital advantages.

Changes in the Research of Crisis

Thanks to the interdisciplinary nature of crises, the position of researchers in that field is rather difficult. Very often the traditional methods of research cannot be applied there. The article is aimed at the changes in crises research. It describes the substance of individual changes and emphasizes the shift in research approaches to the crisis.

Probabilistic Method of Wind Generation Placement for Congestion Management

Wind farms (WFs) with high level of penetration are being established in power systems worldwide more rapidly than other renewable resources. The Independent System Operator (ISO), as a policy maker, should propose appropriate places for WF installation in order to maximize the benefits for the investors. There is also a possibility of congestion relief using the new installation of WFs which should be taken into account by the ISO when proposing the locations for WF installation. In this context, efficient wind farm (WF) placement method is proposed in order to reduce burdens on congested lines. Since the wind speed is a random variable and load forecasts also contain uncertainties, probabilistic approaches are used for this type of study. AC probabilistic optimal power flow (P-OPF) is formulated and solved using Monte Carlo Simulations (MCS). In order to reduce computation time, point estimate methods (PEM) are introduced as efficient alternative for time-demanding MCS. Subsequently, WF optimal placement is determined using generation shift distribution factors (GSDF) considering a new parameter entitled, wind availability factor (WAF). In order to obtain more realistic results, N-1 contingency analysis is employed to find the optimal size of WF, by means of line outage distribution factors (LODF). The IEEE 30-bus test system is used to show and compare the accuracy of proposed methodology.

Evaluating Refactoring with a Quality Index

The aim of every software product is to achieve an appropriate level of software quality. Developers and designers are trying to produce readable, reliable, maintainable, reusable and testable code. To help achieve these goals, several approaches have been utilized. In this paper, refactoring technique was used to evaluate software quality with a quality index. It is composed of different metric sets which describes various quality aspects.

Low Dimensional Representation of Dorsal Hand Vein Features Using Principle Component Analysis (PCA)

The quest of providing more secure identification system has led to a rise in developing biometric systems. Dorsal hand vein pattern is an emerging biometric which has attracted the attention of many researchers, of late. Different approaches have been used to extract the vein pattern and match them. In this work, Principle Component Analysis (PCA) which is a method that has been successfully applied on human faces and hand geometry is applied on the dorsal hand vein pattern. PCA has been used to obtain eigenveins which is a low dimensional representation of vein pattern features. Low cost CCD cameras were used to obtain the vein images. The extraction of the vein pattern was obtained by applying morphology. We have applied noise reduction filters to enhance the vein patterns. The system has been successfully tested on a database of 200 images using a threshold value of 0.9. The results obtained are encouraging.

A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem

The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.

A Game Design Framework for Vocational Education

Serious games have proven to be a useful instrument to engage learners and increase motivation. Nevertheless, a broadly accepted, practical instructional design approach to serious games does not exist. In this paper, we introduce the use of an instructional design model that has not been applied to serious games yet, and has some advantages compared to other design approaches. We present the case of mechanics mechatronics education to illustrate the close match with timing and role of knowledge and information that the instructional design model prescribes and how this has been translated to a rigidly structured game design. The structured approach answers the learning needs of applicable knowledge within the target group. It combines advantages of simulations with strengths of entertainment games to foster learner-s motivation in the best possible way. A prototype of the game will be evaluated along a well-respected evaluation method within an advanced test setting including test and control group.

A Quality-Oriented Approach toward Strategic Positioning in Higher Education Institutions

Positioning the organization in the strategic environment of its industry is one of the first and most important phases of the organizational strategic planning and in today knowledge-based economy has its importance been duplicated for higher education institutes as the centers of education, knowledge creation and knowledge worker training. Up to now, various models with diverse approaches have been applied to investigate organizations- strategic position in different industries. Regarding the essential importance and strategic role of quality in higher education institutes, in this study, a quality-oriented approach has been suggested to positioning them in their strategic environment. Then the European Foundation of Quality Management (EFQM) model has been adopted to position the top Iranian business schools in their strategic environment. The result of this study can be used in strategic planning of these institutes as well as the other Iranian business schools.