Topology Preservation in SOM

The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.

LFC Design of a Deregulated Power System with TCPS Using PSO

In the LFC problem, the interconnections among some areas are the input of disturbances, and therefore, it is important to suppress the disturbances by the coordination of governor systems. In contrast, tie-line power flow control by TCPS located between two areas makes it possible to stabilize the system frequency oscillations positively through interconnection, which is also expected to provide a new ancillary service for the further power systems. Thus, a control strategy using controlling the phase angle of TCPS is proposed for provide active control facility of system frequency in this paper. Also, the optimum adjustment of PID controller's parameters in a robust way under bilateral contracted scenario following the large step load demands and disturbances with and without TCPS are investigated by Particle Swarm Optimization (PSO), that has a strong ability to find the most optimistic results. This newly developed control strategy combines the advantage of PSO and TCPS and has simple stricture that is easy to implement and tune. To demonstrate the effectiveness of the proposed control strategy a three-area restructured power system is considered as a test system under different operating conditions and system nonlinearities. Analysis reveals that the TCPS is quite capable of suppressing the frequency and tie-line power oscillations effectively as compared to that obtained without TCPS for a wide range of plant parameter changes, area load demands and disturbances even in the presence of system nonlinearities.

Dynamic Variational Multiscale LES of Bluff Body Flows on Unstructured Grids

The effects of dynamic subgrid scale (SGS) models are investigated in variational multiscale (VMS) LES simulations of bluff body flows. The spatial discretization is based on a mixed finite element/finite volume formulation on unstructured grids. In the VMS approach used in this work, the separation between the largest and the smallest resolved scales is obtained through a variational projection operator and a finite volume cell agglomeration. The dynamic version of Smagorinsky and WALE SGS models are used to account for the effects of the unresolved scales. In the VMS approach, these effects are only modeled in the smallest resolved scales. The dynamic VMS-LES approach is applied to the simulation of the flow around a circular cylinder at Reynolds numbers 3900 and 20000 and to the flow around a square cylinder at Reynolds numbers 22000 and 175000. It is observed as in previous studies that the dynamic SGS procedure has a smaller impact on the results within the VMS approach than in LES. But improvements are demonstrated for important feature like recirculating part of the flow. The global prediction is improved for a small computational extra cost.

Hierarchies Based On the Number of Cooperating Systems of Finite Automata on Four-Dimensional Input Tapes

In theoretical computer science, the Turing machine has played a number of important roles in understanding and exploiting basic concepts and mechanisms in computing and information processing [20]. It is a simple mathematical model of computers [9]. After that, M.Blum and C.Hewitt first proposed two-dimensional automata as a computational model of two-dimensional pattern processing, and investigated their pattern recognition abilities in 1967 [7]. Since then, a lot of researchers in this field have been investigating many properties about automata on a two- or three-dimensional tape. On the other hand, the question of whether processing fourdimensional digital patterns is much more difficult than two- or threedimensional ones is of great interest from the theoretical and practical standpoints. Thus, the study of four-dimensional automata as a computasional model of four-dimensional pattern processing has been meaningful [8]-[19],[21]. This paper introduces a cooperating system of four-dimensional finite automata as one model of four-dimensional automata. A cooperating system of four-dimensional finite automata consists of a finite number of four-dimensional finite automata and a four-dimensional input tape where these finite automata work independently (in parallel). Those finite automata whose input heads scan the same cell of the input tape can communicate with each other, that is, every finite automaton is allowed to know the internal states of other finite automata on the same cell it is scanning at the moment. In this paper, we mainly investigate some accepting powers of a cooperating system of eight- or seven-way four-dimensional finite automata. The seven-way four-dimensional finite automaton is an eight-way four-dimensional finite automaton whose input head can move east, west, south, north, up, down, or in the fu-ture, but not in the past on a four-dimensional input tape.

Mining Frequent Patterns with Functional Programming

Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.

Mathematical Modeling of Storm Surge in Three Dimensional Primitive Equations

The mathematical modeling of storm surge in sea and coastal regions such as the South China Sea (SCS) and the Gulf of Thailand (GoT) are important to study the typhoon characteristics. The storm surge causes an inundation at a lateral boundary exhibiting in the coastal zones particularly in the GoT and some part of the SCS. The model simulations in the three dimensional primitive equations with a high resolution model are important to protect local properties and human life from the typhoon surges. In the present study, the mathematical modeling is used to simulate the typhoon–induced surges in three case studies of Typhoon Linda 1997. The results of model simulations at the tide gauge stations can describe the characteristics of storm surges at the coastal zones.

Drum-Buffer-Rope: The Technique to Plan and Control the Production Using Theory of Constraints

Theory of Constraints has been emerging as an important tool for optimization of manufacturing/service systems. Goldratt in his first book “ The Goal " gave the introduction on Theory of Constraints and its applications in a factory scenario. A large number of production managers around the globe read this book but only a few could implement it in their plants because the book did not explain the steps to implement TOC in the factory. To overcome these limitations, Goldratt wrote this book to explain TOC, DBR and the method to implement it. In this paper, an attempt has been made to summarize the salient features of TOC and DBR listed in the book and the correct approach to implement TOC in a factory setting. The simulator available along with the book was actually used by the authors and the claim of Goldratt regarding the use of DBR and Buffer management to ease the work of production managers was tested and was found to be correct.

Photograph Based Pair-matching Recognition of Human Faces

In this paper, a novel system recognition of human faces without using face different color photographs is proposed. It mainly in face detection, normalization and recognition. Foot method of combination of Haar-like face determined segmentation and region-based histogram stretchi (RHST) is proposed to achieve more accurate perf using Haar. Apart from an effective angle norm side-face (pose) normalization, which is almost a might be important and beneficial for the prepr introduced. Then histogram-based and photom normalization methods are investigated and ada retinex (ASR) is selected for its satisfactory illumin Finally, weighted multi-block local binary pattern with 3 distance measures is applied for pair-mat Experimental results show its advantageous perfo with PCA and multi-block LBP, based on a principle.

Visual Hull with Imprecise Input

Imprecision is a long-standing problem in CAD design and high accuracy image-based reconstruction applications. The visual hull which is the closed silhouette equivalent shape of the objects of interest is an important concept in image-based reconstruction. We extend the domain-theoretic framework, which is a robust and imprecision capturing geometric model, to analyze the imprecision in the output shape when the input vertices are given with imprecision. Under this framework, we show an efficient algorithm to generate the 2D partial visual hull which represents the exact information of the visual hull with only basic imprecision assumptions. We also show how the visual hull from polyhedra problem can be efficiently solved in the context of imprecise input.

Development of Coronal Field and Solar Wind Components for MHD Interplanetary Simulations

The connection between solar activity and adverse phenomena in the Earth’s environment that can affect space and ground based technologies has spurred interest in Space Weather (SW) research. A great effort has been put on the development of suitable models that can provide advanced forecast of SW events. With the progress in computational technology, it is becoming possible to develop operational large scale physics based models which can incorporate the most important physical processes and domains of the Sun-Earth system. In order to enhance our SW prediction capabilities we are developing advanced numerical tools. With operational requirements in mind, our goal is to develop a modular simulation framework of propagation of the disturbances from the Sun through interplanetary space to the Earth. Here, we report and discuss on the development of coronal field and solar wind components for a large scale MHD code. The model for these components is based on a potential field source surface model and an empirical Wang-Sheeley-Arge solar wind relation. 

Application of Lattice Boltzmann Methods in Heat and Moisture Transfer in Frozen Soil

Although water only takes a little percentage in the total mass of soil, it indeed plays an important role to the strength of structure. Moisture transfer can be carried out by many different mechanisms which may involve heat and mass transfer, thermodynamic phase change, and the interplay of various forces such as viscous, buoyancy, and capillary forces. The continuum models are not well suited for describing those phenomena in which the connectivity of the pore space or the fracture network, or that of a fluid phase, plays a major role. However, Lattice Boltzmann methods (LBMs) are especially well suited to simulate flows around complex geometries. Lattice Boltzmann methods were initially invented for solving fluid flows. Recently, fluid with multicomponent and phase change is also included in the equations. By comparing the numerical result with experimental result, the Lattice Boltzmann methods with phase change will be optimized.

Sustainability Policies and Corporate Social Responsibility (CSR): Ergonomics Contribution Regarding Work in Companies

The growing importance of sustainability in corporate policies represents a great opportunity for workers to gain more consideration, with great benefits to their well being. Sustainable work is believed to be one which improves the organization-s performance and fosters professional development as well as workers- health. In a multiple case study based on document research, information was sought about work activities and their sustainability or corporate social responsibility (CSR) policies, as disseminated by corporations. All the companies devoted attention to work activities and delivered a good amount of information about them. Nevertheless, the information presented was generic; all the actions developed were top-down and there was no information about the impact of changes aimed at sustainability on the workers- activities. It was found that the companies seemed to be at an early stage. In the future, they need to show more commitment through concrete goals: they must be aware that workers contribute directly to the corporations- sustainability. This would allow room for Ergonomics and Work Psychodynamics to be incorporated and to be useful for both companies and society, so as to promote and ensure work sustainability.

Modeling the Effects of Type and Intensity of Selective Logging on Forests of the Amazon

The aim of the work presented here was to either use existing forest dynamic simulation models or calibrate a new one both within the SYMFOR framework with the purpose of examining changes in stand level basal area and functional composition in response to selective logging considering trees > 10 cm d.b.h for two areas of undisturbed Amazonian non flooded tropical forest in Brazil and one in Peru. Model biological realism was evaluated for forest in the undisturbed and selectively logged state and it was concluded that forest dynamics were realistically represented. Results of the logging simulation experiments showed that in relation to undisturbed forest simulation subject to no form of harvesting intervention there was a significant amount of change over a 90 year simulation period that was positively proportional to the intensity of logging. Areas which had in the dynamic equilibrium of undisturbed forest a greater proportion of a specific ecological guild of trees known as the light hardwoods (LHW’s) seemed to respond more favorably in terms of less deviation but only within a specific range of baseline forest composition beyond which compositional diversity became more important. These finds are in line partially with practical management experience and partiality basic systematics theory respectively.

Measuring the Comprehensibility of a UML-B Model and a B Model

Software maintenance, which involves making enhancements, modifications and corrections to existing software systems, consumes more than half of developer time. Specification comprehensibility plays an important role in software maintenance as it permits the understanding of the system properties more easily and quickly. The use of formal notation such as B increases a specification-s precision and consistency. However, the notation is regarded as being difficult to comprehend. Semi-formal notation such as the Unified Modelling Language (UML) is perceived as more accessible but it lacks formality. Perhaps by combining both notations could produce a specification that is not only accurate and consistent but also accessible to users. This paper presents an experiment conducted on a model that integrates the use of both UML and B notations, namely UML-B, versus a B model alone. The objective of the experiment was to evaluate the comprehensibility of a UML-B model compared to a traditional B model. The measurement used in the experiment focused on the efficiency in performing the comprehension tasks. The experiment employed a cross-over design and was conducted on forty-one subjects, including undergraduate and masters students. The results show that the notation used in the UML-B model is more comprehensible than the B model.

Investigating Daylight Quality in Malaysian Government Office Buildings Through Daylight Factor and Surface Luminance

In recent years, there has been an increasing interest in using daylight to save energy in buildings. In tropical regions, daylighting is always an energy saver. On the other hand, daylight provides visual comfort. According to standards, it shows that many criteria should be taken into consideration in order to have daylight utilization and visual comfort. The current standard in Malaysia, MS 1525 does not provide sufficient guideline. Hence, more research is needed on daylight performance. If architects do not consider daylight design, it not only causes inconvenience in working spaces but also causes more energy consumption as well as environmental pollution. This research had surveyed daylight performance in 5 selected office buildings from different area of Malaysian through experimental method. Several parameters of daylight quality such as daylight factor, surface luminance and surface luminance ratio were measured in different rooms in each building. The result of this research demonstrated that most of the buildings were not designed for daylight utilization. Therefore, it is very important that architects follow the daylight design recommendation to reduce consumption of electric power for artificial lighting while the sufficient quality of daylight is available.

Method for Determining the Probing Points for Efficient Measurement of Freeform Surface

In inspection and workpiece localization, sampling point data is an important issue. Since the devices for sampling only sample discrete points, not the completely surface, sampling size and location of the points will be taken into consideration. In this paper a method is presented for determining the sampled points size and location for achieving efficient sampling. Firstly, uncertainty analysis of the localization parameters is investigated. A localization uncertainty model is developed to predict the uncertainty of the localization process. Using this model the minimum size of the sampled points is predicted. Secondly, based on the algebra theory an eigenvalue-optimal optimization is proposed. Then a freeform surface is used in the simulation. The proposed optimization is implemented. The simulation result shows its effectivity.

A Large-Eddy Simulation of Vortex Cell flow with Incoming Turbulent Boundary Layer

We present a Large-Eddy simulation of a vortex cell with circular shaped. The results show that the flow field can be sub divided into four important zones, the shear layer above the cavity, the stagnation zone, the vortex core in the cavity and the boundary layer along the wall of the cavity. It is shown that the vortex core consits of solid body rotation without much turbulence activity. The vortex is mainly driven by high energy packets that are driven into the cavity from the stagnation point region and by entrainment of fluid from the cavity into the shear layer. The physics in the boundary layer along the cavity-s wall seems to be far from that of a canonical boundary layer which might be a crucial point for modelling this flow.

Complex-Valued Neural Network in Image Recognition: A Study on the Effectiveness of Radial Basis Function

A complex valued neural network is a neural network, which consists of complex valued input and/or weights and/or thresholds and/or activation functions. Complex-valued neural networks have been widening the scope of applications not only in electronics and informatics, but also in social systems. One of the most important applications of the complex valued neural network is in image and vision processing. In Neural networks, radial basis functions are often used for interpolation in multidimensional space. A Radial Basis function is a function, which has built into it a distance criterion with respect to a centre. Radial basis functions have often been applied in the area of neural networks where they may be used as a replacement for the sigmoid hidden layer transfer characteristic in multi-layer perceptron. This paper aims to present exhaustive results of using RBF units in a complex-valued neural network model that uses the back-propagation algorithm (called 'Complex-BP') for learning. Our experiments results demonstrate the effectiveness of a Radial basis function in a complex valued neural network in image recognition over a real valued neural network. We have studied and stated various observations like effect of learning rates, ranges of the initial weights randomly selected, error functions used and number of iterations for the convergence of error on a neural network model with RBF units. Some inherent properties of this complex back propagation algorithm are also studied and discussed.

A Soft Set based Group Decision Making Method with Criteria Weight

Molodstov-s soft sets theory was originally proposed as general mathematical tool for dealing with uncertainty problems. The matrix form has been introduced in soft set and some of its properties have been discussed. However, the formulation of soft matrix in group decision making problem only with equal importance weights of criteria, which does not show the true opinion of decision maker on each criteria. The aim of this paper is to propose a method for solving group decision making problem incorporating the importance of criteria by using soft matrices in a more objective manner. The weight of each criterion is calculated by using the Analytic Hierarchy Process (AHP) method. An example of house selection process is given to illustrate the effectiveness of the proposed method.

General Process Control for Intelligent Systems

Development of intelligent assembly cell conception includes new solution kind of how to create structures of automated and flexible assembly system. The current trend of the final product quality increasing is affected by time analysis of the entire manufacturing process. The primary requirement of manufacturing is to produce as many products as soon as possible, at the lowest possible cost, but of course with the highest quality. Such requirements may be satisfied only if all the elements entering and affecting the production cycle are in a fully functional condition. These elements consist of sensory equipment and intelligent control elements that are essential for building intelligent manufacturing systems. Intelligent behavior of the system as the control system will repose on monitoring of important parameters of the system in the real time. Intelligent manufacturing system itself should be a system that can flexibly respond to changes in entering and exiting the process in interaction with the surroundings.