A Thought on Exotic Statistical Distributions

The statistical distributions are modeled in explaining nature of various types of data sets. Although these distributions are mostly uni-modal, it is quite common to see multiple modes in the observed distribution of the underlying variables, which make the precise modeling unrealistic. The observed data do not exhibit smoothness not necessarily due to randomness, but could also be due to non-randomness resulting in zigzag curves, oscillations, humps etc. The present paper argues that trigonometric functions, which have not been used in probability functions of distributions so far, have the potential to take care of this, if incorporated in the distribution appropriately. A simple distribution (named as, Sinoform Distribution), involving trigonometric functions, is illustrated in the paper with a data set. The importance of trigonometric functions is demonstrated in the paper, which have the characteristics to make statistical distributions exotic. It is possible to have multiple modes, oscillations and zigzag curves in the density, which could be suitable to explain the underlying nature of select data set.

Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality

The competitive learning is an adaptive process in which the neurons in a neural network gradually become sensitive to different input pattern clusters. The basic idea behind the Kohonen-s Self-Organizing Feature Maps (SOFM) is competitive learning. SOFM can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of this kind of mappings are topology preserving, feature mappings and probability distribution approximation of input patterns. To overcome some limitations of SOFM, e.g., a fixed number of neural units and a topology of fixed dimensionality, Growing Self-Organizing Neural Network (GSONN) can be used. GSONN can change its topological structure during learning. It grows by learning and shrinks by forgetting. To speed up the training and convergence, a new variant of GSONN, twin growing cell structures (TGCS) is presented here. This paper first gives an introduction to competitive learning, SOFM and its variants. Then, we discuss some GSONN with fixed dimensionality, which include growing cell structures, its variants and the author-s model: TGCS. It is ended with some testing results comparison and conclusions.

Learning Factory for Changeability

Amongst the consistently fluctuating conditions prevailing today, changeability represents a strategic key factor for a manufacturing company to achieve success on the international markets. In order to cope with turbulences and the increasing level of incalculability, not only the flexible design of production systems but in particular the employee as enabler of change provide the focus here. It is important to enable employees from manufacturing companies to participate actively in change events and in change decisions. To this end, the learning factory has been created, which is intended to serve the development of change-promoting competences and the sensitization of employees for the necessity of changes.

Problems and Possible Solutions with the Development of a Computer Model of Quantum Theory

A computer model of Quantum Theory (QT) has been developed by the author. Major goal of the computer model was support and demonstration of an as large as possible scope of QT. This includes simulations for the major QT (Gedanken-) experiments such as, for example, the famous double-slit experiment. Besides the anticipated difficulties with (1) transforming exacting mathematics into a computer program, two further types of problems showed up, namely (2) areas where QT provides a complete mathematical formalism, but when it comes to concrete applications the equations are not solvable at all, or only with extremely high effort; (3) QT rules which are formulated in natural language and which do not seem to be translatable to precise mathematical expressions, nor to a computer program. The paper lists problems in all three categories and describes also the possible solutions or circumventions developed for the computer model.

Fabricating Protruded Micro-features on AA6061 Substrates by Hot Embossing Method

Metallic micro parts are playing an important role in micro-fabrication industry. Recently, we have demonstrated a new deformation mechanism for micro-formability of polycrystalline materials. Different depressed micro-features smaller than the grain size have been successfully fabricated on 6061 aluminum alloy (AA6061) substrates with good fidelity. To further verify this proposed deformation mechanism that grain size is not a limiting factor, we demonstrate here that in addition of depressed features, protruded micro-features on a polycrystalline substrate can similarly be fabricated.

Multi-Objective Optimization for Performance-based Seismic Retrofit using Connection Upgrade

The unanticipated brittle fracture of connection of the steel moment resisting frame (SMRF) occurred in 1994 the Northridge earthquake. Since then, the researches for the vulnerability of connection of the existing SMRF and for rehabilitation of those buildings were conducted. This paper suggests performance-based optimal seismic retrofit technique using connection upgrade. For optimal design, a multi-objective genetic algorithm(NSGA-II) is used. One of the two objective functions is to minimize initial cost and another objective function is to minimize lifetime seismic damages cost. The optimal algorithm proposed in this paper is performed satisfying specified performance objective based on FEMA 356. The nonlinear static analysis is performed for structural seismic performance evaluation. A numerical example of SAC benchmark SMRF is provided using the performance-based optimal seismic retrofit technique proposed in this paper

Reversible, Embedded and Highly Scalable Image Compression System

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

COTT – A Testability Framework for Object-Oriented Software Testing

Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.

Quasilinearization–Barycentric Approach for Numerical Investigation of the Boundary Value Fin Problem

In this paper we improve the quasilinearization method by barycentric Lagrange interpolation because of its numerical stability and computation speed to achieve a stable semi analytical solution. Then we applied the improved method for solving the Fin problem which is a nonlinear equation that occurs in the heat transferring. In the quasilinearization approach the nonlinear differential equation is treated by approximating the nonlinear terms by a sequence of linear expressions. The modified QLM is iterative but not perturbative and gives stable semi analytical solutions to nonlinear problems without depending on the existence of a smallness parameter. Comparison with some numerical solutions shows that the present solution is applicable.

A P-SPACE Algorithm for Groebner Bases Computation in Boolean Rings

The theory of Groebner Bases, which has recently been honored with the ACM Paris Kanellakis Theory and Practice Award, has become a crucial building block to computer algebra, and is widely used in science, engineering, and computer science. It is wellknown that Groebner bases computation is EXP-SPACE in a general setting. In this paper, we give an algorithm to show that Groebner bases computation is P-SPACE in Boolean rings. We also show that with this discovery, the Groebner bases method can theoretically be as efficient as other methods for automated verification of hardware and software. Additionally, many useful and interesting properties of Groebner bases including the ability to efficiently convert the bases for different orders of variables making Groebner bases a promising method in automated verification.

Lower energy Gait Pattern Generation in 5-Link Biped Robot Using Image Processing

The purpose of this study is to find natural gait of biped robot such as human being by analyzing the COG (Center Of Gravity) trajectory of human being's gait. It is discovered that human beings gait naturally maintain the stability and use the minimum energy. This paper intends to find the natural gait pattern of biped robot using the minimum energy as well as maintaining the stability by analyzing the human's gait pattern that is measured from gait image on the sagittal plane and COG trajectory on the frontal plane. It is not possible to apply the torques of human's articulation to those of biped robot's because they have different degrees of freedom. Nonetheless, human and 5-link biped robots are similar in kinematics. For this, we generate gait pattern of the 5-link biped robot by using the GA algorithm of adaptation gait pattern which utilize the human's ZMP (Zero Moment Point) and torque of all articulation that are measured from human's gait pattern. The algorithm proposed creates biped robot's fluent gait pattern as that of human being's and to minimize energy consumption because the gait pattern of the 5-link biped robot model is modeled after consideration about the torque of human's each articulation on the sagittal plane and ZMP trajectory on the frontal plane. This paper demonstrate that the algorithm proposed is superior by evaluating 2 kinds of the 5-link biped robot applied to each gait patterns generated both in the general way using inverse kinematics and in the special way in which by considering visuality and efficiency.

Density, Strength, Thermal Conductivity and Leachate Characteristics of Light-Weight Fired Clay Bricks Incorporating Cigarette Butts

Several trillion cigarettes produced worldwide annually lead to many thousands of kilograms of toxic waste. Cigarette butts (CBs) accumulate in the environment due to the poor biodegradability of the cellulose acetate filters. This paper presents some of the results from a continuing study on recycling CBs into fired clay bricks. Physico-mechanical properties of fired clay bricks manufactured with different percentages of CBs are reported and discussed. The results show that the density of fired bricks was reduced by up to 30 %, depending on the percentage of CBs incorporated into the raw materials. Similarly, the compressive strength of bricks tested decreased according to the percentage of CBs included in the mix. The thermal conductivity performance of bricks was improved by 51 and 58 % for 5 and 10 % CBs content respectively. Leaching tests were carried out to investigate the levels of possible leachates of heavy metals from the manufactured clay-CB bricks. The results revealed trace amounts of heavy metals.

Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area

Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.

Efficient and Extensible Data Processing Framework in Ubiquitious Sensor Networks

This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.

Energy Efficient Reliable Cooperative Multipath Routing in Wireless Sensor Networks

In this paper, a reliable cooperative multipath routing algorithm is proposed for data forwarding in wireless sensor networks (WSNs). In this algorithm, data packets are forwarded towards the base station (BS) through a number of paths, using a set of relay nodes. In addition, the Rayleigh fading model is used to calculate the evaluation metric of links. Here, the quality of reliability is guaranteed by selecting optimal relay set with which the probability of correct packet reception at the BS will exceed a predefined threshold. Therefore, the proposed scheme ensures reliable packet transmission to the BS. Furthermore, in the proposed algorithm, energy efficiency is achieved by energy balancing (i.e. minimizing the energy consumption of the bottleneck node of the routing path) at the same time. This work also demonstrates that the proposed algorithm outperforms existing algorithms in extending longevity of the network, with respect to the quality of reliability. Given this, the obtained results make possible reliable path selection with minimum energy consumption in real time.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

Effect of Na2O Content on Durability of Geopolymer Mortars in Sulphuric Acid

This paper presents the findings of an experimental investigation to study the effect of alkali content in geopolymer mortar specimens exposed to sulphuric acid. Geopolymer mortar specimens were manufactured from Class F fly ash by activation with a mixture of sodium hydroxide and sodium silicate solution containing 5% to 8% Na2O. Durability of specimens were assessed by immersing them in 10% sulphuric acid solution and periodically monitoring surface deterioration and depth of dealkalization, changes in weight and residual compressive strength over a period of 24 weeks. Microstructural changes in the specimens were studied with Scanning electron microscopy (SEM) and EDAX. Alkali content in the activator solution significantly affects the durability of fly ash based geopolymer mortars in sulphuric acid. Specimens manufactured with higher alkali content performed better than those manufactured with lower alkali content. After 24 weeks in sulphuric acid, specimen with 8% alkali still recorded a residual strength as high as 55%.

Free Convection in an Infinite Porous Dusty Medium Induced by Pulsating Point Heat Source

Free convection effects and heat transfer due to a pulsating point heat source embedded in an infinite, fluid saturated, porous dusty medium are studied analytically. Both velocity and temperature fields are discussed in the form of series expansions in the Rayleigh number, for both the fluid and particle phases based on the mean heat generation rate from source and on the permeability of the porous dusty medium. This study is carried out by assuming the Rayleigh number small and the validity of Darcy-s law. Analytical expressions for both phases are obtained for second order mean in both velocity and temperature fields and evolution of different wave patterns are observed in the fluctuating part. It has been observed that, at the vicinity of the origin, the second order mean flow is influenced only by relaxation time of dust particles and not by dust concentration.

A Forward Automatic Censored Cell-Averaging Detector for Multiple Target Situations in Log-Normal Clutter

A challenging problem in radar signal processing is to achieve reliable target detection in the presence of interferences. In this paper, we propose a novel algorithm for automatic censoring of radar interfering targets in log-normal clutter. The proposed algorithm, termed the forward automatic censored cell averaging detector (F-ACCAD), consists of two steps: removing the corrupted reference cells (censoring) and the actual detection. Both steps are performed dynamically by using a suitable set of ranked cells to estimate the unknown background level and set the adaptive thresholds accordingly. The F-ACCAD algorithm does not require any prior information about the clutter parameters nor does it require the number of interfering targets. The effectiveness of the F-ACCAD algorithm is assessed by computing, using Monte Carlo simulations, the probability of censoring and the probability of detection in different background environments.

Implementation of Watch Dog Timer for Fault Tolerant Computing on Cluster Server

In today-s new technology era, cluster has become a necessity for the modern computing and data applications since many applications take more time (even days or months) for computation. Although after parallelization, computation speeds up, still time required for much application can be more. Thus, reliability of the cluster becomes very important issue and implementation of fault tolerant mechanism becomes essential. The difficulty in designing a fault tolerant cluster system increases with the difficulties of various failures. The most imperative obsession is that the algorithm, which avoids a simple failure in a system, must tolerate the more severe failures. In this paper, we implemented the theory of watchdog timer in a parallel environment, to take care of failures. Implementation of simple algorithm in our project helps us to take care of different types of failures; consequently, we found that the reliability of this cluster improves.