New PTH Moment Stable Criteria of Stochastic Neural Networks

In this paper, the issue of pth moment stability of a class of stochastic neural networks with mixed delays is investigated. By establishing two integro-differential inequalities, some new sufficient conditions ensuring pth moment exponential stability are obtained. Compared with some previous publications, our results generalize some earlier works reported in the literature, and remove some strict constraints of time delays and kernel functions. Two numerical examples are presented to illustrate the validity of the main results.

Action Functional of the Electomagnetic Field: Effect of Gravitation

The scalar wave equation for a potential in a curved space time, i.e., the Laplace-Beltrami equation has been studied in this work. An action principle is used to derive a finite element algorithm for determining the modes of propagation inside a waveguide of arbitrary shape. Generalizing this idea, the Maxwell theory in a curved space time determines a set of linear partial differential equations for the four electromagnetic potentials given by the metric of space-time. Similar to the Einstein-s formulation of the field equations of gravitation, these equations are also derived from an action principle. In this paper, the expressions for the action functional of the electromagnetic field have been derived in the presence of gravitational field.

Mamdani Model based Adaptive Neural Fuzzy Inference System and its Application

Hybrid algorithm is the hot issue in Computational Intelligence (CI) study. From in-depth discussion on Simulation Mechanism Based (SMB) classification method and composite patterns, this paper presents the Mamdani model based Adaptive Neural Fuzzy Inference System (M-ANFIS) and weight updating formula in consideration with qualitative representation of inference consequent parts in fuzzy neural networks. M-ANFIS model adopts Mamdani fuzzy inference system which has advantages in consequent part. Experiment results of applying M-ANFIS to evaluate traffic Level of service show that M-ANFIS, as a new hybrid algorithm in computational intelligence, has great advantages in non-linear modeling, membership functions in consequent parts, scale of training data and amount of adjusted parameters.

Simulation of Large Deformations of Rubbers by the RKPM Method

In this paper processes including large deformations of a rubber with hyperelastic material behavior are simulated by the RKPM method. Due to the loss of kronecker delta properties in the mesh less shape functions, the imposition of essential boundary conditions consumes significant CPU time in mesh free computations. In this work transformation method is used for imposition of essential boundary conditions. A RKPM material shape function is used in this analysis. The support of the material shape functions covers the same set of particles during material deformation and hence the transformation matrix is formed only once at the initial stages. A computer program in MATLAB is developed for simulations.

An Automated Approach for Assembling Modular Fixtures Using SolidWorks

Modular fixtures (MFs) are very important tools in manufacturing processes in terms of reduction the cost and the production time. This paper introduces an automated approach for assembling MFs elements by employing SolidWorks as a powerful 3D CAD software. Visual Basic (VB) programming language was applied integrating with SolidWorks API (Application programming interface) functions. This integration allowed creating plug-in file and generating new menus in the SolidWorks environment. The menus allow the user to select, insert, and assemble MFs elements.

ψ-exponential Stability for Non-linear Impulsive Differential Equations

In this paper, we shall present sufficient conditions for the ψ-exponential stability of a class of nonlinear impulsive differential equations. We use the Lyapunov method with functions that are not necessarily differentiable. In the last section, we give some examples to support our theoretical results.

Design of Auto Exposure Unit Based On 2-Way Histogram Equalization

Histogram equalization is often used in image enhancement, but it can be also used in auto exposure. However, conventional histogram equalization does not work well when many pixels are concentrated in a narrow luminance range.This paper proposes an auto exposure method based on 2-way histogram equalization. Two cumulative distribution functions are used, where one is from dark to bright and the other is from bright to dark. In this paper, the proposed auto exposure method is also designed and implemented for image signal processors with full-HD images.

Validation Testing for Temporal Neural Networks for RBF Recognition

A neuron can emit spikes in an irregular time basis and by averaging over a certain time window one would ignore a lot of information. It is known that in the context of fast information processing there is no sufficient time to sample an average firing rate of the spiking neurons. The present work shows that the spiking neurons are capable of computing the radial basis functions by storing the relevant information in the neurons' delays. One of the fundamental findings of the this research also is that when using overlapping receptive fields to encode the data patterns it increases the network-s clustering capacity. The clustering algorithm that is discussed here is interesting from computer science and neuroscience point of view as well as from a perspective.

A Formal Approach for Proof Constructions in Cryptography

In this article we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (σ-algebras, probability spaces and conditional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes- Formula. Besides, we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this article shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in cryptographic research, if the corresponding basic mathematical knowledge is available in a database.

An Algebra for Protein Structure Data

This paper presents an algebraic approach to optimize queries in domain-specific database management system for protein structure data. The approach involves the introduction of several protein structure specific algebraic operators to query the complex data stored in an object-oriented database system. The Protein Algebra provides an extensible set of high-level Genomic Data Types and Protein Data Types along with a comprehensive collection of appropriate genomic and protein functions. The paper also presents a query translator that converts high-level query specifications in algebra into low-level query specifications in Protein-QL, a query language designed to query protein structure data. The query transformation process uses a Protein Ontology that serves the purpose of a dictionary.

Constitutive Equations for Human Saphenous Vein Coronary Artery Bypass Graft

Coronary artery bypass grafts (CABG) are widely studied with respect to hemodynamic conditions which play important role in presence of a restenosis. However, papers which concern with constitutive modeling of CABG are lacking in the literature. The purpose of this study is to find a constitutive model for CABG tissue. A sample of the CABG obtained within an autopsy underwent an inflation–extension test. Displacements were recoredered by CCD cameras and subsequently evaluated by digital image correlation. Pressure – radius and axial force – elongation data were used to fit material model. The tissue was modeled as onelayered composite reinforced by two families of helical fibers. The material is assumed to be locally orthotropic, nonlinear, incompressible and hyperelastic. Material parameters are estimated for two strain energy functions (SEF). The first is classical exponential. The second SEF is logarithmic which allows interpretation by means of limiting (finite) strain extensibility. Presented material parameters are estimated by optimization based on radial and axial equilibrium equation in a thick-walled tube. Both material models fit experimental data successfully. The exponential model fits significantly better relationship between axial force and axial strain than logarithmic one.

Negative Emotions and Ways of Overcoming them in Prison

The aim of this paper is description of the notion of the death for prisoners and the ways of deal with. They express indifference, coldness, inability to accept the blame, they have no shame and no empathy. Is it enough to perform acts verging on the death. In this paper we described mechanisms and regularities of selfdestructive behaviour in the view of the relevant literature? The explanation of the phenomenon is of a biological and sociopsychological nature. It must be clearly stated that all forms of selfdestructive behaviour result from various impulses, conflicts and deficits. That is why they should be treated differently in terms of motivation and functions which they perform in a given group of people. Behind self-destruction there seems to be a motivational mechanism which forces prisoners to rebel and fight against the hated law and penitentiary systems. The imprisoned believe that pain and suffering inflicted on them by themselves are better than passive acceptance of repression. The variety of self-destruction acts is wide, and some of them take strange forms. We assume that a life-death barrier is a kind of game for them. If they cannot change the degrading situation, their life loses sense.

EGCL: An Extended G-Code Language with Flow Control, Functions and Mnemonic Variables

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.

Periodic Solutions of Recurrent Neural Networks with Distributed Delays and Impulses on Time Scales

In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.

Surrogate based Evolutionary Algorithm for Design Optimization

Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.

Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding

In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.

Computable Function Representations Using Effective Chebyshev Polynomial

We show that Chebyshev Polynomials are a practical representation of computable functions on the computable reals. The paper presents error estimates for common operations and demonstrates that Chebyshev Polynomial methods would be more efficient than Taylor Series methods for evaluation of transcendental functions.

A Study on Applying 3D Reconstruction to 3D Last Morphing

When it comes to last, it is regarded as the critical foundation of shoe design and development. A computer aided methodology for various last form designs is proposed in this study. The reverse engineering is mainly applied to the process of scanning for the last form. Then with the minimum energy for revision of surface continuity, the surface reconstruction of last is rebuilt by the feature curves of the scanned last. When the surface reconstruction of last is completed, the weighted arithmetic mean method is applied to the computation on the shape morphing for the control mesh of last, thus 3D last form of different sizes is generated from its original form feature with functions remained. In the end, the result of this study is applied to an application for 3D last reconstruction system. The practicability of the proposed methodology is verified through later case studies.

Induction Motor Speed Control Using Fuzzy Logic Controller

Because of the low maintenance and robustness induction motors have many applications in the industries. The speed control of induction motor is more important to achieve maximum torque and efficiency. Various speed control techniques like, Direct Torque Control, Sensorless Vector Control and Field Oriented Control are discussed in this paper. Soft computing technique – Fuzzy logic is applied in this paper for the speed control of induction motor to achieve maximum torque with minimum loss. The fuzzy logic controller is implemented using the Field Oriented Control technique as it provides better control of motor torque with high dynamic performance. The motor model is designed and membership functions are chosen according to the parameters of the motor model. The simulated design is tested using various tool boxes in MATLAB. The result concludes that the efficiency and reliability of the proposed speed controller is good.

A New Source Code Auditing Algorithm for Detecting LFI and RFI in PHP Programs

Static analysis of source code is used for auditing web applications to detect the vulnerabilities. In this paper, we propose a new algorithm to analyze the PHP source code for detecting LFI and RFI potential vulnerabilities. In our approach, we first define some patterns for finding some functions which have potential to be abused because of unhandled user inputs. More precisely, we use regular expression as a fast and simple method to define some patterns for detection of vulnerabilities. As inclusion functions could be also used in a safe way, there could occur many false positives (FP). The first cause of these FP-s could be that the function does not use a usersupplied variable as an argument. So, we extract a list of usersupplied variables to be used for detecting vulnerable lines of code. On the other side, as vulnerability could spread among the variables like by multi-level assignment, we also try to extract the hidden usersupplied variables. We use the resulted list to decrease the false positives of our method. Finally, as there exist some ways to prevent the vulnerability of inclusion functions, we define also some patterns to detect them and decrease our false positives.