A Deterministic Dynamic Programming Approach for Optimization Problem with Quadratic Objective Function and Linear Constraints

This paper presents the novel deterministic dynamic programming approach for solving optimization problem with quadratic objective function with linear equality and inequality constraints. The proposed method employs backward recursion in which computations proceeds from last stage to first stage in a multi-stage decision problem. A generalized recursive equation which gives the exact solution of an optimization problem is derived in this paper. The method is purely analytical and avoids the usage of initial solution. The feasibility of the proposed method is demonstrated with a practical example. The numerical results show that the proposed method provides global optimum solution with negligible computation time.

Tidal River Sediment Management–A Case Study in Southwestern Bangladesh

The problems of severe drainage congestion and water logging in the southwestern Bangladesh have been solved by an innovative concept, Tidal River Management (TRM). TRM involves the uniform raising of the land inside a tidal basin (beel) while simultaneously maintaining the proper drainage capacity in the river. The present practice of TRM is to link the river with the selected beel by constructing a link canal at the entrance of which most of the sedimentation takes place. This localized sedimentation also creates drainage congestion and water logging making it unattractive to landowners who participate in the program. In this paper a functional sediment management plan is presented to get rid of this problem

Anthropometric Correlates of Balance Performance in Non-Institutionalized Elderly

Purpose: The fear of falling is a major concern among the elderly. Sixty-five percent of individuals older than 60 years of age experience loss of balance often on a daily basis. Therefore, balance assessment in the elderly deserves special attention due to its importance in functional mobility and safety. This study aimed at assessing balance performance and comparing some anthropometric parameters among a Nigerian non-institutionalized elderly population. Methods: Sixty one elderly subjects (31 males and 30 females) participated in this study. Their ages ranged between 62 and 84 years. Ability to maintain balance was assessed using Functional Reach Test (FRT) and Sharpened Romberg Test (SRT). Anthropometric data including age, weight, height, arm length, leg length, bi-acromial breadth, foot length and trunk length were also collected. Analysis was done using Pearson’s Product Moment Correlation Coefficient and Independent T-test, while level of significance was set as p

Impact of Liquidity Crunch on Interbank Network

Most empirical studies have analyzed how liquidity risks faced by individual institutions turn into systemic risk. Recent banking crisis has highlighted the importance of grasping and controlling the systemic risk, and the acceptance by Central Banks to ease their monetary policies for saving default or illiquid banks. This last point shows that banks would pay less attention to liquidity risk which, in turn, can become a new important channel of loss. The financial regulation focuses on the most important and “systemic” banks in the global network. However, to quantify the expected loss associated with liquidity risk, it is worth to analyze sensitivity to this channel for the various elements of the global bank network. A small bank is not considered as potentially systemic; however the interaction of small banks all together can become a systemic element. This paper analyzes the impact of medium and small banks interaction on a set of banks which is considered as the core of the network. The proposed method uses the structure of agent-based model in a two-class environment. In first class, the data from actual balance sheets of 22 large and systemic banks (such as BNP Paribas or Barclays) are collected. In second one, to model a network as closely as possible to actual interbank market, 578 fictitious banks smaller than the ones belonging to first class have been split into two groups of small and medium ones. All banks are active on the European interbank network and have deposit and market activity. A simulation of 12 three month periods representing a midterm time interval three years is projected. In each period, there is a set of behavioral descriptions: repayment of matured loans, liquidation of deposits, income from securities, collection of new deposits, new demands of credit, and securities sale. The last two actions are part of refunding process developed in this paper. To strengthen reliability of proposed model, random parameters dynamics are managed with stochastic equations as rates the variations of which are generated by Vasicek model. The Central Bank is considered as the lender of last resort which allows banks to borrow at REPO rate and some ejection conditions of banks from the system are introduced. Liquidity crunch due to exogenous crisis is simulated in the first class and the loss impact on other bank classes is analyzed though aggregate values representing the aggregate of loans and/or the aggregate of borrowing between classes. It is mainly shown that the three groups of European interbank network do not have the same response, and that intermediate banks are the most sensitive to liquidity risk.

A Distance Function for Data with Missing Values and Its Application

Missing values in data are common in real world applications. Since the performance of many data mining algorithms depend critically on it being given a good metric over the input space, we decided in this paper to define a distance function for unlabeled datasets with missing values. We use the Bhattacharyya distance, which measures the similarity of two probability distributions, to define our new distance function. According to this distance, the distance between two points without missing attributes values is simply the Mahalanobis distance. When on the other hand there is a missing value of one of the coordinates, the distance is computed according to the distribution of the missing coordinate. Our distance is general and can be used as part of any algorithm that computes the distance between data points. Because its performance depends strongly on the chosen distance measure, we opted for the k nearest neighbor classifier to evaluate its ability to accurately reflect object similarity. We experimented on standard numerical datasets from the UCI repository from different fields. On these datasets we simulated missing values and compared the performance of the kNN classifier using our distance to other three basic methods. Our  experiments show that kNN using our distance function outperforms the kNN using other methods. Moreover, the runtime performance of our method is only slightly higher than the other methods.

An Effective Genetic Algorithm for a Complex Real-World Scheduling Problem

We address a complex scheduling problem arising in the wood panel industry with the objective of minimizing a quadratic function of job tardiness. The proposed solution strategy, which is based on an effective genetic algorithm, has been coded and implemented within a major Tunisian company, leader in the wood panel manufacturing. Preliminary experimental results indicate significant decrease of delivery times.

Improved Robust Stability Criteria of a Class of Neutral Lur’e Systems with Interval Time-Varying Delays

This paper addresses the robust stability problem of a class of delayed neutral Lur’e systems. Combined with the property of convex function and double integral Jensen inequality, a new tripe integral Lyapunov functional is constructed to derive some new stability criteria. Compared with some related results, the new criteria established in this paper are less conservative. Finally, two numerical examples are presented to illustrate the validity of the main results.

Technique for Voltage Control in Distribution System

This paper presents the techniques for voltage control in distribution system. It is integrated in the distribution management system. Voltage is an important parameter for the control of electrical power systems. The distribution network operators have the responsibility to regulate the voltage supplied to consumer within statutory limits. Traditionally, the On-Load Tap Changer (OLTC) transformer equipped with automatic voltage control (AVC) relays is the most popular and effective voltage control device. A static synchronous compensator (STATCOM) may be equipped with several controllers to perform multiple control functions. Static Var Compensation (SVC) is regulation slopes and available margins for var dispatch. The voltage control in distribution networks is established as a centralized analytical function in this paper. 

Application of Particle Swarm Optimization for Economic Load Dispatch and Loss Reduction

This paper proposes a particle swarm optimization (PSO) technique to solve the economic load dispatch (ELD) problems. For the ELD problem in this work, the objective function is to minimize the total fuel cost of all generator units for a given daily load pattern while the main constraints are power balance and generation output of each units. Case study in the test system of 40-generation units with 6 load patterns is presented to demonstrate the performance of PSO in solving the ELD problem. It can be seen that the optimal solution given by PSO provides the minimum total cost of generation while satisfying all the constraints and benefiting greatly from saving in power loss reduction.

Properties of a Stochastic Predator-Prey System with Holling II Functional Response

In this paper, a stochastic predator-prey system with Holling II functional response is studied. First, we show that there is a unique positive solution to the system for any given positive initial value. Then, stochastically bounded of the positive solution to the stochastic system is derived. Moreover, sufficient conditions for global asymptotic stability are also established. In the end, some simulation figures are carried out to support the analytical findings.

Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.

Future Logistics - Challenges, Requirements and Solutions for Logistics Networks

The importance of logistics has changed enormously in the last few decades. While logistics was formerly one of the core functions of most companies, logistics or at least parts of their functions are nowadays outsourced to external logistic service providers in terms of contracts. As a result of this shift new business models like the fourth party logistics provider emerged, which designs, plans and monitors the resulting logistics networks. This new business model and topics such as Synchromodality or Big Data impose new requirements on the underlying IT, which cannot be met with conventional concepts and approaches. In this paper, the challenges of logistics network monitoring are outlined by using a scenario. The most common layers in a logical multilayered architecture for an information system are used to point out the arising challenges for IT. In addition, first appropriate solution approaches are introduced.  

Particle Swarm Optimization with Interval-valued Genotypes and Its Application to Neuroevolution

The author proposes an extension of particle swarm optimization (PSO) for solving interval-valued optimization problems and applies the extended PSO to evolutionary training of neural networks (NNs) with interval weights. In the proposed PSO, values in the genotypes are not real numbers but intervals. Experimental results show that interval-valued NNs trained by the proposed method could well approximate hidden target functions despite the fact that no training data was explicitly provided.

Grid–SVC: An Improvement in SVC Algorithm, Based On Grid Based Clustering

Support vector clustering (SVC) is an important kernelbased clustering algorithm in multi applications. It has got two main bottle necks, the high computation price and labeling piece. In this paper, we presented a modified SVC method, named Grid–SVC, to improve the original algorithm computationally. First we normalized and then we parted the interval, where the SVC is processing, using a novel Grid–based clustering algorithm. The algorithm parts the intervals, based on the density function of the data set and then applying the cartesian multiply makes multi-dimensional grids. Eliminating many outliers and noise in the preprocess, we apply an improved SVC method to each parted grid in a parallel way. The experimental results show both improvement in time complexity order and the accuracy.

Evaluation of Hydrogen Particle Volume on Surfaces of Selected Nanocarbons

This paper describes an approach to the adsorption phenomena modeling aimed at specifying the adsorption mechanisms on localized or nonlocalized adsorbent sites, when applied to the nanocarbons. The concept comes from the fundamental thermodynamic description of adsorption equilibrium and is based on numerical calculations of the hydrogen adsorbed particles volume on the surface of selected nanocarbons: single-walled nanotube and nanocone. This approach enables to obtain information on adsorption mechanism and then as a consequence to take appropriate mathematical adsorption model, thus allowing for a more reliable identification of the material porous structure. Theoretical basis of the approach is discussed and newly derived results of the numerical calculations are presented for the selected nanocarbons.

GMDH Modeling Based on Polynomial Spline Estimation and Its Applications

GMDH algorithm can well describe the internal structure of objects. In the process of modeling, automatic screening of model structure and variables ensure the convergence rate.This paper studied a new GMDH model based on polynomial spline  stimation. The polynomial spline function was used to instead of the transfer function of GMDH to characterize the relationship between the input variables and output variables. It has proved that the algorithm has the optimal convergence rate under some conditions. The empirical results show that the algorithm can well forecast Consumer Price Index (CPI).

Kinetic Theory Based CFD Modeling of Particulate Flows in Horizontal Pipes

The numerical simulation of fully developed gas–solid flow in a horizontal pipe is done using the eulerian-eulerian approach, also known as two fluids modeling as both phases are treated as continuum and inter-penetrating continua. The solid phase stresses are modeled using kinetic theory of granular flow (KTGF). The computed results for velocity profiles and pressure drop are compared with the experimental data. We observe that the convection and diffusion terms in the granular temperature cannot be neglected in gas solid flow simulation along a horizontal pipe. The particle-wall collision and lift also play important role in eulerian modeling. We also investigated the effect of flow parameters like gas velocity, particle properties and particle loading on pressure drop prediction in different pipe diameters. Pressure drop increases with gas velocity and particle loading. The gas velocity has the same effect ((proportional toU2 ) as single phase flow on pressure drop prediction. With respect to particle diameter, pressure drop first increases, reaches a peak and then decreases. The peak is a strong function of pipe bore.

Septic B-Spline Collocation Method for Numerical Solution of the Kuramoto-Sivashinsky Equation

In this paper the Kuramoto-Sivashinsky equation is solved numerically by collocation method. The solution is approximated as a linear combination of septic B-spline functions. Applying the Von-Neumann stability analysis technique, we show that the method is unconditionally stable. The method is applied on some test examples, and the numerical results have been compared with the exact solutions. The global relative error and L∞ in the solutions show the efficiency of the method computationally.

Remote Control Software for Rohde and Schwarz Instruments

The paper describes software for remote control and measuring with new Graphical User Interface for Rohde & Schwarz instruments. Software allows remote control through Ethernet and supports basic and advanced functions for control various type of instruments like network and spectrum analyzers, power meters, signal generators and oscilloscopes. Standard Commands for Programmable Instruments (SCPI) and Virtual Instrument Software Architecture (VISA) are used for remote control and setup of instruments. Developed software is modular with user friendly graphic user interface for each instrument with automatic identification of instruments.