Dynamic Safety-Stock Calculation

In order to ensure a high service level industrial enterprises have to maintain safety-stock that directly influences the economic efficiency at the same time. This paper analyses established mathematical methods to calculate safety-stock. Therefore, the performance measured in stock and service level is appraised and the limits of several methods are depicted. Afterwards, a new dynamic approach is presented to gain an extensive method to calculate safety-stock that also takes the knowledge of future volatility into account.

A New Floating Point Implementation of Base 2 Logarithm

Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving insights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.

Bio-Surfactant Production and Its Application in Microbial EOR

There are various sources of energies available worldwide and among them, crude oil plays a vital role. Oil recovery is achieved using conventional primary and secondary recovery methods. In-order to recover the remaining residual oil, technologies like Enhanced Oil Recovery (EOR) are utilized which is also known as tertiary recovery. Among EOR, Microbial enhanced oil recovery (MEOR) is a technique which enables the improvement of oil recovery by injection of bio-surfactant produced by microorganisms. Bio-surfactant can retrieve unrecoverable oil from the cap rock which is held by high capillary force. Bio-surfactant is a surface active agent which can reduce the interfacial tension and reduce viscosity of oil and thereby oil can be recovered to the surface as the mobility of the oil is increased. Research in this area has shown promising results besides the method is echo-friendly and cost effective compared with other EOR techniques. In our research, on laboratory scale we produced bio-surfactant using the strain Pseudomonas putida (MTCC 2467) and injected into designed simple sand packed column which resembles actual petroleum reservoir. The experiment was conducted in order to determine the efficiency of produced bio-surfactant in oil recovery. The column was made of plastic material with 10 cm in length. The diameter was 2.5 cm. The column was packed with fine sand material. Sand was saturated with brine initially followed by oil saturation. Water flooding followed by bio-surfactant injection was done to determine the amount of oil recovered. Further, the injection of bio-surfactant volume was varied and checked how effectively oil recovery can be achieved. A comparative study was also done by injecting Triton X 100 which is one of the chemical surfactant. Since, bio-surfactant reduced surface and interfacial tension oil can be easily recovered from the porous sand packed column.

Derivation of Monotone Likelihood Ratio Using Two Sided Uniformly Normal Distribution Techniques

In this paper, two-sided uniformly normal distribution techniques were used in the derivation of monotone likelihood ratio. The approach mainly employed the parameters of the distribution for a class of all size a. The derivation technique is fast, direct and less burdensome when compared to some existing methods.

Residential Self-Selection and Its Effects on Urban Commute Travels in Iranian Cities Compared to US, UK, and Germany

Residential self-selection has gained increasing attention in the Western travel behavior research during the past decade. Many studies in the US, UK, and Germany conclude that the role of individuals’ residential location choice on commute travel behavior is more important than that of the built environment or at least it has considerable effects. However the effectiveness of location choice in many countries and cultures like Iran is unclear. This study examines the self-selections in two neighborhoods in Tehran. As a part of a research about the influences of land use on travel behavior information about people’s location preferences was collected by direct questioning. The findings show that the main reasons for selecting the location of residential units are related to socio-economic factors such as rise of house price and affordability of house prices. Transportation has little impacts on location decisions. Moreover, residential self-selection accounts for only 3 to 7.5 percent of the pedestrian, PT, and car trips.

Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Mechanical Properties of Pea Pods (Pisium sativum Var. Shamshiri)

Knowing pea pods mechanical resistance against dynamic forces are important for design of combine harvester. In pea combine harvesters, threshing is accomplished by two mechanical actions of impact and friction forces. In this research, the effects of initial moisture content and needed impact and friction energy on threshing of pea pods were studied. An impact device was built based on pendulum mechanism. The experiments were done at three initial moisture content levels of 12.1, 23.5 and 39.5 (%w.b.) for both impact and friction methods. Three energy levels of 0.088, 0.126 and 0.202 J were used for impact method and for friction method three energy levels of 0.784, 0.930 and 1.351 J. The threshing percentage was measured in each method. By using a frictional device, kinetic friction coefficients at above moisture contents were measured 0.257, 0.303 and 0.336, respectively. The results of variance analysis of the two methods showed that moisture content and energy have significant effects on the threshing percentage.

Dimensional Variations of Cement Matrices in the Presence of Metal Fibers

The objective of this study is to present and to analyze the feasibility of using steel fibers as reinforcement in the cementations matrix to minimize the effect of free shrinkage which is a major cause of cracks that have can observe on concrete structures, also to improve the mechanical resistances of this concrete reinforced. The experimental study was performed on specimens with geometric characteristics adapted to the testing. The tests of shrinkage apply on prismatic specimens, equipped with rods fixed to the ends with different dosages of fibers, it should be noted that the fibers used are hooked end of 50mm length and 67 slenderness. The results show that the compressive strength and flexural strength increases as the degree of incorporation of fibbers increases. And the shrinkage deformations are generally less important for fibers-reinforced concrete to those appearing in the concrete without fibers.

The Evaluation of Load-Bearing Capacity of the Planar CHS Joint Using Finite Modeling

The subject of this paper is to verify the behavior of the truss-type CHS joint which is beyond the scope of use of the EN 1993-1-8. This is performed by using the numerical modeling in program ANSYS and the analytical methods recommended in the CIDECT publication. The recommendations for numerical modeling of such types of joints as well as for evaluation of load-bearing capacity of the joint are given in this paper. The results from both analytical and numerical models are compared.

Anthropometric Correlates of Balance Performance in Non-Institutionalized Elderly

Purpose: The fear of falling is a major concern among the elderly. Sixty-five percent of individuals older than 60 years of age experience loss of balance often on a daily basis. Therefore, balance assessment in the elderly deserves special attention due to its importance in functional mobility and safety. This study aimed at assessing balance performance and comparing some anthropometric parameters among a Nigerian non-institutionalized elderly population. Methods: Sixty one elderly subjects (31 males and 30 females) participated in this study. Their ages ranged between 62 and 84 years. Ability to maintain balance was assessed using Functional Reach Test (FRT) and Sharpened Romberg Test (SRT). Anthropometric data including age, weight, height, arm length, leg length, bi-acromial breadth, foot length and trunk length were also collected. Analysis was done using Pearson’s Product Moment Correlation Coefficient and Independent T-test, while level of significance was set as p

Impact of Liquidity Crunch on Interbank Network

Most empirical studies have analyzed how liquidity risks faced by individual institutions turn into systemic risk. Recent banking crisis has highlighted the importance of grasping and controlling the systemic risk, and the acceptance by Central Banks to ease their monetary policies for saving default or illiquid banks. This last point shows that banks would pay less attention to liquidity risk which, in turn, can become a new important channel of loss. The financial regulation focuses on the most important and “systemic” banks in the global network. However, to quantify the expected loss associated with liquidity risk, it is worth to analyze sensitivity to this channel for the various elements of the global bank network. A small bank is not considered as potentially systemic; however the interaction of small banks all together can become a systemic element. This paper analyzes the impact of medium and small banks interaction on a set of banks which is considered as the core of the network. The proposed method uses the structure of agent-based model in a two-class environment. In first class, the data from actual balance sheets of 22 large and systemic banks (such as BNP Paribas or Barclays) are collected. In second one, to model a network as closely as possible to actual interbank market, 578 fictitious banks smaller than the ones belonging to first class have been split into two groups of small and medium ones. All banks are active on the European interbank network and have deposit and market activity. A simulation of 12 three month periods representing a midterm time interval three years is projected. In each period, there is a set of behavioral descriptions: repayment of matured loans, liquidation of deposits, income from securities, collection of new deposits, new demands of credit, and securities sale. The last two actions are part of refunding process developed in this paper. To strengthen reliability of proposed model, random parameters dynamics are managed with stochastic equations as rates the variations of which are generated by Vasicek model. The Central Bank is considered as the lender of last resort which allows banks to borrow at REPO rate and some ejection conditions of banks from the system are introduced. Liquidity crunch due to exogenous crisis is simulated in the first class and the loss impact on other bank classes is analyzed though aggregate values representing the aggregate of loans and/or the aggregate of borrowing between classes. It is mainly shown that the three groups of European interbank network do not have the same response, and that intermediate banks are the most sensitive to liquidity risk.

A Distance Function for Data with Missing Values and Its Application

Missing values in data are common in real world applications. Since the performance of many data mining algorithms depend critically on it being given a good metric over the input space, we decided in this paper to define a distance function for unlabeled datasets with missing values. We use the Bhattacharyya distance, which measures the similarity of two probability distributions, to define our new distance function. According to this distance, the distance between two points without missing attributes values is simply the Mahalanobis distance. When on the other hand there is a missing value of one of the coordinates, the distance is computed according to the distribution of the missing coordinate. Our distance is general and can be used as part of any algorithm that computes the distance between data points. Because its performance depends strongly on the chosen distance measure, we opted for the k nearest neighbor classifier to evaluate its ability to accurately reflect object similarity. We experimented on standard numerical datasets from the UCI repository from different fields. On these datasets we simulated missing values and compared the performance of the kNN classifier using our distance to other three basic methods. Our  experiments show that kNN using our distance function outperforms the kNN using other methods. Moreover, the runtime performance of our method is only slightly higher than the other methods.

A Study of Priority Evaluation and Resource Allocation for Revitalization of Cultural Heritages in the Urban Development

Proper maintenance and preservation of significant cultural heritages or historic buildings is necessary. It can not only enhance environmental benefits and a sense of community, but also preserve a city's history and people’s memory. It allows the next generation to be able to get a glimpse of our past, and achieve the goal of sustainable preserved cultural assets. However, the management of maintenance work has not been appropriate for many designated heritages or historic buildings so far. The planning and implementation of the reuse has yet to have a breakthrough specification. It leads the heritages to a mere formality of being “reserved”, instead of the real meaning of “conservation”. For the restoration and preservation of cultural heritages study issues, it is very important due to the consideration of historical significance, symbolism, and economic benefits effects. However, the decision makers such as the officials from public sector they often encounter which heritage should be prioritized to be restored first under the available limited budgets. Only very few techniques are available today to determine the appropriately restoration priorities for the diverse historical heritages, perhaps because of a lack of systematized decision-making aids been proposed before. In the past, the discussions of management and maintenance towards cultural assets were limited to the selection of reuse alternatives instead of the allocation of resources. In view of this, this research will adopt some integrated research methods to solve the existing problems that decision-makers might encounter when allocating resources in the management and maintenance of heritages and historic buildings. The purpose of this study is to develop a sustainable decision making model for local governments to resolve these problems. We propose an alternative decision support model to prioritize restoration needs within the limited budgets. The model is constructed based on fuzzy Delphi, fuzzy analysis network process (FANP) and goal programming (GP) methods. In order to avoid misallocate resources; this research proposes a precise procedure that can take multi-stakeholders views, limited costs and resources into consideration. Also, the combination of many factors and goals has been taken into account to find the highest priority and feasible solution results. To illustrate the approach we propose in this research, seven cultural heritages in Taipei city as one example has been used as an empirical study, and the results are in depth analyzed to explain the application of our proposed approach.

Comparison of Two Types of Preconditioners for Stokes and Linearized Navier-Stokes Equations

To solve saddle point systems efficiently, several preconditioners have been published. There are many methods for constructing preconditioners for linear systems from saddle point problems, for instance, the relaxed dimensional factorization (RDF) preconditioner and the augmented Lagrangian (AL) preconditioner are used for both steady and unsteady Navier-Stokes equations. In this paper we compare the RDF preconditioner with the modified AL (MAL) preconditioner to show which is more effective to solve Navier-Stokes equations. Numerical experiments indicate that the MAL preconditioner is more efficient and robust, especially, for moderate viscosities and stretched grids in steady problems. For unsteady cases, the convergence rate of the RDF preconditioner is slightly faster than the MAL perconditioner in some circumstances, but the parameter of the RDF preconditioner is more sensitive than the MAL preconditioner. Moreover the convergence rate of the MAL preconditioner is still quite acceptable. Therefore we conclude that the MAL preconditioner is more competitive than the RDF preconditioner. These experiments are implemented with IFISS package. 

An Application of the Data Mining Methods with Decision Rule

  ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.

Effect of Processing Methods on Texture Evolution in AZ31 Mg Alloy Sheet

Textures of AZ31 Mg alloy sheets were evaluated by using neutron diffraction method in this study. The AZ31 sheets were fabricated either by conventional casting and subsequent hot rolling or strip casting. The effect of warm rolling was investigated using the AZ31 Mg alloy sheet produced by conventional casting. Warm rolling of 30% thickness reduction per pass was possible without any side-crack at temperatures as low as 200oC under the roll speed of 30 m/min. The initial microstructure of conventionally cast specimen was found to be partially recrystallized structures. Grain refinement was found to occur actively during the warm rolling. The (0002),(10-10) (10-11),and (10-12) complete pole figures were measured using the HANARO FCD (Neutron Four Circle Diffractometer) and ODF were calculated. The major texture of all specimens can be expressed by ND//(0001) fiber texture. Texture of hot rolled specimen showed the strongest fiber component, while that of strip cast sheet seemed to be similar to random distribution.

Mechanical Quadrature Methods for Solving First Kind Boundary Integral Equations of Stationary Stokes Problem

By means of Sidi-Israeli’s quadrature rules, mechanical quadrature methods (MQMs) for solving the first kind boundary integral equations (BIEs) of steady state Stokes problem are presented. The convergence of numerical solutions by MQMs is proved based on Anselone’s collective compact and asymptotical compact theory, and the asymptotic expansions with the odd powers of the errors are provided, which implies that the accuracy of the approximations by MQMs possesses high accuracy order O (h3). Finally, the numerical examples show the efficiency of our methods.

Some Preconditioners for Block Pentadiagonal Linear Systems Based on New Approximate Factorization Methods

In this paper, getting an high-efficiency parallel algorithm to solve sparse block pentadiagonal linear systems suitable for vectors and parallel processors, stair matrices are used to construct some parallel polynomial approximate inverse preconditioners. These preconditioners are appropriate when the desired target is to maximize parallelism. Moreover, some theoretical results about these preconditioners are presented and how to construct preconditioners effectively for any nonsingular block pentadiagonal H-matrices is also described. In addition, the availability of these preconditioners is illustrated with some numerical experiments arising from two dimensional biharmonic equation.

Evaluation of Energy and Environmental Aspects of Reduced Tillage Systems Applied in Maize Cultivation

In maize growing technologies, tillage technological operations are the most time-consuming and require the greatest fuel input. Substitution of conventional tillage, involving deep ploughing, by other reduced tillage methods can reduce technological production costs, diminish soil degradation and environmental pollution from greenhouse gas emissions, as well as improve economic competitiveness of agricultural produce. Experiments designed to assess energy and environmental aspects associated with different reduced tillage systems, applied in maize cultivation were conducted at Aleksandras Stulginskis University taking into account Lithuania’s economic and climate conditions. The study involved 5 tillage treatments: deep ploughing (DP, control), shallow ploughing (SP), deep cultivation (DC), shallow cultivation (SC) and no-tillage (NT). Our experimental evidence suggests that with the application of reduced tillage systems it is feasible to reduce fuel consumption by 13-58% and working time input by 8.4% to nearly 3-fold, to reduce the cost price of maize cultivation technological operations, decrease environmental pollution with CO2 gas by 30 to 146 kg ha-1, compared with the deep ploughing.

Problem Based Learning in B. P. Koirala Institute of Health Sciences

Problem based learning is one of the highly acclaimed learning methods in medical education since its first introduction at Mc-Master University in Canada in the 1960s. It has now been adopted as a teaching learning method in many medical colleges of Nepal. B.P. Koirala Institute of Health Sciences (BPKIHS), a health science deemed university is the second institute in Nepal to establish problem-based learning academic program and need-based teaching approach hence minimizing teaching through lectures since its inception. During the first two years of MBBS course, the curriculum is divided into various organ-systems incorporated with problem-based learning exercise each of one week duration.