Enhancing Cache Performance Based on Improved Average Access Time

A high performance computer includes a fast processor and millions bytes of memory. During the data processing, huge amount of information are shuffled between the memory and processor. Because of its small size and its effectiveness speed, cache has become a common feature of high performance computers. Enhancing cache performance proved to be essential in the speed up of cache-based computers. Most enhancement approaches can be classified as either software based or hardware controlled. The performance of the cache is quantified in terms of hit ratio or miss ratio. In this paper, we are optimizing the cache performance based on enhancing the cache hit ratio. The optimum cache performance is obtained by focusing on the cache hardware modification in the way to make a quick rejection to the missed line's tags from the hit-or miss comparison stage, and thus a low hit time for the wanted line in the cache is achieved. In the proposed technique which we called Even- Odd Tabulation (EOT), the cache lines come from the main memory into cache are classified in two types; even line's tags and odd line's tags depending on their Least Significant Bit (LSB). This division is exploited by EOT technique to reject the miss match line's tags in very low time compared to the time spent by the main comparator in the cache, giving an optimum hitting time for the wanted cache line. The high performance of EOT technique against the familiar mapping technique FAM is shown in the simulated results.

Promoting Mental and Spiritual Health among Postpartum Mothers to Extend Breastfeeding Period

The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.

An Interval-Based Multi-Attribute Decision Making Approach for Electric Utility Resource Planning

This paper presents an interval-based multi-attribute decision making (MADM) approach in support of the decision process with imprecise information. The proposed decision methodology is based on the model of linear additive utility function but extends the problem formulation with the measure of composite utility variance. A sample study concerning with the evaluation of electric generation expansion strategies is provided showing how the imprecise data may affect the choice toward the best solution and how a set of alternatives, acceptable to the decision maker (DM), may be identified with certain confidence.

On the Dynamic Behaviour of a Four-Bar Linkage Driven by a Velocity Controlled DC Motor

The dynamic behaviour of a four-bar linkage driven by a velocity controlled DC motor is discussed in the paper. In particular the author presents the results obtained by means of a specifically developed software, which implements the mathematical models of all components of the system (linkage, transmission, electric motor, control devices). The use of this software enables a more efficient design approach, since it allows the designer to check, in a simple and immediate way, the dynamic behaviour of the mechanism, arising from different values of the system parameters.

An Effective Hybrid Genetic Algorithm for Job Shop Scheduling Problem

The job shop scheduling problem (JSSP) is well known as one of the most difficult combinatorial optimization problems. This paper presents a hybrid genetic algorithm for the JSSP with the objective of minimizing makespan. The efficiency of the genetic algorithm is enhanced by integrating it with a local search method. The chromosome representation of the problem is based on operations. Schedules are constructed using a procedure that generates full active schedules. In each generation, a local search heuristic based on Nowicki and Smutnicki-s neighborhood is applied to improve the solutions. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed algorithm.

CFD Analysis of Two Phase Flow in a Horizontal Pipe – Prediction of Pressure Drop

In designing of condensers, the prediction of pressure drop is as important as the prediction of heat transfer coefficient. Modeling of two phase flow, particularly liquid – vapor flow under diabatic conditions inside a horizontal tube using CFD analysis is difficult with the available two phase models in FLUENT due to continuously changing flow patterns. In the present analysis, CFD analysis of two phase flow of refrigerants inside a horizontal tube of inner diameter, 0.0085 m and 1.2 m length is carried out using homogeneous model under adiabatic conditions. The refrigerants considered are R22, R134a and R407C. The analysis is performed at different saturation temperatures and at different flow rates to evaluate the local frictional pressure drop. Using Homogeneous model, average properties are obtained for each of the refrigerants that is considered as single phase pseudo fluid. The so obtained pressure drop data is compared with the separated flow models available in literature.

Improvement in Performance and Emission Characteristics of a Single Cylinder S.I. Engine Operated on Blends of CNG and Hydrogen

This paper presents the experimental results of a single cylinder Enfield engine using an electronically controlled fuel injection system which was developed to carry out exhaustive tests using neat CNG, and mixtures of hydrogen in compressed natural gas (HCNG) as 0, 5, 10, 15 and 20% by energy. Experiments were performed at 2000 and 2400 rpm with wide open throttle and varying the equivalence ratio. Hydrogen which has fast burning rate, when added to compressed natural gas, enhances its flame propagation rate. The emissions of HC, CO, decreased with increasing percentage of hydrogen but NOx was found to increase. The results indicated a marked improvement in the brake thermal efficiency with the increase in percentage of hydrogen added. The improved thermal efficiency was clearly observed to be more in lean region as compared to rich region. This study is expected to reduce vehicular emissions along with increase in thermal efficiency and thus help in reduction of further environmental degradation.

Image Segmentation Based on Graph Theoretical Approach to Improve the Quality of Image Segmentation

Graph based image segmentation techniques are considered to be one of the most efficient segmentation techniques which are mainly used as time & space efficient methods for real time applications. How ever, there is need to focus on improving the quality of segmented images obtained from the earlier graph based methods. This paper proposes an improvement to the graph based image segmentation methods already described in the literature. We contribute to the existing method by proposing the use of a weighted Euclidean distance to calculate the edge weight which is the key element in building the graph. We also propose a slight modification of the segmentation method already described in the literature, which results in selection of more prominent edges in the graph. The experimental results show the improvement in the segmentation quality as compared to the methods that already exist, with a slight compromise in efficiency.

Decomposition Method for Neural Multiclass Classification Problem

In this article we are going to discuss the improvement of the multi classes- classification problem using multi layer Perceptron. The considered approach consists in breaking down the n-class problem into two-classes- subproblems. The training of each two-class subproblem is made independently; as for the phase of test, we are going to confront a vector that we want to classify to all two classes- models, the elected class will be the strongest one that won-t lose any competition with the other classes. Rates of recognition gotten with the multi class-s approach by two-class-s decomposition are clearly better that those gotten by the simple multi class-s approach.

Non-Invasive Capillary Blood Flow Measurement: Laser Speckle and Laser Doppler

Microcirculation is essential for the proper supply of oxygen and nutritive substances to the biological tissue and the removal of waste products of metabolism. The determination of blood flow in the capillaries is therefore of great interest to clinicians. A comparison has been carried out using the developed non-invasive, non-contact and whole field laser speckle contrast imaging (LSCI) based technique and as well as a commercially available laser Doppler blood flowmeter (LDF) to evaluate blood flow at the finger tip and elbow and is presented here. The LSCI technique gives more quantitative information on the velocity of blood when compared to the perfusion values obtained using the LDF. Measurement of blood flow in capillaries can be of great interest to clinicians in the diagnosis of vascular diseases of the upper extremities.

Identifying the Objectives of Outsourcing Logistics Services as a Basis for Measuring Its Financial and Operational Performance

Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.

Modified Fast and Exact Algorithm for Fast Haar Transform

Wavelet transform or wavelet analysis is a recently developed mathematical tool in applied mathematics. In numerical analysis, wavelets also serve as a Galerkin basis to solve partial differential equations. Haar transform or Haar wavelet transform has been used as a simplest and earliest example for orthonormal wavelet transform. Since its popularity in wavelet analysis, there are several definitions and various generalizations or algorithms for calculating Haar transform. Fast Haar transform, FHT, is one of the algorithms which can reduce the tedious calculation works in Haar transform. In this paper, we present a modified fast and exact algorithm for FHT, namely Modified Fast Haar Transform, MFHT. The algorithm or procedure proposed allows certain calculation in the process decomposition be ignored without affecting the results.

A Model for Estimation of Efforts in Development of Software Systems

Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.

The Impact of an Air-Supply Guide Vane on the Indoor Air Distribution

Indoor air distribution has great impact on people-s thermal sensation. Therefore, how to remove the indoor excess heat becomes an important issue to create a thermally comfortable indoor environment. To expel the extra indoor heat effectively, this paper used a dynamic CFD approach to study the effect of an air-supply guide vane swinging periodically on the indoor air distribution within a model room. The numerical results revealed that the indoor heat transfer performance caused by the swing guide vane had close relation with the number of vortices developing under the inlet cold jet. At larger swing amplitude, two smaller vortices continued to shed outward under the cold jet and remove the indoor heat load more effectively. As a result, it can be found that the average Nusselt number on the floor increased with the increase of the swing amplitude of the guide vane.

Algerian Irrigation in Transition; Effects on Irrigation Profitability in Irrigation Schemes: The Case of the East Mitidja Scheme

In Algeria, liberalization reforms undertaken since the 1990s have resulted in negative effects on the development and management of irrigation schemes, as well as on the conditions of farmers. Reforms have been undertaken to improve the performance of irrigation schemes, such as the national plan of agricultural development (PNDA) in 2000 and the water pricing policy of 2005. However, after implementation of these policies, questions have arisen with regard to irrigation performance and its suitability for agricultural development. Hence, the aim of this paper is to provide insight into the profitability of irrigation during the transition period under current irrigation agricultural policies in Algeria. By using the method of farm crop budget analysis in the East Mitidja irrigation scheme, the returns from using surface water resources based on farm typology were found to vary among crops and farmers- groups within the scheme. Irrigation under the current situation is profitable for all farmers, including both those who benefit from subsidies and those who do not. However, the returns to water were found to be very sensitive to crop price fluctuations, particularly for non-subsidized groups and less so for those whose farming is based on orchards. Moreover, the socio-economic environment of the farmers contributed to less significant impacts of the PNDA policy. In fact, the limiting factor is not only the water, but also the lack of land ownership title. Market access constraints led to less agricultural investment and therefore to low intensification and low water productivity. It is financially feasible to recover the annual O&M costs in the irrigation scheme. By comparing the irrigation water price, returns to water, and O&M costs of water delivery, it is clear that irrigation can be profitable in the future. However, water productivity must be improved by enhancing farmers- income through farming investment, improving assets access, and the allocation of activities and crops which bring high returns to water; this could allow the farmers to pay more for water and allow cost recovery for water systems.

On the use of Ionic Liquids for CO2 Capturing

In this work, ionic liquids (ILs) for CO2 capturing in typical absorption/stripper process are considered. The use of ionic liquids is considered to be cost-effective because it requires less energy for solvent recovery compared to other conventional processes. A mathematical model is developed for the process based on Peng-Robinson (PR) equation of state (EoS) which is validated with experimental data for various solutions involving CO2. The model is utilized to study the sorbent and energy demand for three types of ILs at specific CO2 capturing rates. The energy demand is manifested by the vapor-liquid equilibrium temperature necessary to remove the captured CO2 from the used solvent in the regeneration step. It is found that higher recovery temperature is required for solvents with higher solubility coefficient. For all ILs, the temperature requirement is less than that required by the typical monoethanolamine (MEA) solvent. The effect of the CO2 loading in the sorbent stream on the process performance is also examined.

Computing Center Conditions for Non-analytic Vector Fields with Constant Angular Speed

We investigate the planar quasi-septic non-analytic systems which have a center-focus equilibrium at the origin and whose angular speed is constant. The system could be changed into an analytic system by two transformations, with the help of computer algebra system MATHEMATICA, the conditions of uniform isochronous center are obtained.

Digital Image Watermarking in the Wavelet Transform Domain

In this paper, we start by first characterizing the most important and distinguishing features of wavelet-based watermarking schemes. We studied the overwhelming amount of algorithms proposed in the literature. Application scenario, copyright protection is considered and building on the experience that was gained, implemented two distinguishing watermarking schemes. Detailed comparison and obtained results are presented and discussed. We concluded that Joo-s [1] technique is more robust for standard noise attacks than Dote-s [2] technique.

An Experimental Design Approach to Determine Effects of The Operating Parameters on The Rate of Ru promoted Ir Carbonylation of Methanol

carbonylation of methanol in homogenous phase is one of the major routesfor production of acetic acid. Amongst group VIII metal catalysts used in this process iridium has displayed the best capabilities. To investigate effect of operating parameters like: temperature, pressure, methyl iodide, methyl acetate, iridium, ruthenium, and water concentrations on the reaction rate, experimental design for this system based upon central composite design (CCD) was utilized. Statistical rate equation developed by this method contained individual, interactions and curvature effects of parameters on the reaction rate. The model with p-value less than 0.0001 and R2 values greater than 0.9; confirmeda satisfactory fitness of the experimental and theoretical studies. In other words, the developed model and experimental data obtained passed all diagnostic tests establishing this model as a statistically significant.

Laplace Decomposition Approximation Solution for a System of Multi-Pantograph Equations

In this work we adopt a combination of Laplace transform and the decomposition method to find numerical solutions of a system of multi-pantograph equations. The procedure leads to a rapid convergence of the series to the exact solution after computing a few terms. The effectiveness of the method is demonstrated in some examples by obtaining the exact solution and in others by computing the absolute error which decreases as the number of terms of the series increases.