Open Source Library Management System Software: A Review

Library management systems are commonly used in all educational related institutes. Many commercial products are available. However, many institutions may not be able to afford the cost of using commercial products. Therefore, an alternative solution in such situations would be open source software. This paper is focusing on reviewing open source library management system packages currently available. The review will focus on the abilities to perform four basic components which are traditional services, interlibrary load management, managing electronic materials and basic common management system such as security, alert system and statistical reports. In addition, environment, basic requirement and supporting aspects of each open source package are also mentioned.

DJess A Knowledge-Sharing Middleware to Deploy Distributed Inference Systems

In this paper DJess is presented, a novel distributed production system that provides an infrastructure for factual and procedural knowledge sharing. DJess is a Java package that provides programmers with a lightweight middleware by which inference systems implemented in Jess and running on different nodes of a network can communicate. Communication and coordination among inference systems (agents) is achieved through the ability of each agent to transparently and asynchronously reason on inferred knowledge (facts) that might be collected and asserted by other agents on the basis of inference code (rules) that might be either local or transmitted by any node to any other node.

A Refined Application of QFD in SCM, A New Approach

Due to the fact that in the new century customers tend to express globally increasing demands, networks of interconnected businesses have been established in societies and the management of such networks seems to be a major key through gaining competitive advantages. Supply chain management encompasses such managerial activities. Within a supply chain, a critical role is played by quality. QFD is a widely-utilized tool which serves the purpose of not only bringing quality to the ultimate provision of products or service packages required by the end customer or the retailer, but it can also initiate us into a satisfactory relationship with our initial customer; that is the wholesaler. However, the wholesalers- cooperation is considerably based on the capabilities that are heavily dependent on their locations and existing circumstances. Therefore, it is undeniable that for all companies each wholesaler possesses a specific importance ratio which can heavily influence the figures calculated in the House of Quality in QFD. Moreover, due to the competitiveness of the marketplace today, it-s been widely recognized that consumers- expression of demands has been highly volatile in periods of production. Apparently, such instability and proneness to change has been very tangibly noticed and taking it into account during the analysis of HOQ is widely influential and doubtlessly required. For a more reliable outcome in such matters, this article demonstrates the application viability of Analytic Network Process for considering the wholesalers- reputation and simultaneously introduces a mortality coefficient for the reliability and stability of the consumers- expressed demands in course of time. Following to this, the paper provides further elaboration on the relevant contributory factors and approaches through the calculation of such coefficients. In the end, the article concludes that an empirical application is needed to achieve broader validity.

Application of Java-based Pointcuts in Aspect Oriented Programming (AOP) for Data Race Detection

Wide applicability of concurrent programming practices in developing various software applications leads to different concurrency errors amongst which data race is the most important. Java provides greatest support for concurrent programming by introducing various concurrency packages. Aspect oriented programming (AOP) is modern programming paradigm facilitating the runtime interception of events of interest and can be effectively used to handle the concurrency problems. AspectJ being an aspect oriented extension to java facilitates the application of concepts of AOP for data race detection. Volatile variables are usually considered thread safe, but they can become the possible candidates of data races if non-atomic operations are performed concurrently upon them. Various data race detection algorithms have been proposed in the past but this issue of volatility and atomicity is still unaddressed. The aim of this research is to propose some suggestions for incorporating certain conditions for data race detection in java programs at the volatile fields by taking into account support for atomicity in java concurrency packages and making use of pointcuts. Two simple test programs will demonstrate the results of research. The results are verified on two different Java Development Kits (JDKs) for the purpose of comparison.

Motion Control of TUAV having Eight Rotors for Enhanced Situational Awareness

This paper focuses on a critical component of the situational awareness (SA), the control of autonomous vertical flight for tactical unmanned aerial vehicle (TUAV). With the SA strategy, we proposed a two stage flight control procedure using two autonomous control subsystems to address the dynamics variation and performance requirement difference in initial and final stages of flight trajectory for a nontrivial nonlinear eight-rotor helicopter model. This control strategy for chosen model of mini-TUAV has been verified by simulation of hovering maneuvers using software package Simulink and demonstrated good performance for fast stabilization of engines in hovering, consequently, fast SA with economy in energy of batteries can be asserted during search-andrescue operations.

Experimental Study of the Metal Foam Flow Conditioner for Orifice Plate Flowmeters

The sensitivity of orifice plate metering to disturbed flow (either asymmetric or swirling) is a subject of great concern to flow meter users and manufacturers. The distortions caused by pipe fittings and pipe installations upstream of the orifice plate are major sources of this type of non-standard flows. These distortions can alter the accuracy of metering to an unacceptable degree. In this work, a multi-scale object known as metal foam has been used to generate a predetermined turbulent flow upstream of the orifice plate. The experimental results showed that the combination of an orifice plate and metal foam flow conditioner is broadly insensitive to upstream disturbances. This metal foam demonstrated a good performance in terms of removing swirl and producing a repeatable flow profile within a short distance downstream of the device. The results of using a combination of a metal foam flow conditioner and orifice plate for non-standard flow conditions including swirling flow and asymmetric flow show this package can preserve the accuracy of metering up to the level required in the standards.

Electrical Performance of a Solid Oxide Fuel Cell Unit with Non-Uniform Inlet Flow and High Fuel Utilization

This study investigates the electrical performance of a planar solid oxide fuel cell unit with cross-flow configuration when the fuel utilization gets higher and the fuel inlet flow are non-uniform. A software package in this study solves two-dimensional, simultaneous, partial differential equations of mass, energy, and electro-chemistry, without considering stack direction variation. The results show that the fuel utilization increases with a decrease in the molar flow rate, and the average current density decreases when the molar flow rate drops. In addition, non-uniform Pattern A will induce more severe happening of non-reaction area in the corner of the fuel exit and the air inlet. This non-reaction area deteriorates the average current density and then deteriorates the electrical performance to –7%.

The Effect of Geometry Dimensions on the Earthquake Response of the Finite Element Method

In this paper, the effect of width and height of the model on the earthquake response in the finite element method is discussed. For this purpose an earth dam as a soil structure under earthquake has been considered. Various dam-foundation models are analyzed by Plaxis, a finite element package for solving geotechnical problems. The results indicate considerable differences in the seismic responses.

Improving the Reusability and Interoperability of E-Learning Material

A key requirement for e-learning materials is reusability and interoperability, that is the possibility to use at least part of the contents in different courses, and to deliver them trough different platforms. These features make possible to limit the cost of new packages, but require the development of material according to proper specifications. SCORM (Sharable Content Object Reference Model) is a set of guidelines suitable for this purpose. A specific adaptation project has been started to make possible to reuse existing materials. The paper describes the main characteristics of SCORM specification, and the procedure used to modify the existing material.

Real-Time Physics Simulation Packages: An Evaluation Study

This paper includes a review of three physics simulation packages that can be used to provide researchers with a virtual ground for modeling, implementing and simulating complex models, as well as testing their control methods with less cost and time of development. The inverted pendulum model was used as a test bed for comparing ODE, DANCE and Webots, while Linear State Feedback was used to control its behavior. The packages were compared with respect to model creation, solving systems of differential equation, data storage, setting system variables, control the experiment and ease of use. The purpose of this paper is to give an overview about our experience with these environments and to demonstrate some of the benefits and drawbacks involved in practice for each package.

Correlations between Cleaning Frequency of Reservoir and Water Tower and Parameters of Water Quality

This study was investigated on sampling and analyzing water quality in water reservoir & water tower installed in two kind of residential buildings and school facilities. Data of water quality was collected for correlation analysis with frequency of sanitization of water reservoir through questioning managers of building about the inspection charts recorded on equipment for water reservoir. Statistical software packages (SPSS) were applied to the data of two groups (cleaning frequency and water quality) for regression analysis to determine the optimal cleaning frequency of sanitization. The correlation coefficient (R) in this paper represented the degree of correlation, with values of R ranging from +1 to -1.After investigating three categories of drinking water users; this study found that the frequency of sanitization of water reservoir significantly influenced the water quality of drinking water. A higher frequency of sanitization (more than four times per 1 year) implied a higher quality of drinking water. Results indicated that sanitizing water reservoir & water tower should at least twice annually for achieving the aim of safety of drinking water.

Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

Active Packaging Influence on Shelf Life Extension of Sliced Wheat Bread

The research object was wheat bread. Experiments were carried out at the Faculty of Food Technology of the Latvia University of Agriculture. An active packaging in combination with modified atmosphere (MAP, CO2 60% and N2 40%) was examined and compared with traditional packaging in air ambiance. Polymer Multibarrier 60, PP and OPP bags were used. Influence of iron based oxygen absorber in sachets of 100 cc obtained from Mitsubishi Gas Chemical Europe Ageless® was tested on the quality during the shelf of wheat bread. Samples of 40±4 g were packaged in polymer pouches (110 mm x 120 mm), hermetically sealed by MULTIVAC C300 vacuum chamber machine, and stored in room temperature +21.0±0.5 °C. The physiochemical properties – weight losses, moisture content, hardness, pH, colour, changes of atmosphere content (CO2 and O2) in headspace of packs, and microbial conditions were analysed before packaging and in the 7th, 14th, 21st and 28th days of storage.

The Effect of Mixture Velocity and Droplet Diameter on Oil-water Separator using Computational Fluid Dynamics (CFD)

The characteristics of fluid flow and phase separation in an oil-water separator were numerically analysed as part of the work presented herein. Simulations were performed for different velocities and droplet diameters, and the way this parameters can influence the separator geometry was studied. The simulations were carried out using the software package Fluent 6.2, which is designed for numerical simulation of fluid flow and mass transfer. The model consisted of a cylindrical horizontal separator. A tetrahedral mesh was employed in the computational domain. The condition of two-phase flow was simulated with the two-fluid model, taking into consideration turbulence effects using the k-ε model. The results showed that there is a strong dependency of phase separation on mixture velocity and droplet diameter. An increase in mixture velocity will bring about a slow down in phase separation and as a consequence will require a weir of greater height. An increase in droplet diameter will produce a better phase separation. The simulations are in agreement with results reported in literature and show that CFD can be a useful tool in studying a horizontal oilwater separator.

A New Approach for Prioritization of Failure Modes in Design FMEA using ANOVA

The traditional Failure Mode and Effects Analysis (FMEA) uses Risk Priority Number (RPN) to evaluate the risk level of a component or process. The RPN index is determined by calculating the product of severity, occurrence and detection indexes. The most critically debated disadvantage of this approach is that various sets of these three indexes may produce an identical value of RPN. This research paper seeks to address the drawbacks in traditional FMEA and to propose a new approach to overcome these shortcomings. The Risk Priority Code (RPC) is used to prioritize failure modes, when two or more failure modes have the same RPN. A new method is proposed to prioritize failure modes, when there is a disagreement in ranking scale for severity, occurrence and detection. An Analysis of Variance (ANOVA) is used to compare means of RPN values. SPSS (Statistical Package for the Social Sciences) statistical analysis package is used to analyze the data. The results presented are based on two case studies. It is found that the proposed new methodology/approach resolves the limitations of traditional FMEA approach.

Applying Complex Network Theory to Software Structure Analysis

Complex networks have been intensively studied across many fields, especially in Internet technology, biological engineering, and nonlinear science. Software is built up out of many interacting components at various levels of granularity, such as functions, classes, and packages, representing another important class of complex networks. It can also be studied using complex network theory. Over the last decade, many papers on the interdisciplinary research between software engineering and complex networks have been published. It provides a different dimension to our understanding of software and also is very useful for the design and development of software systems. This paper will explore how to use the complex network theory to analyze software structure, and briefly review the main advances in corresponding aspects.

High Performance in Parallel Data Integration: An Empirical Evaluation of the Ratio Between Processing Time and Number of Physical Nodes

Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.

Stability Analysis of Single Inverter Fed Two Induction Motors in Parallel

This paper discusses the novel graphical approach for stability analysis of multi induction motor drive controlled by a single inverter. Stability issue arises in parallel connected induction motors under unbalanced load conditions. The two powerful globally accepted modeling and simulation software packages such as MATLAB and LabVIEW are selected to perform the stability analysis. The stability investigation is performed for different load conditions and difference in stator and rotor resistances among the two motors. It is very simple and effective than the techniques presented to obtain the stability of the parallel connected induction motor drive under unbalanced load conditions. Approximate transfer functions are considered to model the induction motors, load dynamics, speed controllers and inverter. Simulink library tools are utilized to model the entire drive scheme in MATLAB. Stability study is discussed in LabVIEW using control design and simulation toolkits. Simulation results are illustrated for various running conditions to demonstrate the effectiveness of the transfer function method.

Impact of a Proposed Pier on Tidal Currents:Koa Kood Island, Thailand

The impact of a proposed pier on tidal current alteration was evaluated. The proposed pier location was in Salad Bay on Koa Kood Island, Trat province, Thailand, and was designed to accommodate passenger ships with a draft of less than 2 m. The study began with collecting necessary data, including bathymetric, water elevation and tidal current characteristics. The impact was assessed using a software package (MIKE21). Although the results showed that the pier would affect the existing current pattern, the change was determined to be insignificant, as the design of the piles for the pier provided sufficient spacing to let the current flow as freely as possible. Consequences of the altered current, such as seabed erosion, water stagnation, sediment deposition and navigational risk were assessed. Environmental mitigation measures might be necessary if the impacts were considered unacceptable.

Structural Study of Boron - Nitride Nanotube with Magnetic Resonance (NMR) Parameters Calculation via Density Functional Theory Method (DFT)

A model of (4, 4) single-walled boron-nitride nanotube as a representative of armchair boron-nitride nanotubes studied. At first the structure optimization performed and then Nuclear Magnetic Resonance parameters (NMR) by Density Functional Theory (DFT) method at 11B and 15N nuclei calculated. Resulted parameters evaluation presents electrostatic environment heterogeneity along the nanotube and especially at the ends but the nuclei in a layer feel the same electrostatic environment. All of calculations carried out using Gaussian 98 Software package.