Applications of Entropy Measures in Field of Queuing Theory

In the present communication, we have studied different variations in the entropy measures in the different states of queueing processes. In case of steady state queuing process, it has been shown that as the arrival rate increases, the uncertainty increases whereas in the case of non-steady birth-death process, it is shown that the uncertainty varies differently. In this pattern, it first increases and attains its maximum value and then with the passage of time, it decreases and attains its minimum value.

Construction of Water Electrolyzer for Single Slice O2/H2 Polymer Electrolyte Membrane Fuel Cell

In the first part of the research work, an electrolyzer (10.16 cm dia and 24.13 cm height) to produce hydrogen and oxygen was constructed for single slice O2/H2 fuel cell using cation exchange membrane. The electrolyzer performance was tested with 23% NaOH, 30% NaOH, 30% KOH and 35% KOH electrolyte solution with current input 4 amp and 2.84 V from the rectifier. Rates of volume of hydrogen produced were 0.159 cm3/sec, 0.155 cm3/sec, 0.169 cm3/sec and 0.163 cm3/sec respectively from 23% NaOH, 30% NaOH, 30% KOH and 35% KOH solution. Rates of volume of oxygen produced were 0.212 cm3/sec, 0.201 cm3/sec, 0.227 cm3/sec and 0.219 cm3/sec respectively from 23% NaOH, 30% NaOH, 30% KOH and 35% KOH solution (1.5 L). In spite of being tested the increased concentration of electrolyte solution, the gas rate does not change significantly. Therefore, inexpensive 23% NaOH electrolyte solution was chosen to use as the electrolyte in the electrolyzer. In the second part of the research work, graphite serpentine flow plates, fiberglass end plates, stainless steel screen electrodes, silicone rubbers were made to assemble the single slice O2/H2 polymer electrolyte membrane fuel cell (PEMFC).

Performance Improvement of Moving Object Recognition and Tracking Algorithm using Parallel Processing of SURF and Optical Flow

The paper proposes a way of parallel processing of SURF and Optical Flow for moving object recognition and tracking. The object recognition and tracking is one of the most important task in computer vision, however disadvantage are many operations cause processing speed slower so that it can-t do real-time object recognition and tracking. The proposed method uses a typical way of feature extraction SURF and moving object Optical Flow for reduce disadvantage and real-time moving object recognition and tracking, and parallel processing techniques for speed improvement. First analyse that an image from DB and acquired through the camera using SURF for compared to the same object recognition then set ROI (Region of Interest) for tracking movement of feature points using Optical Flow. Secondly, using Multi-Thread is for improved processing speed and recognition by parallel processing. Finally, performance is evaluated and verified efficiency of algorithm throughout the experiment.

Moving Vehicles Detection Using Automatic Background Extraction

Vehicle detection is the critical step for highway monitoring. In this paper we propose background subtraction and edge detection technique for vehicle detection. This technique uses the advantages of both approaches. The practical applications approved the effectiveness of this method. This method consists of two procedures: First, automatic background extraction procedure, in which the background is extracted automatically from the successive frames; Second vehicles detection procedure, which depend on edge detection and background subtraction. Experimental results show the effective application of this algorithm. Vehicles detection rate was higher than 91%.

Generalized Differential Quadrature Nonlinear Consolidation Analysis of Clay Layer with Time-Varied Drainage Conditions

In this article, the phenomenon of nonlinear consolidation in saturated and homogeneous clay layer is studied. Considering time-varied drainage model, the excess pore water pressure in the layer depth is calculated. The Generalized Differential Quadrature (GDQ) method is used for the modeling and numerical analysis. For the purpose of analysis, first the domain of independent variables (i.e., time and clay layer depth) is discretized by the Chebyshev-Gauss-Lobatto series and then the nonlinear system of equations obtained from the GDQ method is solved by means of the Newton-Raphson approach. The obtained results indicate that the Generalized Differential Quadrature method, in addition to being simple to apply, enjoys a very high accuracy in the calculation of excess pore water pressure.

3D Definition for Human Smiles

The study explored varied types of human smiles and extracted most of the key factors affecting the smiles. These key factors then were converted into a set of control points which could serve to satisfy the needs for creation of facial expression for 3D animators and be further applied to the face simulation for robots in the future. First, hundreds of human smile pictures were collected and analyzed to identify the key factors for face expression. Then, the factors were converted into a set of control points and sizing parameters calculated proportionally. Finally, two different faces were constructed for validating the parameters via the process of simulating smiles of the same type as the original one.

A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment

According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.

White Blood Cells Identification and Counting from Microscopic Blood Image

The counting and analysis of blood cells allows the evaluation and diagnosis of a vast number of diseases. In particular, the analysis of white blood cells (WBCs) is a topic of great interest to hematologists. Nowadays the morphological analysis of blood cells is performed manually by skilled operators. This involves numerous drawbacks, such as slowness of the analysis and a nonstandard accuracy, dependent on the operator skills. In literature there are only few examples of automated systems in order to analyze the white blood cells, most of which only partial. This paper presents a complete and fully automatic method for white blood cells identification from microscopic images. The proposed method firstly individuates white blood cells from which, subsequently, nucleus and cytoplasm are extracted. The whole work has been developed using MATLAB environment, in particular the Image Processing Toolbox.

The Development and Examination of a Teaching Commitment Scale for Elementary School Health and Physical Education Teachers

The purpose of this study was to develop and examine a Teaching Commitment Scale of Health and Physical Education (TCS-HPE) for Taiwanese elementary school teachers. First of all, based on teaching commitment related theory and literatures to develop a original scale with 40 items, later both stratified random sampling and cluster sampling were used to sample participants. During the first stage, 300 teachers were sampled and 251 valid scales (83.7%) returned. Later, the data was analyzed by exploratory factor analysis to obtain 74.30% of total variance for the construct validity. The Cronbach-s alpha coefficient of sum scale reliability was 0.94, and subscale coefficients were between 0.80 and 0.96. In the second stage, 400 teachers were sampled and 318 valid scales (79.5%) returned. Finally, this study used confirmatory factor analysis to test validity and reliability of TCS-HPE. The result showed that the fit indexes reached acceptable criteria(¤ç2 (246 ) =557.64 , p

Robust Detection of R-Wave Using Wavelet Technique

Electrocardiogram (ECG) is considered to be the backbone of cardiology. ECG is composed of P, QRS & T waves and information related to cardiac diseases can be extracted from the intervals and amplitudes of these waves. The first step in extracting ECG features starts from the accurate detection of R peaks in the QRS complex. We have developed a robust R wave detector using wavelets. The wavelets used for detection are Daubechies and Symmetric. The method does not require any preprocessing therefore, only needs the ECG correct recordings while implementing the detection. The database has been collected from MIT-BIH arrhythmia database and the signals from Lead-II have been analyzed. MatLab 7.0 has been used to develop the algorithm. The ECG signal under test has been decomposed to the required level using the selected wavelet and the selection of detail coefficient d4 has been done based on energy, frequency and cross-correlation analysis of decomposition structure of ECG signal. The robustness of the method is apparent from the obtained results.

Lean Changeability – Evaluation and Design of Lean and Transformable Factories

In today-s turbulent environment, companies are faced with two principal challenges. On the one hand, it is necessary to produce ever more cost-effectively to remain competitive. On the other hand, factories need to be transformable in order to manage unpredictable changes in the corporate environment. To deal with these different challenges, companies use the philosophy of lean production in the first case, in the second case the philosophy of transformability. To a certain extent these two approaches follow different directions. This can cause conflicts when designing factories. Therefore, the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover has developed a procedure to allow companies to evaluate and design their factories with respect to the requirements of both philosophies.

Quantitative Evaluation of Frameworks for Web Applications

An empirical study of web applications that use software frameworks is presented here. The analysis is based on two approaches. In the first, developers using such frameworks are required, based on their experience, to assign weights to parameters such as database connection. In the second approach, a performance testing tool, OpenSTA, is used to compute start time and other such measures. From such an analysis, it is concluded that open source software is superior to proprietary software. The motivation behind this research is to examine ways in which a quantitative assessment can be made of software in general and frameworks in particular. Concepts such as metrics and architectural styles are discussed along with previously published research.

Changes in Subjective and Objective Measures of Performance in Ramadan

The Muslim faith requires individuals to fast between the hours of sunrise and sunset during the month of Ramadan. Our recent work has concentrated on some of the changes that take place during the daytime when fasting. A questionnaire was developed to assess subjective estimates of physical, mental and social activities, and fatigue. Four days were studied: in the weeks before and after Ramadan (control days) and during the first and last weeks of Ramadan (experimental days). On each of these four days, this questionnaire was given several times during the daytime and once after the fast had been broken and just before individuals retired at night. During Ramadan, daytime mental, physical and social activities all decreased below control values but then increased to abovecontrol values in the evening. The desires to perform physical and mental activities showed very similar patterns. That is, individuals tried to conserve energy during the daytime in preparation for the evenings when they ate and drank, often with friends. During Ramadan also, individuals were more fatigued in the daytime and napped more often than on control days. This extra fatigue probably reflected decreased sleep, individuals often having risen earlier (before sunrise, to prepare for fasting) and retired later (to enable recovery from the fast). Some physiological measures and objective measures of performance (including the response to a bout of exercise) have also been investigated. Urine osmolality fell during the daytime on control days as subjects drank, but rose in Ramadan to reach values at sunset indicative of dehydration. Exercise performance was also compromised, particularly late in the afternoon when the fast had lasted several hours. Self-chosen exercise work-rates fell and a set amount of exercise felt more arduous. There were also changes in heart rate and lactate accumulation in the blood, indicative of greater cardiovascular and metabolic stress caused by the exercise in subjects who had been fasting. Daytime fasting in Ramadan produces widespread effects which probably reflect combined effects of sleep loss and restrictions to intakes of water and food.

An Evaluation of Carbon Dioxide Emissions Trading among Enterprises -The Tokyo Cap and Trade Program-

This study aims to propose three evaluation methods to evaluate the Tokyo Cap and Trade Program when emissions trading is performed virtually among enterprises, focusing on carbon dioxide (CO2), which is the only emitted greenhouse gas that tends to increase. The first method clarifies the optimum reduction rate for the highest cost benefit, the second discusses emissions trading among enterprises through market trading, and the third verifies long-term emissions trading during the term of the plan (2010-2019), checking the validity of emissions trading partly using Geographic Information Systems (GIS). The findings of this study can be summarized in the following three points. 1. Since the total cost benefit is the greatest at a 44% reduction rate, it is possible to set it more highly than that of the Tokyo Cap and Trade Program to get more total cost benefit. 2. At a 44% reduction rate, among 320 enterprises, 8 purchasing enterprises and 245 sales enterprises gain profits from emissions trading, and 67 enterprises perform voluntary reduction without conducting emissions trading. Therefore, to further promote emissions trading, it is necessary to increase the sales volumes of emissions trading in addition to sales enterprises by increasing the number of purchasing enterprises. 3. Compared to short-term emissions trading, there are few enterprises which benefit in each year through the long-term emissions trading of the Tokyo Cap and Trade Program. Only 81 enterprises at the most can gain profits from emissions trading in FY 2019. Therefore, by setting the reduction rate more highly, it is necessary to increase the number of enterprises that participate in emissions trading and benefit from the restraint of CO2 emissions.

Project Selection by Using Fuzzy AHP and TOPSIS Technique

In this article, by using fuzzy AHP and TOPSIS technique we propose a new method for project selection problem. After reviewing four common methods of comparing alternatives investment (net present value, rate of return, benefit cost analysis and payback period) we use them as criteria in AHP tree. In this methodology by utilizing improved Analytical Hierarchy Process by Fuzzy set theory, first we try to calculate weight of each criterion. Then by implementing TOPSIS algorithm, assessment of projects has been done. Obtained results have been tested in a numerical example.

Cryptography Over Elliptic Curve Of The Ring Fq[e], e4 = 0

Groups where the discrete logarithm problem (DLP) is believed to be intractable have proved to be inestimable building blocks for cryptographic applications. They are at the heart of numerous protocols such as key agreements, public-key cryptosystems, digital signatures, identification schemes, publicly verifiable secret sharings, hash functions and bit commitments. The search for new groups with intractable DLP is therefore of great importance.The goal of this article is to study elliptic curves over the ring Fq[], with Fq a finite field of order q and with the relation n = 0, n ≥ 3. The motivation for this work came from the observation that several practical discrete logarithm-based cryptosystems, such as ElGamal, the Elliptic Curve Cryptosystems . In a first time, we describe these curves defined over a ring. Then, we study the algorithmic properties by proposing effective implementations for representing the elements and the group law. In anther article we study their cryptographic properties, an attack of the elliptic discrete logarithm problem, a new cryptosystem over these curves.

The Nanobiotechnology of Obtaining of Collagen Gels from Marin Fish Skin and Yours Reological Properties for using Like New Materials in Dental Medicine

This paper aims at presenting the biotechnology used to obtain collagen-based gels from shark (Squalus acanthias) and brill skin, marine fish growing in the Black Sea. Due to the structure of its micro-fibres, collagen can be considered a nanomaterial; in order to use collagen-based matrixes as biomaterial, rheological studies must be performed first, to state whether they are stable or not. For the triple-helix structure to remain stable within these gels at room or human body temperature, they must be stabilized by reticulation.

Biospeckle Techniques in Quality Evaluation of Indian Fruits

In this study spatial-temporal speckle correlation techniques have been applied for the quality evaluation of three different Indian fruits namely apple, pear and tomato for the first time. The method is based on the analysis of variations of laser light scattered from biological samples. The results showed that crosscorrelation coefficients of biospeckle patterns change subject to their freshness and the storage conditions. The biospeckle activity was determined by means of the cross-correlation functions of the intensity fluctuations. Significant changes in biospeckle activity were observed during their shelf lives. From the study, it is found that the biospeckle activity decreases with the shelf-life storage time. Further it has been shown that biospeckle activity changes according to their respiration rates.

Prioritizing Service Quality Dimensions:A Neural Network Approach

One of the determinants of a firm-s prosperity is the customers- perceived service quality and satisfaction. While service quality is wide in scope, and consists of various dimensions, there may be differences in the relative importance of these dimensions in affecting customers- overall satisfaction of service quality. Identifying the relative rank of different dimensions of service quality is very important in that it can help managers to find out which service dimensions have a greater effect on customers- overall satisfaction. Such an insight will consequently lead to more effective resource allocation which will finally end in higher levels of customer satisfaction. This issue –despite its criticality- has not received enough attention so far. Therefore, using a sample of 240 bank customers in Iran, an artificial neural network is developed to address this gap in the literature. As customers- evaluation of service quality is a subjective process, artificial neural networks –as a brain metaphor- may appear to have a potentiality to model such a complicated process. Proposing a neural network which is able to predict the customers- overall satisfaction of service quality with a promising level of accuracy is the first contribution of this study. In addition, prioritizing the service quality dimensions in affecting customers- overall satisfaction –by using sensitivity analysis of neural network- is the second important finding of this paper.

Hierarchical PSO-Adaboost Based Classifiers for Fast and Robust Face Detection

We propose a fast and robust hierarchical face detection system which finds and localizes face images with a cascade of classifiers. Three modules contribute to the efficiency of our detector. First, heterogeneous feature descriptors are exploited to enrich feature types and feature numbers for face representation. Second, a PSO-Adaboost algorithm is proposed to efficiently select discriminative features from a large pool of available features and reinforce them into the final ensemble classifier. Compared with the standard exhaustive Adaboost for feature selection, the new PSOAdaboost algorithm reduces the training time up to 20 times. Finally, a three-stage hierarchical classifier framework is developed for rapid background removal. In particular, candidate face regions are detected more quickly by using a large size window in the first stage. Nonlinear SVM classifiers are used instead of decision stump functions in the last stage to remove those remaining complex nonface patterns that can not be rejected in the previous two stages. Experimental results show our detector achieves superior performance on the CMU+MIT frontal face dataset.