Thermogravimetry Study on Pyrolysis of Various Lignocellulosic Biomass for Potential Hydrogen Production

This paper aims to study decomposition behavior in pyrolytic environment of four lignocellulosic biomass (oil palm shell, oil palm frond, rice husk and paddy straw), and two commercial components of biomass (pure cellulose and lignin), performed in a thermogravimetry analyzer (TGA). The unit which consists of a microbalance and a furnace flowed with 100 cc (STP) min-1 Nitrogen, N2 as inert. Heating rate was set at 20⁰C min-1 and temperature started from 50 to 900⁰C. Hydrogen gas production during the pyrolysis was observed using Agilent Gas Chromatography Analyzer 7890A. Oil palm shell, oil palm frond, paddy straw and rice husk were found to be reactive enough in a pyrolytic environment of up to 900°C since pyrolysis of these biomass starts at temperature as low as 200°C and maximum value of weight loss is achieved at about 500°C. Since there was not much different in the cellulose, hemicelluloses and lignin fractions between oil palm shell, oil palm frond, paddy straw and rice husk, the T-50 and R-50 values obtained are almost similar. H2 productions started rapidly at this temperature as well due to the decompositions of biomass inside the TGA. Biomass with more lignin content such as oil palm shell was found to have longer duration of H2 production compared to materials of high cellulose and hemicelluloses contents.

Instability Problem of Turbo-Machines with Radial Distortion Problems

In the upstream we place a piece of ring and rotate it with 83Hz, 166Hz, 333Hz,and 666H to find the effect of the periodic distortion.In the experiment this type of the perturbation will not allow since the mechanical failure of any parts of the equipment in the upstream will destroy the blade system. This type of study will be only possible by CFD. We use two pumps NS32 (ENSAM) and three blades pump (Tamagawa Univ). The benchmark computations were performed without perturbation parts, and confirm the computational results well agreement in head-flow rate. We obtained the pressure fluctuation growth rate that is representing the global instability of the turbo-system. The fluctuating torque components were 0.01Nm(5000rpm), 0.1Nm(10000rmp), 0.04Nm(20000rmp), 0.15Nm( 40000rmp) respectively. Only for 10000rpm(166Hz) the output toque was random, and it implies that it creates unsteady flow by separations on the blades, and will reduce the pressure loss significantly

Absence of Leave and Job Morality in the ICU

Leave of absence is important in maintaining a good status of human resource quality. Allowing the employees temporarily free from the routine assignments can vitalize the workers- morality and productivity. This is particularly critical to secure a satisfactory service quality for healthcare professionals of which were typically featured with labor intensive and complicated works to perform. As one of the veteran hospitals that were found and operated by the Veteran Department of Taiwan, the nursing staff of the case hospital was squeezed to an extreme minimum level under the pressure of a tight budgeting. Leave of absence on schedule became extremely difficult, especially for the intensive care units (ICU), in which required close monitoring over the cared patients, and that had more easily driven the ICU nurses nervous. Even worse, the deferred leaves were more than 10 days at any time in the ICU because of a fluctuating occupancy. As a result, these had brought a bad setback to this particular nursing team, and consequently defeated the job performance and service quality. To solve this problem and accordingly to strengthen their morality, a project team was organized across different departments specific for this. Sufficient information regarding jobs and positions requirements, labor resources, and actual working hours in detail were collected and analyzed in the team meetings. Several alternatives were finalized. These included job rotating, job combination, leave on impromptu and cross-departmental redeployment. Consequently, the deferred leave days sharply reduced 70% to a level of 3 or less days. This improvement had not only provided good shelter for the ICU nurses that improved their job performance and patient safety but also encouraged the nurses active participating of a project and learned the skills of solving problems with colleagues.

Necessity of Risk Management of Various Industry-Associated Pollutants(Case Study of Gavkhoni Wetland Ecosystem)

Since the beginning of human history, human activities have caused many changes in the environment. Today, a particular attention should be paid to gaining knowledge about water quality of wetlands which are pristine natural environments rich in genetic reserves. If qualitative conditions of industrial areas (in terms of both physicochemical and biological conditions) are not addressed properly, they could cause disruption in natural ecosystems, especially in rivers. With regards to the quality of water resources, determination of pollutant sources plays a pivotal role in engineering projects as well as designing water quality control systems. Thus, using different methods such as flow duration curves, dischargepollution load model and frequency analysis by HYFA software package, risk of various industrial pollutants in international and ecologically important Gavkhoni wetland is analyzed. In this study, a station located at Varzaneh City is used as the last station on Zayanderud River, from where the river water is discharged into the wetland. Results showed that elements- concentrations often exceeded the allowed level and river water can endanger regional ecosystem. In addition, if the river discharge is managed on Q25 basis, this basis can lower concentrations of elements, keeping them within the normal level.

A Simulator for Robot Navigation Algorithms

A robot simulator was developed to measure and investigate the performance of a robot navigation system based on the relative position of the robot with respect to random obstacles in any two dimensional environment. The presented simulator focuses on investigating the ability of a fuzzy-neural system for object avoidance. A navigation algorithm is proposed and used to allow random navigation of a robot among obstacles when the robot faces an obstacle in the environment. The main features of this simulator can be used for evaluating the performance of any system that can provide the position of the robot with respect to obstacles in the environment. This allows a robot developer to investigate and analyze the performance of a robot without implementing the physical robot.

Comparison of Different Neural Network Approaches for the Prediction of Kidney Dysfunction

This paper presents the prediction of kidney dysfunction using different neural network (NN) approaches. Self organization Maps (SOM), Probabilistic Neural Network (PNN) and Multi Layer Perceptron Neural Network (MLPNN) trained with Back Propagation Algorithm (BPA) are used in this study. Six hundred and sixty three sets of analytical laboratory tests have been collected from one of the private clinical laboratories in Baghdad. For each subject, Serum urea and Serum creatinin levels have been analyzed and tested by using clinical laboratory measurements. The collected urea and cretinine levels are then used as inputs to the three NN models in which the training process is done by different neural approaches. SOM which is a class of unsupervised network whereas PNN and BPNN are considered as class of supervised networks. These networks are used as a classifier to predict whether kidney is normal or it will have a dysfunction. The accuracy of prediction, sensitivity and specificity were found for each type of the proposed networks .We conclude that PNN gives faster and more accurate prediction of kidney dysfunction and it works as promising tool for predicting of routine kidney dysfunction from the clinical laboratory data.

A Positioning Matrix to Assess and to Develop CSR Strategies

A company CSR commitment, as stated in its Social Report is, actually, perceived by its stakeholders?And in what measure? Moreover, are stakeholders satisfied with the company CSR efforts? Indeed, business returns from Corporate Social Responsibility (CSR) practices, such as company reputation and customer loyalty, depend heavily on how stakeholders perceive the company social conduct. In this paper, we propose a methodology to assess a company CSR commitment based on Global Reporting Initiative (GRI) indicators, Content Analysis and a CSR positioning matrix. We evaluate three aspects of CSR: the company commitment disclosed through its Social Report; the company commitment perceived by its stakeholders; the CSR commitment that stakeholders require to the company. The positioning of the company under study in the CSR matrix is based on the comparison among the three commitment aspects (disclosed, perceived, required) and it allows assessment and development of CSR strategies.

Numerical Analysis of Concrete Crash Barriers

Reinforced concrete crash barriers used in road traffic must meet a number of criteria. Crash barriers are laid lengthwise, one behind another, and joined using specially designed steel locks. While developing BSV reinforced concrete crash barriers (type ŽPSV), experiments and calculations aimed to optimize the shape of a newly designed lock and the reinforcement quantity and distribution in a crash barrier were carried out. The tension carrying capacity of two parallelly joined locks was solved experimentally. Based on the performed experiments, adjustments of nonlinear properties of steel were performed in the calculations. The obtained results served as a basis to optimize the lock design using a computational model that takes into account the plastic behaviour of steel and the influence of the surrounding concrete [6]. The response to the vehicle impact has been analyzed using a specially elaborated complex computational model, comprising both the nonlinear model of the damping wall or crash barrier and the detailed model of the vehicle [7].

A New Precautionary Method for Measurement and Improvement the Data Quality

the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.

Concept of Automation in Management of Electric Power Systems

An electric power system includes a generating, a transmission, a distribution, and consumers subsystems. An electrical power network in Tanzania keeps growing larger by the day and become more complex so that, most utilities have long wished for real-time monitoring and remote control of electrical power system elements such as substations, intelligent devices, power lines, capacitor banks, feeder switches, fault analyzers and other physical facilities. In this paper, the concept of automation of management of power systems from generation level to end user levels was determined by using Power System Simulator for Engineering (PSS/E) version 30.3.2.

Entrepreneurship, Innovation, Incubator and Economic Development: A Case Study

The objective of this paper is twofold: (1) discuss and analyze the successful case studies worldwide, and (2) identify the similarities and differences of case studies worldwide. Design methodology/approach: The nature of this research is mainly method qualitative (multi-case studies, literature review). This investigation uses ten case studies, and the data was mainly collected and organizational documents from the international countries. Finding: The finding of this research can help incubator manager, policy maker and government parties for successful implementation. Originality/value: This paper contributes to the current literate review on the best practices worldwide. Additionally, it presents future perspective for academicians and practitioners.

Significance of Splitting Method in Non-linear Grid system for the Solution of Navier-Stokes Equation

Solution to unsteady Navier-Stokes equation by Splitting method in physical orthogonal algebraic curvilinear coordinate system, also termed 'Non-linear grid system' is presented. The linear terms in Navier-Stokes equation are solved by Crank- Nicholson method while the non-linear term is solved by the second order Adams-Bashforth method. This work is meant to bring together the advantage of Splitting method as pressure-velocity solver of higher efficiency with the advantage of consuming Non-linear grid system which produce more accurate results in relatively equal number of grid points as compared to Cartesian grid. The validation of Splitting method as a solution of Navier-Stokes equation in Nonlinear grid system is done by comparison with the benchmark results for lid driven cavity flow by Ghia and some case studies including Backward Facing Step Flow Problem.

An Advanced Hybrid P2p Botnet 2.0

Recently, malware attacks have become more serious over the Internet by e-mail, denial of service (DoS) or distributed denial of service (DDoS). The Botnets have become a significant part of the Internet malware attacks. The traditional botnets include three parts – botmaster, command and control (C&C) servers and bots. The C&C servers receive commands from botmaster and control the distributions of computers remotely. Bots use DNS to find the positions of C&C server. In this paper, we propose an advanced hybrid peer-to-peer (P2P) botnet 2.0 (AHP2P botnet 2.0) using web 2.0 technology to hide the instructions from botmaster into social sites, which are regarded as C&C servers. Servent bots are regarded as sub-C&C servers to get the instructions from social sites. The AHP2P botnet 2.0 can evaluate the performance of servent bots, reduce DNS traffics from bots to C&C servers, and achieve harder detection bots actions than IRC-based botnets over the Internet.

New Scheme in Determining nth Order Diagrams for Cross Multiplication Method via Combinatorial Approach

In this paper, a new recursive strategy is proposed for determining $\frac{(n-1)!}{2}$ of $n$th order diagrams. The generalization of $n$th diagram for cross multiplication method were proposed by Pavlovic and Bankier but the specific rule of determining $\frac{(n-1)!}{2}$ of the $n$th order diagrams for square matrix is yet to be discovered. Thus using combinatorial approach, $\frac{(n-1)!}{2}$ of the $n$th order diagrams will be presented as $\frac{(n-1)!}{2}$ starter sets. These starter sets will be generated based on exchanging one element. The advantages of this new strategy are the discarding process was eliminated and the sign of starter set is alternated to each others.

Effect of Natural Animal Fillers on Polymer Rheology Behaviour

This paper deals with the evaluation of flow properties of polymeric matrix with natural animal fillers. Technical university of Liberec cooperates on the long-term development of “green materials“ that should replace conventionally used materials (especially in automotive industry). Natural fibres (of animal and plant origin) from all over the world are collected and adapted (drying, cutting etc.) for extrusion processing. Inside the extruder these natural additives are blended with polymeric (synthetic and biodegradable - PLA) matrix and created compound is subsequently cut for pellets in the wet way. These green materials with unique recipes are then studied and their mechanical, physical and processing properties are determined. The main goal of this research is to develop new ecological materials very similar to unfilled polymers. In this article the rheological behaviour of chosen natural animal fibres is introduced considering their shape and surface that were observed with use of SEM microscopy.

Approximating Maximum Weighted Independent Set Using Vertex Support

The Maximum Weighted Independent Set (MWIS) problem is a classic graph optimization NP-hard problem. Given an undirected graph G = (V, E) and weighting function defined on the vertex set, the MWIS problem is to find a vertex set S V whose total weight is maximum subject to no two vertices in S are adjacent. This paper presents a novel approach to approximate the MWIS of a graph using minimum weighted vertex cover of the graph. Computational experiments are designed and conducted to study the performance of our proposed algorithm. Extensive simulation results show that the proposed algorithm can yield better solutions than other existing algorithms found in the literature for solving the MWIS.

Application of Extreme Learning Machine Method for Time Series Analysis

In this paper, we study the application of Extreme Learning Machine (ELM) algorithm for single layered feedforward neural networks to non-linear chaotic time series problems. In this algorithm the input weights and the hidden layer bias are randomly chosen. The ELM formulation leads to solving a system of linear equations in terms of the unknown weights connecting the hidden layer to the output layer. The solution of this general system of linear equations will be obtained using Moore-Penrose generalized pseudo inverse. For the study of the application of the method we consider the time series generated by the Mackey Glass delay differential equation with different time delays, Santa Fe A and UCR heart beat rate ECG time series. For the choice of sigmoid, sin and hardlim activation functions the optimal values for the memory order and the number of hidden neurons which give the best prediction performance in terms of root mean square error are determined. It is observed that the results obtained are in close agreement with the exact solution of the problems considered which clearly shows that ELM is a very promising alternative method for time series prediction.

Predicting the Minimum Free Energy RNA Secondary Structures using Harmony Search Algorithm

The physical methods for RNA secondary structure prediction are time consuming and expensive, thus methods for computational prediction will be a proper alternative. Various algorithms have been used for RNA structure prediction including dynamic programming and metaheuristic algorithms. Musician's behaviorinspired harmony search is a recently developed metaheuristic algorithm which has been successful in a wide variety of complex optimization problems. This paper proposes a harmony search algorithm (HSRNAFold) to find RNA secondary structure with minimum free energy and similar to the native structure. HSRNAFold is compared with dynamic programming benchmark mfold and metaheuristic algorithms (RnaPredict, SetPSO and HelixPSO). The results showed that HSRNAFold is comparable to mfold and better than metaheuristics in finding the minimum free energies and the number of correct base pairs.

An Improved QRS Complex Detection for Online Medical Diagnosis

This paper presents the work of signal discrimination specifically for Electrocardiogram (ECG) waveform. ECG signal is comprised of P, QRS, and T waves in each normal heart beat to describe the pattern of heart rhythms corresponds to a specific individual. Further medical diagnosis could be done to determine any heart related disease using ECG information. The emphasis on QRS Complex classification is further discussed to illustrate the importance of it. Pan-Tompkins Algorithm, a widely known technique has been adapted to realize the QRS Complex classification process. There are eight steps involved namely sampling, normalization, low pass filter, high pass filter (build a band pass filter), derivation, squaring, averaging and lastly is the QRS detection. The simulation results obtained is represented in a Graphical User Interface (GUI) developed using MATLAB.

An Optimized Multi-block Method for Turbulent Flows

A major part of the flow field involves no complicated turbulent behavior in many turbulent flows. In this research work, in order to reduce required memory and CPU time, the flow field was decomposed into several blocks, each block including its special turbulence. A two dimensional backward facing step was considered here. Four combinations of the Prandtl mixing length and standard k- E models were implemented as well. Computer memory and CPU time consumption in addition to numerical convergence and accuracy of the obtained results were mainly investigated. Observations showed that, a suitable combination of turbulence models in different blocks led to the results with the same accuracy as the high order turbulence model for all of the blocks, in addition to the reductions in memory and CPU time consumption.