Evaluating the Effect of Domestic Price on Rice Production in an African Setting: A Typical Evidence of the Sierra Leone Case

Rice, which is the staple food in Sierra Leone, is consumed on a daily basis. It is the most imperative food crop extensively grown by farmers across all ecologies in the country. Though much attention is now given to rice grain production through the small holder commercialization programme (SHCP), however, no attention has been given in investigating the limitations faced by rice producers. This paper will contribute to attempts to overcome the development challenges caused by food insecurity. The objective of this paper is thus, to analysis the relationship between rice production and the domestic retail price of rice. The study employed a log linear model in which, the quantity of rice produced is the dependent variable, quantity of rice imported, price of imported rice and price of domestic rice as explanatory variables. Findings showed that, locally produced rice is even more expensive than the imported rice per ton, and almost all the inhabitants in the capital city which hosts about 65% of the entire population of the country favor imported rice, as it is free from stones with other impurities. On the other hand, to control price and simultaneously increase rice production, the government should purchase the rice from the farmers and then sell to private retailers.

Determination and Preconcentration of Iron (II) in Aqueous Solution with Amberlite XAD-4 Functionalized with 1-amino-2-naphthole by Flame Atomic Absorption Spectrometry

A new chelating resin is prepared by coupling Amberlite XAD-4 with 1-amino-2-naphthole through an azo spacer. The resulting sorbent has been characterized by FT-IR, elemental analysis and thermogravimetric analysis (TGA) and studied for preconcentrating of Fe (II) using flame atomic absorption spectrometry (FAAS) for metal monitoring. The optimum pH value for sorption of the iron ions was 6.5. The resin was subjected to evaluation through batch binding of mentioned metal ion. Quantitative desorption occurs instantaneously with 0.5 M HNO3. The sorption capacity was found 4.1 mmol.g-1 of resin for Fe (II) in the aqueous solution. The chelating resin can be reused for 10 cycles of sorption-desorption without any significant change in sorption capacity. A recovery of 97% was obtained the metal ions with 0.5 M HNO3 as eluting agent. The method was applied for metal ions determination from industrial waste water sample.

A Proposal of an Automatic Formatting Method for Transforming XML Data

PPX(Pretty Printer for XML) is a query language that offers a concise description method of formatting the XML data into HTML. In this paper, we propose a simple specification of formatting method that is a combination description of automatic layout operators and variables in the layout expression of the GENERATE clause of PPX. This method can automatically format irregular XML data included in a part of XML with layout decision rule that is referred to DTD. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing same tasks.

FT-IR Study of Stabilized PAN Fibers for Fabrication of Carbon Fibers

In this investigation, types of commercial and special polyacrylonitrile (PAN) fibers contain sodium 2-methyl-2- acrylamidopropane sulfonate (SAMPS) and itaconic acid (IA) comonomers were studied by fourier transform infrared (FT-IR) spectroscopy. The study of FT-IR spectra of PAN fibers samples with different comonomers shows that during stabilization of PAN fibers, the peaks related to C≡N bonds and CH2 are reduced sharply. These reductions are related to cyclization of nitrile groups and stabilization procedure. This reduction in PAN fibers contain IA comonomer is very intense in comparison with PAN fibers contain SAMPS comonomer. This fact indicates the cycling and stabilization for sample contain IA comonomer have been conducted more completely. Therefore the carbon fibers produced from this material have higher tensile strength due to suitable stabilization.

Impact of Environmental Factors on Profit Efficiency of Rice Production: A Study in Vietnam-s Red River Delta

Environmental factors affect agriculture production productivity and efficiency resulted in changing of profit efficiency. This paper attempts to estimate the impacts of environmental factors to profitability of rice farmers in the Red River Delta of Vietnam. The dataset was extracted from 349 rice farmers using personal interviews. Both OLS and MLE trans-log profit functions were used in this study. Five production inputs and four environmental factors were included in these functions. The estimation of the stochastic profit frontier with a two-stage approach was used to measure profitability. The results showed that the profit efficiency was about 75% on the average and environmental factors change profit efficiency significantly beside farm specific characteristics. Plant disease, soil fertility, irrigation apply and water pollution were the four environmental factors cause profit loss in rice production. The result indicated that farmers should reduce household size, farm plots, apply row seeding technique and improve environmental factors to obtain high profit efficiency with special consideration is given for irrigation water quality improvement.

A New Approach for Mobile Agent Security

A mobile agent is a software which performs an action autonomously and independently as a person or an organizations assistance. Mobile agents are used for searching information, retrieval information, filtering, intruder recognition in networks, and so on. One of the important issues of mobile agent is their security. It must consider different security issues in effective and secured usage of mobile agent. One of those issues is the integrity-s protection of mobile agents. In this paper, the advantages and disadvantages of each method, after reviewing the existing methods, is examined. Regarding to this matter that each method has its own advantage or disadvantage, it seems that by combining these methods, one can reach to a better method for protecting the integrity of mobile agents. Therefore, this method is provided in this paper and then is evaluated in terms of existing method. Finally, this method is simulated and its results are the sign of improving the possibility of integrity-s protection of mobile agents.

A New Approach for Image Segmentation using Pillar-Kmeans Algorithm

This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.

2Taiwan Public Corporation's Participation in the Mechanism of Payment for Environmental Services

The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC-s participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC-s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC-s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC-s consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC-s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.

Laser Surface Hardening Considering Coupled Thermoelasticity using an Eulerian Formulations

Thermoelastic temperature, displacement, and stress in heat transfer during laser surface hardening are solved in Eulerian formulation. In Eulerian formulations the heat flux is fixed in space and the workpiece is moved through a control volume. In the case of uniform velocity and uniform heat flux distribution, the Eulerian formulations leads to a steady-state problem, while the Lagrangian formulations remains transient. In Eulerian formulations the reduction to a steady-state problem increases the computational efficiency. In this study also an analytical solution is developed for an uncoupled transient heat conduction equation in which a plane slab is heated by a laser beam. The thermal result of the numerical model is compared with the result of this analytical model. Comparing the results shows numerical solution for uncoupled equations are in good agreement with the analytical solution.

Impacts of Project-Overload on Innovation inside Organizations: Agent-Based Modeling

Market competition and a desire to gain advantages on globalized market, drives companies towards innovation efforts. Project overload is an unpleasant phenomenon, which is happening for employees inside those organizations trying to make the most efficient use of their resources to be innovative. But what are the impacts of project overload on organization-s innovation capabilities? Advanced engineering teams (AE) inside a major heavy equipment manufacturer are suffering from project overload in their quest for innovation. In this paper, Agent-based modeling (ABM) is used to examine the current reality of the company context, and of the AE team, where the opportunities and challenges for reducing the risk of project overload and moving towards innovation were identified. Project overload is more likely to stifle innovation and creativity inside teams. On the other hand, motivations on proper challenging goals are more likely to help individual to alleviate the negative aspects of low level of project overload.

Metadata Update Mechanism Improvements in Data Grid

Grid environments include aggregation of geographical distributed resources. Grid is put forward in three types of computational, data and storage. This paper presents a research on data grid. Data grid is used for covering and securing accessibility to data from among many heterogeneous sources. Users are not worry on the place where data is located in it, provided that, they should get access to the data. Metadata is used for getting access to data in data grid. Presently, application metadata catalogue and SRB middle-ware package are used in data grids for management of metadata. At this paper, possibility of updating, streamlining and searching is provided simultaneously and rapidly through classified table of preserving metadata and conversion of each table to numerous tables. Meanwhile, with regard to the specific application, the most appropriate and best division is set and determined. Concurrency of implementation of some of requests and execution of pipeline is adaptability as a result of this technique.

Multi-Font Farsi/Arabic Isolated Character Recognition Using Chain Codes

Nowadays, OCR systems have got several applications and are increasingly employed in daily life. Much research has been done regarding the identification of Latin, Japanese, and Chinese characters. However, very little investigation has been performed regarding Farsi/Arabic characters recognition. Probably the reason is difficulty and complexity of those characters identification compared to the others and limitation of IT activities in Farsi and Arabic speaking countries. In this paper, a technique has been employed to identify isolated Farsi/Arabic characters. A chain code based algorithm along with other significant peculiarities such as number and location of dots and auxiliary parts, and the number of holes existing in the isolated character has been used in this study to identify Farsi/Arabic characters. Experimental results show the relatively high accuracy of the method developed when it is tested on several standard Farsi fonts.

An Experimental Comparison of Unsupervised Learning Techniques for Face Recognition

Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.

Optimization of Process Parameters for Diesters Biolubricant using D-optimal Design

Optimization study of the diesters biolubricant oleyl 9(12)-hydroxy-10(13)-oleioxy-12(9)-octadecanoate (OLHYOOT) was synthesized in the presence of sulfuric acid (SA) as catalyst has been done. Optimum conditions of the experiment to obtain high yield% of OLHYOOT were predicted at ratio of OL/HYOOA of 1:1 g/g, ratio of SA/HYOOA of 0.20:1 g/g, reaction temperature 110 °C and 4.5 h of reaction time. At this condition, the Yield% of OLHYOOT was 88.7. Disappearance of carboxylic acid (C=O) peak has observed by FTIR with appearance ester (C=O) at 1738 cm-1. 1H NMR spectra analyses confirmed the result of OLHYOOT with appearance ester (-CHOCOR) at 4.05ppm and also the 13C-NMR confirmed the result with appearance ester (C=O) peak at 173.93ppm.

Physical and Chemical Investigation of Polycaprolactone, Nanohydroxyapatite and Poly (Vinyl Alcohol) Nanocomposite Scaffolds

Aligned and random nanofibrous scaffolds of PVA/PCL/nHA were fabricated by electrospinning method. The composite nanofibrous scaffolds were subjected to detailed analysis. Morphological investigations revealed that the prepared nanofibers have uniform morphology and the average fiber diameters of aligned and random scaffolds were 135.5 and 290 nm, respectively. The obtained scaffolds have a porous structure with porosity of 88 and 76% for random and aligned nanofibers, respectively. Furthermore, FTIR analysis demonstrated that there were strong intramolecular interactions between the molecules of PVA/PCL/nHA. On the other hand, mechanical characterizations show that aligning the nanofibers, could significantly improve the rigidity of the resultant biocomposite nanofibrous scaffolds.

Modeling of Catalyst Deactivation in Catalytic Wet Air Oxidation of Phenol in Fixed Bed Three-Phase Reactor

Modeling and simulation of fixed bed three-phase catalytic reactors are considered for wet air catalytic oxidation of phenol to perform a comparative numerical analysis between tricklebed and packed-bubble column reactors. The modeling involves material balances both for the catalyst particle as well as for different fluid phases. Catalyst deactivation is also considered in a transient reactor model to investigate the effects of various parameters including reactor temperature on catalyst deactivation. The simulation results indicated that packed-bubble columns were slightly superior in performance than trickle beds. It was also found that reaction temperature was the most effective parameter in catalyst deactivation.

Compiler-Based Architecture for Context Aware Frameworks

Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.

Numerical Solution of Infinite Boundary Integral Equation by Using Galerkin Method with Laguerre Polynomials

In this paper the exact solution of infinite boundary integral equation (IBIE) of the second kind with degenerate kernel is presented. Moreover Galerkin method with Laguerre polynomial is applied to get the approximate solution of IBIE. Numerical examples are given to show the validity of the method presented.

Evaluation Framework for Agent-Oriented Methodologies

Many agent-oriented software engineering methodologies have been proposed for software developing; however their application is still limited due to their lack of maturity. Evaluating the strengths and weaknesses of these methodologies plays an important role in improving them and in developing new stronger methodologies. This paper presents an evaluation framework for agent-oriented methodologies, which addresses six major areas: concepts, notation, process, pragmatics, support for software engineering and marketability. The framework is then used to evaluate the Gaia methodology to identify its strengths and weaknesses, and to prove the ability of the framework for promoting the agent-oriented methodologies by detecting their weaknesses in detail.

Prioritizing Service Quality Dimensions:A Neural Network Approach

One of the determinants of a firm-s prosperity is the customers- perceived service quality and satisfaction. While service quality is wide in scope, and consists of various dimensions, there may be differences in the relative importance of these dimensions in affecting customers- overall satisfaction of service quality. Identifying the relative rank of different dimensions of service quality is very important in that it can help managers to find out which service dimensions have a greater effect on customers- overall satisfaction. Such an insight will consequently lead to more effective resource allocation which will finally end in higher levels of customer satisfaction. This issue –despite its criticality- has not received enough attention so far. Therefore, using a sample of 240 bank customers in Iran, an artificial neural network is developed to address this gap in the literature. As customers- evaluation of service quality is a subjective process, artificial neural networks –as a brain metaphor- may appear to have a potentiality to model such a complicated process. Proposing a neural network which is able to predict the customers- overall satisfaction of service quality with a promising level of accuracy is the first contribution of this study. In addition, prioritizing the service quality dimensions in affecting customers- overall satisfaction –by using sensitivity analysis of neural network- is the second important finding of this paper.