Synthesis and Characterization of Chromium (III) Complexes with L-Glutamic Acid, Glycine and LCysteine

Some Chromium (III) complexes were synthesized with three amino acids: L Glutamic Acid, Glycine, and L-cysteine as the ligands, in order to provide a new supplement containing Cr(III) for patients with type 2 diabetes mellitus. The complexes have been prepared by refluxing a mixture of Chromium(III) chloride in aqueous solution with L-glutamic acid, Glycine, and L-cysteine after pH adjustment by sodium hydroxide. These complexes were characterized by Infrared and Uv-Vis spectrophotometer and Elemental analyzer. The product yields of four products were 87.50 and 56.76% for Cr-Glu complexes, 46.70% for Cr-Gly complex and 40.08% for Cr-Cys complex respectively. The predicted structure of the complexes are [Cr(glu)2(H2O)2].xH2O, Cr(gly)3..xH2O and Cr(cys)3.xH2O., respectively.

A Novel VLSI Architecture of Hybrid Image Compression Model based on Reversible Blockade Transform

Image compression can improve the performance of the digital systems by reducing time and cost in image storage and transmission without significant reduction of the image quality. Furthermore, the discrete cosine transform has emerged as the new state-of-the art standard for image compression. In this paper, a hybrid image compression technique based on reversible blockade transform coding is proposed. The technique, implemented over regions of interest (ROIs), is based on selection of the coefficients that belong to different transforms, depending on the coefficients is proposed. This method allows: (1) codification of multiple kernals at various degrees of interest, (2) arbitrary shaped spectrum,and (3) flexible adjustment of the compression quality of the image and the background. No standard modification for JPEG2000 decoder was required. The method was applied over different types of images. Results show a better performance for the selected regions, when image coding methods were employed for the whole set of images. We believe that this method is an excellent tool for future image compression research, mainly on images where image coding can be of interest, such as the medical imaging modalities and several multimedia applications. Finally VLSI implementation of proposed method is shown. It is also shown that the kernal of Hartley and Cosine transform gives the better performance than any other model.

Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Effect of Silver Nanoparticles Size Prepared by Photoreduction Method on Optical Absorption Spectra of TiO2/Ag/N719 Dye Composite Films

TiO2/Ag composite films were prepared by incorporating Ag in the pores of mesoporous TiO2 films using a photoreduction method. The Ag nanoparticle sizes were in a range of 3.66-38.56 nm. The TiO2/Ag composite films were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM) and transmission electron microscropy (TEM). The TiO2 films and TiO2/Ag composite films were immersed in a 0.3 mM N719 dye solution and characterized by UV-Vis spectrophotometer. The TiO2/Ag/N719 composite film showed that an optimal size of Ag nanoparticles was 19.12 nm and, hence, gave the maximum optical absorption spectra. The improved absorption was due to surface plasmon resonance induced by the Ag nanoparticles to enhance the absorption coefficient of the dye.

Enthalpies of Dissociation of Pure Methane and Carbon Dioxide Gas Hydrate

In this study the enthalpies of dissociation for pure methane and pure carbon dioxide was calculated using a hydrate equilibrium data obtained in this study. The enthalpy of dissociation was determined using Clausius-Clapeyron equation. The results were compared with the values reported in literature obtained using various techniques.

Influence of Apo E Polymorphism on Coronary Artery Disease

The ε4 allele of the ε2, ε3 and ε4 protein isoform polymorphism in the gene encoding apolipoprotein E (Apo E) has previously been associated with increased cardiac artery disease (CAD); therefore to investigate the significance of this polymorphism in pathogenesis of CAD in Iranian patients with stenosis and control subjects. To investigate the association between  Apo E polymorphism and coronary artery disease we performed a comparative case control study of the frequency of Apo E  polymorphism in One hundred CAD patients with stenosis who underwent coronary angiography (>50% stenosis) and 100 control subjects (

Secure Block-Based Video Authentication with Localization and Self-Recovery

Because of the great advance in multimedia technology, digital multimedia is vulnerable to malicious manipulations. In this paper, a public key self-recovery block-based video authentication technique is proposed which can not only precisely localize the alteration detection but also recover the missing data with high reliability. In the proposed block-based technique, multiple description coding MDC is used to generate two codes (two descriptions) for each block. Although one block code (one description) is enough to rebuild the altered block, the altered block is rebuilt with better quality by the two block descriptions. So using MDC increases the ratability of recovering data. A block signature is computed using a cryptographic hash function and a doubly linked chain is utilized to embed the block signature copies and the block descriptions into the LSBs of distant blocks and the block itself. The doubly linked chain scheme gives the proposed technique the capability to thwart vector quantization attacks. In our proposed technique , anyone can check the authenticity of a given video using the public key. The experimental results show that the proposed technique is reliable for detecting, localizing and recovering the alterations.

Taiwan Sugar Corporation's Participation in the Mechanism of Payment for Environmental Services (PES)

The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.

An Empirical Analysis of Arabic WebPages Classification using Fuzzy Operators

In this study, a fuzzy similarity approach for Arabic web pages classification is presented. The approach uses a fuzzy term-category relation by manipulating membership degree for the training data and the degree value for a test web page. Six measures are used and compared in this study. These measures include: Einstein, Algebraic, Hamacher, MinMax, Special case fuzzy and Bounded Difference approaches. These measures are applied and compared using 50 different Arabic web pages. Einstein measure was gave best performance among the other measures. An analysis of these measures and concluding remarks are drawn in this study.

A Fast HRRP Synthesis Algorithm with Sensing Dictionary in GTD Model

In the paper, a fast high-resolution range profile synthetic algorithm called orthogonal matching pursuit with sensing dictionary (OMP-SD) is proposed. It formulates the traditional HRRP synthetic to be a sparse approximation problem over redundant dictionary. As it employs a priori that the synthetic range profile (SRP) of targets are sparse, SRP can be accomplished even in presence of data lost. Besides, the computation complexity decreases from O(MNDK) flops for OMP to O(M(N + D)K) flops for OMP-SD by introducing sensing dictionary (SD). Simulation experiments illustrate its advantages both in additive white Gaussian noise (AWGN) and noiseless situation, respectively.

Biodegradation of Lignocellulosic Residues of Water Hyacinth (Eichhornia crassipes) and Response Surface Methodological Approach to Optimize Bioethanol Production Using Fermenting Yeast Pachysolen tannophilus NRRL Y-2460

The objective of this research was to investigate biodegradation of water hyacinth (Eichhornia crassipes) to produce bioethanol using dilute-acid pretreatment (1% sulfuric acid) results in high hemicellulose decomposition and using yeast (Pachysolen tannophilus) as bioethanol producing strain. A maximum ethanol yield of 1.14g/L with coefficient, 0.24g g-1; productivity, 0.015g l-1h-1 was comparable to predicted value 32.05g/L obtained by Central Composite Design (CCD). Maximum ethanol yield coefficient was comparable to those obtained through enzymatic saccharification and fermentation of acid hydrolysate using fully equipped fermentor. Although maximum ethanol concentration was low in lab scale, the improvement of lignocellulosic ethanol yield is necessary for large scale production.

Evaluation of Bacterial Composition of the Aerosol of Selected Abattoirs in Akure, South Western Nigeria

This study was carried out to reveal the bacterial composition of aerosol in the studied abattoirs. Bacteria isolated were characterized according to microbiological standards. Factors such as temperature and distance were considered as variable in this study. The isolation was carried out at different temperatures such as 27oC, 31oC and 29oC and at various distances of 100meters and 200meters away from the slaughter sites. Result obtained showed that strains of Staphylococcus aureus, Escherichia coli, Bacillus subtilis, Lactobacillus alimentarius and Micrococcus sp. were identified. The total viable counts showed that more microorganisms were present in the morning while the least viable count of 388cfu was recorded in the evening period of this study. This study also showed that more microbial loads were recorded the further the distance is to the slaughter site. Conclusively, the array of bacteria isolated suggests that abattoir sites may be a potential source of pathogenic organisms to commuters if located within residential environment.

The Problem of Using the Calculation of the Critical Path to Solver Instances of the Job Shop Scheduling Problem

A procedure commonly used in Job Shop Scheduling Problem (JSSP) to evaluate the neighborhoods functions that use the non-deterministic algorithms is the calculation of the critical path in a digraph. This paper presents an experimental study of the cost of computation that exists when the calculation of the critical path in the solution for instances in which a JSSP of large size is involved. The results indicate that if the critical path is use in order to generate neighborhoods in the meta-heuristics that are used in JSSP, an elevated cost of computation exists in spite of the fact that the calculation of the critical path in any digraph is of polynomial complexity.

Application of Artificial Neural Network for the Prediction of Pressure Distribution of a Plunging Airfoil

Series of experimental tests were conducted on a section of a 660 kW wind turbine blade to measure the pressure distribution of this model oscillating in plunging motion. In order to minimize the amount of data required to predict aerodynamic loads of the airfoil, a General Regression Neural Network, GRNN, was trained using the measured experimental data. The network once proved to be accurate enough, was used to predict the flow behavior of the airfoil for the desired conditions. Results showed that with using a few of the acquired data, the trained neural network was able to predict accurate results with minimal errors when compared with the corresponding measured values. Therefore with employing this trained network the aerodynamic coefficients of the plunging airfoil, are predicted accurately at different oscillation frequencies, amplitudes, and angles of attack; hence reducing the cost of tests while achieving acceptable accuracy.

Coordinated Design of TCSC Controller and PSS Employing Particle Swarm Optimization Technique

This paper investigates the application of Particle Swarm Optimization (PSO) technique for coordinated design of a Power System Stabilizer (PSS) and a Thyristor Controlled Series Compensator (TCSC)-based controller to enhance the power system stability. The design problem of PSS and TCSC-based controllers is formulated as a time domain based optimization problem. PSO algorithm is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. To compare the capability of PSS and TCSC-based controller, both are designed independently first and then in a coordinated manner for individual and coordinated application. The proposed controllers are tested on a weakly connected power system. The eigenvalue analysis and non-linear simulation results are presented to show the effectiveness of the coordinated design approach over individual design. The simulation results show that the proposed controllers are effective in damping low frequency oscillations resulting from various small disturbances like change in mechanical power input and reference voltage setting.

Robust Regression and its Application in Financial Data Analysis

This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from the robust regression and the least square regression shows that the former can provide the possibility of a better and more realistic analysis owing to eliminating or reducing the contribution of outliers and influential data. Therefore, robust regression is recommended for getting more precise results in financial data analysis.

The Incorporation of In in GaAsN as a Means of N Fraction Calibration

InGaAsN and GaAsN epitaxial layers with similar nitrogen compositions in a sample were successfully grown on a GaAs (001) substrate by solid source molecular beam epitaxy. An electron cyclotron resonance nitrogen plasma source has been used to generate atomic nitrogen during the growth of the nitride layers. The indium composition changed from sample to sample to give compressive and tensile strained InGaAsN layers. Layer characteristics have been assessed by high-resolution x-ray diffraction to determine the relationship between the lattice constant of the GaAs1-yNy layer and the fraction x of In. The objective was to determine the In fraction x in an InxGa1-xAs1-yNy epitaxial layer which exactly cancels the strain present in a GaAs1-yNy epitaxial layer with the same nitrogen content when grown on a GaAs substrate.

Adaptive Algorithm to Predict the QoS of Web Processes and Workflows

Workflow Management Systems (WfMS) alloworganizations to streamline and automate business processes and reengineer their structure. One important requirement for this type of system is the management and computation of the Quality of Service(QoS) of processes and workflows. Currently, a range of Web processes and workflow languages exist. Each language can be characterized by the set of patterns they support. Developing andimplementing a suitable and generic algorithm to compute the QoSof processes that have been designed using different languages is a difficult task. This is because some patterns are specific to particular process languages and new patterns may be introduced in future versions of a language. In this paper, we describe an adaptive algorithm implemented to cope with these two problems. The algorithm is called adaptive since it can be dynamically changed as the patterns of a process language also change.

Malaysia Folk Literature in Early Childhood Education

Malay Folk Literature in early childhood education served as an important agent in child development that involved emotional, thinking and language aspects. Up to this moment not much research has been carried out in Malaysia particularly in the teaching and learning aspects nor has there been an effort to publish “big books." Hence this article will discuss the stance taken by university undergraduate students, teachers and parents in evaluating Malay Folk Literature in early childhood education to be used as big books. The data collated and analyzed were taken from 646 respondents comprising 347 undergraduates and 299 teachers. Results of the study indicated that Malay Folk Literature can be absorbed into teaching and learning for early childhood with a mean of 4.25 while it can be in big books with a mean of 4.14. Meanwhile the highest mean value required for placing Malay Folk Literature genre as big books in early childhood education rests on exemplary stories for undergraduates with mean of 4.47; animal fables for teachers with a mean of 4.38. The lowest mean value of 3.57 is given to lipurlara stories. The most popular Malay Folk Literature found suitable for early children is Sang Kancil and the Crocodile, followed by Bawang Putih Bawang Merah. Pak Padir, Legends of Mahsuri, Origin of Malacca, and Origin of Rainbow are among the popular stories as well. Overall the undergraduates show a positive attitude toward all the items compared to teachers. The t-test analysis has revealed a non significant relationship between the undergraduate students and teachers with all the items for the teaching and learning of Malay Folk Literature.

Transmitter Macrodiversity in Multihopping- SFN Based Algorithm for Improved Node Reachability and Robust Routing

A novel idea presented in this paper is to combine multihop routing with single-frequency networks (SFNs) for a broadcasting scenario. An SFN is a set of multiple nodes that transmit the same data simultaneously, resulting in transmitter macrodiversity. Two of the most important performance factors of multihop networks, node reachability and routing robustness, are analyzed. Simulation results show that our proposed SFN-D routing algorithm improves the node reachability by 37 percentage points as compared to non-SFN multihop routing. It shows a diversity gain of 3.7 dB, meaning that 3.7 dB lower transmission powers are required for the same reachability. Even better results are possible for larger networks. If an important node becomes inactive, this algorithm can find new routes that a non-SFN scheme would not be able to find. Thus, two of the major problems in multihopping are addressed; achieving robust routing as well as improving node reachability or reducing transmission power.