Improvement of Lipase Catalytic Properties by Immobilization in Hybrid Matrices

Lipases are enzymes particularly amenable for immobilization by entrapment methods, as they can work equally well in aqueous or non-conventional media and long-time stability of enzyme activity and enantioselectivity is needed to elaborate more efficient bioprocesses. The improvement of Pseudomonas fluorescens (Amano AK) lipase characteristics was investigated by optimizing the immobilization procedure in hybrid organic-inorganic matrices using ionic liquids as additives. Ionic liquids containing a more hydrophobic alkyl group in the cationic moiety are beneficial for the activity of immobilized lipase. Silanes with alkyl- or aryl nonhydrolizable groups used as precursors in combination with tetramethoxysilane could generate composites with higher enantioselectivity compared to the native enzyme in acylation reactions of secondary alcohols. The optimal effect on both activity and enantioselectivity was achieved for the composite made from octyltrimethoxysilane and tetramethoxysilane at 1:1 molar ratio (60% increase of total activity following immobilization and enantiomeric ratio of 30). Ionic liquids also demonstrated valuable properties as reaction media for the studied reactions, comparable with the usual organic solvent, hexane.

Improvement of Semen Quality in Holstein Bulls during Heat Stress by Supplementing Omega-3 Fatty Acids

The aim of current study was to investigate the changes in the quality parameters of Holstein bull semen during the heat stress and the effect of feeding a source of omega-3 fatty acids in this period. Samples were obtained from 19 Holstein bulls during the expected time of heat stress in Iran (June to September 2009). Control group (n=10) were fed a standard concentrate feed while treatment group (n=9) had this feed top dressed with 100 g of an omega-3 enriched nutriceutical. Semen quality was assessed on ejaculates collected after 1, 5, 9 and 12 weeks of supplementation. Computer-assisted assessment of sperm motility, viability (eosinnigrosin) and hypo-osmotic swelling test (HOST) were conducted. Heat stress affected sperm quality parameters by week 5 and 9 (p

A Perceptually Optimized Wavelet Embedded Zero Tree Image Coder

In this paper, we propose a Perceptually Optimized Embedded ZeroTree Image Coder (POEZIC) that introduces a perceptual weighting to wavelet transform coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to the coding quality obtained using the SPIHT algorithm only. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEZIC quality assessment. Our POEZIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) luminance masking and Contrast masking, 2) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting, 3) the Wavelet Error Sensitivity WES used to reduce the perceptual quantization errors. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

The Riemann Barycenter Computation and Means of Several Matrices

An iterative definition of any n variable mean function is given in this article, which iteratively uses the two-variable form of the corresponding two-variable mean function. This extension method omits recursivity which is an important improvement compared with certain recursive formulas given before by Ando-Li-Mathias, Petz- Temesi. Furthermore it is conjectured here that this iterative algorithm coincides with the solution of the Riemann centroid minimization problem. Certain simulations are given here to compare the convergence rate of the different algorithms given in the literature. These algorithms will be the gradient and the Newton mehod for the Riemann centroid computation.

An Experimental Study on Development of the Connection System of Concrete Barriers Applicable to Modular Bridge

Although many studies on the assembly technology of the bridge construction have dealt mostly with on the pier, girder or the deck of the bridge, studies on the prefabricated barrier have rarely been performed. For understanding structural characteristics and application of the concrete barrier in the modular bridge, which is an assembly of structure members, static loading test was performed. Structural performances as a road barrier of the three methods, conventional cast-in-place(ST), vertical bolt connection(BVC) and horizontal bolt connection(BHC) were evaluated and compared through the analyses of load-displacement curves, strain curves of the steel, concrete strain curves and the visual appearances of crack patterns. The vertical bolt connection(BVC) method demonstrated comparable performance as an alternative to conventional cast-in-place(ST) while providing all the advantages of prefabricated technology. Necessities for the future improvement in nuts enforcement as well as legal standard and regulation are also addressed.

Ensemble Learning with Decision Tree for Remote Sensing Classification

In recent years, a number of works proposing the combination of multiple classifiers to produce a single classification have been reported in remote sensing literature. The resulting classifier, referred to as an ensemble classifier, is generally found to be more accurate than any of the individual classifiers making up the ensemble. As accuracy is the primary concern, much of the research in the field of land cover classification is focused on improving classification accuracy. This study compares the performance of four ensemble approaches (boosting, bagging, DECORATE and random subspace) with a univariate decision tree as base classifier. Two training datasets, one without ant noise and other with 20 percent noise was used to judge the performance of different ensemble approaches. Results with noise free data set suggest an improvement of about 4% in classification accuracy with all ensemble approaches in comparison to the results provided by univariate decision tree classifier. Highest classification accuracy of 87.43% was achieved by boosted decision tree. A comparison of results with noisy data set suggests that bagging, DECORATE and random subspace approaches works well with this data whereas the performance of boosted decision tree degrades and a classification accuracy of 79.7% is achieved which is even lower than that is achieved (i.e. 80.02%) by using unboosted decision tree classifier.

Comparative Study on Recent Integer DCTs

This paper presents comparative study on recent integer DCTs and a new method to construct a low sensitive structure of integer DCT for colored input signals. The method refers to sensitivity of multiplier coefficients to finite word length as an indicator of how word length truncation effects on quality of output signal. The sensitivity is also theoretically evaluated as a function of auto-correlation and covariance matrix of input signal. The structure of integer DCT algorithm is optimized by combination of lower sensitive lifting structure types of IRT. It is evaluated by the sensitivity of multiplier coefficients to finite word length expression in a function of covariance matrix of input signal. Effectiveness of the optimum combination of IRT in integer DCT algorithm is confirmed by quality improvement comparing with existing case. As a result, the optimum combination of IRT in each integer DCT algorithm evidently improves output signal quality and it is still compatible with the existing one.

An Efficient Approach to Mining Frequent Itemsets on Data Streams

The increasing importance of data stream arising in a wide range of advanced applications has led to the extensive study of mining frequent patterns. Mining data streams poses many new challenges amongst which are the one-scan nature, the unbounded memory requirement and the high arrival rate of data streams. In this paper, we propose a new approach for mining itemsets on data stream. Our approach SFIDS has been developed based on FIDS algorithm. The main attempts were to keep some advantages of the previous approach and resolve some of its drawbacks, and consequently to improve run time and memory consumption. Our approach has the following advantages: using a data structure similar to lattice for keeping frequent itemsets, separating regions from each other with deleting common nodes that results in a decrease in search space, memory consumption and run time; and Finally, considering CPU constraint, with increasing arrival rate of data that result in overloading system, SFIDS automatically detect this situation and discard some of unprocessing data. We guarantee that error of results is bounded to user pre-specified threshold, based on a probability technique. Final results show that SFIDS algorithm could attain about 50% run time improvement than FIDS approach.

Comparison of Three Meta Heuristics to Optimize Hybrid Flow Shop Scheduling Problem with Parallel Machines

This study compares three meta heuristics to minimize makespan (Cmax) for Hybrid Flow Shop (HFS) Scheduling Problem with Parallel Machines. This problem is known to be NP-Hard. This study proposes three algorithms among improvement heuristic searches which are: Genetic Algorithm (GA), Simulated Annealing (SA), and Tabu Search (TS). SA and TS are known as deterministic improvement heuristic search. GA is known as stochastic improvement heuristic search. A comprehensive comparison from these three improvement heuristic searches is presented. The results for the experiments conducted show that TS is effective and efficient to solve HFS scheduling problems.

Towards a Measurement-Based E-Government Portals Maturity Model

The e-government emerging concept transforms the way in which the citizens are dealing with their governments. Thus, the citizens can execute the intended services online anytime and anywhere. This results in great benefits for both the governments (reduces the number of officers) and the citizens (more flexibility and time saving). Therefore, building a maturity model to assess the egovernment portals becomes desired to help in the improvement process of such portals. This paper aims at proposing an egovernment maturity model based on the measurement of the best practices’ presence. The main benefit of such maturity model is to provide a way to rank an e-government portal based on the used best practices, and also giving a set of recommendations to go to the higher stage in the maturity model.

Quality-Driven Business Process Refactoring

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

Improvement of MLLR Speaker Adaptation Using a Novel Method

This paper presents a technical speaker adaptation method called WMLLR, which is based on maximum likelihood linear regression (MLLR). In MLLR, a linear regression-based transform which adapted the HMM mean vectors was calculated to maximize the likelihood of adaptation data. In this paper, the prior knowledge of the initial model is adequately incorporated into the adaptation. A series of speaker adaptation experiments are carried out at a 30 famous city names database to investigate the efficiency of the proposed method. Experimental results show that the WMLLR method outperforms the conventional MLLR method, especially when only few utterances from a new speaker are available for adaptation.

Impact of Environmental Factors on Profit Efficiency of Rice Production: A Study in Vietnam-s Red River Delta

Environmental factors affect agriculture production productivity and efficiency resulted in changing of profit efficiency. This paper attempts to estimate the impacts of environmental factors to profitability of rice farmers in the Red River Delta of Vietnam. The dataset was extracted from 349 rice farmers using personal interviews. Both OLS and MLE trans-log profit functions were used in this study. Five production inputs and four environmental factors were included in these functions. The estimation of the stochastic profit frontier with a two-stage approach was used to measure profitability. The results showed that the profit efficiency was about 75% on the average and environmental factors change profit efficiency significantly beside farm specific characteristics. Plant disease, soil fertility, irrigation apply and water pollution were the four environmental factors cause profit loss in rice production. The result indicated that farmers should reduce household size, farm plots, apply row seeding technique and improve environmental factors to obtain high profit efficiency with special consideration is given for irrigation water quality improvement.

The Application of Six Sigma to Integration of Computer Based Systems

This paper introduces a process for the module level integration of computer based systems. It is based on the Six Sigma Process Improvement Model, where the goal of the process is to improve the overall quality of the system under development. We also present a conceptual framework that shows how this process can be implemented as an integration solution. Finally, we provide a partial implementation of key components in the conceptual framework.

Are XBRL-based Financial Reports Better than Non-XBRL Reports? A Quality Assessment

Using a scoring system, this paper provides a comparative assessment of the quality of data between XBRL formatted financial reports and non-XBRL financial reports. It shows a major improvement in the quality of data of XBRL formatted financial reports. Although XBRL formatted financial reports do not show much advantage in the quality at the beginning, XBRL financial reports lately display a large improvement in the quality of data in almost all aspects. With the improved XBRL web data managing, presentation and analysis applications, XBRL formatted financial reports have a much better accessibility, are more accurate and better in timeliness.

Automatic Generation Control of an Interconnected Power System with Capacitive Energy Storage

This paper is concerned with the application of small rating Capacitive Energy Storage units for the improvement of Automatic Generation Control of a multiunit multiarea power system. Generation Rate Constraints are also considered in the investigations. Integral Squared Error technique is used to obtain the optimal integral gain settings by minimizing a quadratic performance index. Simulation studies reveal that with CES units, the deviations in area frequencies and inter-area tie-power are considerably improved in terms of peak deviations and settling time as compared to that obtained without CES units.

Advanced Micromanufacturing for Ultra Precision Part by Soft Lithography and Nano Powder Injection Molding

Recently, the advanced technologies that offer high precision product, relative easy, economical process and also rapid production are needed to realize the high demand of ultra precision micro part. In our research, micromanufacturing based on soft lithography and nanopowder injection molding was investigated. The silicone metal pattern with ultra thick and high aspect ratio succeeds to fabricate Polydimethylsiloxane (PDMS) micro mold. The process followed by nanopowder injection molding (PIM) by a simple vacuum hot press. The 17-4ph nanopowder with diameter of 100 nm, succeed to be injected and it forms green sample microbearing with thickness, microchannel and aspect ratio is 700μm, 60μm and 12, respectively. Sintering process was done in 1200 C for 2 hours and heating rate 0.83oC/min. Since low powder load (45% PL) was applied to achieve green sample fabrication, ~15% shrinkage happen in the 86% relative density. Several improvements should be done to produce high accuracy and full density sintered part.

A Study of Development to Take for the Enterprise of the Critical Success Factors in the Taiwan Szuchung Creek Hot Springs

The purpose of this study was to investigate the impact of the development of Szuchung Creek take for the cause of the critical success factors, This research is to use the depth interviews, document analysis and Modified-Delphi technique survey of nine depth interviews with experts and 14 experts of Modified-Delphi technique questionnaire and inviting as the research object, Szuchung Creek Hot Springs for the scope of the study. The results show, Szuchung Creek Hot Springs development take for career success factors for the following reasons: 1. Government. 2. Opportunities. 3. Factors of production. 4. Demand conditions. 5. Corporate structure and the degree of competition. 6. Related and supporting industries. Furthermore, Szuchung Creek hot springs, itself already has a number of critical success factors. Contingent less than or inadequacies by Szuchung Creek take for the enterprise development to take for the cause of the critical success factors as the basis for correcting, planning out for local use improvement strategies to achieve the objective of sustainable management.

Impact of ISO 9000 on Time-based Performance: An Event Study

ISO 9000 is the most popular and widely adopted meta-standard for quality and operational improvements. However, only limited empirical research has been conducted to examine the impact of ISO 9000 on operational performance based on objective and longitudinal data. To reveal any causal relationship between the adoption of ISO 9000 and operational performance, we examined the timing and magnitude of change in time-based performance as a result of ISO 9000 adoption. We analyzed the changes in operating cycle, inventory days, and account receivable days prior and after the implementation of ISO 9000 in 695 publicly listed manufacturing firms. We found that ISO 9000 certified firms shortened their operating cycle time by 5.28 days one year after the implementation of ISO 9000. In the long-run (3 years after certification), certified firms showed continuous improvement in time-based efficiency, and experienced a shorter operating cycle time of 11 days than that of non-certified firms. There was an average of 6.5% improvement in operating cycle time for ISO 9000 certified firms. Both inventory days and account receivable days showed similar significant improvements after the implementation of ISO 9000, too.

Metadata Update Mechanism Improvements in Data Grid

Grid environments include aggregation of geographical distributed resources. Grid is put forward in three types of computational, data and storage. This paper presents a research on data grid. Data grid is used for covering and securing accessibility to data from among many heterogeneous sources. Users are not worry on the place where data is located in it, provided that, they should get access to the data. Metadata is used for getting access to data in data grid. Presently, application metadata catalogue and SRB middle-ware package are used in data grids for management of metadata. At this paper, possibility of updating, streamlining and searching is provided simultaneously and rapidly through classified table of preserving metadata and conversion of each table to numerous tables. Meanwhile, with regard to the specific application, the most appropriate and best division is set and determined. Concurrency of implementation of some of requests and execution of pipeline is adaptability as a result of this technique.