Development of Molecular Imprinted Polymers (MIPs) for the Selective Removal of Carbamazepine from Aqueous Solution

The occurrence and removal of trace organic contaminants in the aquatic environment has become a focus of environmental concern. For the selective removal of carbamazepine from loaded waters molecularly imprinted polymers (MIPs) were synthesized with carbamazepine as template. Parameters varied were the type of monomer, crosslinker, and porogen, the ratio of starting materials, and the synthesis temperature. Best results were obtained with a template to crosslinker ratio of 1:20, toluene as porogen, and methacrylic acid (MAA) as monomer. MIPs were then capable to recover carbamazepine by 93% from a 10-5 M landfill leachate solution containing also caffeine and salicylic acid. By comparison, carbamazepine recoveries of 75% were achieved using a nonimprinted polymer (NIP) synthesized under the same conditions, but without template. In landfill leachate containing solutions carbamazepine was adsorbed by 93-96% compared with an uptake of 73% by activated carbon. The best solvent for desorption was acetonitrile, with which the amount of solvent necessary and dilution with water was tested. Selected MIPs were tested for their reusability and showed good results for at least five cycles. Adsorption isotherms were prepared with carbamazepine solutions in the concentration range of 0.01 M to 5*10-6 M. The heterogeneity index showed a more homogenous binding site distribution.

Comparation Treatment Method for Industrial Tempeh Waste by Constructed Wetland and Activated Sludge

Ever since industrial revolution began, our ecosystem has changed. And indeed, the negatives outweigh the positives. Industrial waste usually released into all kinds of body of water, such as river or sea. Tempeh waste is one example of waste that carries many hazardous and unwanted substances that will affect the surrounding environment. Tempeh is a popular fermented food in Asia which is rich in nutrients and active substances. Tempeh liquid waste- in particular- can cause an air pollution, and if penetrates through the soil, it will contaminates ground-water, making it unavailable for the water to be consumed. Moreover, bacteria will thrive within the polluted water, which often responsible for causing many kinds of diseases. The treatment used for this chemical waste is biological treatment such as constructed wetland and activated sludge. These kinds of treatment are able to reduce both physical and chemical parameters altogether such as temperature, TSS, pH, BOD, COD, NH3-N, NO3-N, and PO4-P. These treatments are implemented before the waste is released into the water. The result is a comparation between constructed wetland and activated sludge, along with determining which method is better suited to reduce the physical and chemical subtances of the waste.

Assembly and Alignment of Ship Power Plants in Modern Shipbuilding

Fine alignment of main ship power plants mechanisms and shaft lines provides long-term and failure-free performance of propulsion system while fast and high-quality installation of mechanisms and shaft lines decreases common labor intensity. For checking shaft line allowed stress and setting its alignment it is required to perform calculations considering various stages of life cycle. In 2012 JSC SSTC developed special software complex “Shaftline” for calculation of alignment of having its own I/O interface and display of shaft line 3D model. Alignment of shaft line as per bearing loads is rather labor-intensive procedure. In order to decrease its duration, JSC SSTC developed automated alignment system from ship power plants mechanisms. System operation principle is based on automatic simulation of design load on bearings. Initial data for shaft line alignment can be exported to automated alignment system from PC “Shaft line”.

Pharmaceutical Microencapsulation Technology for Development of Controlled Release Drug Delivery systems

This article demonstrated development of controlled release system of an NSAID drug, Diclofenac sodium employing different ratios of Ethyl cellulose. Diclofenac sodium and ethyl cellulose in different proportions were processed by microencapsulation based on phase separation technique to formulate microcapsules. The prepared microcapsules were then compressed into tablets to obtain controlled release oral formulations. In-vitro evaluation was performed by dissolution test of each preparation was conducted in 900 ml of phosphate buffer solution of pH 7.2 maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12, 16, 20 and 24 hrs). The drug concentration in the collected samples was determined by UV spectrophotometer at 276 nm. The physical characteristics of diclofenac sodium microcapsules were according to accepted range. These were off-white, free flowing and spherical in shape. The release profile of diclofenac sodium from microcapsules was found to be directly proportional to the proportion of ethylcellulose and coat thickness. The in-vitro release pattern showed that with ratio of 1:1 and 1:2 (drug: polymer), the percentage release of drug at first hour was 16.91 and 11.52 %, respectively as compared to 1:3 which is only 6.87 % with in this time. The release mechanism followed higuchi model for its release pattern. Tablet Formulation (F2) of present study was found comparable in release profile the marketed brand Phlogin-SR, microcapsules showed an extended release beyond 24 h. Further, a good correlation was found between drug release and proportion of ethylcellulose in the microcapsules. Microencapsulation based on coacervation found as good technique to control release of diclofenac sodium for making the controlled release formulations.

An Efficient Algorithm for Computing all Program Forward Static Slices

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program backward slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. The existing algorithms for computing program slices are introduced to compute a slice at a program point. In these algorithms, the program, or the model that represents the program, is traversed completely or partially once. To compute more than one slice, the same algorithm is applied for every point of interest in the program. Thus, the same program, or program representation, is traversed several times. In this paper, an algorithm is introduced to compute all forward static slices of a computer program by traversing the program representation graph once. Therefore, the introduced algorithm is useful for software engineering applications that require computing program slices at different points of a program. The program representation graph used in this paper is called Program Dependence Graph (PDG).

Stabilizing Voltage for Sheens with Motor Loading due to Starting Inductive Motor by using STATCOM

In this treatise we will study the capability of static compensator for reactive power to stabilize sheen voltage with motor loading on power networks system. We also explain the structure and main function of STATCOM and the method to control it using STATCOM transformer current to simultaneously predict after telling about the necessity of FACTS tools to compensate in power networks. Then we study topology and controlling system to stabilize voltage during start of inductive motor. The outcome of stimulat by MATLAB software supports presented controlling idea and system in the treatise.

Delay-dependent Stability Analysis for Uncertain Switched Neutral System

This paper considers the robust exponential stability issues for a class of uncertain switched neutral system which delays switched according to the switching rule. The system under consideration includes both stable and unstable subsystems. The uncertainties considered in this paper are norm bounded, and possibly time varying. Based on multiple Lyapunov functional approach and dwell-time technique, the time-dependent switching rule is designed depend on the so-called average dwell time of stable subsystems as well as the ratio of the total activation time of stable subsystems and unstable subsystems. It is shown that by suitably controlling the switching between the stable and unstable modes, the robust stabilization of the switched uncertain neutral systems can be achieved. Two simulation examples are given to demonstrate the effectiveness of the proposed method.

Modeling Directional Thermal Radiance Anisotropy for Urban Canopy

one of the significant factors for improving the accuracy of Land Surface Temperature (LST) retrieval is the correct understanding of the directional anisotropy for thermal radiance. In this paper, the multiple scattering effect between heterogeneous non-isothermal surfaces is described rigorously according to the concept of configuration factor, based on which a directional thermal radiance model is built, and the directional radiant character for urban canopy is analyzed. The model is applied to a simple urban canopy with row structure to simulate the change of Directional Brightness Temperature (DBT). The results show that the DBT is aggrandized because of the multiple scattering effects, whereas the change range of DBT is smoothed. The temperature difference, spatial distribution, emissivity of the components can all lead to the change of DBT. The “hot spot" phenomenon occurs when the proportion of high temperature component in the vision field came to a head. On the other hand, the “cool spot" phenomena occur when low temperature proportion came to the head. The “spot" effect disappears only when the proportion of every component keeps invariability. The model built in this paper can be used for the study of directional effect on emissivity, the LST retrieval over urban areas and the adjacency effect of thermal remote sensing pixels.

Implementation of Geo-knowledge Based Geographic Information System for Estimating Earthquake Hazard Potential at a Metropolitan Area, Gwangju, in Korea

In this study, an inland metropolitan area, Gwangju, in Korea was selected to assess the amplification potential of earthquake motion and provide the information for regional seismic countermeasure. A geographic information system-based expert system was implemented for reliably predicting the spatial geotechnical layers in the entire region of interesting by building a geo-knowledge database. Particularly, the database consists of the existing boring data gathered from the prior geotechnical projects and the surface geo-knowledge data acquired from the site visit. For practical application of the geo-knowledge database to estimate the earthquake hazard potential related to site amplification effects at the study area, seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created within GIS framework. In addition, seismic zonation of site classification was also performed to determine the site amplification coefficients for seismic design at any site in the study area. KeywordsEarthquake hazard, geo-knowledge, geographic information system, seismic zonation, site period.

Data Structures and Algorithms of Intelligent Web-Based System for Modular Design

In recent years, new product development became more and more competitive and globalized, and the designing phase is critical for the product success. The concept of modularity can provide the necessary foundation for organizations to design products that can respond rapidly to market needs. The paper describes data structures and algorithms of intelligent Web-based system for modular design taking into account modules compatibility relationship and given design requirements. The system intelligence is realized by developed algorithms for choice of modules reflecting all system restrictions and requirements. The proposed data structure and algorithms are illustrated by case study of personal computer configuration. The applicability of the proposed approach is tested through a prototype of Web-based system.

Modeling and Identification of Hammerstein System by using Triangular Basis Functions

This paper deals with modeling and parameter identification of nonlinear systems described by Hammerstein model having Piecewise nonlinear characteristics such as Dead-zone nonlinearity characteristic. The simultaneous use of both an easy decomposition technique and the triangular basis functions leads to a particular form of Hammerstein model. The approximation by using Triangular basis functions for the description of the static nonlinear block conducts to a linear regressor model, so that least squares techniques can be used for the parameter estimation. Singular Values Decomposition (SVD) technique has been applied to separate the coupled parameters. The proposed approach has been efficiently tested on academic examples of simulation.

E-government Adoption in Romania

The Romanian government has been making significant attempts to make its services and information available on the Internet. According to the UN e-government survey conducted in 2008, Romania comes under mid range countries by utilization of egovernment (percent of utilization 41%). Romania-s national portal www.e-guvernare.ro aims at progressively making all services and information accessible through the portal. However, the success of these efforts depends, to a great extent, on how well the targeted users for such services, citizens in general, make use of them. For this reason, the purpose of the presented study was to identify what factors could affect the citizens' adoption of e-government services. The study is an extension of the Technology Acceptance Model. The proposed model was validated using data collected from 481 citizens. The results provided substantial support for all proposed hypotheses and showed the significance of the extended constructs.

Understanding and Measuring Trust Evolution Effectiveness in Peer-to-Peer Computing Systems

In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.

Management and Control of Industrial Effluents Discharged to Public Sewers: A Case Study

An overview of the important aspects of managing and controlling industrial effluent discharges to public sewers namely sampling, characterization, quantification and legislative controls has been presented. The findings have been validated by means of a case study covering three industrial sectors namely, tanning, textile finishing and food processing industries. Industrial effluents discharges were found to be best monitored by systematic and automatic sampling and quantified using water meter readings corrected for evaporative and consumptive losses. Based on the treatment processes employed in the public owned treatment works and the chemical oxygen demand and biochemical oxygen demand levels obtained, the effluent from all the three industrial sectors studied were found to lie in the toxic zone. Thus, physico-chemical treatment of these effluents is required to bring them into the biodegradable zone. KL values (quoted to base e) were greater than 0.50 day-1 compared to 0.39 day-1 for typical municipality wastewater.

Ontology of Collaborative Supply Chain for Quality Management

In the highly competitive and rapidly changing global marketplace, independent organizations and enterprises often come together and form a temporary alignment of virtual enterprise in a supply chain to better provide products or service. As firms adopt the systems approach implicit in supply chain management, they must manage the quality from both internal process control and external control of supplier quality and customer requirements. How to incorporate quality management of upstream and downstream supply chain partners into their own quality management system has recently received a great deal of attention from both academic and practice. This paper investigate the collaborative feature and the entities- relationship in a supply chain, and presents an ontology of collaborative supply chain from an approach of aligning service-oriented framework with service-dominant logic. This perspective facilitates the segregation of material flow management from manufacturing capability management, which provides a foundation for the coordination and integration of the business process to measure, analyze, and continually improve the quality of products, services, and process. Further, this approach characterizes the different interests of supply chain partners, providing an innovative approach to analyze the collaborative features of supply chain. Furthermore, this ontology is the foundation to develop quality management system which internalizes the quality management in upstream and downstream supply chain partners and manages the quality in supply chain systematically.

Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Optimization Using Simulation of the Vehicle Routing Problem

A key element of many distribution systems is the routing and scheduling of vehicles servicing a set of customers. A wide variety of exact and approximate algorithms have been proposed for solving the vehicle routing problems (VRP). Exact algorithms can only solve relatively small problems of VRP, which is classified as NP-Hard. Several approximate algorithms have proven successful in finding a feasible solution not necessarily optimum. Although different parts of the problem are stochastic in nature; yet, limited work relevant to the application of discrete event system simulation has addressed the problem. Presented here is optimization using simulation of VRP; where, a simplified problem has been developed in the ExtendSimTM simulation environment; where, ExtendSimTM evolutionary optimizer is used to minimize the total transportation cost of the problem. Results obtained from the model are very satisfactory. Further complexities of the problem are proposed for consideration in the future.

DNA Computing for an Absolute 1-Center Problem: An Evolutionary Approach

Deoxyribonucleic Acid or DNA computing has emerged as an interdisciplinary field that draws together chemistry, molecular biology, computer science and mathematics. Thus, in this paper, the possibility of DNA-based computing to solve an absolute 1-center problem by molecular manipulations is presented. This is truly the first attempt to solve such a problem by DNA-based computing approach. Since, part of the procedures involve with shortest path computation, research works on DNA computing for shortest path Traveling Salesman Problem, in short, TSP are reviewed. These approaches are studied and only the appropriate one is adapted in designing the computation procedures. This DNA-based computation is designed in such a way that every path is encoded by oligonucleotides and the path-s length is directly proportional to the length of oligonucleotides. Using these properties, gel electrophoresis is performed in order to separate the respective DNA molecules according to their length. One expectation arise from this paper is that it is possible to verify the instance absolute 1-center problem using DNA computing by laboratory experiments.

Analysis of Sonogram Images of Thyroid Gland Based on Wavelet Transform

Sonogram images of normal and lymphocyte thyroid tissues have considerable overlap which makes it difficult to interpret and distinguish. Classification from sonogram images of thyroid gland is tackled in semiautomatic way. While making manual diagnosis from images, some relevant information need not to be recognized by human visual system. Quantitative image analysis could be helpful to manual diagnostic process so far done by physician. Two classes are considered: normal tissue and chronic lymphocyte thyroid (Hashimoto's Thyroid). Data structure is analyzed using K-nearest-neighbors classification. This paper is mentioned that unlike the wavelet sub bands' energy, histograms and Haralick features are not appropriate to distinguish between normal tissue and Hashimoto's thyroid.