On Generalizing Rough Set Theory via using a Filter

The theory of rough sets is generalized by using a filter. The filter is induced by binary relations and it is used to generalize the basic rough set concepts. The knowledge representations and processing of binary relations in the style of rough set theory are investigated.

Model Based Monitoring Using Integrated Data Validation, Simulation and Parameter Estimation

Efficient and safe plant operation can only be achieved if the operators are able to monitor all key process parameters. Instrumentation is used to measure many process variables, like temperatures, pressures, flow rates, compositions or other product properties. Therefore Performance monitoring is a suitable tool for operators. In this paper, we integrate rigorous simulation model, data reconciliation and parameter estimation to monitor process equipments and determine key performance indicator (KPI) of them. The applied method here has been implemented in two case studies.

A New IT-Convergence Service Design Framework

In many countries, digital city or ubiquitous city (u-City) projects have been initiated to provide digitalized economic environments to cities. Recently in Korea, Kangwon Province has started the u-Kangwon project to boost local economy with digitalized tourism services. We analyze the limitations of the ubiquitous IT approach through the u-Kangwon case. We have found that travelers are more interested in quality over speed in access of information. For improved service quality, we are looking to develop an IT-convergence service design framework (ISDF). The ISDF is based on the service engineering technique and composed of three parts: Service Design, Service Simulation, and the Service Platform.

Fabrication of Nanoporous Template of Aluminum Oxide with High Regularity Using Hard Anodization Method

Anodizing is an electrochemical process that converts the metal surface into a decorative, durable, corrosion-resistant, anodic oxide finish. Aluminum is ideally suited to anodizing, although other nonferrous metals, such as magnesium and titanium, also can be anodized. The anodic oxide structure originates from the aluminum substrate and is composed entirely of aluminum oxide. This aluminum oxide is not applied to the surface like paint or plating, but is fully integrated with the underlying aluminum substrate, so cannot chip or peel. It has a highly ordered, porous structure that allows for secondary processes such as coloring and sealing. In this experimental paper, we focus on a reliable method for fabricating nanoporous alumina with high regularity. Starting from study of nanostructure materials synthesize methods. After that, porous alumina fabricate in the laboratory by anodization of aluminum oxide. Hard anodization processes are employed to fabricate the nanoporous alumina using 0.3M oxalic acid and 90, 120 and 140 anodized voltages. The nanoporous templates were characterized by SEM and FFT. The nanoporous templates using 140 voltages have high ordered. The pore formation, influence of the experimental conditions on the pore formation, the structural characteristics of the pore and the oxide chemical reactions involved in the pore growth are discuss.

Prevalence and Antimicrobial Susceptibility Patterns of Enteric Bacteria Isolated from Water and Fish in Lake Victoria Basin of Western Kenya

A cross sectional study design and standard microbiological procedures were used to determine the prevalence and antimicrobial susceptibility patterns of Escherichia coli, Salmonella enterica serovar typhimurium and Vibrio cholerae O1 isolated from water and two fish species Rastrineobola argentea and Oreochromis niloticus collected from fish landing beaches and markets in the Lake Victoria Basin of western Kenya. Out of 162 samples analyzed, 133 (82.1%) were contaminated, with S. typhimurium as the most prevalent (49.6%), followed by E. coli (46.6%), and lastly V. cholerae (2.8%). All the bacteria isolates were sensitive to ciprofloxacin. E. coli isolates were resistant to ampicillin, tetracycline, cotrimoxazole, chloramphenical and gentamicin while S. typhimurium isolates exhibited resistance to ampicillin, tetracycline, and cotrimoxazole. The V. cholerae O1 isolates were resistant to tetracycline and ampicillin. The high prevalence of drug resistant enteric bacteria in water and fish from the study region needs public health intervention from the local government.

Molecular Dynamic Simulation and Receptor-based Pharmacophore Modeling on Human Renin for Discovery of Novel Inhibitors

Hypertension is characterized with stress on the heart and blood vessels thus increasing the risk of heart attack and renal diseases. The Renin angiotensin system (RAS) plays a major role in blood pressure control. Renin is the enzyme that controls the RAS at the rate-limiting step. Our aim is to develop new drug-like leads which can inhibit renin and thereby emerge as therapeutics for hypertension. To achieve this, molecular dynamics (MD) simulation and receptor-based pharmacophore modeling were implemented, and three rennin-inhibitor complex structures were selected based on IC50 value and scaffolds of inhibitors. Three pharmacophore models were generated considering conformations induced by inhibitor. The compounds mapped to these models were selected and subjected to drug-like screening. The identified hits were docked into the active site of renin. Finally, hit1 satisfying the binding mode and interaction energy was selected as possible lead candidate to be used in novel renin inhibitors.

Philosophy of Education: The Challenges of Globalization and Innovation in the Information Society

Information society is an absolutely new public formation at which the infrastructure and the social relations correspond to the socialized essence of «information genotype» mankind. Information society is a natural social environment which allows the person to open completely the information nature, to use intelligence for joint creation with other people of new information on the basis of knowledge earlier saved up by previous generations.

Role of Customers in Stakeholders- Approach in Company Corporate Governance

The purpose of this paper is to explore the relationship between the customers- issues in company corporate governance and the financial performance. At the beginning theoretical background consisting stakeholder theory and corporate governance is presented. On this theoretical background, the empirical research is built, collecting data of 60 Czech joint stock companies- boards considering their relationships with customers. Correlation analysis and multivariate regression analysis were employed to test the sample on two hypotheses. The weak positive correlation between stakeholder approach and the company size was identified. But both hypotheses were not supported, because there was no significant relation of independent variables to financial performance.

Mapping Soil Fertility at Different Scales to Support Sustainable Brazilian Agriculture

Most agricultural crops cultivated in Brazil are highly nutrient demanding. Brazilian soils are generally acidic with low base saturation and available nutrients. Demand for fertilizer application has increased because the national agricultural sector expansion. To improve productivity without environmental impact, there is the need for the utilization of novel procedures and techniques to optimize fertilizer application. This includes the digital soil mapping and GIS application applied to mapping in different scales. This paper is based on research, realized during 2005 to 2010 by Brazilian Corporation for Agricultural Research (EMBRAPA) and its partners. The purpose was to map soil fertility in national and regional scales. A soil profile data set in national scale (1:5,000,000) was constructed from the soil archives of Embrapa Soils, Rio de Janeiro and in the regional scale (1:250,000) from COMIGO Cooperative soil data set, Rio Verde, Brazil. The mapping was doing using ArcGIS 9.1 tools from ESRI.

Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Corporate Information System Educational Center

The given work is devoted to the description of Information Technologies NAS of Azerbaijan created and successfully maintained in Institute. On the basis of the decision of board of the Supreme Certifying commission at the President of the Azerbaijan Republic and Presidium of National Academy of Sciences of the Azerbaijan Republic, the organization of training courses on Computer Sciences for all post-graduate students and dissertators of the republic, taking of examinations of candidate minima, it was on-line entrusted to Institute of Information Technologies of the National Academy of Sciences of Azerbaijan. Therefore, teaching the computer sciences to post-graduate students and dissertators a scientific - methodological manual on effective application of new information technologies for research works by post-graduate students and dissertators and taking of candidate minima is carried out in the Educational Center. Information and communication technologies offer new opportunities and prospects of their application for teaching and training. The new level of literacy demands creation of essentially new technology of obtaining of scientific knowledge. Methods of training and development, social and professional requirements, globalization of the communicative economic and political projects connected with construction of a new society, depends on a level of application of information and communication technologies in the educational process. Computer technologies develop ideas of programmed training, open completely new, not investigated technological ways of training connected to unique opportunities of modern computers and telecommunications. Computer technologies of training are processes of preparation and transfer of the information to the trainee by means of computer. Scientific and technical progress as well as global spread of the technologies created in the most developed countries of the world is the main proof of the leading role of education in XXI century. Information society needs individuals having modern knowledge. In practice, all technologies, using special technical information means (computer, audio, video) are called information technologies of education.

Optimized Data Fusion in an Intelligent Integrated GPS/INS System Using Genetic Algorithm

Most integrated inertial navigation systems (INS) and global positioning systems (GPS) have been implemented using the Kalman filtering technique with its drawbacks related to the need for predefined INS error model and observability of at least four satellites. Most recently, a method using a hybrid-adaptive network based fuzzy inference system (ANFIS) has been proposed which is trained during the availability of GPS signal to map the error between the GPS and the INS. Then it will be used to predict the error of the INS position components during GPS signal blockage. This paper introduces a genetic optimization algorithm that is used to update the ANFIS parameters with respect to the INS/GPS error function used as the objective function to be minimized. The results demonstrate the advantages of the genetically optimized ANFIS for INS/GPS integration in comparison with conventional ANFIS specially in the cases of satellites- outages. Coping with this problem plays an important role in assessment of the fusion approach in land navigation.

Discovery of Quantified Hierarchical Production Rules from Large Set of Discovered Rules

Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Practical Applications and Connectivity Algorithms in Future Wireless Sensor Networks

Like any sentient organism, a smart environment relies first and foremost on sensory data captured from the real world. The sensory data come from sensor nodes of different modalities deployed on different locations forming a Wireless Sensor Network (WSN). Embedding smart sensors in humans has been a research challenge due to the limitations imposed by these sensors from computational capabilities to limited power. In this paper, we first propose a practical WSN application that will enable blind people to see what their neighboring partners can see. The challenge is that the actual mapping between the input images to brain pattern is too complex and not well understood. We also study the connectivity problem in 3D/2D wireless sensor networks and propose distributed efficient algorithms to accomplish the required connectivity of the system. We provide a new connectivity algorithm CDCA to connect disconnected parts of a network using cooperative diversity. Through simulations, we analyze the connectivity gains and energy savings provided by this novel form of cooperative diversity in WSNs.

Image Restoration in Non-Linear Filtering Domain using MDB approach

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Self Compensating ON Chip LDO Voltage Regulator in 180nm

An on chip low drop out voltage regulator that employs elegant compensation scheme is presented in this paper. The novelty in this design is that the device parasitic capacitances are exploited for compensation at different loads. The proposed LDO is designed to provide a constant voltage of 1.2V and is implemented in UMC 180 nano meter CMOS technology. The voltage regulator presented improves stability even at lighter loads and enhances line and load regulation.

Texture Feature Extraction using Slant-Hadamard Transform

Random and natural textures classification is still one of the biggest challenges in the field of image processing and pattern recognition. In this paper, texture feature extraction using Slant Hadamard Transform was studied and compared to other signal processing-based texture classification schemes. A parametric SHT was also introduced and employed for natural textures feature extraction. We showed that a subtly modified parametric SHT can outperform ordinary Walsh-Hadamard transform and discrete cosine transform. Experiments were carried out on a subset of Vistex random natural texture images using a kNN classifier.

Constraint Based Frequent Pattern Mining Technique for Solving GCS Problem

Generalized Center String (GCS) problem are generalized from Common Approximate Substring problem and Common substring problems. GCS are known to be NP-hard allowing the problems lies in the explosion of potential candidates. Finding longest center string without concerning the sequence that may not contain any motifs is not known in advance in any particular biological gene process. GCS solved by frequent pattern-mining techniques and known to be fixed parameter tractable based on the fixed input sequence length and symbol set size. Efficient method known as Bpriori algorithms can solve GCS with reasonable time/space complexities. Bpriori 2 and Bpriori 3-2 algorithm are been proposed of any length and any positions of all their instances in input sequences. In this paper, we reduced the time/space complexity of Bpriori algorithm by Constrained Based Frequent Pattern mining (CBFP) technique which integrates the idea of Constraint Based Mining and FP-tree mining. CBFP mining technique solves the GCS problem works for all center string of any length, but also for the positions of all their mutated copies of input sequence. CBFP mining technique construct TRIE like with FP tree to represent the mutated copies of center string of any length, along with constraints to restraint growth of the consensus tree. The complexity analysis for Constrained Based FP mining technique and Bpriori algorithm is done based on the worst case and average case approach. Algorithm's correctness compared with the Bpriori algorithm using artificial data is shown.

Phenolic Content and Antioxidant Activity Determination in Broccoli and Lamb’s Lettuce

Broccoli has been widely recognized as a wealthy vegetable which contains multiple nutrients with potent anti-cancer properties. Lamb’s lettuce has been used as food for many centuries but only recently became commercially available and literature is therefore exiguous concerning these vegetables. The aim of this work was to evaluate the influence of the extraction conditions on the yield of phenolic compounds and the corresponding antioxidant capacity of broccoli and lamb’s lettuce. The results indicate that lamb’s lettuce, compared to broccoli, contains simultaneously a large amount of total polyphenols as well as high antioxidant activity. It is clearly demonstrated that extraction solvent significantly influences the antioxidant activity. Methanol is the solvent that can globally maximize the antioxidant extraction yield. The results presented herein prove lamb’s lettuce as a very interesting source of polyphenols, and thus a potential health-promoting food.

Efficient Preparation and Characterization of Carbohydrate Based Monomers. D-mannose Derivatives

The field of polymeric biomaterials is very important from the socio-economical viewpoint. Synthetic carbohydrate polymers are being increasingly investigated as biodegradable, biocompatible and biorenewable materials. The aim of this study was to synthesize and characterize some derivatives based on D-mannose. D-mannose was chemically modified to obtain 1-O-allyl-2,3:5,6-di- O-isopropylidene-D-mannofuranose and 1-O-(2-,3--epoxy-propyl)- 2,3:5,6-di-O-isopropylidene-D-mannofuranose. The chemical structure of the resulting compounds was characterized by FT-IR and NMR spectroscopy, and by HPLC-MS.