EDULOGIC+ - Knowledge Management through Data Analysis in Education

This paper outlines the application of Knowledge Management (KM) principles in the context of Educational institutions. The paper caters to the needs of the engineering institutions for imparting quality education by delineating the instruction delivery process in a highly structured, controlled and quantified manner. This is done using a software tool EDULOGIC+. The central idea has been based on the engineering education pattern in Indian Universities/ Institutions. The data, contents and results produced over contiguous years build the necessary ground for managing the related accumulated knowledge. Application of KM has been explained using certain examples of data analysis and knowledge extraction.

Application of LSB Based Steganographic Technique for 8-bit Color Images

Steganography is the process of hiding one file inside another such that others can neither identify the meaning of the embedded object, nor even recognize its existence. Current trends favor using digital image files as the cover file to hide another digital file that contains the secret message or information. One of the most common methods of implementation is Least Significant Bit Insertion, in which the least significant bit of every byte is altered to form the bit-string representing the embedded file. Altering the LSB will only cause minor changes in color, and thus is usually not noticeable to the human eye. While this technique works well for 24-bit color image files, steganography has not been as successful when using an 8-bit color image file, due to limitations in color variations and the use of a colormap. This paper presents the results of research investigating the combination of image compression and steganography. The technique developed starts with a 24-bit color bitmap file, then compresses the file by organizing and optimizing an 8-bit colormap. After the process of compression, a text message is hidden in the final, compressed image. Results indicate that the final technique has potential of being useful in the steganographic world.

Does Training in the Use of a Magnifier Improve Efficiency?

Provision of optical devices without proper instruction and training may cause frustration resulting in rejection or incorrect use of the magnifiers. However training in the use of magnifiers increases the cost of providing these devices. This study compared the efficacy of providing instruction alone and instruction plus training in the use of magnifiers. 24 participants randomly assigned to two groups. 15 received instruction and training and 9 received instruction only. Repeated measures of print size and reading speed were performed at pre, post training and follow up. Print size decreased in both groups between pre and post training maintained at follow up. Reading speed increased in both groups over time with the training group demonstrating more rapid improvement. Whilst overall outcomes were similar, training decreased the time required to increase reading speed supporting the use of training for increased efficiency. A cost effective form of training is suggested.

Utilization of Laser-Ablation Based Analytical Methods for Obtaining Complete Chemical Information of Algae

Themain goal of this article is to find efficient methods for elemental and molecular analysis of living microorganisms (algae) under defined environmental conditions and cultivation processes. The overall knowledge of chemical composition is obtained utilizing laser-based techniques, Laser- Induced Breakdown Spectroscopy (LIBS) for acquiring information about elemental composition and Raman Spectroscopy for gaining molecular information, respectively. Algal cells were suspended in liquid media and characterized using their spectra. Results obtained employing LIBS and Raman Spectroscopy techniques will help to elucidate algae biology (nutrition dynamics depending on cultivation conditions) and to identify algal strains, which have the potential for applications in metal-ion absorption (bioremediation) and biofuel industry. Moreover, bioremediation can be readily combined with production of 3rd generation biofuels. In order to use algae for efficient fuel production, the optimal cultivation parameters have to be determinedleading to high production of oil in selected cellswithout significant inhibition of the photosynthetic activity and the culture growth rate, e.g. it is necessary to distinguish conditions for algal strain containing high amount of higher unsaturated fatty acids. Measurements employing LIBS and Raman Spectroscopy were utilized in order to give information about alga Trachydiscusminutus with emphasis on the amount of the lipid content inside the algal cell and the ability of algae to withdraw nutrients from its environment and bioremediation (elemental composition), respectively. This article can serve as the reference for further efforts in describing complete chemical composition of algal samples employing laserablation techniques.

Sensor Network Based Emergency Response and Navigation Support Architecture

In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment. 

Faster FPGA Routing Solution using DNA Computing

There are many classical algorithms for finding routing in FPGA. But Using DNA computing we can solve the routes efficiently and fast. The run time complexity of DNA algorithms is much less than other classical algorithms which are used for solving routing in FPGA. The research in DNA computing is in a primary level. High information density of DNA molecules and massive parallelism involved in the DNA reactions make DNA computing a powerful tool. It has been proved by many research accomplishments that any procedure that can be programmed in a silicon computer can be realized as a DNA computing procedure. In this paper we have proposed two tier approaches for the FPGA routing solution. First, geometric FPGA detailed routing task is solved by transforming it into a Boolean satisfiability equation with the property that any assignment of input variables that satisfies the equation specifies a valid routing. Satisfying assignment for particular route will result in a valid routing and absence of a satisfying assignment implies that the layout is un-routable. In second step, DNA search algorithm is applied on this Boolean equation for solving routing alternatives utilizing the properties of DNA computation. The simulated results are satisfactory and give the indication of applicability of DNA computing for solving the FPGA Routing problem.

Performance Evaluation of the OCDM/WDM Technique for Optical Packet Switches

The performance of the Optical Code Division Multiplexing/ Wavelength Division Multiplexing (WDM/OCDM) technique for Optical Packet Switch is investigated. The impact on the performance of the impairment due to both Multiple Access Interference and Beat noise is studied. The Packet Loss Probability due to output packet contentions is evaluated as a function of the main switch and traffic parameters when Gold coherent optical codes are adopted. The Packet Loss Probability of the OCDM/WDM switch can reach 10-9 when M=16 wavelengths, Gold code of length L=511 and only 24 wavelength converters are used in the switch.

Formation and Evaluation of Lahar/HDPE Hybrid Composite as a Structural Material for Household Biogas Digester

This study was an investigation on the suitability of Lahar/HDPE composite as a primary material for low-cost smallscale biogas digesters. While sources of raw materials for biogas are abundant in the Philippines, cost of the technology has made the widespread utilization of this resource an indefinite proposition. Aside from capital economics, another problem arises with space requirements of current digester designs. These problems may be simultaneously addressed by fabricating digesters on a smaller, household scale to reach a wider market, and to use materials that may accommodate optimization of overall design and fabrication cost without sacrificing operational efficiency. This study involved actual fabrication of the Lahar/HDPE composite at varying composition and geometry, subsequent mechanical and thermal characterization, and implementation of Statistical Analysis to find intrinsic relationships between variables. From the results, Lahar/HDPE composite was found to be feasible for use as digester material from both mechanical and economic standpoints. 

Grid Computing in Physics and Life Sciences

Certain sciences such as physics, chemistry or biology, have a strong computational aspect and use computing infrastructures to advance their scientific goals. Often, high performance and/or high throughput computing infrastructures such as clusters and computational Grids are applied to satisfy computational needs. In addition, these sciences are sometimes characterised by scientific collaborations requiring resource sharing which is typically provided by Grid approaches. In this article, I discuss Grid computing approaches in High Energy Physics as well as in bioinformatics and highlight some of my experience in both scientific domains.

Action Potential Propagation in Inhomogeneous 2D Mouse Ventricular Tissue Model

Heterogeneous repolarization causes dispersion of the T-wave and has been linked to arrhythmogenesis. Such heterogeneities appear due to differential expression of ionic currents in different regions of the heart, both in healthy and diseased animals and humans. Mice are important animals for the study of heart diseases because of the ability to create transgenic animals. We used our previously reported model of mouse ventricular myocytes to develop 2D mouse ventricular tissue model consisting of 14,000 cells (apical or septal ventricular myocytes) and to study the stability of action potential propagation and Ca2+ dynamics. The 2D tissue model was implemented as a FORTRAN program code for highperformance multiprocessor computers that runs on 36 processors. Our tissue model is able to simulate heterogeneities not only in action potential repolarization, but also heterogeneities in intracellular Ca2+ transients. The multicellular model reproduced experimentally observed velocities of action potential propagation and demonstrated the importance of incorporation of realistic Ca2+ dynamics for action potential propagation. The simulations show that relatively sharp gradients of repolarization are predicted to exist in 2D mouse tissue models, and they are primarily determined by the cellular properties of ventricular myocytes. Abrupt local gradients of channel expression can cause alternans at longer pacing basic cycle lengths than gradual changes, and development of alternans depends on the site of stimulation.

On the Parameter Optimization of Fuzzy Inference Systems

Nowadays, more engineering systems are using some kind of Artificial Intelligence (AI) for the development of their processes. Some well-known AI techniques include artificial neural nets, fuzzy inference systems, and neuro-fuzzy inference systems among others. Furthermore, many decision-making applications base their intelligent processes on Fuzzy Logic; due to the Fuzzy Inference Systems (FIS) capability to deal with problems that are based on user knowledge and experience. Also, knowing that users have a wide variety of distinctiveness, and generally, provide uncertain data, this information can be used and properly processed by a FIS. To properly consider uncertainty and inexact system input values, FIS normally use Membership Functions (MF) that represent a degree of user satisfaction on certain conditions and/or constraints. In order to define the parameters of the MFs, the knowledge from experts in the field is very important. This knowledge defines the MF shape to process the user inputs and through fuzzy reasoning and inference mechanisms, the FIS can provide an “appropriate" output. However an important issue immediately arises: How can it be assured that the obtained output is the optimum solution? How can it be guaranteed that each MF has an optimum shape? A viable solution to these questions is through the MFs parameter optimization. In this Paper a novel parameter optimization process is presented. The process for FIS parameter optimization consists of the five simple steps that can be easily realized off-line. Here the proposed process of FIS parameter optimization it is demonstrated by its implementation on an Intelligent Interface section dealing with the on-line customization / personalization of internet portals applied to E-commerce.

Robust Stability in Multivariable Neural Network Control using Harmonic Analysis

Robust stability and performance are the two most basic features of feedback control systems. The harmonic balance analysis technique enables to analyze the stability of limit cycles arising from a neural network control based system operating over nonlinear plants. In this work a robust stability analysis based on the harmonic balance is presented and applied to a neural based control of a non-linear binary distillation column with unstructured uncertainty. We develop ways to describe uncertainty in the form of neglected nonlinear dynamics and high harmonics for the plant and controller respectively. Finally, conclusions about the performance of the neural control system are discussed using the Nyquist stability margin together with the structured singular values of the uncertainty as a robustness measure.

The Influence of Ancient Artifacts on Contemporary Culture (exemplified by the Painting and Sculpture of Kazakhstan)

Petroglyphs, stone sculptures, burial mounds, and other memorial religious structures are ancient artifacts which find reflection in contemporary world culture, including the culture of Kazakhstan. In this article, the problem of the influence of ancient artifacts on contemporary culture is researched, using as an example Kazakhstan-s sculpture and painting. The practice of creating petroglyphs, stone sculptures, and memorial religious structures was closely connected to all fields of human existence, which fostered the formation of and became an inseparable part of a traditional worldview. The ancient roots of Saka-Sythian and Turkic nomadic culture have been studied, and integrated into the foundations of the contemporary art of Kazakhstan. The study of the ancient cultural heritage of Kazakhstan by contemporary artists, sculptors and architects, as well as the influence of European art and cultures on the art of Kazakhstan are furthering the development of a new national art.

Measuring the Structural Similarity of Web-based Documents: A Novel Approach

Most known methods for measuring the structural similarity of document structures are based on, e.g., tag measures, path metrics and tree measures in terms of their DOM-Trees. Other methods measures the similarity in the framework of the well known vector space model. In contrast to these we present a new approach to measuring the structural similarity of web-based documents represented by so called generalized trees which are more general than DOM-Trees which represent only directed rooted trees.We will design a new similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as strings of linear integers, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments to solve a novel and challenging problem: Measuring the structural similarity of generalized trees. More precisely, we first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based documents.

Ultra-Wideband Slot Antenna with Notched Band for World Interoperability for Microwave Access

In this paper a novel ultra-wideband (UWB) slot antenna with band notch characteristics for world interoperability for microwave access (WiMAX) is proposed. The designed antenna consists of a rectangular radiating patch and a ground plane with tapered shape slot. To realize a notch band, a curved parasitic element has been etched out along with the radiating patch. It is observed that by adjusting the length, thickness and position of the parasitic element, the proposed antenna can achieved an impedance bandwidth of 8.01GHz (2.84 to 10.85GHz) with a notched band of 3.28-3.85GHz. Compared to the recently reported band notch antennas, the proposed antenna has a simple configuration to realize band notch characteristics in order to mitigate the potential interference between WiMAX and UWB system. Furthermore, a stable radiation pattern and moderate gain except at the notched band makes the proposed antenna suitable for various UWB applications. 

Portable Continuous Aerosol Concentrator for the Determination of NO2 in the Air

The paper deals with the development of portable aerosol concentrator and its application for the determination of nitrites and nitrates. The device enables the continuous trapping of pollutants in the air. An extensive literature search has been elaborated which aims at the development of samplers and the possibilities of their application in the continuous determination of volatile organic compounds. The practical part of the paper is focused on the development of the portable aerosol concentrator. The device using the Aerosol Enrichment Unit has been experimentally verified and subsequently realized. It operates on the principle of equilibrium accumulation of pollutants from the gaseous phase using absorption liquid polydisperse aerosol. The device has been applied for monitoring nitrites and nitrates in the air. The chemiluminescence detector was used for detection; the achieved detection limit for nitrites was 28 ng/m3 and for nitrates 78 ng/m3.

Efficient Method for ECG Compression Using Two Dimensional Multiwavelet Transform

In this paper we introduce an effective ECG compression algorithm based on two dimensional multiwavelet transform. Multiwavelets offer simultaneous orthogonality, symmetry and short support, which is not possible with scalar two-channel wavelet systems. These features are known to be important in signal processing. Thus multiwavelet offers the possibility of superior performance for image processing applications. The SPIHT algorithm has achieved notable success in still image coding. We suggested applying SPIHT algorithm to 2-D multiwavelet transform of2-D arranged ECG signals. Experiments on selected records of ECG from MIT-BIH arrhythmia database revealed that the proposed algorithm is significantly more efficient in comparison with previously proposed ECG compression schemes.

Identification of Non-Lexicon Non-Slang Unigrams in Body-enhancement Medicinal UBE

Email has become a fast and cheap means of online communication. The main threat to email is Unsolicited Bulk Email (UBE), commonly called spam email. The current work aims at identification of unigrams in more than 2700 UBE that advertise body-enhancement drugs. The identification is based on the requirement that the unigram is neither present in dictionary, nor is a slang term. The motives of the paper are many fold. This is an attempt to analyze spamming behaviour and employment of wordmutation technique. On the side-lines of the paper, we have attempted to better understand the spam, the slang and their interplay. The problem has been addressed by employing Tokenization technique and Unigram BOW model. We found that the non-lexicon words constitute nearly 66% of total number of lexis of corpus whereas non-slang words constitute nearly 2.4% of non-lexicon words. Further, non-lexicon non-slang unigrams composed of 2 lexicon words, form more than 71% of the total number of such unigrams. To the best of our knowledge, this is the first attempt to analyze usage of non-lexicon non-slang unigrams in any kind of UBE.

Good Practices in the Development of the Erasmus Mundus Master program in Color in Informatics and Media Technology

The main objective of this paper is to identify and disseminate good practice in quality assurance and enhancement as well as in teaching and learning at master level. This paper focuses on the experience of the Erasmus Mundus Master program CIMET (Color in Informatics and Media Technology). Amongst topics covered, we discuss the adjustments necessary to a curriculum designed for excellent international students and their preparation for a global labor market.

Stroke Extraction and Approximation with Interpolating Lagrange Curves

This paper proposes a stroke extraction method for use in off-line signature verification. After giving a brief overview of the current ongoing researches an algorithm is introduced for detecting and following strokes in static images of signatures. Problems like the handling of junctions and variations in line width and line intensity are discussed in detail. Results are validated by both using an existing on-line signature database and by employing image registration methods.