Removal of Copper (II) from Aqueous Solutions Using Teak (Tectona grandis L.f) Leaves

The experiments were performed in a batch set up under different concentrations of Cu (II) (0.2 g.l-1 to 0.9 g.l-1), pH (4- 6), temperatures (20oC – 40oC) with varying teak leaves powder (as biosorbent) dosage of 0.3 g.l-1 to 0.5 g.l-1. The kinetics of interactions were tested with pseudo first order Lagergran equation and the value for k1 was found to be 6.909 x 10-3 min-1. The biosorption data gave a good fit with Langmuir and Fruendlich isotherms and the Langmuir monolayer capacity (qm) was found to be 166.78 mg. g-1. Similarly the Freundlich adsorption capacity (Kf) was estimated as 2.49 l g-1. The mean values of the thermodynamic parameters ΔH, ΔS, and ΔG were -62.42 KJ. mol-1, -0.219 KJ.mol-1 K-1 and -1.747 KJ.mol-1 at 293 K from a solution containing 0.4 g l-1 of Cu(II) showing the biosorption to be thermodynamically favourable. These results show good potentiality of using teak leaves as a biosorbent for the removal of Cu(II) from aqueous solutions.

The Conceptualization of Integrated Consumer Health Informatics Utilization Framework

The purpose of this paper is to propose an integrated consumer health informatics utilization framework that can be used to gauge the online health information needs and usage patterns among Malaysian women. The proposed framework was developed based on four different theories/models: Use and Gratification Theory, Technology Acceptance 3 Model, Health Belief Model, and Multi-level Model of Information Seeking. The relevant constructs and research hypotheses are also presented in this paper. The framework will be tested in order for it to be used successfully to identify Malaysian women-s preferences of online health information resources and health information seeking activities.

An Ontology for Spatial Relevant Objects in a Location-aware System: Case Study: A Tourist Guide System

Location-aware computing is a type of pervasive computing that utilizes user-s location as a dominant factor for providing urban services and application-related usages. One of the important urban services is navigation instruction for wayfinders in a city especially when the user is a tourist. The services which are presented to the tourists should provide adapted location aware instructions. In order to achieve this goal, the main challenge is to find spatial relevant objects and location-dependent information. The aim of this paper is the development of a reusable location-aware model to handle spatial relevancy parameters in urban location-aware systems. In this way we utilized ontology as an approach which could manage spatial relevancy by defining a generic model. Our contribution is the introduction of an ontological model based on the directed interval algebra principles. Indeed, it is assumed that the basic elements of our ontology are the spatial intervals for the user and his/her related contexts. The relationships between them would model the spatial relevancy parameters. The implementation language for the model is OWLs, a web ontology language. The achieved results show that our proposed location-aware model and the application adaptation strategies provide appropriate services for the user.

A Semantic Recommendation Procedure for Electronic Product Catalog

To overcome the product overload of Internet shoppers, we introduce a semantic recommendation procedure which is more efficient when applied to Internet shopping malls. The suggested procedure recommends the semantic products to the customers and is originally based on Web usage mining, product classification, association rule mining, and frequently purchasing. We applied the procedure to the data set of MovieLens Company for performance evaluation, and some experimental results are provided. The experimental results have shown superior performance in terms of coverage and precision.

A Metric Framework for Analysis of Quality of Object Oriented Design

The impact of OO design on software quality characteristics such as defect density and rework by mean of experimental validation. Encapsulation, inheritance, polymorphism, reusability, Data hiding and message-passing are the major attribute of an Object Oriented system. In order to evaluate the quality of an Object oriented system the above said attributes can act as indicators. The metrics are the well known quantifiable approach to express any attribute. Hence, in this paper we tried to formulate a framework of metrics representing the attributes of object oriented system. Empirical Data is collected from three different projects based on object oriented paradigms to calculate the metrics.

NEAR: Visualizing Information Relations in Multimedia Repository A•VI•RE

This paper describes the NEAR (Navigating Exhibitions, Annotations and Resources) panel, a novel interactive visualization technique designed to help people navigate and interpret groups of resources, exhibitions and annotations by revealing hidden relations such as similarities and references. NEAR is implemented on A•VI•RE, an extended online information repository. A•VI•RE supports a semi-structured collection of exhibitions containing various resources and annotations. Users are encouraged to contribute, share, annotate and interpret resources in the system by building their own exhibitions and annotations. However, it is hard to navigate smoothly and efficiently in A•VI•RE because of its high capacity and complexity. We present a visual panel that implements new navigation and communication approaches that support discovery of implied relations. By quickly scanning and interacting with NEAR, users can see not only implied relations but also potential connections among different data elements. NEAR was tested by several users in the A•VI•RE system and shown to be a supportive navigation tool. In the paper, we further analyze the design, report the evaluation and consider its usage in other applications.

Intelligent Network-Based Stepping Stone Detection Approach

This research intends to introduce a new usage of Artificial Intelligent (AI) approaches in Stepping Stone Detection (SSD) fields of research. By using Self-Organizing Map (SOM) approaches as the engine, through the experiment, it is shown that SOM has the capability to detect the number of connection chains that involved in a stepping stones. Realizing that by counting the number of connection chain is one of the important steps of stepping stone detection and it become the research focus currently, this research has chosen SOM as the AI techniques because of its capabilities. Through the experiment, it is shown that SOM can detect the number of involved connection chains in Network-based Stepping Stone Detection (NSSD).

In Cognitive Radio the Analysis of Bit-Error- Rate (BER) by using PSO Algorithm

The electromagnetic spectrum is a natural resource and hence well-organized usage of the limited natural resources is the necessities for better communication. The present static frequency allocation schemes cannot accommodate demands of the rapidly increasing number of higher data rate services. Therefore, dynamic usage of the spectrum must be distinguished from the static usage to increase the availability of frequency spectrum. Cognitive radio is not a single piece of apparatus but it is a technology that can incorporate components spread across a network. It offers great promise for improving system efficiency, spectrum utilization, more effective applications, reduction in interference and reduced complexity of usage for users. Cognitive radio is aware of its environmental, internal state, and location, and autonomously adjusts its operations to achieve designed objectives. It first senses its spectral environment over a wide frequency band, and then adapts the parameters to maximize spectrum efficiency with high performance. This paper only focuses on the analysis of Bit-Error-Rate in cognitive radio by using Particle Swarm Optimization Algorithm. It is theoretically as well as practically analyzed and interpreted in the sense of advantages and drawbacks and how BER affects the efficiency and performance of the communication system.

Distributed 2-Vertex Connectivity Test of Graphs Using Local Knowledge

The vertex connectivity of a graph is the smallest number of vertices whose deletion separates the graph or makes it trivial. This work is devoted to the problem of vertex connectivity test of graphs in a distributed environment based on a general and a constructive approach. The contribution of this paper is threefold. First, using a preconstructed spanning tree of the considered graph, we present a protocol to test whether a given graph is 2-connected using only local knowledge. Second, we present an encoding of this protocol using graph relabeling systems. The last contribution is the implementation of this protocol in the message passing model. For a given graph G, where M is the number of its edges, N the number of its nodes and Δ is its degree, our algorithms need the following requirements: The first one uses O(Δ×N2) steps and O(Δ×logΔ) bits per node. The second one uses O(Δ×N2) messages, O(N2) time and O(Δ × logΔ) bits per node. Furthermore, the studied network is semi-anonymous: Only the root of the pre-constructed spanning tree needs to be identified.

Implementing Authentication Protocol for Exchanging Encrypted Messages via an Authentication Server Based on Elliptic Curve Cryptography with the ElGamal-s Algorithm

In this paper the authors propose a protocol, which uses Elliptic Curve Cryptography (ECC) based on the ElGamal-s algorithm, for sending small amounts of data via an authentication server. The innovation of this approach is that there is no need for a symmetric algorithm or a safe communication channel such as SSL. The reason that ECC has been chosen instead of RSA is that it provides a methodology for obtaining high-speed implementations of authentication protocols and encrypted mail techniques while using fewer bits for the keys. This means that ECC systems require smaller chip size and less power consumption. The proposed protocol has been implemented in Java to analyse its features and vulnerabilities in the real world.

A Genetic-Algorithm-Based Approach for Audio Steganography

In this paper, we present a novel, principled approach to resolve the remained problems of substitution technique of audio steganography. Using the proposed genetic algorithm, message bits are embedded into multiple, vague and higher LSB layers, resulting in increased robustness. The robustness specially would be increased against those intentional attacks which try to reveal the hidden message and also some unintentional attacks like noise addition as well.

Bridging the Mental Gap between Convolution Approach and Compartmental Modeling in Functional Imaging: Typical Embedding of an Open Two-Compartment Model into the Systems Theory Approach of Indicator Dilution Theory

Functional imaging procedures for the non-invasive assessment of tissue microcirculation are highly requested, but require a mathematical approach describing the trans- and intercapillary passage of tracer particles. Up to now, two theoretical, for the moment different concepts have been established for tracer kinetic modeling of contrast agent transport in tissues: pharmacokinetic compartment models, which are usually written as coupled differential equations, and the indicator dilution theory, which can be generalized in accordance with the theory of lineartime- invariant (LTI) systems by using a convolution approach. Based on mathematical considerations, it can be shown that also in the case of an open two-compartment model well-known from functional imaging, the concentration-time course in tissue is given by a convolution, which allows a separation of the arterial input function from a system function being the impulse response function, summarizing the available information on tissue microcirculation. Due to this reason, it is possible to integrate the open two-compartment model into the system-theoretic concept of indicator dilution theory (IDT) and thus results known from IDT remain valid for the compartment approach. According to the long number of applications of compartmental analysis, even for a more general context similar solutions of the so-called forward problem can already be found in the extensively available appropriate literature of the seventies and early eighties. Nevertheless, to this day, within the field of biomedical imaging – not from the mathematical point of view – there seems to be a trench between both approaches, which the author would like to get over by exemplary analysis of the well-known model.

Bureau Management Technologies and Information Systems in Developing Countries

This study focuses on bureau management technologies and information systems in developing countries. Developing countries use such systems which facilitate executive and organizational functions through the utilization of bureau management technologies and provide the executive staff with necessary information. The concepts of data and information differ from each other in developing countries, and thus the concepts of data processing and information processing are different. Symbols represent ideas, objects, figures, letters and numbers. Data processing system is an integrated system which deals with the processing of the data related to the internal and external environment of the organization in order to make decisions, create plans and develop strategies; it goes without saying that this system is composed of both human beings and machines. Information is obtained through the acquisition and the processing of data. On the other hand, data are raw communicative messages. Within this framework, data processing equals to producing plausible information out of raw data. Organizations in developing countries need to obtain information relevant to them because rapid changes in the organizational arena require rapid access to accurate information. The most significant role of the directors and managers who work in the organizational arena is to make decisions. Making a correct decision is possible only when the directors and managers are equipped with sound ideas and appropriate information. Therefore, acquisition, organization and distribution of information gain significance. Today-s organizations make use of computer-assisted “Management Information Systems" in order to obtain and distribute information. Decision Support System which is closely related to practice is an information system that facilitates the director-s task of making decisions. Decision Support System integrates human intelligence, information technology and software in order to solve the complex problems. With the support of the computer technology and software systems, Decision Support System produces information relevant to the decision to be made by the director and provides the executive staff with supportive ideas about the decision. Artificial Intelligence programs which transfer the studies and experiences of the people to the computer are called expert systems. An expert system stores expert information in a limited area and can solve problems by deriving rational consequences. Bureau management technologies and information systems in developing countries create a kind of information society and information economy which make those countries have their places in the global socio-economic structure and which enable them to play a reasonable and fruitful role; therefore it is of crucial importance to make use of information and management technologies in order to work together with innovative and enterprising individuals and it is also significant to create “scientific policies" based on information and technology in the fields of economy, politics, law and culture.

Static Single Point Positioning Using The Extended Kalman Filter

Global Positioning System (GPS) technology is widely used today in the areas of geodesy and topography as well as in aeronautics mainly for military purposes. Due to the military usage of GPS, full access and use of this technology is being denied to the civilian user who must then work with a less accurate version. In this paper we focus on the estimation of the receiver coordinates ( X, Y, Z ) and its clock bias ( δtr ) of a fixed point based on pseudorange measurements of a single GPS receiver. Utilizing the instantaneous coordinates of just 4 satellites and their clock offsets, by taking into account the atmospheric delays, we are able to derive a set of pseudorange equations. The estimation of the four unknowns ( X, Y, Z , δtr ) is achieved by introducing an extended Kalman filter that processes, off-line, all the data collected from the receiver. Higher performance of position accuracy is attained by appropriate tuning of the filter noise parameters and by including other forms of biases.

Intelligent Multi-Agent Middleware for Ubiquitous Home Networking Environments

The next stage of the home networking environment is supposed to be ubiquitous, where each piece of material is equipped with an RFID (Radio Frequency Identification) tag. To fully support the ubiquitous environment, home networking middleware should be able to recommend home services based on a user-s interests and efficiently manage information on service usage profiles for the users. Therefore, USN (Ubiquitous Sensor Network) technology, which recognizes and manages a appliance-s state-information (location, capabilities, and so on) by connecting RFID tags is considered. The Intelligent Multi-Agent Middleware (IMAM) architecture was proposed to intelligently manage the mobile RFID-based home networking and to automatically supply information about home services that match a user-s interests. Evaluation results for personalization services for IMAM using Bayesian-Net and Decision Trees are presented.

Going beyond Social Maternage.The Principle of Brotherhood in the Community Psychology's Intervention

The aim of this paper is to study in depth some methodological aspects of social interventation, focusing on desirable passage from social maternage method to peer advocacy method. For this purpose, we intend analyze social and organizative components, that affect operator's professional action and that are part of his psychological environment, besides the physical and social one. In fact, operator's interventation should not be limited to a pure supply of techniques, nor to take shape as improvised action, but “full of good purposes".

About Analysis and Modelling of the Open Message Switching System

The modern queueing theory is one of the powerful tools for a quantitative and qualitative analysis of communication systems, computer networks, transportation systems, and many other technical systems. The paper is designated to the analysis of queueing systems, arising in the networks theory and communications theory (called open queueing network). The authors of this research in the sphere of queueing theory present the theorem about the law of the iterated logarithm (LIL) for the queue length of a customers in open queueing network and its application to the mathematical model of the open message switching system.

A Fair Non-transfer Exchange Protocol

Network exchange is now widely used. However, it still cannot avoid the problems evolving from network exchange. For example. A buyer may not receive the order even if he/she makes the payment. For another example, the seller possibly get nothing even when the merchandise is sent. Some studies about the fair exchange have proposed protocols for the design of efficiency and exploited the signature property to specify that two parties agree on the exchange. The information about purchased item and price are disclosed in this way. This paper proposes a new fair network payment protocol with off-line trusted third party. The proposed protocol can protect the buyers- purchase message from being traced. In addition, the proposed protocol can meet the proposed requirements. The most significant feature is Non-transfer property we achieved.

Tracking Activity of Real Individuals in Web Logs

This paper describes an enhanced cookie-based method for counting the visitors of web sites by using a web log processing system that aims to cope with the ambitious goal of creating countrywide statistics about the browsing practices of real human individuals. The focus is put on describing a new more efficient way of detecting human beings behind web users by placing different identifiers on the client computers. We briefly introduce our processing system designed to handle the massive amount of data records continuously gathered from the most important content providers of the Hungary. We conclude by showing statistics of different time spans comparing the efficiency of multiple visitor counting methods to the one presented here, and some interesting charts about content providers and web usage based on real data recorded in 2007 will also be presented.

Finite Element Analysis of Full Ceramic Crowns with and without Zirconia Framework

Simulation of occlusal function during laboratory material-s testing becomes essential in predicting long-term performance before clinical usage. The aim of the study was to assess the influence of chamfer preparation depth on failure risk of heat pressed ceramic crowns with and without zirconia framework by means of finite element analysis. 3D models of maxillary central incisor, prepared for full ceramic crowns with different depths of the chamfer margin (between 0.8 and 1.2 mm) and 6-degree tapered walls together with the overlying crowns were generated using literature data (Fig. 1, 2). The crowns were designed with and without a zirconia framework with a thickness of 0.4 mm. For all preparations and crowns, stresses in the pressed ceramic crown, zirconia framework, pressed ceramic veneer, and dentin were evaluated separately. The highest stresses were registered in the dentin. The depth of the preparations had no significant influence on the stress values of the teeth and pressed ceramics for the studied cases, only for the zirconia framework. The zirconia framework decreases the stress values in the veneer.