Ontology-based Query System for UNITEN Postgraduate Students

This paper proposes a new model to support user queries on postgraduate research information at Universiti Tenaga Nasional. The ontology to be developed will contribute towards shareable and reusable domain knowledge that makes knowledge assets intelligently accessible to both people and software. This work adapts a methodology for ontology development based on the framework proposed by Uschold and King. The concepts and relations in this domain are represented in a class diagram using the Protégé software. The ontology will be used to support a menudriven query system for assisting students in searching for information related to postgraduate research at the university.

Balancing Strategies for Parallel Content-based Data Retrieval Algorithms in a k-tree Structured Database

The paper proposes a unified model for multimedia data retrieval which includes data representatives, content representatives, index structure, and search algorithms. The multimedia data are defined as k-dimensional signals indexed in a multidimensional k-tree structure. The benefits of using the k-tree unified model were demonstrated by running the data retrieval application on a six networked nodes test bed cluster. The tests were performed with two retrieval algorithms, one that allows parallel searching using a single feature, the second that performs a weighted cascade search for multiple features querying. The experiments show a significant reduction of retrieval time while maintaining the quality of results.

Join and Meet Block Based Default Definite Decision Rule Mining from IDT and an Incremental Algorithm

Using maximal consistent blocks of tolerance relation on the universe in incomplete decision table, the concepts of join block and meet block are introduced and studied. Including tolerance class, other blocks such as tolerant kernel and compatible kernel of an object are also discussed at the same time. Upper and lower approximations based on those blocks are also defined. Default definite decision rules acquired from incomplete decision table are proposed in the paper. An incremental algorithm to update default definite decision rules is suggested for effective mining tasks from incomplete decision table into which data is appended. Through an example, we demonstrate how default definite decision rules based on maximal consistent blocks, join blocks and meet blocks are acquired and how optimization is done in support of discernibility matrix and discernibility function in the incomplete decision table.

Carbon Dioxide Capture and Storage: A General Review on Adsorbents

CO2 is the primary anthropogenic greenhouse gas, accounting for 77% of the human contribution to the greenhouse effect in 2004. In the recent years, global concentration of CO2 in the atmosphere is increasing rapidly. CO2 emissions have an impact on global climate change. Anthropogenic CO2 is emitted primarily from fossil fuel combustion. Carbon capture and storage (CCS) is one option for reducing CO2 emissions. There are three major approaches for CCS: post-combustion capture, pre-combustion capture and oxyfuel process. Post-combustion capture offers some advantages as existing combustion technologies can still be used without radical changes on them. There are several post combustion gas separation and capture technologies being investigated, namely; (a) absorption, (b) cryogenic separation, (c) membrane separation (d) micro algal biofixation and (e) adsorption. Apart from establishing new techniques, the exploration of capture materials with high separation performance and low capital cost are paramount importance. However, the application of adsorption from either technology, require easily regenerable and durable adsorbents with a high CO2 adsorption capacity. It has recently been reported that the cost of the CO2 capture can be reduced by using this technology. In this paper, the research progress (from experimental results) in adsorbents for CO2 adsorption, storage, and separations were reviewed and future research directions were suggested as well.

Sequential Straightforward Clustering for Local Image Block Matching

Duplicated region detection is a technical method to expose copy-paste forgeries on digital images. Copy-paste is one of the common types of forgeries to clone portion of an image in order to conceal or duplicate special object. In this type of forgery detection, extracting robust block feature and also high time complexity of matching step are two main open problems. This paper concentrates on computational time and proposes a local block matching algorithm based on block clustering to enhance time complexity. Time complexity of the proposed algorithm is formulated and effects of two parameter, block size and number of cluster, on efficiency of this algorithm are considered. The experimental results and mathematical analysis demonstrate this algorithm is more costeffective than lexicographically algorithms in time complexity issue when the image is complex.

Promoting Collaborative Learning in Software Engineering by Adapting the PBL Strategy

Software engineering education not only embraces technical skills of software development but also necessitates communication and interaction among learners. In this paper, it is proposed to adapt the PBL methodology that is especially designed to be integrated into software engineering classroom in order to promote collaborative learning environment. This approach helps students better understand the significance of social aspects and provides a systematic framework to enhance teamwork skills. The adaptation of PBL facilitates the transition to an innovative software development environment where cooperative learning can be actualized.

A New Stabilizing GPC for Nonminimum Phase LTI Systems Using Time Varying Weighting

In this paper, we show that the stability can not be achieved with current stabilizing MPC methods for some unstable processes. Hence we present a new method for stabilizing these processes. The main idea is to use a new time varying weighted cost function for traditional GPC. This stabilizes the closed loop system without adding soft or hard constraint in optimization problem. By studying different examples it is shown that using the proposed method, the closed-loop stability of unstable nonminimum phase process is achieved.

Strategies for Securing Safety Messages with Fixed Key Infrastructure in Vehicular Network

Vehicular communications play a substantial role in providing safety in transportation by means of safety message exchange. Researchers have proposed several solutions for securing safety messages. Protocols based on a fixed key infrastructure are more efficient in implementation and maintain stronger security in comparison with dynamic structures. These protocols utilize zone partitioning to establish distinct key infrastructure under Certificate Authority (CA) supervision in different regions. Secure anonymous broadcasting (SAB) is one of these protocols that preserves most of security aspects but it has some deficiencies in practice. A very important issue is region change of a vehicle for its mobility. Changing regions leads to change of CA and necessity of having new key set to resume communication. In this paper, we propose solutions for informing vehicles about region change to obtain new key set before entering next region. This hinders attackers- intrusion, packet loss and lessons time delay. We also make key request messages secure by confirming old CA-s public key to the message, hence stronger security for safety message broadcasting is attained.

Daemon- Based Distributed Deadlock Detection and Resolution

detecting the deadlock is one of the important problems in distributed systems and different solutions have been proposed for it. Among the many deadlock detection algorithms, Edge-chasing has been the most widely used. In Edge-chasing algorithm, a special message called probe is made and sent along dependency edges. When the initiator of a probe receives the probe back the existence of a deadlock is revealed. But these algorithms are not problem-free. One of the problems associated with them is that they cannot detect some deadlocks and they even identify false deadlocks. A key point not mentioned in the literature is that when the process is waiting to obtain the required resources and its execution has been blocked, how it can actually respond to probe messages in the system. Also the question of 'which process should be victimized in order to achieve a better performance when multiple cycles exist within one single process in the system' has received little attention. In this paper, one of the basic concepts of the operating system - daemon - will be used to solve the problems mentioned. The proposed Algorithm becomes engaged in sending probe messages to the mandatory daemons and collects enough information to effectively identify and resolve multi-cycle deadlocks in distributed systems.

Personal Knowledge Management among Adult Learners: Behind the Scene of Social Network

The burst of Web 2.0 technology and social networking tools manifest different styles of learning and managing knowledge among both knowledge workers and adult learners. In the Western countries, open-learning concept has been made popular due to the ease of use and the reach that the technology provides. In Malaysia, there are still some gaps between the learners- acceptance of technology and the full implementation of the technology in the education system. There is a need to understand how adult learners, who are knowledge workers, manage their personal knowledge via social networking tools, especially in their learning process. Four processes of personal knowledge management (PKM) and four cognitive enablers are proposed supported by analysed data on adult learners in a university. The model derived from these processes and enablers is tested and presented, with recommendations on features to be included in adult learners- learning environment.

New Graph Similarity Measurements based on Isomorphic and Nonisomorphic Data Fusion and their Use in the Prediction of the Pharmacological Behavior of Drugs

New graph similarity methods have been proposed in this work with the aim to refining the chemical information extracted from molecules matching. For this purpose, data fusion of the isomorphic and nonisomorphic subgraphs into a new similarity measure, the Approximate Similarity, was carried out by several approaches. The application of the proposed method to the development of quantitative structure-activity relationships (QSAR) has provided reliable tools for predicting several pharmacological parameters: binding of steroids to the globulin-corticosteroid receptor, the activity of benzodiazepine receptor compounds, and the blood brain barrier permeability. Acceptable results were obtained for the models presented here.

Faculty-Industry R&D Joint Ventures: Barriers VS Incentives for Developing Nations

The aspiration of this research article is to target and focus the gains of university-Industry (U-I) collaborations and exploring those hurdles which are the obstacles for attaining these gains. University-Industry collaborations have attained great importance since 1980 in USA due to its application in all fields of life. U-I collaboration is a bilateral process where academia is a proactive member to make such alliances. Universities want to ameliorate their academic-base with the technicalities of technobabbles. U-I collaboration is becoming an essential lane for achieving innovative goals in this century. Many developed nations have set successful examples to prove this phenomenon as a catalyst to reduce costs, efforts and personnel for R&D projects. This study is exploits amplitudes of UI collaboration incentives in the light of success stories of developed countries. Many universities in USA, UK, Canada and various European Countries have been engaged with enterprises for numerous collaborative agreements. A long list of strategic and short term R&D projects has been executed in developed countries to accomplish their intended purposes. Due to the lack of intentions, genuine research and research-oriented environment, the mentioned field could not grow very well in developing countries. During last decade, a new wave of research has induced the institutes of developing countries to promote R&D culture especially in Pakistan. Higher Education Commission (HEC) has initiated many projects and funding supports for universities which have collaborative intentions with industry. Findings show that rapid innovation, overwhelm the technological complexities and articulated intellectual-base are major incentives which steer both partners to establish faculty-industry alliances. Everchanging technologies, concerned about intellectual property, different research environment and culture, research relevancy (Basic or applied), exposure differences and diversity of knowledge (bookish or practical) are main barriers to establish and retain joint ventures. Findings also concluded that, it is dire need to support and enhance cooperation among academia and industry to promote highly coordinated research behaviors. Author has proposed a roadmap for developing countries to promote R&D clusters among faculty and industry to deal the technological challenges and innovation complexities. Based on our research findings, Model for R&D Collaboration for developing countries also have been proposed to promote articulated R&D environment. If developing countries follow this phenomenon, rapid innovations can be achieved with limited R&D budget heads.

Audio Watermarking Based on Compression-expansion Technique

A novel robust audio watermarking scheme is proposed in this paper. In the proposed scheme, the host audio signals are segmented into frames. Two consecutive frames are assessed if they are suitable to represent a watermark bit. If so, frequency transform is performed on these two frames. The compressionexpansion technique is adopted to generate distortion over the two frames. The distortion is used to represent one watermark bit. Psychoacoustic model is applied to calculate local auditory mask to ensure that the distortion is not audible. The watermarking schemes using mono and stereo audio signals are designed differently. The correlation-based detection method is used to detect the distortion and extract embedded watermark bits. The experimental results show that the quality degradation caused by the embedded watermarks is perceptually transparent and the proposed schemes are very robust against different types of attacks.

A New Approach to Design Low Power Continues-Time Sigma-Delta Modulators

This paper presents the design of a low power second-order continuous-time sigma-delta modulator for low power applications. The loop filter of this modulator has been implemented based on the nonlinear transconductance-capacitor (Gm-C) by employing current-mode technique. The nonlinear transconductance uses floating gate MOS (FG-MOS) transistors that operate in weak inversion region. The proposed modulator features low power consumption (

Tipover Stability Enhancement of Wheeled Mobile Manipulators Using an Adaptive Neuro- Fuzzy Inference Controller System

In this paper an algorithm based on the adaptive neuro-fuzzy controller is provided to enhance the tipover stability of mobile manipulators when they are subjected to predefined trajectories for the end-effector and the vehicle. The controller creates proper configurations for the manipulator to prevent the robot from being overturned. The optimal configuration and thus the most favorable control are obtained through soft computing approaches including a combination of genetic algorithm, neural networks, and fuzzy logic. The proposed algorithm, in this paper, is that a look-up table is designed by employing the obtained values from the genetic algorithm in order to minimize the performance index and by using this data base, rule bases are designed for the ANFIS controller and will be exerted on the actuators to enhance the tipover stability of the mobile manipulator. A numerical example is presented to demonstrate the effectiveness of the proposed algorithm.

Evaluation of Urban Development Proposals An ANP Approach

In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.

Language and Retrieval Accuracy

One of the major challenges in the Information Retrieval field is handling the massive amount of information available to Internet users. Existing ranking techniques and strategies that govern the retrieval process fall short of expected accuracy. Often relevant documents are buried deep in the list of documents returned by the search engine. In order to improve retrieval accuracy we examine the issue of language effect on the retrieval process. Then, we propose a solution for a more biased, user-centric relevance for retrieved data. The results demonstrate that using indices based on variations of the same language enhances the accuracy of search engines for individual users.

Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

The Association between the Firm Characteristics and Corporate Mandatory Disclosure the Case of Greece

The main thrust of this paper is to assess the level of disclosure in the annual reports of non-financial Greek firms and to empirically investigate the hypothesized impact of several firm characteristics on the extent of mandatory disclosure. A disclosure checklist consisting of 100 mandatory items was developed to assess the level of disclosure in the 2009 annual reports of 43 Greek companies listed at the Athens stock exchange. The association between the level of disclosure and some firm characteristics was examined using multiple linear regression analysis. The study reveals that Greek companies on general have responded adequately to the mandatory disclosure requirements of the regulatory bodies. The findings also indicate that firm size was significant positively associated with the level of disclosure. The remaining variables such as age, profitability, liquidity, and board composition were found to be insignificant in explaining the variation of mandatory disclosures. The outcome of this study is undoubtedly of great concern to the investment community at large to assist in evaluating the extent of mandatory disclosure by Greek firms and explaining the variation of disclosure in light of firm-specific characteristics.

Centre Of Mass Selection Operator Based Meta-Heuristic For Unbounded Knapsack Problem

In this paper a new Genetic Algorithm based on a heuristic operator and Centre of Mass selection operator (CMGA) is designed for the unbounded knapsack problem(UKP), which is NP-Hard combinatorial optimization problem. The proposed genetic algorithm is based on a heuristic operator, which utilizes problem specific knowledge. This center of mass operator when combined with other Genetic Operators forms a competitive algorithm to the existing ones. Computational results show that the proposed algorithm is capable of obtaining high quality solutions for problems of standard randomly generated knapsack instances. Comparative study of CMGA with simple GA in terms of results for unbounded knapsack instances of size up to 200 show the superiority of CMGA. Thus CMGA is an efficient tool of solving UKP and this algorithm is competitive with other Genetic Algorithms also.