Applying GQM Approach towards Development of Criterion-Referenced Assessment Model for OO Programming Courses

The most influential programming paradigm today is object oriented (OO) programming and it is widely used in education and industry. Recognizing the importance of equipping students with OO knowledge and skills, it is not surprising that most Computer Science degree programs offer OO-related courses. How do we assess whether the students have acquired the right objectoriented skills after they have completed their OO courses? What are object oriented skills? Currently none of the current assessment techniques would be able to provide this answer. Traditional forms of OO programming assessment provide a ways for assigning numerical scores to determine letter grades. But this rarely reveals information about how students actually understand OO concept. It appears reasonable that a better understanding of how to define and assess OO skills is needed by developing a criterion referenced model. It is even critical in the context of Malaysia where there is currently a growing concern over the level of competency of Malaysian IT graduates in object oriented programming. This paper discussed the approach used to develop the criterion-referenced assessment model. The model can serve as a guideline when conducting OO programming assessment as mentioned. The proposed model is derived by using Goal Questions Metrics methodology, which helps formulate the metrics of interest. It concluded with a few suggestions for further study.

Adaptive Filtering of Heart Rate Signals for an Improved Measure of Cardiac Autonomic Control

In order to provide accurate heart rate variability indices of sympathetic and parasympathetic activity, the low frequency and high frequency components of an RR heart rate signal must be adequately separated. This is not always possible by just applying spectral analysis, as power from the high and low frequency components often leak into their adjacent bands. Furthermore, without the respiratory spectra it is not obvious that the low frequency component is not another respiratory component, which can appear in the lower band. This paper describes an adaptive filter, which aids the separation of the low frequency sympathetic and high frequency parasympathetic components from an ECG R-R interval signal, enabling the attainment of more accurate heart rate variability measures. The algorithm is applied to simulated signals and heart rate and respiratory signals acquired from an ambulatory monitor incorporating single lead ECG and inductive plethysmography sensors embedded in a garment. The results show an improvement over standard heart rate variability spectral measurements.

Megalopolisation: An Effect of Large Scale Urbanisation in Post-Reform China

Megalopolis is a group of densely populated metropolitan areas that combine to form an urban complex. Since China introduced the economic reforms in late 1970s, the Chinese urban system has experienced unprecedented growth. The process of urbanisation prevailed in the 1980s, and the process of predominantly large city growth appeared to continue through 1990s and 2000s. In this study, the magnitude and pattern of urbanisation in China during 1990s were examined using remotely sensed imagery acquired by TM/ETM+ sensor onboard the Landsat satellites. The development of megalopolis areas in China was also studied based on the GIS analysis of the increases of urban and built-up area from 1990 to 2000. The analysis suggests that in the traditional agricultural zones in China, e.g., Huang-Huai-Hai Plains, Changjiang River Delta, Pearl River Delta and Sichuan Basin, the urban and built-up areas increased by 1.76 million hectares, of which 0.82 million hectares are expansion of urban areas, an increase of 24.78% compared with 1990 at the national scale. The Yellow River Delta, Changjiang River Delta and Pearl River Delta also saw an increase of urban and built-up area by 63.9%, 66.2% and 83.0% respectively. As a result, three major megalopolises were developed in China: the Guangzhou-Shenzhen-Hong Kong- Macau (Pearl River Delta: PRD) megalopolis area, the Shanghai- Nanjing-Hangzhou (Changjiang River Delta: CRD) megalopolis area and the Beijing-Tianjing-Tangshan-Qinhuangdao (Yellow River Delta-Bohai Sea Ring: YRD) megalopolis area. The relationship between the processed of megalopolisation and the inter-provincial population flow was also explored in the context of social-economic and transport infrastructure development in Post-reform China.

Join and Meet Block Based Default Definite Decision Rule Mining from IDT and an Incremental Algorithm

Using maximal consistent blocks of tolerance relation on the universe in incomplete decision table, the concepts of join block and meet block are introduced and studied. Including tolerance class, other blocks such as tolerant kernel and compatible kernel of an object are also discussed at the same time. Upper and lower approximations based on those blocks are also defined. Default definite decision rules acquired from incomplete decision table are proposed in the paper. An incremental algorithm to update default definite decision rules is suggested for effective mining tasks from incomplete decision table into which data is appended. Through an example, we demonstrate how default definite decision rules based on maximal consistent blocks, join blocks and meet blocks are acquired and how optimization is done in support of discernibility matrix and discernibility function in the incomplete decision table.

Train the Trainer: The Bricks in the Learning Community Scaffold of Professional Development

Professional development is the focus of this study. It reports on questionnaire data that examined the perceived effectiveness of the Train the Trainer model of technology professional development for elementary teachers. Eighty-three selected teachers called Information Technology Coaches received four half-day and one after-school in-service sessions. Subsequently, coaches shared the information and skills acquired during training with colleagues. Results indicated that participants felt comfortable as Information Technology Coaches and felt well prepared because of their technological professional development. Overall, participants perceived the Train the Trainer model to be effective. The outcomes of this study suggest that the use of the Train the Trainer model, a known professional development model, can be an integral and interdependent component of the newer more comprehensive learning community professional development model.

Identification of Arousal and Relaxation by using SVM-Based Fusion of PPG Features

In this paper, we propose a new method to distinguish between arousal and relaxation states by using multiple features acquired from a photoplethysmogram (PPG) and support vector machine (SVM). To induce arousal and relaxation states in subjects, 2 kinds of sound stimuli are used, and their corresponding biosignals are obtained using the PPG sensor. Two features–pulse to pulse interval (PPI) and pulse amplitude (PA)–are extracted from acquired PPG data, and a nonlinear classification between arousal and relaxation is performed using SVM. This methodology has several advantages when compared with previous similar studies. Firstly, we extracted 2 separate features from PPG, i.e., PPI and PA. Secondly, in order to improve the classification accuracy, SVM-based nonlinear classification was performed. Thirdly, to solve classification problems caused by generalized features of whole subjects, we defined each threshold according to individual features. Experimental results showed that the average classification accuracy was 74.67%. Also, the proposed method showed the better identification performance than the single feature based methods. From this result, we confirmed that arousal and relaxation can be classified using SVM and PPG features.

Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Building an Inferential Model between Caregivers and Patients by using RFID

Nosocomial (i.e., hospital-acquired) infections (NI) is a major cause of morbidity and mortality in hospitals. NI rate is higher in intensive care units (ICU) than in the general ward due to patients with severe symptoms, poor immunity, and accepted many invasive therapies. Contact behaviors between health caregivers and patients is one of the infect factors. It is difficult to obtain complete contact records by traditional method of retrospective analysis of medical records. This paper establishes a contact history inferential model (CHIM) intended to extend the use of Proximity Sensing of rapid frequency identification (RFID) technology to transferring all proximity events between health caregivers and patients into clinical events (close-in events, contact events and invasive events).The results of the study indicated that the CHIM can infer proximity care activities into close-in events and contact events. The infection control team could redesign and build optimal workflow in the ICU according to the patient-specific contact history which provided by our automatic tracing system.

Visualization of Sediment Thickness Variation for Sea Bed Logging using Spline Interpolation

This paper discusses on the use of Spline Interpolation and Mean Square Error (MSE) as tools to process data acquired from the developed simulator that shall replicate sea bed logging environment. Sea bed logging (SBL) is a new technique that uses marine controlled source electromagnetic (CSEM) sounding technique and is proven to be very successful in detecting and characterizing hydrocarbon reservoirs in deep water area by using resistivity contrasts. It uses very low frequency of 0.1Hz to 10 Hz to obtain greater wavelength. In this work the in house built simulator was used and was provided with predefined parameters and the transmitted frequency was varied for sediment thickness of 1000m to 4000m for environment with and without hydrocarbon. From series of simulations, synthetics data were generated. These data were interpolated using Spline interpolation technique (degree of three) and mean square error (MSE) were calculated between original data and interpolated data. Comparisons were made by studying the trends and relationship between frequency and sediment thickness based on the MSE calculated. It was found that the MSE was on increasing trends in the set up that has the presence of hydrocarbon in the setting than the one without. The MSE was also on decreasing trends as sediment thickness was increased and with higher transmitted frequency.

Differentiation of Cancerous Prostate tissue from Non-Cancerous Prostate tissue by using Elastic Light Single-Scattering Spectroscopy: A Feasibility Study

Elastic light single-scattering spectroscopy system with a single optical fiber probe was employed to differentiate cancerous prostate tissue from non-cancerous prostate tissue ex-vivo just after radical prostatectomy. First, ELSSS spectra were acquired from cancerous prostate tissue to define its spectral features. Then, spectra were acquired from normal prostate tissue to define difference in spectral features between the cancerous and normal prostate tissues. Of the total 66 tissue samples were evaluated from nine patients by ELSSS system. Comparing of histopathology results and ELSSS measurements revealed that sign of the spectral slopes of cancerous prostate tissue is negative and non-cancerous tissue is positive in the wavelength range from 450 to 750 nm. Based on the correlation between histopathology results and sign of the spectral slopes, ELSSS system differentiates cancerous prostate tissue from non- cancerous with a sensitivity of 0.95 and a specificity of 0.94.

Detection and Correction of Ectopic Beats for HRV Analysis Applying Discrete Wavelet Transforms

The clinical usefulness of heart rate variability is limited to the range of Holter monitoring software available. These software algorithms require a normal sinus rhythm to accurately acquire heart rate variability (HRV) measures in the frequency domain. Premature ventricular contractions (PVC) or more commonly referred to as ectopic beats, frequent in heart failure, hinder this analysis and introduce ambiguity. This investigation demonstrates an algorithm to automatically detect ectopic beats by analyzing discrete wavelet transform coefficients. Two techniques for filtering and replacing the ectopic beats from the RR signal are compared. One technique applies wavelet hard thresholding techniques and another applies linear interpolation to replace ectopic cycles. The results demonstrate through simulation, and signals acquired from a 24hr ambulatory recorder, that these techniques can accurately detect PVC-s and remove the noise and leakage effects produced by ectopic cycles retaining smooth spectra with the minimum of error.

The Experiences of South-African High-School Girls in a Fab Lab Environment

This paper reports on an effort to address the issue of inequality in girls- and women-s access to science, engineering and technology (SET) education and careers through raising awareness on SET among secondary school girls in South Africa. Girls participated in hands-on high-tech rapid prototyping environment of a fabrication laboratory that was aimed at stimulating creativity and innovation as part of a Fab Kids initiative. The Fab Kids intervention is about creating a SET pipeline as part of the Young Engineers and Scientists of Africa Initiative.The methodology was based on a real world situation and a hands-on approach. In the process, participants acquired a number of skills including computer-aided design, research skills, communication skills, teamwork skills, technical drawing skills, writing skills and problem-solving skills. Exposure to technology enhanced the girls- confidence in being able to handle technology-related tasks.

Land Surface Temperature and Biophysical Factors in Urban Planning

Land surface temperature (LST) is an important parameter to study in urban climate. The understanding of the influence of biophysical factors could improve the establishment of modeling urban thermal landscape. It is well established that climate hold a great influence on the urban landscape. However, it has been recognize that climate has a low priority in urban planning process, due to the complex nature of its influence. This study will focus on the relatively cloud free Landsat Thematic Mapper image of the study area, acquired on the 2nd March 2006. Correlation analyses were conducted to identify the relationship of LST to the biophysical factors; vegetation indices, impervious surface, and albedo to investigate the variation of LST. We suggest that the results can be considered by the stackholders during decision-making process to create a cooler and comfortable environment in the urban landscape for city dwellers.

The Water Level Detection Algorithm Using the Accumulated Histogram with Band Pass Filter

In this paper, we propose the robust water level detection method based on the accumulated histogram under small changed image which is acquired from water level surveillance camera. In general surveillance system, this is detecting and recognizing invasion from searching area which is in big change on the sequential images. However, in case of a water level detection system, these general surveillance techniques are not suitable due to small change on the water surface. Therefore the algorithm introduces the accumulated histogram which is emphasizing change of water surface in sequential images. Accumulated histogram is based on the current image frame. The histogram is cumulating differences between previous images and current image. But, these differences are also appeared in the land region. The band pass filter is able to remove noises in the accumulated histogram Finally, this algorithm clearly separates water and land regions. After these works, the algorithm converts from the water level value on the image space to the real water level on the real space using calibration table. The detected water level is sent to the host computer with current image. To evaluate the proposed algorithm, we use test images from various situations.

Hand Vein Image Enhancement With Radon Like Features Descriptor

Nowadays, hand vein recognition has attracted more attentions in identification biometrics systems. Generally, hand vein image is acquired with low contrast and irregular illumination. Accordingly, if you have a good preprocessing of hand vein image, we can easy extracted the feature extraction even with simple binarization. In this paper, a proposed approach is processed to improve the quality of hand vein image. First, a brief survey on existing methods of enhancement is investigated. Then a Radon Like features method is applied to preprocessing hand vein image. Finally, experiments results show that the proposed method give the better effective and reliable in improving hand vein images.

Bode Stability Analysis for Single Wall Carbon Nanotube Interconnects Used in 3D-VLSI Circuits

Bode stability analysis based on transmission line modeling (TLM) for single wall carbon nanotube (SWCNT) interconnects used in 3D-VLSI circuits is investigated for the first time. In this analysis, the dependence of the degree of relative stability for SWCNT interconnects on the geometry of each tube has been acquired. It is shown that, increasing the length and diameter of each tube, SWCNT interconnects become more stable.

Enhanced-Delivery Overlay Multicasting Scheme by Optimizing Bandwidth and Latency Discrepancy Ratios

With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.

EHW from Consumer Point of View: Consumer-Triggered Evolution

Evolvable Hardware (EHW) has been regarded as adaptive system acquired by wide application market. Consumer market of any good requires diversity to satisfy consumers- preferences. Adaptation of EHW is a key technology that could provide individual approach to every particular user. This situation raises a question: how to set target for evolutionary algorithm? The existing techniques do not allow consumer to influence evolutionary process. Only designer at the moment is capable to influence the evolution. The proposed consumer-triggered evolution overcomes this problem by introducing new features to EHW that help adaptive system to obtain targets during consumer stage. Classification of EHW is given according to responsiveness, imitation of human behavior and target circuit response. Home intelligent water heating system is considered as an example.

Pilot-scale Study of Horizontal Anaerobic Digester for Biogas Production using Food Waste

A horizontal anaerobic digester was developed and tested in pilot scale for Korean food waste with high water contents (>80%). The hydrogen sulfide in the biogas was removed by a biological desulfurization equipment integrated in the horizontal digester. A mixer of the horizontal digester was designed to easily remove the sediment in the bottom and scum layers on surface in the digester. Experimental result for 120 days of operation of the pilot plant showed a high removal efficiency of 81.2% for organic substance and high stability during the whole operation period were acquired. Also food waste was treated at high organic loading rates over 4 kg•VS/m3∙day and a methane gas production rate of 0.62 m3/kg•VSremoved was accomplished. The biological desulfurization equipment inside the horizontal digester was proven to be an economic and effective method to reduce the biogas desulfurization cost by removing hydrogen sulfide more than 90% without external desulfurization equipments.

Risk Evaluation of Information Technology Projects Based on Fuzzy Analytic Hierarchal Process

Information Technology (IT) projects are always accompanied by various risks and because of high rate of failure in such projects, managing risks in order to neutralize or at least decrease their effects on the success of the project is strongly essential. In this paper, fuzzy analytical hierarchy process (FAHP) is exploited as a means of risk evaluation methodology to prioritize and organize risk factors faced in IT projects. A real case of IT projects, a project of design and implementation of an integrated information system in a vehicle producing company in Iran is studied. Related risk factors are identified and then expert qualitative judgments about these factors are acquired. Translating these judgments to fuzzy numbers and using them as an input to FAHP, risk factors are then ranked and prioritized by FAHP in order to make project managers aware of more important risks and enable them to adopt suitable measures to deal with these highly devastative risks.