Evaluating Performance of Quality-of-Service Routing in Large Networks

The performance and complexity of QoS routing depends on the complex interaction between a large set of parameters. This paper investigated the scaling properties of source-directed link-state routing in large core networks. The simulation results show that the routing algorithm, network topology, and link cost function each have a significant impact on the probability of successfully routing new connections. The experiments confirm and extend the findings of other studies, and also lend new insight designing efficient quality-of-service routing policies in large networks.

Visualisation and Navigation in Large Scale P2P Service Networks

In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.

The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?

To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.

Diffusion of Mobile Entertainment in Malaysia: Drivers and Barriers

This research aims to examine the key success factors for the diffusion of mobile entertainment services in Malaysia. The drivers and barriers observed in this research include perceived benefit; concerns pertaining to pricing, product and technological standardization, privacy and security; as well as influences from peers and community. An analysis of a Malaysian survey of 384 respondents between 18 to 25 years shows that subscribers placed greater importance on perceived benefit of mobile entertainment services compared to other factors. Results of the survey also show that there are strong positive correlations between all the factors, with pricing issue–perceived benefit showing the strongest relationship. This paper aims to provide an extensive study on the drivers and barriers that could be used to derive architecture for entertainment service provision to serve as a guide for telcos to outline suitable approaches in order to encourage mass market adoption of mobile entertainment services in Malaysia.

A Gnutella-based P2P System Using Cross-Layer Design for MANET

It is expected that ubiquitous era will come soon. A ubiquitous environment has features like peer-to-peer and nomadic environments. Such features can be represented by peer-to-peer systems and mobile ad-hoc networks (MANETs). The features of P2P systems and MANETs are similar, appealing for implementing P2P systems in MANET environment. It has been shown that, however, the performance of the P2P systems designed for wired networks do not perform satisfactorily in mobile ad-hoc environment. Subsequently, this paper proposes a method to improve P2P performance using cross-layer design and the goodness of a node as a peer. The proposed method uses routing metric as well as P2P metric to choose favorable peers to connect. It also utilizes proactive approach for distributing peer information. According to the simulation results, the proposed method provides higher query success rate, shorter query response time and less energy consumption by constructing an efficient overlay network.

Sun, Salon, and Cosmetic Tanning: Predictors and Motives

The appearance management behavior of tanning by gay men is examined through the lens of Impression Formation. The study proposes that body image, self-esteem, and internalized homophobia are connected and affect the motives for engaging in sun, salon, and cosmetic tanning. Motives examined were: to look masculine, to look attractive to (potential) partners, to look attractive in general, to socialize, to meet a peer standard, and for personal satisfaction. Using regression analysis to examine data of 103 gay men who engage in at least one method of tanning, results reveal that components of body image and internalized homophobia–but not self-esteem–are linked to various motives and methods of tanning. These findings support and extend the literature of Impression Formation Theory and provide practitioners in the health and healthrelated fields new avenues to pursue when dealing with diseases related to tanning.

A Framework for Scalable Autonomous P2P Resource Discovery for the Grid Implementation

Recently, there have been considerable efforts towards the convergence between P2P and Grid computing in order to reach a solution that takes the best of both worlds by exploiting the advantages that each offers. Augmenting the peer-to-peer model to the services of the Grid promises to eliminate bottlenecks and ensure greater scalability, availability, and fault-tolerance. The Grid Information Service (GIS) directly influences quality of service for grid platforms. Most of the proposed solutions for decentralizing the GIS are based on completely flat overlays. The main contributions for this paper are: the investigation of a novel resource discovery framework for Grid implementations based on a hierarchy of structured peer-to-peer overlay networks, and introducing a discovery algorithm utilizing the proposed framework. Validation of the framework-s performance is done via simulation. Experimental results show that the proposed organization has the advantage of being scalable while providing fault-isolation, effective bandwidth utilization, and hierarchical access control. In addition, it will lead to a reliable, guaranteed sub-linear search which returns results within a bounded interval of time and with a smaller amount of generated traffic within each domain.

PRO-Teaching – Sharing Ideas to Develop Capabilities

In this paper, the action research driven design of a context relevant, developmental peer review of teaching model, its implementation strategy and its impact at an Australian university is presented. PRO-Teaching realizes an innovative process that triangulates contemporaneous teaching quality data from a range of stakeholders including students, discipline academics, learning and teaching expert academics, and teacher reflection to create reliable evidence of teaching quality. Data collected over multiple classroom observations allows objective reporting on development differentials in constructive alignment, peer, and student evaluations. Further innovation is realized in the application of this highly structured developmental process to provide summative evidence of sufficient validity to support claims for professional advancement and learning and teaching awards. Design decision points and contextual triggers are described within the operating domain. Academics and developers seeking to introduce structured peer review of teaching into their organization will find this paper a useful reference.

Hybrid Prefix Adder Architecture for Minimizing the Power Delay Product

Parallel Prefix addition is a technique for improving the speed of binary addition. Due to continuing integrating intensity and the growing needs of portable devices, low-power and highperformance designs are of prime importance. The classical parallel prefix adder structures presented in the literature over the years optimize for logic depth, area, fan-out and interconnect count of logic circuits. In this paper, a new architecture for performing 8-bit, 16-bit and 32-bit Parallel Prefix addition is proposed. The proposed prefix adder structures is compared with several classical adders of same bit width in terms of power, delay and number of computational nodes. The results reveal that the proposed structures have the least power delay product when compared with its peer existing Prefix adder structures. Tanner EDA tool was used for simulating the adder designs in the TSMC 180 nm and TSMC 130 nm technologies.

WLAN Positioning Based on Joint TOA and RSS Characteristics

WLAN Positioning has been presented by many approaches in literatures using the characteristics of Received Signal Strength (RSS), Time of Arrival (TOA) or Time Difference of Arrival (TDOA), Angle of Arrival (AOA) and cell ID. Among these, RSS approach is the simplest method to implement because there is no need of modification on both access points and client devices whereas its accuracy is terrible due to physical environments. For TOA or TDOA approach, the accuracy is quite acceptable but most researches have to modify either software or hardware on existing WLAN infrastructure. The scales of modifications are made on only access card up to the changes in protocol of WLAN. Hence, it is an unattractive approach to use TOA or TDOA for positioning system. In this paper, the new concept of merging both RSS and TOA positioning techniques is proposed. In addition, the method to achieve TOA characteristic for positioning WLAN user without any extra modification necessarily appended in the existing system is presented. The measurement results confirm that the proposed technique using both RSS and TOA characteristics provides better accuracy than using only either RSS or TOA approach.

Fast Dummy Sequence Insertion Method for PAPR Reduction in WiMAX Systems

In literatures, many researches proposed various methods to reduce PAPR (Peak to Average Power Ratio). Among those, DSI (Dummy Sequence Insertion) is one of the most attractive methods for WiMAX systems because it does not require side information transmitted along with user data. However, the conventional DSI methods find dummy sequence by performing an iterative procedure until achieving PAPR under a desired threshold. This causes a significant delay on finding dummy sequence and also effects to the overall performances in WiMAX systems. In this paper, the new method based on DSI is proposed by finding dummy sequence without the need of iterative procedure. The fast DSI method can reduce PAPR without either delays or required side information. The simulation results confirm that the proposed method is able to carry out PAPR performances as similar to the other methods without any delays. In addition, the simulations of WiMAX system with adaptive modulations are also investigated to realize the use of proposed methods on various fading schemes. The results suggest the WiMAX designers to modify a new Signal to Noise Ratio (SNR) criteria for adaptation.

CoSP2P: A Component-Based Service Model for Peer-to-Peer Systems

The increasing complexity of software development based on peer to peer networks makes necessary the creation of new frameworks in order to simplify the developer-s task. Additionally, some applications, e.g. fire detection or security alarms may require real-time constraints and the high level definition of these features eases the application development. In this paper, a service model based on a component model with real-time features is proposed. The high-level model will abstract developers from implementation tasks, such as discovery, communication, security or real-time requirements. The model is oriented to deploy services on small mobile devices, such as sensors, mobile phones and PDAs, where the computation is light-weight. Services can be composed among them by means of the port concept to form complex ad-hoc systems and their implementation is carried out using a component language called UM-RTCOM. In order to apply our proposals a fire detection application is described.

Application of a Systemic Soft Domain-Driven Design Framework

This paper proposes a “soft systems" approach to domain-driven design of computer-based information systems. We propose a systemic framework combining techniques from Soft Systems Methodology (SSM), the Unified Modelling Language (UML), and an implementation pattern known as “Naked Objects". We have used this framework in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within the proposed framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to generate a ubiquitous language (soft language) which can be used as the basis for developing an object-oriented domain model. The domain model is further developed using techniques based on the UML and is implemented in software following the “Naked Objects" implementation pattern. We argue that there are advantages from combining and using techniques from different methodologies in this way. The proposed systemic framework is overviewed and justified as multimethodologyusing Mingers multimethodology ideas. This multimethodology approach is being evaluated through a series of action research projects based on real-world case studies. A Peer-Tutoring case study is presented here as a sample of the framework evaluation process

Enhancing the Peer-To-Peer Architecture with a Roaming Service and OWL

This paper addresses the problem of building a unified structure to describe a peer-to-peer system. Our approach uses the well-known notations in the P2P area, and provides a global architecture that puts a separation between the platform specific characteristics and the logical ones. In order to enable the navigation of the peer across platforms, a roaming layer is added. The latter provides a capability to define a unique identification of peer and assures the mapping between this identification and those used in each platform. The mapping task is assured by special wrapper. In addition, ontology is proposed to give a clear presentation of the structure of the P2P system without interesting in the content and the resource managed by the peer. The ontology is created according to the web semantic paradigm and using OWL language; so, the structure of the system is considered as a web resource.

Enhancing the Connectedness in Ad–hoc Mesh Networks using the Terranet Technology

This paper simulates the ad-hoc mesh network in rural areas, where such networks receive great attention due to their cost, since installing the infrastructure for regular networks in these areas is not possible due to the high cost. The distance between the communicating nodes is the most obstacles that the ad-hoc mesh network will face. For example, in Terranet technology, two nodes can communicate if they are only one kilometer far from each other. However, if the distance between them is more than one kilometer, then each node in the ad-hoc mesh networks has to act as a router that forwards the data it receives to other nodes. In this paper, we try to find the critical number of nodes which makes the network fully connected in a particular area, and then propose a method to enhance the intermediate node to accept to be a router to forward the data from the sender to the receiver. Much work was done on technological changes on peer to peer networks, but the focus of this paper will be on another feature which is to find the minimum number of nodes needed for a particular area to be fully connected and then to enhance the users to switch on their phones and accept to work as a router for other nodes. Our method raises the successful calls to 81.5% out of 100% attempt calls.

A P2P File Sharing Technique by Indexed-Priority Metric

Recently, the improvements in processing performance of a computer and in high speed communication of an optical fiber have been achieved, so that the amount of data which are processed by a computer and flowed on a network has been increasing greatly. However, in a client-server system, since the server receives and processes the amount of data from the clients through the network, a load on the server is increasing. Thus, there are needed to introduce a server with high processing ability and to have a line with high bandwidth. In this paper, concerning to P2P networks to resolve the load on a specific server, a criterion called an Indexed-Priority Metric is proposed and its performance is evaluated. The proposed metric is to allocate some files to each node. As a result, the load on a specific server can distribute them to each node equally well. A P2P file sharing system using the proposed metric is implemented. Simulation results show that the proposed metric can make it distribute files on the specific server.

Comparing Academically Gifted and Non-Gifted Students- Supportive Environments in Jordan

Jordan exerts many efforts to nurture their academically gifted students in special schools since 2001. During the past nine years of launching these schools, their learning and excellence environments were believed to be distinguished compared to public schools. This study investigated the environments of gifted students compared with other non-gifted, using a survey instrument that measures the dimensions of family, peers, teachers, school- support, society, and resources –dimensions rooted deeply in supporting gifted education, learning, and achievement. A total number of 109 were selected from excellence schools for academically gifted students, and 119 non-gifted students were selected from public schools. Around 8.3% of the non-gifted students reported that they “Never" received any support from their surrounding environments, 14.9% reported “Seldom" support, 23.7% reported “ Often" support, 26.0% reported “Frequent" support, and 32.8% reported “Very frequent" support. Where the gifted students reported more “Never" support than the non-gifted did with 11.3%, “Seldom" support with 15.4%, “Often" support with 26.6%, “Frequent" support with 29.0%, and reported “Very frequent" support less than the non-gifted students with 23.6%. Unexpectedly, statistical differences were found between the two groups favoring non-gifted students in perception of their surrounding environments in specific dimensions, namely, school- support, teachers, and society. No statistical differences were found in the other dimensions of the survey, namely, family, peers, and resources. As the differences were found in teachers, school- support, and society, the nurturing environments for the excellence schools need to be revised to adopt more creative teaching styles, rich school atmosphere and infrastructures, interactive guiding for the students and their parents, promoting for the excellence environments, and re-build successful identification models. Thus, families, schools, and society should increase their cooperation, communication, and awareness of the gifted supportive environments. However, more studies to investigate other aspects of promoting academic giftedness and excellence are recommended.

GridNtru: High Performance PKCS

Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.

Combination of Different Classifiers for Cardiac Arrhythmia Recognition

This paper describes a new supervised fusion (hybrid) electrocardiogram (ECG) classification solution consisting of a new QRS complex geometrical feature extraction as well as a new version of the learning vector quantization (LVQ) classification algorithm aimed for overcoming the stability-plasticity dilemma. Toward this objective, after detection and delineation of the major events of ECG signal via an appropriate algorithm, each QRS region and also its corresponding discrete wavelet transform (DWT) are supposed as virtual images and each of them is divided into eight polar sectors. Then, the curve length of each excerpted segment is calculated and is used as the element of the feature space. To increase the robustness of the proposed classification algorithm versus noise, artifacts and arrhythmic outliers, a fusion structure consisting of five different classifiers namely as Support Vector Machine (SVM), Modified Learning Vector Quantization (MLVQ) and three Multi Layer Perceptron-Back Propagation (MLP–BP) neural networks with different topologies were designed and implemented. The new proposed algorithm was applied to all 48 MIT–BIH Arrhythmia Database records (within–record analysis) and the discrimination power of the classifier in isolation of different beat types of each record was assessed and as the result, the average accuracy value Acc=98.51% was obtained. Also, the proposed method was applied to 6 number of arrhythmias (Normal, LBBB, RBBB, PVC, APB, PB) belonging to 20 different records of the aforementioned database (between– record analysis) and the average value of Acc=95.6% was achieved. To evaluate performance quality of the new proposed hybrid learning machine, the obtained results were compared with similar peer– reviewed studies in this area.

Extended Dynamic Source Routing Protocol for the Non Co-Operating Nodes in Mobile Adhoc Networks

In this paper, a new approach based on the extent of friendship between the nodes is proposed which makes the nodes to co-operate in an ad hoc environment. The extended DSR protocol is tested under different scenarios by varying the number of malicious nodes and node moving speed. It is also tested varying the number of nodes in simulation used. The result indicates the achieved throughput by extended DSR is greater than the standard DSR and indicates the percentage of malicious drops over total drops are less in the case of extended DSR than the standard DSR.