Laboratory Experimentation for Supporting Collaborative Working in Engineering Education over the Internet

Collaborative working environments for distance education can be considered as a more generic form of contemporary remote labs. At present, the majority of existing real laboratories are not constructed to allow the involved participants to collaborate in real time. To make this revolutionary learning environment possible we must allow the different users to carry out an experiment simultaneously. In recent times, multi-user environments are successfully applied in many applications such as air traffic control systems, team-oriented military systems, chat-text tools, multi-player games etc. Thus, understanding the ideas and techniques behind these systems could be of great importance in the contribution of ideas to our e-learning environment for collaborative working. In this investigation, collaborative working environments from theoretical and practical perspectives are considered in order to build an effective collaborative real laboratory, which allows two students or more to conduct remote experiments at the same time as a team. In order to achieve this goal, we have implemented distributed system architecture, enabling students to obtain an automated help by either a human tutor or a rule-based e-tutor.

An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement

Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.

Designing a Framework for Network Security Protection

As the Internet continues to grow at a rapid pace as the primary medium for communications and commerce and as telecommunication networks and systems continue to expand their global reach, digital information has become the most popular and important information resource and our dependence upon the underlying cyber infrastructure has been increasing significantly. Unfortunately, as our dependency has grown, so has the threat to the cyber infrastructure from spammers, attackers and criminal enterprises. In this paper, we propose a new machine learning based network intrusion detection framework for cyber security. The detection process of the framework consists of two stages: model construction and intrusion detection. In the model construction stage, a semi-supervised machine learning algorithm is applied to a collected set of network audit data to generate a profile of normal network behavior and in the intrusion detection stage, input network events are analyzed and compared with the patterns gathered in the profile, and some of them are then flagged as anomalies should these events are sufficiently far from the expected normal behavior. The proposed framework is particularly applicable to the situations where there is only a small amount of labeled network training data available, which is very typical in real world network environments.

Comparison between Higher-Order SVD and Third-order Orthogonal Tensor Product Expansion

In digital signal processing it is important to approximate multi-dimensional data by the method called rank reduction, in which we reduce the rank of multi-dimensional data from higher to lower. For 2-dimennsional data, singular value decomposition (SVD) is one of the most known rank reduction techniques. Additional, outer product expansion expanded from SVD was proposed and implemented for multi-dimensional data, which has been widely applied to image processing and pattern recognition. However, the multi-dimensional outer product expansion has behavior of great computation complex and has not orthogonally between the expansion terms. Therefore we have proposed an alterative method, Third-order Orthogonal Tensor Product Expansion short for 3-OTPE. 3-OTPE uses the power method instead of nonlinear optimization method for decreasing at computing time. At the same time the group of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is also developed with SVD extensions for multi-dimensional data. 3-OTPE and HOSVD are similarly on the rank reduction of multi-dimensional data. Using these two methods we can obtain computation results respectively, some ones are the same while some ones are slight different. In this paper, we compare 3-OTPE to HOSVD in accuracy of calculation and computing time of resolution, and clarify the difference between these two methods.

Connectionist Approach to Generic Text Summarization

As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance.

The Effect of Stress Biaxiality on Crack Shape Development

The development of shape and size of a crack in a pressure vessel under uniaxial and biaxial loadings is important in fitness-for-service evaluations such as leak-before-break. In this work finite element modelling was used to evaluate the mean stress and the J-integral around a front of a surface-breaking crack. A procedure on the basis of ductile tearing resistance curves of high and low constrained fracture mechanics geometries was developed to estimate the amount of ductile crack extension for surface-breaking cracks and to show the evolution of the initial crack shape. The results showed non-uniform constraint levels and crack driving forces around the crack front at large deformation levels. It was also shown that initially semi-elliptical surface cracks under biaxial load developed higher constraint levels around the crack front than in uniaxial tension. However similar crack shapes were observed with more extensions associated with cracks under biaxial loading.

Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.

Similarity Detection in Collaborative Development of Object-Oriented Formal Specifications

The complexity of today-s software systems makes collaborative development necessary to accomplish tasks. Frameworks are necessary to allow developers perform their tasks independently yet collaboratively. Similarity detection is one of the major issues to consider when developing such frameworks. It allows developers to mine existing repositories when developing their own views of a software artifact, and it is necessary for identifying the correspondences between the views to allow merging them and checking their consistency. Due to the importance of the requirements specification stage in software development, this paper proposes a framework for collaborative development of Object- Oriented formal specifications along with a similarity detection approach to support the creation, merging and consistency checking of specifications. The paper also explores the impact of using additional concepts on improving the matching results. Finally, the proposed approach is empirically evaluated.

Reduction of Search Space by Applying Controlled Genetic Operators for Weight Constrained Shortest Path Problem

The weight constrained shortest path problem (WCSPP) is one of most several known basic problems in combinatorial optimization. Because of its importance in many areas of applications such as computer science, engineering and operations research, many researchers have extensively studied the WCSPP. This paper mainly concentrates on the reduction of total search space for finding WCSP using some existing Genetic Algorithm (GA). For this purpose, some controlled schemes of genetic operators are adopted on list chromosome representation. This approach gives a near optimum solution with smaller elapsed generation than classical GA technique. From further analysis on the matter, a new generalized schema theorem is also developed from the philosophy of Holland-s theorem.

Machine Vision for the Inspection of Surgical Tasks: Applications to Robotic Surgery Systems

The use of machine vision to inspect the outcome of surgical tasks is investigated, with the aim of incorporating this approach in robotic surgery systems. Machine vision is a non-contact form of inspection i.e. no part of the vision system is in direct contact with the patient, and is therefore well suited for surgery where sterility is an important consideration,. As a proof-of-concept, three primary surgical tasks for a common neurosurgical procedure were inspected using machine vision. Experiments were performed on cadaveric pig heads to simulate the two possible outcomes i.e. satisfactory or unsatisfactory, for tasks involved in making a burr hole, namely incision, retraction, and drilling. We identify low level image features to distinguish the two outcomes, as well as report on results that validate our proposed approach. The potential of using machine vision in a surgical environment, and the challenges that must be addressed, are identified and discussed.

Cultural Effect on Using New Technologies

One of the main concerns in the Information Technology field is adoption with new technologies in organizations which may result in increasing the usage paste of these technologies.This study aims to look at the issue of culture-s role in accepting and using new technologies in organizations. The study examines the effect of culture on accepting and intention to use new technology in organizations. Studies show culture is one of the most important barriers in adoption new technologies. The model used for accepting and using new technology is Technology Acceptance Model (TAM), while for culture and dimensions a well-known theory by Hofsted was used. Results of the study show significant effect of culture on intention to use new technologies. All four dimensions of culture were tested to find the strength of relationship with behavioral intention to use new technologies. Findings indicate the important role of culture in the level of intention to use new technologies and different role of each dimension to improve adaptation process. The study suggests that transferring of new technologies efforts are most likely to be successful if the parties are culturally aligned.

Simulation Games in Business Process Management Education

Business process management (BPM) has become widely accepted within business community as a means for improving business performance. However, it is of the highest importance to incorporate BPM as part of the curriculum at the university level education in order to achieve the appropriate acceptance of the method. Goal of the paper is to determine the current state of education in business process management (BPM) at the Croatian universities and abroad. It investigates the applied forms of instruction and teaching methods and gives several proposals for BPM courses improvement. Since majority of undergraduate and postgraduate students have limited understanding of business processes and lack of any practical experience, there is a need for introducing new teaching approaches. Therefore, we offer some suggestions for further improvement, among which the introduction of simulation games environment in BPM education is strongly recommended.

Sensory Evaluation of the Selected Coffee Products Using Fuzzy Approach

Knowing consumers' preferences and perceptions of the sensory evaluation of drink products are very significant to manufacturers and retailers alike. With no appropriate sensory analysis, there is a high risk of market disappointment. This paper aims to rank the selected coffee products and also to determine the best of quality attribute through sensory evaluation using fuzzy decision making model. Three products of coffee drinks were used for sensory evaluation. Data were collected from thirty judges at a hypermarket in Kuala Terengganu, Malaysia. The judges were asked to specify their sensory evaluation in linguistic terms of the quality attributes of colour, smell, taste and mouth feel for each product and also the weight of each quality attribute. Five fuzzy linguistic terms represent the quality attributes were introduced prior analysing. The judgment membership function and the weights were compared to rank the products and also to determine the best quality attribute. The product of Indoc was judged as the first in ranking and 'taste' as the best quality attribute. These implicate the importance of sensory evaluation in identifying consumers- preferences and also the competency of fuzzy approach in decision making.

Web Page Watermarking: XML files using Synonyms and Acronyms

Advent enhancements in the field of computing have increased massive use of web based electronic documents. Current Copyright protection laws are inadequate to prove the ownership for electronic documents and do not provide strong features against copying and manipulating information from the web. This has opened many channels for securing information and significant evolutions have been made in the area of information security. Digital Watermarking has developed into a very dynamic area of research and has addressed challenging issues for digital content. Watermarking can be visible (logos or signatures) and invisible (encoding and decoding). Many visible watermarking techniques have been studied for text documents but there are very few for web based text. XML files are used to trade information on the internet and contain important information. In this paper, two invisible watermarking techniques using Synonyms and Acronyms are proposed for XML files to prove the intellectual ownership and to achieve the security. Analysis is made for different attacks and amount of capacity to be embedded in the XML file is also noticed. A comparative analysis for capacity is also made for both methods. The system has been implemented using C# language and all tests are made practically to get the results.

Factorial Structure and Psychometric Validation of Ecotourism Experiential Value Construct: Insights from Taman Negara National Park, Malaysia

The purpose of this research is to disentangle and validate the underlying factorial-structure of Ecotourism Experiential Value (EEV) measurement scale and subsequently investigate its psychometric properties. The analysis was based on a sample of 225 eco-tourists, collected at the vicinity of Taman Negara National Park (TNNP) via interviewer-administered questionnaire. Exploratory factor analysis (EFA) was performed to determine the factorial structure of EEV. Subsequently, to confirm and validate the factorial structure and assess the psychometric properties of EEV, confirmatory factor analysis (CFA) was executed. In addition, to establish the nomological validity of EEV a structural model was developed to examine the effect of EEV on Total Eco-tourist Experience Quality (TEEQ). It is unveiled that EEV is a secondorder six-factorial structure construct and it scale has adequately met the psychometric criteria, thus could permit interpretation of results confidently. The findings have important implications for future research directions and management of ecotourism destination.

Communicating a Mega Sporting Event in a Social Network Environment

Arguments on a popular microblogging site were analysed by means of a methodological approach to business rhetoric focusing on the logos communication technique. The focus of the analysis was the 100 day countdown to the 2011 Rugby World Cup as advanced by the organisers. Big sporting events provide an attractive medium for sport event marketers in that they have become important strategic communication tools directed at sport consumers. Sport event marketing is understood in the sense of using a microblogging site as a communication tool whose purpose it is to disseminate a company-s marketing messages by involving the target audience in experiential activities. Sport creates a universal language in that it excites and increases the spread of information by word of mouth and other means. The findings highlight the limitations of a microblogging site in terms of marketing messages which can assist in better practices. This study can also serve as a heuristic tool for other researchers analysing sports marketing messages in social network environments.

SMCC: Self-Managing Congestion Control Algorithm

Transmission control protocol (TCP) Vegas detects network congestion in the early stage and successfully prevents periodic packet loss that usually occurs in TCP Reno. It has been demonstrated that TCP Vegas outperforms TCP Reno in many aspects. However, TCP Vegas suffers several problems that affect its congestion avoidance mechanism. One of the most important weaknesses in TCP Vegas is that alpha and beta depend on a good expected throughput estimate, which as we have seen, depends on a good minimum RTT estimate. In order to make the system more robust alpha and beta must be made responsive to network conditions (they are currently chosen statically). This paper proposes a modified Vegas algorithm, which can be adjusted to present good performance compared to other transmission control protocols (TCPs). In order to do this, we use PSO algorithm to tune alpha and beta. The simulation results validate the advantages of the proposed algorithm in term of performance.

Location Management in Cellular Networks

Cellular networks provide voice and data services to the users with mobility. To deliver services to the mobile users, the cellular network is capable of tracking the locations of the users, and allowing user movement during the conversations. These capabilities are achieved by the location management. Location management in mobile communication systems is concerned with those network functions necessary to allow the users to be reached wherever they are in the network coverage area. In a cellular network, a service coverage area is divided into smaller areas of hexagonal shape, referred to as cells. The cellular concept was introduced to reuse the radio frequency. Continued expansion of cellular networks, coupled with an increasingly restricted mobile spectrum, has established the reduction of communication overhead as a highly important issue. Much of this traffic is used in determining the precise location of individual users when relaying calls, with the field of location management aiming to reduce this overhead through prediction of user location. This paper describes and compares various location management schemes in the cellular networks.

Reflections of Utopia and the Ideal City in the Development of Physical Structure of Nikšić Aspect of Visual Perception

Aspect of visual perception occupies a central position in shaping the physical structure of a city. This paper discusses the visual characteristics of utopian cities and their impact on the shaping of real urban structures. Utopian examples of cities will not be discussed in terms of social and sociological conditions, but rather the emphasis is on urban utopias and ideal cities that have achieved or have had potential impact on the shape of the physical structure of Nikšić. It is a Renaissance-Baroque period with a touch of classicism. The paper’s emphasis is on the physical dimension, not excluding the importance of social equilibrium, studies of which are dating back to Aristotle, Plato, Thomas More, Robert Owen, Tommaso Campanella and others. The emphasis is on urban utopias and their impact on the development of sustainable physical structure of a real city in the context of visual perception. In the case of Nikšić, this paper identifies the common features of a real city and a utopian city, as well as criteria for sustainable urban development in the context of visual achievement.

Optimizing Materials Cost and Mechanical Properties of PVC Electrical Cable-s Insulation by Using Mixture Experimental Design Approach

With the development of the Polyvinyl chloride (PVC) products in many applications, the challenge of investigating the raw material composition and reducing the cost have both become more and more important. Considerable research has been done investigating the effect of additives on the PVC products. Most of the PVC composites research investigates only the effect of single/few factors, at a time. This isolated consideration of the input factors does not take in consideration the interaction effect of the different factors. This paper implements a mixture experimental design approach to find out a cost-effective PVC composition for the production of electrical-insulation cables considering the ASTM Designation (D) 6096. The results analysis showed that a minimum cost can be achieved through using 20% virgin PVC, 18.75% recycled PVC, 43.75% CaCO3 with participle size 10 microns, 14% DOP plasticizer, and 3.5% CPW plasticizer. For maximum UTS the compound should consist of: 17.5% DOP, 62.5% virgin PVC, and 20.0% CaCO3 of particle size 5 microns. Finally, for the highest ductility the compound should be made of 35% virgin PVC, 20% CaCO3 of particle size 5 microns, and 45.0% DOP plasticizer.