The Advent of Electronic Logbook Technology - Reducing Cost and Risk to Both Marine Resources and the Fishing Industry

Fisheries management all around the world is hampered by the lack, or poor quality, of critical data on fish resources and fishing operations. The main reasons for the chronic inability to collect good quality data during fishing operations is the culture of secrecy common among fishers and the lack of modern data gathering technology onboard most fishing vessels. In response, OLRAC-SPS, a South African company, developed fisheries datalogging software (eLog in short) and named it Olrac. The Olrac eLog solution is capable of collecting, analysing, plotting, mapping, reporting, tracing and transmitting all data related to fishing operations. Olrac can be used by skippers, fleet/company managers, offshore mariculture farmers, scientists, observers, compliance inspectors and fisheries management authorities. The authors believe that using eLog onboard fishing vessels has the potential to revolutionise the entire process of data collection and reporting during fishing operations and, if properly deployed and utilised, could transform the entire commercial fleet to a provider of good quality data and forever change the way fish resources are managed. In addition it will make it possible to trace catches back to the actual individual fishing operation, to improve fishing efficiency and to dramatically improve control of fishing operations and enforcement of fishing regulations.

Investigation of Time Delay Factors in Global Software Development

Global Software Development (GSD) projects are passing through different boundaries of a company, country and even in other continents where time zone differs between both sites. Beside many benefits of such development, research declared plenty of negative impacts on these GSD projects. It is important to understand problems which may lie during the execution of GSD project with different time zones. This research project discussed and provided different issues related to time delays in GSD projects. In this paper, authors investigated some of the time delay factors which usually lie in GSD projects with different time zones. This investigation is done through systematic review of literature. Furthermore, the practices to overcome these delay factors which have already been reported in literature and GSD organizations are also explored through literature survey and case studies.

Theoretical Considerations for Software Component Metrics

We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.

Expressive Modes and Species of Language

Computer languages are usually lumped together into broad -paradigms-, leaving us in want of a finer classification of kinds of language. Theories distinguishing between -genuine differences- in language has been called for, and we propose that such differences can be observed through a notion of expressive mode. We outline this concept, propose how it could be operationalized and indicate a possible context for the development of a corresponding theory. Finally we consider a possible application in connection with evaluation of language revision. We illustrate this with a case, investigating possible revisions of the relational algebra in order to overcome weaknesses of the division operator in connection with universal queries.

Hydrogen Rich Fuel Gas Production from 2- Propanol Using Pt/Al2O3 and Ni/Al2O3 Catalysts in Supercritical Water

Hydrogen is an important chemical in many industries and it is expected to become one of the major fuels for energy generation in the future. Unfortunately, hydrogen does not exist in its elemental form in nature and therefore has to be produced from hydrocarbons, hydrogen-containing compounds or water. Above its critical point (374.8oC and 22.1MPa), water has lower density and viscosity, and a higher heat capacity than those of ambient water. Mass transfer in supercritical water (SCW) is enhanced due to its increased diffusivity and transport ability. The reduced dielectric constant makes supercritical water a better solvent for organic compounds and gases. Hence, due to the aforementioned desirable properties, there is a growing interest toward studies regarding the gasification of organic matter containing biomass or model biomass solutions in supercritical water. In this study, hydrogen and biofuel production by the catalytic gasification of 2-Propanol in supercritical conditions of water was investigated. Pt/Al2O3and Ni/Al2O3were the catalysts used in the gasification reactions. All of the experiments were performed under a constant pressure of 25MPa. The effects of five reaction temperatures (400, 450, 500, 550 and 600°C) and five reaction times (10, 15, 20, 25 and 30 s) on the gasification yield and flammable component content were investigated.

Nutrients Removal from Municipal Wastewater Treatment Plant Effluent using Eichhornia Crassipes

Water hyacinth has been used in aquatic systems for wastewater purification in many years worldwide. The role of water hyacinth (Eichhornia crassipes) species in polishing nitrate and phosphorus concentration from municipal wastewater treatment plant effluent by phytoremediation method was evaluated. The objective of this project is to determine the removal efficiency of water hyacinth in polishing nitrate and phosphorus, as well as chemical oxygen demand (COD) and ammonia. Water hyacinth is considered as the most efficient aquatic plant used in removing vast range of pollutants such as organic matters, nutrients and heavy metals. Water hyacinth, also referred as macrophytes, were cultivated in the treatment house in a reactor tank of approximately 90(L) x 40(W) x 25(H) in dimension and built with three compartments. Three water hyacinths were placed in each compartments and water sample in each compartment were collected in every two days. The plant observation was conducted by weight measurement, plant uptake and new young shoot development. Water hyacinth effectively removed approximately 49% of COD, 81% of ammonia, 67% of phosphorus and 92% of nitrate. It also showed significant growth rate at starting from day 6 with 0.33 shoot/day and they kept developing up to 0.38 shoot/day at the end of day 24. From the studies conducted, it was proved that water hyacinth is capable of polishing the effluent of municipal wastewater which contains undesirable amount of nitrate and phosphorus concentration.

Constructing Approximate and Exact Solutions for Boussinesq Equations using Homotopy Perturbation Padé Technique

Based on the homotopy perturbation method (HPM) and Padé approximants (PA), approximate and exact solutions are obtained for cubic Boussinesq and modified Boussinesq equations. The obtained solutions contain solitary waves, rational solutions. HPM is used for analytic treatment to those equations and PA for increasing the convergence region of the HPM analytical solution. The results reveal that the HPM with the enhancement of PA is a very effective, convenient and quite accurate to such types of partial differential equations.

Laboratory Experimentation for Supporting Collaborative Working in Engineering Education over the Internet

Collaborative working environments for distance education can be considered as a more generic form of contemporary remote labs. At present, the majority of existing real laboratories are not constructed to allow the involved participants to collaborate in real time. To make this revolutionary learning environment possible we must allow the different users to carry out an experiment simultaneously. In recent times, multi-user environments are successfully applied in many applications such as air traffic control systems, team-oriented military systems, chat-text tools, multi-player games etc. Thus, understanding the ideas and techniques behind these systems could be of great importance in the contribution of ideas to our e-learning environment for collaborative working. In this investigation, collaborative working environments from theoretical and practical perspectives are considered in order to build an effective collaborative real laboratory, which allows two students or more to conduct remote experiments at the same time as a team. In order to achieve this goal, we have implemented distributed system architecture, enabling students to obtain an automated help by either a human tutor or a rule-based e-tutor.

A Hybrid Search Algorithm for Solving Constraint Satisfaction Problems

In this paper we present a hybrid search algorithm for solving constraint satisfaction and optimization problems. This algorithm combines ideas of two basic approaches: complete and incomplete algorithms which also known as systematic search and local search algorithms. Different characteristics of systematic search and local search methods are complementary. Therefore we have tried to get the advantages of both approaches in the presented algorithm. The major advantage of presented algorithm is finding partial sound solution for complicated problems which their complete solution could not be found in a reasonable time. This algorithm results are compared with other algorithms using the well known n-queens problem.

An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement

Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.

An Improved Method to Watermark Images Sensitive to Blocking Artifacts

A new digital watermarking technique for images that are sensitive to blocking artifacts is presented. Experimental results show that the proposed MDCT based approach produces highly imperceptible watermarked images and is robust to attacks such as compression, noise, filtering and geometric transformations. The proposed MDCT watermarking technique is applied to fingerprints for ensuring security. The face image and demographic text data of an individual are used as multiple watermarks. An AFIS system was used to quantitatively evaluate the matching performance of the MDCT-based watermarked fingerprint. The high fingerprint matching scores show that the MDCT approach is resilient to blocking artifacts. The quality of the extracted face and extracted text images was computed using two human visual system metrics and the results show that the image quality was high.

Designing a Framework for Network Security Protection

As the Internet continues to grow at a rapid pace as the primary medium for communications and commerce and as telecommunication networks and systems continue to expand their global reach, digital information has become the most popular and important information resource and our dependence upon the underlying cyber infrastructure has been increasing significantly. Unfortunately, as our dependency has grown, so has the threat to the cyber infrastructure from spammers, attackers and criminal enterprises. In this paper, we propose a new machine learning based network intrusion detection framework for cyber security. The detection process of the framework consists of two stages: model construction and intrusion detection. In the model construction stage, a semi-supervised machine learning algorithm is applied to a collected set of network audit data to generate a profile of normal network behavior and in the intrusion detection stage, input network events are analyzed and compared with the patterns gathered in the profile, and some of them are then flagged as anomalies should these events are sufficiently far from the expected normal behavior. The proposed framework is particularly applicable to the situations where there is only a small amount of labeled network training data available, which is very typical in real world network environments.

An Efficient and Optimized Multi Constrained Path Computation for Real Time Interactive Applications in Packet Switched Networks

Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.

Comparison between Higher-Order SVD and Third-order Orthogonal Tensor Product Expansion

In digital signal processing it is important to approximate multi-dimensional data by the method called rank reduction, in which we reduce the rank of multi-dimensional data from higher to lower. For 2-dimennsional data, singular value decomposition (SVD) is one of the most known rank reduction techniques. Additional, outer product expansion expanded from SVD was proposed and implemented for multi-dimensional data, which has been widely applied to image processing and pattern recognition. However, the multi-dimensional outer product expansion has behavior of great computation complex and has not orthogonally between the expansion terms. Therefore we have proposed an alterative method, Third-order Orthogonal Tensor Product Expansion short for 3-OTPE. 3-OTPE uses the power method instead of nonlinear optimization method for decreasing at computing time. At the same time the group of B. D. Lathauwer proposed Higher-Order SVD (HOSVD) that is also developed with SVD extensions for multi-dimensional data. 3-OTPE and HOSVD are similarly on the rank reduction of multi-dimensional data. Using these two methods we can obtain computation results respectively, some ones are the same while some ones are slight different. In this paper, we compare 3-OTPE to HOSVD in accuracy of calculation and computing time of resolution, and clarify the difference between these two methods.

Connectionist Approach to Generic Text Summarization

As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance.

GPT Onto: A New Beginning for Malaysia Gross Pollutant Trap Ontology

Ontology is widely being used as a tool for organizing information, creating the relation between the subjects within the defined knowledge domain area. Various fields such as Civil, Biology, and Management have successful integrated ontology in decision support systems for managing domain knowledge and to assist their decision makers. Gross pollutant traps (GPT) are devices used in trapping and preventing large items or hazardous particles in polluting and entering our waterways. However choosing and determining GPT is a challenge in Malaysia as there are inadequate GPT data repositories being captured and shared. Hence ontology is needed to capture, organize and represent this knowledge into meaningful information which can be contributed to the efficiency of GPT selection in Malaysia urbanization. A GPT Ontology framework is therefore built as the first step to capture GPT knowledge which will then be integrated into the decision support system. This paper will provide several examples of the GPT ontology, and explain how it is constructed by using the Protégé tool.

The Effect of Stress Biaxiality on Crack Shape Development

The development of shape and size of a crack in a pressure vessel under uniaxial and biaxial loadings is important in fitness-for-service evaluations such as leak-before-break. In this work finite element modelling was used to evaluate the mean stress and the J-integral around a front of a surface-breaking crack. A procedure on the basis of ductile tearing resistance curves of high and low constrained fracture mechanics geometries was developed to estimate the amount of ductile crack extension for surface-breaking cracks and to show the evolution of the initial crack shape. The results showed non-uniform constraint levels and crack driving forces around the crack front at large deformation levels. It was also shown that initially semi-elliptical surface cracks under biaxial load developed higher constraint levels around the crack front than in uniaxial tension. However similar crack shapes were observed with more extensions associated with cracks under biaxial loading.

Plug and Play Interferometer Configuration using Single Modulator Technique

We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.

Semi-Automatic Approach for Semantic Annotation

The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.

Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.