SIP Authentication Scheme using ECDH

SIP (Session Initiation Protocol), using HTML based call control messaging which is quite simple and efficient, is being replaced for VoIP networks recently. As for authentication and authorization purposes there are many approaches and considerations for securing SIP to eliminate forgery on the integrity of SIP messages. On the other hand Elliptic Curve Cryptography has significant advantages like smaller key sizes, faster computations on behalf of other Public Key Cryptography (PKC) systems that obtain data transmission more secure and efficient. In this work a new approach is proposed for secure SIP authentication by using a public key exchange mechanism using ECC. Total execution times and memory requirements of proposed scheme have been improved in comparison with non-elliptic approaches by adopting elliptic-based key exchange mechanism.

A Video-based Algorithm for Moving Objects Detection at Signalized Intersection

Mixed-traffic (e.g., pedestrians, bicycles, and vehicles) data at an intersection is one of the essential factors for intersection design and traffic control. However, some data such as pedestrian volume cannot be directly collected by common detectors (e.g. inductive loop, sonar and microwave sensors). In this paper, a video based detection algorithm is proposed for mixed-traffic data collection at intersections using surveillance cameras. The algorithm is derived from Gaussian Mixture Model (GMM), and uses a mergence time adjustment scheme to improve the traditional algorithm. Real-world video data were selected to test the algorithm. The results show that the proposed algorithm has the faster processing speed and more accuracy than the traditional algorithm. This indicates that the improved algorithm can be applied to detect mixed-traffic at signalized intersection, even when conflicts occur.

e-Service Innovation within Open Innovation Networks

Service innovations are central concerns in fast changing environment. Due to the fitness in customer demands and advances in information technologies (IT) in service management, an expanded conceptualization of e-service innovation is required. Specially, innovation practices have become increasingly more challenging, driving managers to employ a different open innovation model to maintain competitive advantages. At the same time, firms need to interact with external and internal customers in innovative environments, like the open innovation networks, to co-create values. Based on these issues, an important conceptual framework of e-service innovation is developed. This paper aims to examine the contributing factors on e-service innovation and firm performance, including financial and non-financial aspects. The study concludes by showing how e-service innovation will play a significant role in growing the overall values of the firm. The discussion and conclusion will lead to a stronger understanding of e-service innovation and co-creating values with customers within open innovation networks.

The Self-Propelled Model of a Boat, Based on the Wave Thrust

We attempted investigate a boat model, based on the conversion of energy of surface wave into a sequence of unidirectional pulses of jet spurts, in other words - model of the boat, which is thrusting by the waves field on water surface. These pulses are forming some average reactive stream from the output nozzle on the stern of boat. The suggested model provides the conversion of its oscillatory motions (both pitching and rolling) into a jet flow. This becomes possible due to special construction of the boat and due to several details, sensitive to the local wave field. The boat model presents the uniflow jet engine without slow conversions of mechanical energy into intermediate forms and without any external sources of energy (besides surface waves). Motion of boat is characterized by fast jerks and average onward velocity, which exceeds the velocities of liquid particles in the wave.

Public Transport Prospective of People with Reduced Mobility in Hungary

To comply with the international human right legislation concerning the freedom of movement, transport systems are required to be made accessible in order that all citizens, regardless of their physical condition, have equal possibilities to use them. In Hungary, apparently there is a considerable default in the improvement of accessible public transport. This study is aiming to overview the current Hungarian situation and to reveal the reasons of the deficiency. The result shows that in spite of the relatively favourable juridical background linked to the accessibility needs and to the rights of persons with disabilities there is a strong delay in putting all in practice in the field of public transport. Its main reason is the lack of financial resource and referring to this the lack of creating mandatory regulations. In addition to this the proprietary rights related to public transport are also variable, which also limits the improvement possibilities. Consequently, first of all an accurate and detailed regulatory procedure is expected to change the present unfavourable situation and to create the conditions of the fast realization, which is already behind time.

Digital Hypertexts vs. Traditional Books: An Inquiry into Non-Linearity

The current study begins with an awareness that today-s media environment is characterized by technological development and a new way of reading caused by the introduction of the Internet. The researcher conducted a meta analysis framed within Technological Determinism to investigate the process of hypertext reading, its differences from linear reading and the effects such differences can have on people-s ways of mentally structuring their world. The relationship between literacy and the comprehension achieved by reading hypertexts is also investigated. The results show hypertexts are not always user friendly. People experience hyperlinks as interruptions that distract their attention generating comprehension and disorientation. On one hand hypertextual jumping reading generates interruptions that finally make people lose their concentration. On the other hand hypertexts fascinate people who would rather read a document in such a format even though the outcome is often frustrating and affects their ability to elaborate and retain information.

Fast Algorithm of Infrared Point Target Detection in Fluctuant Background

The background estimation approach using a small window median filter is presented on the bases of analyzing IR point target, noise and clutter model. After simplifying the two-dimensional filter, a simple method of adopting one-dimensional median filter is illustrated to make estimations of background according to the characteristics of IR scanning system. The adaptive threshold is used to segment canceled image in the background. Experimental results show that the algorithm achieved good performance and satisfy the requirement of big size image-s real-time processing.

Fault Localization and Alarm Correlation in Optical WDM Networks

For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.

An Approach of Control System for Automated Storage and Retrieval System (AS/RS)

Automated storage and retrieval systems (AS/RS) become frequently used systems in warehouses. There has been a transition from human based forklift applications to fast and safe AS/RS applications in firm-s warehouse systems. In this study, basic components and automation systems of the AS/RS are examined. Proposed system's automation components and their tasks in the system control algorithm were stated. According to this control algorithm the control system structure was obtained.

Removal of Chlorinated Resin and Fatty Acids from Paper Mill wastewater through Constructed Wetland

This study evaluates the performance of horizontal subsurface flow constructed wetland (HSSF-CW) for the removal of chlorinated resin and fatty acids (RFAs) from pulp and paper mill wastewater. The dimensions of the treatment system were 3.5 m x 1.5 m x 0.28 m with surface area of 5.25 m2, filled with fine sand and gravel. The cell was planted with an ornamental plant species Canna indica. The removal efficiency of chlorinated RFAs was in the range of 92-96% at the hydraulic retention time (HRT) of 5.9 days. Plant biomass and soil (sand and gravel) were analyzed for chlorinated RFAs content. No chlorinated RFAs were detected in plant biomass but detected in soil samples. Mass balance studies of chlorinated RFAs in HSSF-CW were also carried out.

Using Support Vector Machine for Prediction Dynamic Voltage Collapse in an Actual Power System

This paper presents dynamic voltage collapse prediction on an actual power system using support vector machines. Dynamic voltage collapse prediction is first determined based on the PTSI calculated from information in dynamic simulation output. Simulations were carried out on a practical 87 bus test system by considering load increase as the contingency. The data collected from the time domain simulation is then used as input to the SVM in which support vector regression is used as a predictor to determine the dynamic voltage collapse indices of the power system. To reduce training time and improve accuracy of the SVM, the Kernel function type and Kernel parameter are considered. To verify the effectiveness of the proposed SVM method, its performance is compared with the multi layer perceptron neural network (MLPNN). Studies show that the SVM gives faster and more accurate results for dynamic voltage collapse prediction compared with the MLPNN.

Project Management Success for Contractors

The aim of this paper is to provide a better understanding of the implementation of Project Management practices by UiTM contractors to ensure project success. A questionnaire survey was administered to 120 UiTM contractors in Malaysia. The purpose of this method was to gather information on the contractors- project background and project management skills. It was found that all of the contractors had basic knowledge and understanding of project management skills. It is suggested that a reasonable project plan and an appropriate organizational structure are influential factors for project success. It is recommended that the contractors need to have an effective program of work and up to date information system are emphasized.

Comparison of Methods of Testing Composite Slabs

Composite steel-concrete slabs using thin-walled corrugated steel sheets with embossments represent a modern and effective combination of steel and concrete. However, the design of new types of sheeting is conditional on the execution of expensive and time-consuming laboratory testing. The effort to develop a cheaper and faster method has lead to many investigations all over the world. In our paper we compare the results from our experiments involving vacuum loading, four-point bending and small-scale shear tests.

Practical Issues for Real-Time Video Tracking

In this paper we present the algorithm which allows us to have an object tracking close to real time in Full HD videos. The frame rate (FR) of a video stream is considered to be between 5 and 30 frames per second. The real time track building will be achieved if the algorithm can follow 5 or more frames per second. The principle idea is to use fast algorithms when doing preprocessing to obtain the key points and track them after. The procedure of matching points during assignment is hardly dependent on the number of points. Because of this we have to limit pointed number of points using the most informative of them.

Assesing Extension of Meeting System Performance in Information Technology in Defense and Aerospace Project

The Ministry of Defense (MoD) spends hundreds of millions of dollars on software to support its infrastructure, operate its weapons and provide command, control, communications, computing, intelligence, surveillance, and reconnaissance (C4ISR) functions. These and other all new advanced systems have a common critical component is information technology. Defense and Aerospace environment is continuously striving to keep up with increasingly sophisticated Information Technology (IT) in order to remain effective in today-s dynamic and unpredictable threat environment. This makes it one of the largest and fastest growing expenses of Defense. Hundreds of millions of dollars spent a year on IT projects. But, too many of those millions are wasted on costly mistakes. Systems that do not work properly, new components that are not compatible with old once, trendily new applications that do not really satisfy defense needs or lost though poorly managed contracts. This paper investigates and compiles the effective strategies that aim to end exasperation with low returns and high cost of Information Technology Acquisition for defense; it tries to show how to maximize value while reducing time and expenditure.

Block Cipher Based on Randomly Generated Quasigroups

Quasigroups are algebraic structures closely related to Latin squares which have many different applications. The construction of block cipher is based on quasigroup string transformation. This article describes a block cipher based Quasigroup of order 256, suitable for fast software encryption of messages written down in universal ASCII code. The novelty of this cipher lies on the fact that every time the cipher is invoked a new set of two randomly generated quasigroups are used which in turn is used to create a pair of quasigroup of dual operations. The cryptographic strength of the block cipher is examined by calculation of the xor-distribution tables. In this approach some algebraic operations allows quasigroups of huge order to be used without any requisite to be stored.

Emotion Recognition Using Neural Network: A Comparative Study

Emotion recognition is an important research field that finds lots of applications nowadays. This work emphasizes on recognizing different emotions from speech signal. The extracted features are related to statistics of pitch, formants, and energy contours, as well as spectral, perceptual and temporal features, jitter, and shimmer. The Artificial Neural Networks (ANN) was chosen as the classifier. Working on finding a robust and fast ANN classifier suitable for different real life application is our concern. Several experiments were carried out on different ANN to investigate the different factors that impact the classification success rate. Using a database containing 7 different emotions, it will be shown that with a proper and careful adjustment of features format, training data sorting, number of features selected and even the ANN type and architecture used, a success rate of 85% or even more can be achieved without increasing the system complicity and the computation time

On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel

This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.

An Improved Greedy Routing Algorithm for Grid using Pheromone-Based Landmarks

This paper objects to extend Jon Kleinberg-s research. He introduced the structure of small-world in a grid and shows with a greedy algorithm using only local information able to find route between source and target in delivery time O(log2n). His fundamental model for distributed system uses a two-dimensional grid with longrange random links added between any two node u and v with a probability proportional to distance d(u,v)-2. We propose with an additional information of the long link nearby, we can find the shorter path. We apply the ant colony system as a messenger distributed their pheromone, the long-link details, in surrounding area. The subsequence forwarding decision has more option to move to, select among local neighbors or send to node has long link closer to its target. Our experiment results sustain our approach, the average routing time by Color Pheromone faster than greedy method.

Fast Wavelength Calibration Algorithm for Optical Spectrum Analyzers

In this paper an algorithm for fast wavelength calibration of Optical Spectrum Analyzers (OSAs) using low power reference gas spectra is proposed. In existing OSAs a reference spectrum with low noise for precise detection of the reference extreme values is needed. To generate this spectrum costly hardware with high optical power is necessary. With this new wavelength calibration algorithm it is possible to use a noisy reference spectrum and therefore hardware costs can be cut. With this algorithm the reference spectrum is filtered and the key information is extracted by segmenting and finding the local minima and maxima. Afterwards slope and offset of a linear correction function for best matching the measured and theoretical spectra are found by correlating the measured with the stored minima. With this algorithm a reliable wavelength referencing of an OSA can be implemented on a microcontroller with a calculation time of less than one second.