A Model for Collaborative COTS Software Acquisition (COSA)

Acquiring commercial off-the-shelf (COTS) software applications is becoming routine in organizations. However, eliciting user requirements, finding the candidate COTS products and making the decision is a complex task, especially for SMEs who do not have the time and knowledge needed to do the task properly. The existing models intended to help the decision makers are originally designed for professional use. SMEs are obligated to rely on the software vendor’s ability to solve the problem with the systems provided.  In this paper, we develop a model for SMEs for the acquisition of Commercial Off-The-Shelf (COTS) software products. A leading idea of the model is that the ICT investment is basically a change initiative and therefore it should also be taken as a process of organizational learning. The model is designed bearing three objectives in mind: 1) business orientation, 2) agility, and 3) Learning and knowledge management orientation. The model can be applied to ICT investments in SMEs which have a professional team leader with basic business and IT knowledge.   

Motion Control of an Autonomous Surface Vessel for Enhanced Situational Awareness

This paper focuses on the critical components of the situational awareness (SA), the controls of position and orientation of an autonomous surface vessel (ASV). Moving of vessel into desired area in particular sea is a challenging but important task for ASVs to achieve high level of autonomy under adverse conditions. With the SA strategy, the approach motion by neural control of an initial stage of an ASV trajectory using neural network predictive controller and the circular motion by control of yaw moment in the final stage of trajectory were proposed. This control system has been demonstrated and evaluated by simulation of maritime maneuvers using software package Simulink. From the simulation results it can be seen that the fast SA of similar ASVs with economy in energy can be asserted during the maritime missions in search-and-rescue operations.

Measuring the Cognitive Abilities of Teenage Basketball Players in Singapore

This paper discusses the use of a computerized test to measure the decision-making abilities of teenage basketball players in Singapore. There are five sections in this test – Competitive state anxiety inventory-2 (CSAI-2) questionnaire (measures player’s cognitive anxiety, somatic anxiety and self-confidence), Corsi block-tapping task (measures player’s short-term spatial memory), situation awareness global assessment technique (SAGAT) (measures players’ situation awareness in a basketball game), multiple choice questions on basketball knowledge (measures players’ knowledge of basketball rules and concepts), and lastly, a learning test that requires participants to recall and recognize basketball set plays (measures player’s ability to learn and recognize set plays). A total of 25 basketball players, aged 14 to 16 years old, from three secondary school teams participated in this experiment. The results that these basketball players obtained from this cognitive test were then used to compare with their physical fitness and basketball performance.

Video-Based Face Recognition Based On State-Space Model

This paper proposes a video-based framework for face recognition to identify which faces appear in a video sequence. Our basic idea is like a tracking task - to track a selection of person candidates over time according to the observing visual features of face images in video frames. Hence, we employ the state-space model to formulate video-based face recognition by dividing this problem into two parts: the likelihood and the transition measures. The likelihood measure is to recognize whose face is currently being observed in video frames, for which two-dimensional linear discriminant analysis is employed. The transition measure estimates the probability of changing from an incorrect recognition at the previous stage to the correct person at the current stage. Moreover, extra nodes associated with head nodes are incorporated into our proposed state-space model. The experimental results are also provided to demonstrate the robustness and efficiency of our proposed approach.

Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Optical Flow Based Moving Object Detection and Tracking for Traffic Surveillance

Automated motion detection and tracking is a challenging task in traffic surveillance. In this paper, a system is developed to gather useful information from stationary cameras for detecting moving objects in digital videos. The moving detection and tracking system is developed based on optical flow estimation together with application and combination of various relevant computer vision and image processing techniques to enhance the process. To remove noises, median filter is used and the unwanted objects are removed by applying thresholding algorithms in morphological operations. Also the object type restrictions are set using blob analysis. The results show that the proposed system successfully detects and tracks moving objects in urban videos.

Comparisons of Fine Motor Functions in Subjects with Parkinson’s Disease and Essential Tremor

This study explores the clinical features of neurodegenerative disease patients with tremor. We study the motor impairments in patients with Parkinson’s disease (PD) and essential tremor (ET). Since uncertainty exists on whether Parkinson's disease (PD) and essential tremor (ET) patients have similar degree of impairment during motor tasks, this study based on the self-developed computerized handwriting movement analysis to characterize motor functions of these two impairments. The recruited subjects were diagnosed and confirmed one of neurodegenerative diseases. They were undergone general clinical evaluations by physicians in the first year. We recruited 8 participants with PD and 10 with ET. Additional 12 participants without any neuromuscular dysfunction were recruited as control group. This study used fine motor control of penmanship on digital tablet for sensorimotor function tests. The movement speed in PD/ET group is found significant slower than subjects in normal control group. In movement intensity and speed, the result found subject with ET has similar clinical feature with PD subjects. The ET group shows smaller and slower movements than control group but not to the same extent as PD group. The results of this study contribute to the early screening and detection of diseases and the evaluation of disease progression.

Amplitude and Phase Analysis of EEG Signal by Complex Demodulation

Analysis of amplitude and phase characteristics for delta, theta, and alpha bands at localized time instant from EEG signals is important for the characterizing information processing in the brain. In this paper, complex demodulation method was used to analyze EEG (Electroencephalographic) signal, particularly for auditory evoked potential response signal, with sufficient time resolution and designated frequency bandwidth resolution required. The complex demodulation decomposes raw EEG signal into 3 designated delta, theta, and alpha bands with complex EEG signal representation at sampled time instant, which can enable the extraction of amplitude envelope and phase information. Throughout simulated test data, and real EEG signal acquired during auditory attention task, it can extract the phase offset, phase and frequency changing instant and decomposed amplitude envelope for delta, theta, and alpha bands. The complex demodulation technique can be efficiently used in brain signal analysis in case of phase, and amplitude information required.

On the Computation of a Common n-finger Robotic Grasp for a Set of Objects

Industrial robotic arms utilize multiple end-effectors, each for a specific part and for a specific task. We propose a novel algorithm which will define a single end-effector’s configuration able to grasp a given set of objects with different geometries. The algorithm will have great benefit in production lines allowing a single robot to grasp various parts. Hence, reducing the number of endeffectors needed. Moreover, the algorithm will reduce end-effector design and manufacturing time and final product cost. The algorithm searches for a common grasp over the set of objects. The search algorithm maps all possible grasps for each object which satisfy a quality criterion and takes into account possible external wrenches (forces and torques) applied to the object. The mapped grasps are- represented by high-dimensional feature vectors which describes the shape of the gripper. We generate a database of all possible grasps for each object in the feature space. Then we use a search and classification algorithm for intersecting all possible grasps over all parts and finding a single common grasp suitable for all objects. We present simulations of planar and spatial objects to validate the feasibility of the approach.

A Consideration of the Achievement of Productive Level Parallel Programming Skills

This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.

Investments Attractiveness via Combinatorial Optimization Ranking

The paper proposes an approach to ranking a set of potential countries to invest taking into account the investor point of view about importance of different economic indicators. For the goal, a ranking algorithm that contributes to rational decision making is proposed. The described algorithm is based on combinatorial optimization modeling and repeated multi-criteria tasks solution. The final result is list of countries ranked in respect of investor preferences about importance of economic indicators for investment attractiveness. Different scenarios are simulated conforming to different investors preferences. A numerical example with real dataset of indicators is solved. The numerical testing shows the applicability of the described algorithm. The proposed approach can be used with any sets of indicators as ranking criteria reflecting different points of view of investors. 

A New Classification of Risk-Reduction Options to Improve the Risk-Reduction Readiness of the Railway Industry

The gap between the selection of risk-reduction options in the railway industry and the task of their effective implementation results in compromised safety and substantial losses. An effective risk management must necessarily integrate the evaluation phases with the implementation phase. This paper proposes an essential categorisation of risk reduction measures that best addresses a standard railway industry portfolio. By categorising the risk reduction options into design, operational, procedural and technical options, it is guaranteed that the efforts of the implementation facilitators (people, processes and supporting systems) are systematically harmonised. The classification is based on an integration of fundamental principles of risk reduction in the railway industry with the systems engineering approach. This paper argues that the use of a similar classification approach is an attribute of organisations possessing a superior level of risk-reduction readiness. The integration of the proposed rational classification structure provides a solid ground for effective risk reduction.

A Robust Diverged Localization and Recognition of License Registration Characters

Localization and Recognition of License registration characters from the moving vehicle is a computationally complex task in the field of machine vision and is of substantial interest because of its diverse applications such as cross border security, law enforcement and various other intelligent transportation applications. Previous research used the plate specific details such as aspect ratio, character style, color or dimensions of the plate in the complex task of plate localization. In this paper, license registration character is localized by Enhanced Weight based density map (EWBDM) method, which is independent of such constraints. In connection with our previous method, this paper proposes a method that relaxes constraints in lighting conditions, different fonts of character occurred in the plate and plates with hand-drawn characters in various aspect quotients. The robustness of this method is well suited for applications where the appearance of plates seems to be varied widely. Experimental results show that this approach is suited for recognizing license plates in different external environments. 

Linguistic Phenomena in Men and Women - TOT, FOK, Verbal Fluency

The aim of this study is to describe the differences between women and men in the phenomena of feeling of knowing/know (FOK), tip of the tongue (TOT), and verbal fluency. Two studies are presented. The first included a group of 60 participants and focused on the analysis of FOK and TOT in men and women. The second study described the performance of 302 participants in verbal fluency tasks. Both studies showed that sex is not a significant predictor of linguistic abilities. Rather, the main factors influencing one’s linguistic ability were Vocabulary and education. This study enriches the knowledge on mechanisms of memory and verbal production.

Comparison of Router Intelligent and Cooperative Host Intelligent Algorithms in a Continuous Model of Fixed Telecommunication Networks

The performance of state of the art worldwide telecommunication networks strongly depends on the efficiency of the applied routing mechanism. Game theoretical approaches to this problem offer new solutions. In this paper a new continuous network routing model is defined to describe data transfer in fixed telecommunication networks of multiple hosts. The nodes of the network correspond to routers whose latency is assumed to be traffic dependent. We propose that the whole traffic of the network can be decomposed to a finite number of tasks, which belong to various hosts. To describe the different latency-sensitivity, utility functions are defined for each task. The model is used to compare router and host intelligent types of routing methods, corresponding to various data transfer protocols. We analyze host intelligent routing as a transferable utility cooperative game with externalities. The main aim of the paper is to provide a framework in which the efficiency of various routing algorithms can be compared and the transferable utility game arising in the cooperative case can be analyzed.

A Rigid Point Set Registration of Remote Sensing Images Based on Genetic Algorithms and Hausdorff Distance

Image registration is the process of establishing point by point correspondence between images obtained from a same scene. This process is very useful in remote sensing, medicine, cartography, computer vision, etc. Then, the task of registration is to place the data into a common reference frame by estimating the transformations between the data sets. In this work, we develop a rigid point registration method based on the application of genetic algorithms and Hausdorff distance. First, we extract the feature points from both images based on the algorithm of global and local curvature corner. After refining the feature points, we use Hausdorff distance as similarity measure between the two data sets and for optimizing the search space we use genetic algorithms to achieve high computation speed for its inertial parallel. The results show the efficiency of this method for registration of satellite images.

Novel Security Strategy for Real Time Digital Videos

Now a days video data embedding approach is a very challenging and interesting task towards keeping real time video data secure. We can implement and use this technique with high-level applications. As the rate-distortion of any image is not confirmed, because the gain provided by accurate image frame segmentation are balanced by the inefficiency of coding objects of arbitrary shape, with a lot factors like losses that depend on both the coding scheme and the object structure. By using rate controller in association with the encoder one can dynamically adjust the target bitrate. This paper discusses about to keep secure videos by mixing signature data with negligible distortion in the original video, and to keep steganographic video as closely as possible to the quality of the original video. In this discussion we propose the method for embedding the signature data into separate video frames by the use of block Discrete Cosine Transform. These frames are then encoded by real time encoding H.264 scheme concepts. After processing, at receiver end recovery of original video and the signature data is proposed.

Shot Transition Detection with Minimal Decoding of MPEG Video Streams

Digital libraries become more and more necessary in order to support users with powerful and easy-to-use tools for searching, browsing and retrieving media information. The starting point for these tasks is the segmentation of video content into shots. To segment MPEG video streams into shots, a fully automatic procedure to detect both abrupt and gradual transitions (dissolve and fade-groups) with minimal decoding in real time is developed in this study. Each was explored through two phases: macro-block type's analysis in B-frames, and on-demand intensity information analysis. The experimental results show remarkable performance in detecting gradual transitions of some kinds of input data and comparable results of the rest of the examined video streams. Almost all abrupt transitions could be detected with very few false positive alarms.

Faster FPGA Routing Solution using DNA Computing

There are many classical algorithms for finding routing in FPGA. But Using DNA computing we can solve the routes efficiently and fast. The run time complexity of DNA algorithms is much less than other classical algorithms which are used for solving routing in FPGA. The research in DNA computing is in a primary level. High information density of DNA molecules and massive parallelism involved in the DNA reactions make DNA computing a powerful tool. It has been proved by many research accomplishments that any procedure that can be programmed in a silicon computer can be realized as a DNA computing procedure. In this paper we have proposed two tier approaches for the FPGA routing solution. First, geometric FPGA detailed routing task is solved by transforming it into a Boolean satisfiability equation with the property that any assignment of input variables that satisfies the equation specifies a valid routing. Satisfying assignment for particular route will result in a valid routing and absence of a satisfying assignment implies that the layout is un-routable. In second step, DNA search algorithm is applied on this Boolean equation for solving routing alternatives utilizing the properties of DNA computation. The simulated results are satisfactory and give the indication of applicability of DNA computing for solving the FPGA Routing problem.

Intelligent Neural Network Based STLF

Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.