The Link between Money Market and Economic Growth in Nigeria: Vector Error Correction Model Approach

The paper examines the impact of money market on economic growth in Nigeria using data for the period 1980-2012. Econometrics techniques such as Ordinary Least Squares Method, Johanson’s Co-integration Test and Vector Error Correction Model were used to examine both the long-run and short-run relationship. Evidence from the study suggest that though a long-run relationship exists between money market and economic growth, but the present state of the Nigerian money market is significantly and negatively related to economic growth. The link between the money market and the real sector of the economy remains very weak. This implies that the market is not yet developed enough to produce the needed growth that will propel the Nigerian economy because of several challenges. It was therefore recommended that government should create the appropriate macroeconomic policies, legal framework and sustain the present reforms with a view to developing the market so as to promote productive activities, investments, and ultimately economic growth.

Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Application of Scientific Metrics to Evaluate Academic Reputation in Different Research Areas

In this paper, we address the problem of identifying academic reputation of researchers using scientific metrics in different research areas. Due to the characteristics of each area, researchers can present different behaviors. In previous work, we define Rep-Index that makes use of a profile template to individually identify the reputation of researchers. The Rep-Index is comprehensive and adaptive because involves hole trajectory of the researcher built throughout his career and can be used in different areas and in different contexts. Now, we compare our metric (Rep-Index) with the h-index and the g-index through experiments with researchers in the fields of Economics, Dentistry and Computer Science. We analyze the trajectory of 830 Brazilian researchers from the National Council of Technological and Scientific Development (CNPq), which receive grants research productivity. The grants are aimed at productivity researchers that stand out among their peers, enhancing their scientific normative criteria established by CNPq. Of the 830 researchers, 210 are in the area of Economics, 216 of Dentistry e 404 of Computer Science. The experiments show that our metric is strongly correlated with h-index, g-index and CNPq ranking. We also show good results for our hypothesis that our metric can be used to evaluate research in several areas. We apply our metric (Rep-Index) to compare the behavior of researchers in relation to their h-index and g-index through extensive experiments. The experiments showed that our metric is strongly correlated with h-index, g-index and CNPq ranking.

Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm

The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.

Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Efficient Iris Recognition Method for Human Identification

In this paper, an efficient method for personal identification based on the pattern of human iris is proposed. It is composed of image acquisition, image preprocessing to make a flat iris then it is converted into eigeniris and decision is carried out using only reduction of iris in one dimension. By comparing the eigenirises it is determined whether two irises are similar. The results show that proposed method is quite effective.

Measuring the Structural Similarity of Web-based Documents: A Novel Approach

Most known methods for measuring the structural similarity of document structures are based on, e.g., tag measures, path metrics and tree measures in terms of their DOM-Trees. Other methods measures the similarity in the framework of the well known vector space model. In contrast to these we present a new approach to measuring the structural similarity of web-based documents represented by so called generalized trees which are more general than DOM-Trees which represent only directed rooted trees.We will design a new similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as strings of linear integers, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments to solve a novel and challenging problem: Measuring the structural similarity of generalized trees. More precisely, we first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based documents.

Offline Handwritten Signature Recognition

Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capability to reliably distinguish between an authorized person and an imposter. Signature verification systems can be categorized as offline (static) and online (dynamic). This paper presents a neural network based recognition of offline handwritten signatures system that is trained with low-resolution scanned signature images.

Behavioral Signature Generation using Shadow Honeypot

A novel behavioral detection framework is proposed to detect zero day buffer overflow vulnerabilities (based on network behavioral signatures) using zero-day exploits, instead of the signature-based or anomaly-based detection solutions currently available for IDPS techniques. At first we present the detection model that uses shadow honeypot. Our system is used for the online processing of network attacks and generating a behavior detection profile. The detection profile represents the dataset of 112 types of metrics describing the exact behavior of malware in the network. In this paper we present the examples of generating behavioral signatures for two attacks – a buffer overflow exploit on FTP server and well known Conficker worm. We demonstrated the visualization of important aspects by showing the differences between valid behavior and the attacks. Based on these metrics we can detect attacks with a very high probability of success, the process of detection is however very expensive.

A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation

Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.

Flagging Critical Components to Prevent Transient Faults in Real-Time Systems

This paper proposes the use of metrics in design space exploration that highlight where in the structure of the model and at what point in the behaviour, prevention is needed against transient faults. Previous approaches to tackle transient faults focused on recovery after detection. Almost no research has been directed towards preventive measures. But in real-time systems, hard deadlines are performance requirements that absolutely must be met and a missed deadline constitutes an erroneous action and a possible system failure. This paper proposes the use of metrics to assess the system design to flag where transient faults may have significant impact. These tools then allow the design to be changed to minimize that impact, and they also flag where particular design techniques – such as coding of communications or memories – need to be applied in later stages of design.

Confidence Intervals for the Difference of Two Normal Population Variances

Motivated by the recent work of Herbert, Hayen, Macaskill and Walter [Interval estimation for the difference of two independent variances. Communications in Statistics, Simulation and Computation, 40: 744-758, 2011.], we investigate, in this paper, new confidence intervals for the difference between two normal population variances based on the generalized confidence interval of Weerahandi [Generalized Confidence Intervals. Journal of the American Statistical Association, 88(423): 899-905, 1993.] and the closed form method of variance estimation of Zou, Huo and Taleban [Simple confidence intervals for lognormal means and their differences with environmental applications. Environmetrics 20: 172-180, 2009]. Monte Carlo simulation results indicate that our proposed confidence intervals give a better coverage probability than that of the existing confidence interval. Also two new confidence intervals perform similarly based on their coverage probabilities and their average length widths.

Evaluation of Energy-Aware QoS Routing Protocol for Ad Hoc Wireless Sensor Networks

Many advanced Routing protocols for wireless sensor networks have been implemented for the effective routing of data. Energy awareness is an essential design issue and almost all of these routing protocols are considered as energy efficient and its ultimate objective is to maximize the whole network lifetime. However, the introductions of video and imaging sensors have posed additional challenges. Transmission of video and imaging data requires both energy and QoS aware routing in order to ensure efficient usage of the sensors and effective access to the gathered measurements. In this paper, the performance of the energy-aware QoS routing Protocol are analyzed in different performance metrics like average lifetime of a node, average delay per packet and network throughput. The parameters considered in this study are end-to-end delay, real time data generation/capture rates, packet drop probability and buffer size. The network throughput for realtime and non-realtime data was also has been analyzed. The simulation has been done in NS2 simulation environment and the simulation results were analyzed with respect to different metrics.

Innovative Activity of Virtual Firm

The strengthening of competitive advantage combined with the transformation of business strategy is necessary for the company to succeed in the time of market changes. And in this sense the innovation activities of the firm are exactly significanting. Virtual firms are a specific form of enterprise in which we can't suppose all regularities obtaining in other forms. The aim of the paper is to evaluate factors influencing the innovation activity of virtual firm with the determination of their importance and influences on the basis of selected metrics.

Dynamic Metrics for Polymorphism in Object Oriented Systems

Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.

Hutchinson-Barnsley Operator in Intuitionistic Fuzzy Metric Spaces

The main purpose of this paper is to prove the intuitionistic fuzzy contraction properties of the Hutchinson-Barnsley operator on the intuitionistic fuzzy hyperspace with respect to the Hausdorff intuitionistic fuzzy metrics. Also we discuss about the relationships between the Hausdorff intuitionistic fuzzy metrics on the intuitionistic fuzzy hyperspaces. Our theorems generalize and extend some recent results related with Hutchinson-Barnsley operator in the metric spaces to the intuitionistic fuzzy metric spaces.

Q-Net: A Novel QoS Aware Routing Algorithm for Future Data Networks

The expectation of network performance from the early days of ARPANET until now has been changed significantly. Every day, new advancement in technological infrastructure opens the doors for better quality of service and accordingly level of perceived quality of network services have been increased over the time. Nowadays for many applications, late information has no value or even may result in financial or catastrophic loss, on the other hand, demands for some level of guarantee in providing and maintaining quality of service are ever increasing. Based on this history, having a QoS aware routing system which is able to provide today's required level of quality of service in the networks and effectively adapt to the future needs, seems as a key requirement for future Internet. In this work we have extended the traditional AntNet routing system to support QoS with multiple metrics such as bandwidth and delay which is named Q-Net. This novel scalable QoS routing system aims to provide different types of services in the network simultaneously. Each type of service can be provided for a period of time in the network and network nodes do not need to have any previous knowledge about it. When a type of quality of service is requested, Q-Net will allocate required resources for the service and will guarantee QoS requirement of the service, based on target objectives.

A Novel Approach to Iris Localization for Iris Biometric Processing

Iris-based biometric system is gaining its importance in several applications. However, processing of iris biometric is a challenging and time consuming task. Detection of iris part in an eye image poses a number of challenges such as, inferior image quality, occlusion of eyelids and eyelashes etc. Due to these problems it is not possible to achieve 100% accuracy rate in any iris-based biometric authentication systems. Further, iris detection is a computationally intensive task in the overall iris biometric processing. In this paper, we address these two problems and propose a technique to localize iris part efficiently and accurately. We propose scaling and color level transform followed by thresholding, finding pupil boundary points for pupil boundary detection and dilation, thresholding, vertical edge detection and removal of unnecessary edges present in the eye images for iris boundary detection. Scaling reduces the search space significantly and intensity level transform is helpful for image thresholding. Experimental results show that our approach is comparable with the existing approaches. Following our approach it is possible to detect iris part with 95-99% accuracy as substantiated by our experiments on CASIA Ver-3.0, ICE 2005, UBIRIS, Bath and MMU iris image databases.

Sentiment Analysis: Popularity of Candidates for the President of the United States

This article deals with the popularity of candidates for the president of the United States of America. The popularity is assessed according to public comments on the Web 2.0. Social networking, blogging and online forums (collectively Web 2.0) are for common Internet users the easiest way to share their personal opinions, thoughts, and ideas with the entire world. However, the web content diversity, variety of technologies and website structure differences, all of these make the Web 2.0 a network of heterogeneous data, where things are difficult to find for common users. The introductory part of the article describes methodology for gathering and processing data from Web 2.0. The next part of the article is focused on the evaluation and content analysis of obtained information, which write about presidential candidates.