IT Management: How IT Managers Gain IT knowledge

It is not a secret that, IT management has become more and more and integrated part of almost all organizations. IT managers posses an enormous amount of knowledge within both organizational knowledge and general IT knowledge. This article investigates how IT managers keep themselves updated on IT knowledge in general and looks into how much time IT managers spend on weekly basis searching the net for new or problem solving IT knowledge. The theory used in this paper is used to investigate the current role of IT managers and what issues they are facing. Furthermore a research is conducted where 7 IT managers in medium sized and large Danish companies are interviewed to add further focus on the role of the IT manager and to focus on how they keep themselves updated. Beside finding substantial need for more research, IT managers – generalists or specialists – only have limited knowledge resources at hand in updating their own knowledge – leaving much initiative to vendors.

User Acceptance of Educational Games: A Revised Unified Theory of Acceptance and Use of Technology (UTAUT)

Educational games (EG) seem to have lots of potential due to digital games popularity and preferences of our younger generations of learners. However, most studies focus on game design and its effectiveness while little has been known about the factors that can affect users to accept or to reject EG for their learning. User acceptance research try to understand the determinants of information systems (IS) adoption among users by investigating both systems factors and users factors. Upon the lack of knowledge on acceptance factors for educational games, we seek to understand the issue. This study proposed a model of acceptance factors based on Unified Theory of Acceptance and Use of Technology (UTAUT). We use original model (performance expectancy, effort expectancy and social influence) together with two new determinants (learning opportunities and enjoyment). We will also investigate the effect of gender and gaming experience that moderate the proposed factors.

Modelling of Soil Erosion by Non Conventional Methods

Soil erosion is the most serious problem faced at global and local level. So planning of soil conservation measures has become prominent agenda in the view of water basin managers. To plan for the soil conservation measures, the information on soil erosion is essential. Universal Soil Loss Equation (USLE), Revised Universal Soil Loss Equation 1 (RUSLE1or RUSLE) and Modified Universal Soil Loss Equation (MUSLE), RUSLE 1.06, RUSLE1.06c, RUSLE2 are most widely used conventional erosion estimation methods. The essential drawbacks of USLE, RUSLE1 equations are that they are based on average annual values of its parameters and so their applicability to small temporal scale is questionable. Also these equations do not estimate runoff generated soil erosion. So applicability of these equations to estimate runoff generated soil erosion is questionable. Data used in formation of USLE, RUSLE1 equations was plot data so its applicability at greater spatial scale needs some scale correction factors to be induced. On the other hand MUSLE is unsuitable for predicting sediment yield of small and large events. Although the new revised forms of USLE like RUSLE 1.06, RUSLE1.06c and RUSLE2 were land use independent and they have almost cleared all the drawbacks in earlier versions like USLE and RUSLE1, they are based on the regional data of specific area and their applicability to other areas having different climate, soil, land use is questionable. These conventional equations are applicable for sheet and rill erosion and unable to predict gully erosion and spatial pattern of rills. So the research was focused on development of nonconventional (other than conventional) methods of soil erosion estimation. When these non-conventional methods are combined with GIS and RS, gives spatial distribution of soil erosion. In the present paper the review of literature on non- conventional methods of soil erosion estimation supported by GIS and RS is presented.

ANP-based Intra and Inter-industry Analysis for Measuring Spillover Effect of ICT Industries

The interaction among information and communication technology (ICT) industries is a recently ubiquitous phenomenon through fixed-mobile integration. To monitor the impact of interaction, previous research has mainly focused on measuring spillover effect among ICT industries using various methods. Among others, inter-industry analysis is one of the useful methods for examining spillover effect between industries. However, more complex ICT industries become, more important the impact within an industry is. Inter-industry analysis is limited in mirroring intra-relationships within an industry. Thus, this study applies the analytic network process (ANP) to measure the spillover effect, capturing all of the intra and inter-relationships. Using ANP-based intra and inter-industry analysis, the spillover effect is effectively measured, mirroring the complex structure of ICT industries. A main ICT industry and its linkages are also explored to show the current structure of ICT industries. The proposed approach is expected to allow policy makers to understand interactions of ICT industries and their impact.

Query Optimization Techniques for XML Databases

Over the past few years, XML (eXtensible Mark-up Language) has emerged as the standard for information representation and data exchange over the Internet. This paper provides a kick-start for new researches venturing in XML databases field. We survey the storage representation for XML document, review the XML query processing and optimization techniques with respect to the particular storage instance. Various optimization technologies have been developed to solve the query retrieval and updating problems. Towards the later year, most researchers proposed hybrid optimization techniques. Hybrid system opens the possibility of covering each technology-s weakness by its strengths. This paper reviews the advantages and limitations of optimization techniques.

Achieving Performance in an Organization through Marketing Innovation

Innovation is becoming more and more important in modern society. There are a lot of researches on different kinds of innovation but marketing innovation is one kind of innovation that has not been studied frequently before. Marketing innovation is defined as a new way in which companies can market themselves to potential or existing customers. The study shows some key elements for marketing innovation that are worth paying attention to when implementing marketing innovation projects. Examples of such key elements are: paying attention to the neglected market, suitable market segmentatio reliable market information, public relationship, increased customer value, combination of market factors, explore different marketing channels and the use of technology in combination with what? Beside the key elements for marketing innovation, we also present some risks that may occur, such as cost, market uncertainty, information leakage, imitation and overdependence on experience. By proposing a set of indicators to measure marketing innovation, the article offers solutions for marketing innovation implementation so that any organization can achieve optimal results.

Meta-reasoning for Multi-agent Communication of Semantic Web Information

Meta-reasoning is essential for multi-agent communication. In this paper we propose a framework of multi-agent communication in which agents employ meta-reasoning to reason with agent and ontology locations in order to communicate semantic information with other agents on the semantic web and also reason with multiple distributed ontologies. We shall argue that multi-agent communication of Semantic Web information cannot be realized without the need to reason with agent and ontology locations. This is because for an agent to be able to communicate with another agent, it must know where and how to send a message to that agent. Similarly, for an agent to be able to reason with an external semantic web ontology, it must know where and how to access to that ontology. The agent framework and its communication mechanism are formulated entirely in meta-logic.

Performance Study on Audio Codec and Session Transfer of Open Source VoIP applications

Voice over Internet Protocol (VoIP) application or commonly known as softphone has been developing an increasingly large market in today-s telecommunication world and the trend is expected to continue with the enhancement of additional features. This includes leveraging on the existing presence services, location and contextual information to enable more ubiquitous and seamless communications. In this paper, we discuss the concept of seamless session transfer for real-time application such as VoIP and IPTV, and our prototype implementation of such concept on a selected open source VoIP application. The first part of this paper is about conducting performance evaluation and assessments across some commonly found open source VoIP applications that are Ekiga, Kphone, Linphone and Twinkle so as to identify one of them for implementing our design of seamless session transfer. Subjective testing has been carried out to evaluate the audio performance on these VoIP applications and rank them according to their Mean Opinion Score (MOS) results. The second part of this paper is to discuss on the performance evaluations of our prototype implementation of session transfer using Linphone.

A Comparative Study of Page Ranking Algorithms for Information Retrieval

This paper gives an introduction to Web mining, then describes Web Structure mining in detail, and explores the data structure used by the Web. This paper also explores different Page Rank algorithms and compare those algorithms used for Information Retrieval. In Web Mining, the basics of Web mining and the Web mining categories are explained. Different Page Rank based algorithms like PageRank (PR), WPR (Weighted PageRank), HITS (Hyperlink-Induced Topic Search), DistanceRank and DirichletRank algorithms are discussed and compared. PageRanks are calculated for PageRank and Weighted PageRank algorithms for a given hyperlink structure. Simulation Program is developed for PageRank algorithm because PageRank is the only ranking algorithm implemented in the search engine (Google). The outputs are shown in a table and chart format.

Automated Thickness Measurement of Retinal Blood Vessels for Implementation of Clinical Decision Support Systems in Diagnostic Diabetic Retinopathy

The structure of retinal vessels is a prominent feature, that reveals information on the state of disease that are reflected in the form of measurable abnormalities in thickness and colour. Vascular structures of retina, for implementation of clinical diabetic retinopathy decision making system is presented in this paper. Retinal Vascular structure is with thin blood vessel, whose accuracy is highly dependent upon the vessel segmentation. In this paper the blood vessel thickness is automatically detected using preprocessing techniques and vessel segmentation algorithm. First the capture image is binarized to get the blood vessel structure clearly, then it is skeletonised to get the overall structure of all the terminal and branching nodes of the blood vessels. By identifying the terminal node and the branching points automatically, the main and branching blood vessel thickness is estimated. Results are presented and compared with those provided by clinical classification on 50 vessels collected from Bejan Singh Eye hospital..

The Model of Blended Learning and Its Use at Foreign Language Teaching

In present article the model of Blended Learning, its advantage at foreign language teaching, and also some problems that can arise during its use are considered. The Blended Learning is a special organization of learning, which allows to combine classroom work and modern technologies in electronic distance teaching environment. Nowadays a lot of European educational institutions and companies use such technology. Through this method: student gets the opportunity to learn in a group (classroom) with a teacher and additionally at home at a convenient time; student himself sets the optimal speed and intensity of the learning process; this method helps student to discipline himself and learn to work independently.

A Text Mining Technique Using Association Rules Extraction

This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.

Causes of Rotor Distortions and Applicable Common Straightening Methods for Turbine Rotors and Shafts

Different problems may causes distortion of the rotor, and hence vibration, which is the most severe damage of the turbine rotors. In many years different techniques have been developed for the straightening of bent rotors. The method for straightening can be selected according to initial information from preliminary inspections and tests such as nondestructive tests, chemical analysis, run out tests and also a knowledge of the shaft material. This article covers the various causes of excessive bends and then some applicable common straightening methods are reviewed. Finally, hot spotting is opted for a particular bent rotor. A 325 MW steam turbine rotor is modeled and finite element analyses are arranged to investigate this straightening process. Results of experimental data show that performing the exact hot spot straightening process reduced the bending of the rotor significantly.

The Resource Description Framework (RDF) as a Modern Structure for Medical Data

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.

Pulsed Multi-Layered Image Filtering: A VLSI Implementation

Image convolution similar to the receptive fields found in mammalian visual pathways has long been used in conventional image processing in the form of Gabor masks. However, no VLSI implementation of parallel, multi-layered pulsed processing has been brought forward which would emulate this property. We present a technical realization of such a pulsed image processing scheme. The discussed IC also serves as a general testbed for VLSI-based pulsed information processing, which is of interest especially with regard to the robustness of representing an analog signal in the phase or duration of a pulsed, quasi-digital signal, as well as the possibility of direct digital manipulation of such an analog signal. The network connectivity and processing properties are reconfigurable so as to allow adaptation to various processing tasks.

A Fitted Random Sampling Scheme for Load Distribution in Grid Networks

Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.

Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images

We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.

Estimation of Time -Varying Linear Regression with Unknown Time -Volatility via Continuous Generalization of the Akaike Information Criterion

The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility - ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. The Akaike Information Criterion is a commonly adopted means of adjusting a model to the given data set within a succession of nested parametric model classes, but its crucial restriction is that the classes are rigidly defined by the growing integer-valued dimension of the unknown vector. To make the Kullback information maximization principle underlying the classical AIC applicable to the problem of time-varying regression estimation, we extend it onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions.

A Novel Machining Signal Filtering Technique: Z-notch Filter

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Goal-Based Request Cloud Resource Broker in Medical Application

In this paper, cloud resource broker using goalbased request in medical application is proposed. To handle recent huge production of digital images and data in medical informatics application, the cloud resource broker could be used by medical practitioner for proper process in discovering and selecting correct information and application. This paper summarizes several reviewed articles to relate medical informatics application with current broker technology and presents a research work in applying goal-based request in cloud resource broker to optimize the use of resources in cloud environment. The objective of proposing a new kind of resource broker is to enhance the current resource scheduling, discovery, and selection procedures. We believed that it could help to maximize resources allocation in medical informatics application.