Abstract: In this paper we present the algorithm which allows
us to have an object tracking close to real time in Full HD videos.
The frame rate (FR) of a video stream is considered to be between
5 and 30 frames per second. The real time track building will be
achieved if the algorithm can follow 5 or more frames per second. The
principle idea is to use fast algorithms when doing preprocessing to
obtain the key points and track them after. The procedure of matching
points during assignment is hardly dependent on the number of points.
Because of this we have to limit pointed number of points using the
most informative of them.
Abstract: In this paper we report a study aimed at determining
the most effective animation technique for representing ASL
(American Sign Language) finger-spelling. Specifically, in the study
we compare two commonly used 3D computer animation methods
(keyframe animation and motion capture) in order to ascertain which
technique produces the most 'accurate', 'readable', and 'close to
actual signing' (i.e. realistic) rendering of ASL finger-spelling. To
accomplish this goal we have developed 20 animated clips of fingerspelled
words and we have designed an experiment consisting of a
web survey with rating questions. 71 subjects ages 19-45 participated
in the study. Results showed that recognition of the words was
correlated with the method used to animate the signs. In particular,
keyframe technique produced the most accurate representation of the
signs (i.e., participants were more likely to identify the words
correctly in keyframed sequences rather than in motion captured
ones). Further, findings showed that the animation method had an
effect on the reported scores for readability and closeness to actual
signing; the estimated marginal mean readability and closeness was
greater for keyframed signs than for motion captured signs. To our
knowledge, this is the first study aimed at measuring and comparing
accuracy, readability and realism of ASL animations produced with
different techniques.
Abstract: Technological innovation capability (TIC) is
defined as a comprehensive set of characteristics of a firm that
facilities and supports its technological innovation strategies.
An audit to evaluate the TICs of a firm may trigger
improvement in its future practices. Such an audit can be used
by the firm for self assessment or third-party independent
assessment to identify problems of its capability status. This
paper attempts to develop such an auditing framework that
can help to determine the subtle links between innovation
capabilities and business performance; and to enable the
auditor to determine whether good practice is in place. The
seven TICs in this study include learning, R&D, resources
allocation, manufacturing, marketing, organization and
strategic planning capabilities. Empirical data was acquired
through a survey study of 200 manufacturing firms in the
Hong Kong/Pearl River Delta (HK/PRD) region. Structural
equation modelling was employed to examine the
relationships among TICs and various performance indicators:
sales performance, innovation performance, product
performance, and sales growth. The results revealed that
different TICs have different impacts on different
performance measures. Organization capability was found to
have the most influential impact. Hong Kong manufacturers
are now facing the challenge of high-mix-low-volume
customer orders. In order to cope with this change, good
capability in organizing different activities among various
departments is critical to the success of a company.
Abstract: The paper presents new results of a recent industry
supported research and development study in which an efficient
framework for evaluating practical and meaningful power system
reliability and quality indices was applied. The system-wide
integrated performance indices are capable of addressing and
revealing areas of deficiencies and bottlenecks as well as
redundancies in the composite generation-transmission-demand
structure of large-scale power grids. The technique utilizes a linear
programming formulation, which simulates practical operating
actions and offers a general and comprehensive framework to assess
the harmony and compatibility of generation, transmission and
demand in a power system. Practical applications to a reduced
system model as well as a portion of the Saudi power grid are also
presented in the paper for demonstration purposes.
Abstract: This paper studies the effect of different compression
constraints and schemes presented in a new and flexible paradigm to
achieve high compression ratios and acceptable signal to noise ratios
of Arabic speech signals. Compression parameters are computed for
variable frame sizes of a level 5 to 7 Discrete Wavelet Transform
(DWT) representation of the signals for different analyzing mother
wavelet functions. Results are obtained and compared for Global
threshold and level dependent threshold techniques. The results
obtained also include comparisons with Signal to Noise Ratios, Peak
Signal to Noise Ratios and Normalized Root Mean Square Error.
Abstract: Multi-Agent Systems (MAS) emerged in the pursuit to improve our standard of living, and hence can manifest complex human behaviors such as communication, decision making, negotiation and self-organization. The Social Network Services (SNSs) have attracted millions of users, many of whom have integrated these sites into their daily practices. The domains of MAS and SNS have lots of similarities such as architecture, features and functions. Exploring social network users- behavior through multiagent model is therefore our research focus, in order to generate more accurate and meaningful information to SNS users. An application of MAS is the e-Auction and e-Rental services of the Universiti Cyber AgenT(UniCAT), a Social Network for students in Universiti Tunku Abdul Rahman (UTAR), Kampar, Malaysia, built around the Belief- Desire-Intention (BDI) model. However, in spite of the various advantages of the BDI model, it has also been discovered to have some shortcomings. This paper therefore proposes a multi-agent framework utilizing a modified BDI model- Belief-Desire-Intention in Dynamic and Uncertain Situations (BDIDUS), using UniCAT system as a case study.
Abstract: This research elaborates decision models for product
innovation in the early phases, focusing on one of the most widely
implemented method in marketing research: conjoint analysis and the
related conjoint-based models with special focus on heuristics
programming techniques for the development of optimal product
innovation. The concept, potential, requirements and limitations of
conjoint analysis and its conjoint-based heuristics successors are
analysed and the development of conceptual framework of Genetic
Algorithm (GA) as one of the most widely implemented heuristic
methods for developing product innovations are discussed.
Abstract: Grid computing provides a virtual framework for
controlled sharing of resources across institutional boundaries.
Recently, trust has been recognised as an important factor for
selection of optimal resources in a grid. We introduce a new method
that provides a quantitative trust value, based on the past interactions
and present environment characteristics. This quantitative trust value
is used to select a suitable resource for a job and eliminates run time
failures arising from incompatible user-resource pairs. The proposed
work will act as a tool to calculate the trust values of the various
components of the grid and there by improves the success rate of the
jobs submitted to the resource on the grid. The access to a resource
not only depend on the identity and behaviour of the resource but
also upon its context of transaction, time of transaction, connectivity
bandwidth, availability of the resource and load on the resource. The
quality of the recommender is also evaluated based on the accuracy
of the feedback provided about a resource. The jobs are submitted for
execution to the selected resource after finding the overall trust value
of the resource. The overall trust value is computed with respect to
the subjective and objective parameters.
Abstract: One problem in evaluating recent computational models of human category learning is that there is no standardized method for systematically comparing the models' assumptions or hypotheses. In the present study, a flexible general model (called GECLE) is introduced that can be used as a framework to systematically manipulate and compare the effects and descriptive validities of a limited number of assumptions at a time. Two example simulation studies are presented to show how the GECLE framework can be useful in the field of human high-order cognition research.
Abstract: As far as the latest technological improvements are concerned, digital systems more become popular than the past. Despite this growing demand to the digital systems, content copy and attack against the digital cinema contents becomes a serious problem. To solve the above security problem, we propose “traceable watermarking using Hash functions for digital cinema system. Digital Cinema is a great application for traceable watermarking since it uses watermarking technology during content play as well as content transmission. The watermark is embedded into the randomly selected movie frames using CRC-32 techniques. CRC-32 is a Hash function. Using it, the embedding position is distributed by Hash Function so that any party cannot break off the watermarking or will not be able to change. Finally, our experimental results show that proposed DWT watermarking method using CRC-32 is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.
Abstract: B2E portals represent a new class of web-based
information technologies which many organisations are introducing
in recent years to stay in touch with their distributed workforces and
enable them to perform value added activities for organisations.
However, actual usage of these emerging systems (measured using
suitable instruments) has not been reported in the contemporary
scholarly literature. We argue that many of the instruments to
measure usage of various types of IT-enabled information systems
are not directly applicable for B2E portals because they were
developed for the context of traditional mainframe and PC-based
information systems. It is therefore important to develop a new
instrument for web-based portal technologies aimed at employees. In
this article, we report on the development and initial qualitative
evaluation of an instrument that seeks to operationaise a set of
independent factors affecting the usage of portals by employees. The
proposed instrument is useful to IT/e-commerce researchers and
practitioners alike as it enhances their confidence in predicting
employee usage of portals in organisations.
Abstract: Magneto-rheological (MR) fluid damper is a semiactive
control device that has recently received more attention by the
vibration control community. But inherent hysteretic and highly
nonlinear dynamics of MR fluid damper is one of the challenging
aspects to employ its unique characteristics. The combination of
artificial neural network (ANN) and fuzzy logic system (FLS) have
been used to imitate more precisely the behavior of this device.
However, the derivative-based nature of adaptive networks causes
some deficiencies. Therefore, in this paper, a novel approach that
employ genetic algorithm, as a free-derivative algorithm, to enhance
the capability of fuzzy systems, is proposed. The proposed method
used to model MR damper. The results will be compared with
adaptive neuro-fuzzy inference system (ANFIS) model, which is one
of the well-known approaches in soft computing framework, and two
best parametric models of MR damper. Data are generated based on
benchmark program by applying a number of famous earthquake
records.
Abstract: Master plan is a tool to guide and manage the growth of cities in a planned manner. The soul of a master plan lies in its implementation framework. If not implemented, people are trapped in a mess of urban problems and laissez-faire development having serious long term repercussions. Unfortunately, Master Plans prepared for several major cities of Pakistan could not be fully implemented due to host of reasons and Lahore is no exception. Being the second largest city of Pakistan with a population of over 7 million people, Lahore holds the distinction that the first ever Master Plan in the country was prepared for this city in 1966. Recently in 2004, a new plan titled `Integrated Master Plan for Lahore-2021- has been approved for implementation. This paper provides a comprehensive account of the weaknesses and constraints in the plan preparation process and implementation strategies of Master Plans prepared for Lahore. It also critically reviews the new Master Plan particularly with respect to the proposed implementation framework. The paper discusses the prospects and pre-conditions for successful implementation of the new Plan in the light of historic analysis, interviews with stakeholders and the new institutional context under the devolution plan.
Abstract: The use of human hand as a natural interface for humancomputer interaction (HCI) serves as the motivation for research in hand gesture recognition. Vision-based hand gesture recognition involves visual analysis of hand shape, position and/or movement. In this paper, we use the concept of object-based video abstraction for segmenting the frames into video object planes (VOPs), as used in MPEG-4, with each VOP corresponding to one semantically meaningful hand position. Next, the key VOPs are selected on the basis of the amount of change in hand shape – for a given key frame in the sequence the next key frame is the one in which the hand changes its shape significantly. Thus, an entire video clip is transformed into a small number of representative frames that are sufficient to represent a gesture sequence. Subsequently, we model a particular gesture as a sequence of key frames each bearing information about its duration. These constitute a finite state machine. For recognition, the states of the incoming gesture sequence are matched with the states of all different FSMs contained in the database of gesture vocabulary. The core idea of our proposed representation is that redundant frames of the gesture video sequence bear only the temporal information of a gesture and hence discarded for computational efficiency. Experimental results obtained demonstrate the effectiveness of our proposed scheme for key frame extraction, subsequent gesture summarization and finally gesture recognition.
Abstract: Vibration characteristics of subcooled flow boiling on
thin and long structures such as a heating rod were recently
investigated by the author. The results show that the intensity of the
subcooled boiling-induced vibration (SBIV) was influenced strongly
by the conditions of the subcooling temperature, linear power density
and flow velocity. Implosive bubble formation and collapse are the
main nature of subcooled boiling, and their behaviors are the only
sources to originate from SBIV. Therefore, in order to explain the
phenomenon of SBIV, it is essential to obtain reliable information
about bubble behavior in subcooled boiling conditions. This was
investigated at different conditions of coolant subcooling
temperatures of 25 to 75°C, coolant flow velocities of 0.16 to
0.53m/s, and linear power densities of 100 to 600 W/cm. High speed
photography at 13,500 frames per second was performed at these
conditions. The results show that even at the highest subcooling
condition, the absolute majority of bubbles collapse very close to the
surface after detaching from the heating surface. Based on these
observations, a simple model of surface tension and momentum
change is introduced to offer a rough quantitative estimate of the
force exerted on the heating surface during the bubble ebullition. The
formation of a typical bubble in subcooled boiling is predicted to
exert an excitation force in the order of 10-4 N.
Abstract: The understanding of the system level of biological behavior and phenomenon variously needs some elements such as gene sequence, protein structure, gene functions and metabolic pathways. Challenging problems are representing, learning and reasoning about these biochemical reactions, gene and protein structure, genotype and relation between the phenotype, and expression system on those interactions. The goal of our work is to understand the behaviors of the interactions networks and to model their evolution in time and in space. We propose in this study an ontological meta-model for the knowledge representation of the genetic regulatory networks. Ontology in artificial intelligence means the fundamental categories and relations that provide a framework for knowledge models. Domain ontology's are now commonly used to enable heterogeneous information resources, such as knowledge-based systems, to communicate with each other. The interest of our model is to represent the spatial, temporal and spatio-temporal knowledge. We validated our propositions in the genetic regulatory network of the Aarbidosis thaliana flower
Abstract: This paper draws a methodological framework adopted within an internal Telecomitalia project aimed to identify, on a user centred base, the potential interest towards a technological scenario aimed to extend on a personal bubble the typical communication and media fruition home environment. The problem is that involving user in the early stage of the development of such disruptive technology scenario asking users opinions on something that users actually do not manage even in a rough manner could lead to wrong or distorted results. For that reason we chose an approach that indirectly aim to understand users hidden needs in order to obtain a meaningful picture of the possible interest for a technological proposition non yet easily understandable.
Abstract: This study empirically examines the long run equilibrium relationship between South Africa’s exports and imports using quarterly data from 1985 to 2012. The theoretical framework used for the study is based on Johansen’s Maximum Likelihood cointegration technique which tests for both the existence and number of cointegration vectors that exists. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant cointegrating relationship is found to exist between exports and imports. The study models this unique linear and lagged relationship using a Vector Error Correction Model (VECM). The findings of the study confirm the existence of a long run equilibrium relationship between exports and imports.
Abstract: In data mining, the association rules are used to find
for the associations between the different items of the transactions
database. As the data collected and stored, rules of value can be found
through association rules, which can be applied to help managers
execute marketing strategies and establish sound market frameworks.
This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth)
to derive from fuzzy association rules. At first, we apply fuzzy
partition methods and decide a membership function of quantitative
value for each transaction item. Next, we implement FFP-growth
to deal with the process of data mining. In addition, in order to
understand the impact of Apriori algorithm and FFP-growth algorithm
on the execution time and the number of generated association
rules, the experiment will be performed by using different sizes of
databases and thresholds. Lastly, the experiment results show FFPgrowth
algorithm is more efficient than other existing methods.
Abstract: In many countries, digital city or ubiquitous city
(u-City) projects have been initiated to provide digitalized economic
environments to cities. Recently in Korea, Kangwon Province has
started the u-Kangwon project to boost local economy with digitalized
tourism services. We analyze the limitations of the ubiquitous IT
approach through the u-Kangwon case. We have found that travelers
are more interested in quality over speed in access of information. For
improved service quality, we are looking to develop an
IT-convergence service design framework (ISDF). The ISDF is based
on the service engineering technique and composed of three parts:
Service Design, Service Simulation, and the Service Platform.