Abstract: Digital libraries become more and more necessary in
order to support users with powerful and easy-to-use tools for
searching, browsing and retrieving media information. The starting
point for these tasks is the segmentation of video content into shots.
To segment MPEG video streams into shots, a fully automatic
procedure to detect both abrupt and gradual transitions (dissolve and
fade-groups) with minimal decoding in real time is developed in this
study. Each was explored through two phases: macro-block type's
analysis in B-frames, and on-demand intensity information analysis.
The experimental results show remarkable performance in
detecting gradual transitions of some kinds of input data and
comparable results of the rest of the examined video streams. Almost
all abrupt transitions could be detected with very few false positive
alarms.
Abstract: Code mobility technologies attract more and more developers and consumers. Numerous domains are concerned, many platforms are developed and interest applications are realized. However, developing good software products requires modeling, analyzing and proving steps. The choice of models and modeling languages is so critical on these steps. Formal tools are powerful in analyzing and proving steps. However, poorness of classical modeling language to model mobility requires proposition of new models. The objective of this paper is to provide a specific formalism “Coloured Reconfigurable Nets" and to show how this one seems to be adequate to model different kinds of code mobility.
Abstract: The software evolution control requires a deep
understanding of the changes and their impact on different system
heterogeneous artifacts. And an understanding of descriptive
knowledge of the developed software artifacts is a prerequisite
condition for the success of the evolutionary process.
The implementation of an evolutionary process is to make changes
more or less important to many heterogeneous software artifacts such
as source code, analysis and design models, unit testing, XML
deployment descriptors, user guides, and others. These changes can
be a source of degradation in functional, qualitative or behavioral
terms of modified software. Hence the need for a unified approach
for extraction and representation of different heterogeneous artifacts
in order to ensure a unified and detailed description of heterogeneous
software artifacts, exploitable by several software tools and allowing
to responsible for the evolution of carry out the reasoning change
concerned.
Abstract: This contribution was developed from a research
within the doctoral thesis. Its object was to create multimedia
materials for sport gymnastics. Consequently we surveyed the
influence of its practical application on the efficiency of schooling at
a university. We verified the prescribed hypothesis of the efficiency
of the teaching process using the method of single-factor experiment,
where the entrance independent variable was the change of system of
tuition and the outgoing dependent variable was the change of level
of acquired motor skills. The results confirmed the positive impact of
using multimedia materials on the efficiency of the teaching process.
Further, with the aid of questionnaires, we evaluated how the tested
subjects perceive the innovative methods in sport gymnastics. The
responses showed that the students rate the application of multimedia
materials very positively.
Abstract: Knowledge sharing culture contributes to a positive
working environment. Currently, there is no platform for the Faculty
of Industrial Information Technology (FIIT), Unisel academic staff to
share knowledge among them. As it is done manually, the sharing
process is through common meeting or by any offline discussions.
There is no repository for future retrieval. However, with open
source solution the development of knowledge based application may
reduce the cost tremendously. In this paper we discuss about the
domain on which this knowledge portal is being developed and also
the deployment of open source tools such as JOOMLA, PHP
programming language and MySQL. This knowledge portal is
evidence that open source tools also reliable in developing
knowledge based portal. These recommendations will be useful to
the open source community to produce more open source products in
future.
Abstract: To achieve competitive advantage nowadays, most of
the industrial companies are considering that success is sustained to
great product development. That is to manage the product throughout
its entire lifetime ranging from design, manufacture, operation and
destruction. Achieving this goal requires a tight collaboration
between partners from a wide variety of domains, resulting in various
product data types and formats, as well as different software tools. So
far, the lack of a meaningful unified representation for product data
semantics has slowed down efficient product development. This
paper proposes an ontology based approach to enable such semantic
interoperability. Generic and extendible product ontology is
described, gathering main concepts pertaining to the mechanical field
and the relations that hold among them. The ontology is not
exhaustive; nevertheless, it shows that such a unified representation
is possible and easily exploitable. This is illustrated thru a case study
with an example product and some semantic requests to which the
ontology responds quite easily. The study proves the efficiency of
ontologies as a support to product data exchange and information
sharing, especially in product development environments where
collaboration is not just a choice but a mandatory prerequisite.
Abstract: With a growing number of digital libraries and other
open education repositories being made available throughout the
world, effective search and retrieval tools are necessary to access the
desired materials that surpass the effectiveness of traditional, allinclusive
search engines. This paper discusses the design and use of
Folksemantic, a platform that integrates OpenCourseWare search,
Open Educational Resource recommendations, and social network
functionality into a single open source project. The paper describes
how the system was originally envisioned, its goals for users, and
data that provides insight into how it is actually being used. Data
sources include website click-through data, query logs, web server
log files and user account data. Based on a descriptive analysis of its
current use, modifications to the platform's design are recommended
to better address goals of the system, along with recommendations
for additional phases of research.
Abstract: Recently, the Field Programmable Gate Array (FPGA) technology offers the potential of designing high performance systems at low cost. The discrete wavelet transform has gained the reputation of being a very effective signal analysis tool for many practical applications. However, due to its computation-intensive nature, current implementation of the transform falls short of meeting real-time processing requirements of most application. The objectives of this paper are implement the Haar and Daubechies wavelets using FPGA technology. In addition, the Bit Error Rate (BER) between the input audio signal and the reconstructed output signal for each wavelet is calculated. From the BER, it is seen that the implementations execute the operation of the wavelet transform correctly and satisfying the perfect reconstruction conditions. The design procedure has been explained and designed using the stat-ofart Electronic Design Automation (EDA) tools for system design on FPGA. Simulation, synthesis and implementation on the FPGA target technology has been carried out.
Abstract: This research is aimed at studying the nature of
problems and demands of the training for community leaders in the
upper northeastern region of Thailand. Population and group
samplings are based on 360 community leaders in the region who
have experienced prior training from the Udonthani Rajabhat
University. Stratified random samplings have been drawn upon 186
participants. The research tools is questionnaires. The frequency,
percentage and standard deviation are employed in data analysis. The
findings indicate that most of community leaders are males and
senior adults. The problems in training are associated with the
inconveniences of long-distance travelling to training locations,
inadequacy of learning centers and training sites and high training
costs. The demand of training is basically motivated by a desire for
self-development in modern knowledge in keeping up-to-date with
the changing world and the need for technological application and
facilitation in shortening the distance to training locations and in
limiting expensive training costs.
Abstract: This paper proposes the use of metrics in design space exploration that highlight where in the structure of the model and at what point in the behaviour, prevention is needed against transient faults. Previous approaches to tackle transient faults focused on recovery after detection. Almost no research has been directed towards preventive measures. But in real-time systems, hard deadlines are performance requirements that absolutely must be met and a missed deadline constitutes an erroneous action and a possible system failure. This paper proposes the use of metrics to assess the system design to flag where transient faults may have significant impact. These tools then allow the design to be changed to minimize that impact, and they also flag where particular design techniques – such as coding of communications or memories – need to be applied in later stages of design.
Abstract: Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, implementation of various experiments or learning games, and that it is especially useful in developing the higher-order skills of critical thinking, observation, comprehension, implementation, comparison, analysis and active attention to activities such as research, field work, simulations and scientific inquiry. The ICT in education supports the learning procedure by enabling it to be more flexible and effective, create a rich and attractive training environment and equip the students with knowledge and potential useful for the competitive social environment in which they live. This paper presents the design, the development, and the results of the evaluation analysis of an interactive educational game which using real electric vehicles - toys (material) on a toy race track. When the game starts each student selects a specific vehicle toy. Then students are answering questionnaires in the computer. The vehicles' speed is related to the percentage of right answers in a multiple choice questionnaire (software). Every question has its own significant value depending of the different level of questionnaire. Via the developed software, each right or wrong answers in questionnaire increase or decrease the real time speed of their vehicle toys. Moreover the rate of vehicle's speed increase or decrease depends on the difficulty level of each question. The aim of the work is to attract the student’s interest in a learning process and also to improve their scores. The developed real time game was tested using independent populations of students of age groups: 8-10, 11-14, 15-18 years. Standard educational and statistical analysis tools were used for the evaluation analysis of the game. Results reveal that students using the developed real time control game scored much higher (60%) than students using a traditional simulation game on the same questionnaire. Results further indicate that student's interest in repeating the developed real time control gaming was far higher (70%) than the interest of students using a traditional simulation game.
Abstract: Urban problems are problems of organized complexity. Thus, many models and scientific methods to resolve urban problems are failed. This study is concerned with proposing of a fuzzy system driven approach for classification and solving urban problems. The proposed study investigated mainly the selection of the inputs and outputs of urban systems for classification of urban problems. In this research, five categories of urban problems, respect to fuzzy system approach had been recognized: control, polytely, optimizing, open and decision making problems. Grounded Theory techniques were then applied to analyze the data and develop new solving method for each category. The findings indicate that the fuzzy system methods are powerful processes and analytic tools for helping planners to resolve urban complex problems. These tools can be successful where as others have failed because both incorporate or address uncertainty and risk; complexity and systems interacting with other systems.
Abstract: Video streaming over lossy IP networks is very
important issues, due to the heterogeneous structure of networks.
Infrastructure of the Internet exhibits variable bandwidths, delays,
congestions and time-varying packet losses. Because of variable
attributes of the Internet, video streaming applications should not
only have a good end-to-end transport performance but also have a
robust rate control, furthermore multipath rate allocation mechanism.
So for providing the video streaming service quality, some other
components such as Bandwidth Estimation and Adaptive Rate
Controller should be taken into consideration. This paper gives an
overview of video streaming concept and bandwidth estimation tools
and then introduces special architectures for bandwidth adaptive
video streaming. A bandwidth estimation algorithm – pathChirp,
Optimized Rate Controllers and Multipath Rate Allocation Algorithm
are considered as all-in-one solution for video streaming problem.
This solution is directed and optimized by a decision center which is
designed for obtaining the maximum quality at the receiving side.
Abstract: Advances in processors architecture, such as multicore,
increase the size of complexity of parallel computer systems.
With multi-core architecture there are different parallel languages
that can be used to run parallel programs. One of these languages is
OpenMP which embedded in C/Cµ or FORTRAN. Because of this
new architecture and the complexity, it is very important to evaluate
the performance of OpenMP constructs, kernels, and application
program on multi-core systems. Performance is the activity of
collecting the information about the execution characteristics of a
program. Performance tools consists of at least three interfacing
software layers, including instrumentation, measurement, and
analysis. The instrumentation layer defines the measured
performance events. The measurement layer determines what
performance event is actually captured and how it is measured by the
tool. The analysis layer processes the performance data and
summarizes it into a form that can be displayed in performance tools.
In this paper, a number of OpenMP performance tools are surveyed,
explaining how each is used to collect, analyse, and display data
collection.
Abstract: This paper will present the implementation of QoS
policy based system by utilizing rules on Access Control List (ACL)
over Layer 3 (L3) switch. Also presented is the architecture on that
implementation; the tools being used and the result were gathered.
The system architecture has an ability to control ACL rules which are
installed inside an external L3 switch. ACL rules used to instruct the
way of access control being executed, in order to entertain all traffics
through that particular switch. The main advantage of using this
approach is that the single point of failure could be prevented when
there are any changes on ACL rules inside L3 switches. Another
advantage is that the agent could instruct ACL rules automatically
straight away based on the changes occur on policy database without
configuring them one by one. Other than that, when QoS policy
based system was implemented in distributed environment, the
monitoring process can be synchronized easily due to the automate
process running by agent over external policy devices.
Abstract: In the present paper the extreme shear stresses with the corresponding planes are established using the freely available computer tools like the Gnuplot, Sage, R, Python and Octave. In order to support these freely available computer tools, their strong symbolical and graphical abilities are illustrated. The nature of the stationary points obtained by the Method of Lagrangian Multipliers can be determined using freely available computer symbolical tools like Sage. The characters of the stationary points can be explained in the easiest way using freely available computer graphical tools like Gnuplot, Sage, R, Python and Octave. The presented figures improve the understanding of the problem and the obtained solutions for the majority of students of civil or mechanical engineering.
Abstract: In this paper, different nonlinear dynamics analysis techniques are employed to unveil the rich nonlinear phenomena of the electromagnetic system. In particular, bifurcation diagrams, time responses, phase portraits, Poincare maps, power spectrum analysis, and the construction of basins of attraction are all powerful and effective tools for nonlinear dynamics problems. We also employ the method of Lyapunov exponents to show the occurrence of chaotic motion and to verify those numerical simulation results. Finally, two cases of a chaotic electromagnetic system being effectively controlled by a reference signal or being synchronized to another nonlinear electromagnetic system are presented.
Abstract: The primary aim of the e-government applications is
the fast citizen service and the accomplishment of governmental
functions. This paper discusses the knowledge management for egovernment
development in the needs and role. The paper focused
on analyzing the advantages of using knowledge management by
using the existing IT technologies to maximize the government
functions efficiency. The proposed new approach of providing
government services is based on using Knowledge management as a
part of e-government system.
Abstract: The characterization and modeling of the dynamic
behavior of many built-up structures under vibration conditions is still
a subject of current research. The present study emphasizes the
theoretical investigation of slip damping in layered and jointed
welded cantilever structures using finite element approach.
Application of finite element method in damping analysis is relatively
recent, as such, some problems particularly slip damping analysis has
not received enough attention. To validate the finite element model
developed, experiments have been conducted on a number of mild
steel specimens under different initial conditions of vibration. Finite
element model developed affirms that the damping capacity of such
structures is influenced by a number of vital parameters such as;
pressure distribution, kinematic coefficient of friction and micro-slip
at the interfaces, amplitude, frequency of vibration, length and
thickness of the specimen. Finite element model developed can be
utilized effectively in the design of machine tools, automobiles,
aerodynamic and space structures, frames and machine members for
enhancing their damping capacity.
Abstract: Recently, the Spherical Motion Models (SMM-s) have been introduced [1]. These new models have been developed for 3D local landmark-base Autonomous Navigation (AN). This paper is revealing new arguments and experimental results to support the SMM-s characteristics. The accuracy and the robustness in performing a specific task are the main concerns of the new investigations. To analyze their performances of the SMM-s, the most powerful tools of estimation theory, the extended Kalman filter (EKF) and unscented Kalman filter (UKF), which give the best estimations in noisy environments, have been employed. The Monte Carlo validation implementations used to test the stability and robustness of the models have been employed as well.