Abstract: Common Platform for Automated Programming
(CPAP) is defined in details. Two versions of CPAP are described:
Cloud based (including set of components for classic programming,
and set of components for combined programming); and Knowledge
Based Automated Software Engineering (KBASE) based (including
set of components for automated programming, and set of
components for ontology programming). Four KBASE products
(Module for Automated Programming of Robots, Intelligent Product
Manual, Intelligent Document Display, and Intelligent Form
Generator) are analyzed and CPAP contributions to automated
programming are presented.
Abstract: The paper focuses on the benefits of business process
modeling. Although this discipline is developing for many years,
there is still necessity of creating new opportunities to meet the ever
increasing users’ needs. Because one of these needs is related to the
conversion of business process models from one standard to another,
the authors have developed a converter between BPMN and EPC
standards using workflow patterns as intermediate tool. Nowadays
there are too many systems for business process modeling. The
variety of output formats is almost the same as the systems
themselves. This diversity additionally hampers the conversion of the
models. The presented study is aimed at discussing problems due to
differences in the output formats of various modeling environments.
Abstract: Opportunistic Routing (OR) increases the
transmission reliability and network throughput. Traditional routing
protocols preselects one or more predetermined nodes before
transmission starts and uses a predetermined neighbor to forward a
packet in each hop. The opportunistic routing overcomes the
drawback of unreliable wireless transmission by broadcasting one
transmission can be overheard by manifold neighbors. The first
cooperation-optimal protocol for Multirate OR (COMO) used to
achieve social efficiency and prevent the selfish behavior of the
nodes. The novel link-correlation-aware OR improves the
performance by exploiting the miscellaneous low correlated forward
links. Context aware Adaptive OR (CAOR) uses active suppression
mechanism to reduce packet duplication. The Context-aware OR
(COR) can provide efficient routing in mobile networks. By using
Cooperative Opportunistic Routing in Mobile Ad hoc Networks
(CORMAN), the problem of opportunistic data transfer can be
tackled. While comparing to all the protocols, COMO is the best as it
achieves social efficiency and prevents the selfish behavior of the
nodes.
Abstract: This paper presents the development of a mobile
application for students at the Faculty of Information Technology,
Rangsit University (RSU), Thailand. RSU upgrades an enrollment
process by improving its information systems. Students can
download the RSU APP easily in order to access the RSU substantial
information. The reason of having a mobile application is to help
students to access the system regardless of time and place. The objectives of this paper include: 1. To develop an application
on iOS platform for those students at the Faculty of Information
Technology, Rangsit University, Thailand. 2. To obtain the students’
perception towards the new mobile app. The target group is those
from the freshman year till the senior year of the faculty of
Information Technology, Rangsit University. The new mobile application, called as RSU APP, is developed by
the department of Information Technology, Rangsit University. It
contains useful features and various functionalities particularly on
those that can give support to students. The core contents of the app
consist of RSU’s announcement, calendar, events, activities, and ebook.
The mobile app is developed on the iOS platform. The user
satisfaction is analyzed from the interview data from 81 interviewees
as well as a Google application like a Google form which 122
interviewees are involved. The result shows that users are satisfied
with the application as they score it the most satisfaction level at 4.67
SD 0.52. The score for the question if users can learn and use the
application quickly is high which is 4.82 SD 0.71. On the other hand,
the lowest satisfaction rating is in the app’s form, apps lists, with the
satisfaction level as 4.01 SD 0.45.
Abstract: This study aims to establish function point process
based on stochastic distribution. In order to demonstrate effectiveness
of the study we present a case study that it applies suggested method
on an automotive electrical and electronics system software
development based on Monte Carlo Simulation. It is expected that the
result of this paper is used as guidance for establishing function point
process in organizations and tools for helping project managers make
decisions correctly.
Abstract: Since the advances in digital imaging technologies have led to
development of high quality digital devices, there are a lot of illegal copies
of copyrighted video content on the Internet. Also, unauthorized editing is
occurred frequently. Thus, we propose an editing prevention technique for
high-quality (HQ) video that can prevent these illegally edited copies from
spreading out. The proposed technique is applied spatial and temporal gradient
methods to improve the fidelity and detection performance. Also, the scheme
duplicates the embedding signal temporally to alleviate the signal reduction
caused by geometric and signal-processing distortions. Experimental results
show that the proposed scheme achieves better performance than previously
proposed schemes and it has high fidelity. The proposed scheme can be used
in unauthorized access prevention method of visual communication or traitor
tracking applications which need fast detection process to prevent illegally
edited video content from spreading out.
Abstract: Prior literature in the field of adaptive and
personalized learning sequence in e-learning have proposed and
implemented various mechanisms to improve the learning process
such as individualization and personalization, but complex to
implement due to expensive algorithmic programming and need of
extensive and prior data. The main objective of personalizing
learning sequence is to maximize learning by dynamically selecting
the closest teaching operation in order to achieve the learning
competency of learner. In this paper, a revolutionary technique has
been proposed and tested to perform individualization and
personalization using modified reversed roulette wheel selection
algorithm that runs at O(n). The technique is simpler to implement
and is algorithmically less expensive compared to other revolutionary
algorithms since it collects the dynamic real time performance matrix
such as examinations, reviews, and study to form the RWSA single
numerical fitness value. Results show that the implemented system is
capable of recommending new learning sequences that lessens time
of study based on student's prior knowledge and real performance
matrix.
Abstract: Recently, Job Recommender Systems have gained
much attention in industries since they solve the problem of
information overload on the recruiting website. Therefore, we
proposed Extended Personalized Job System that has the capability of
providing the appropriate jobs for job seeker and recommending
some suitable information for them using Data Mining Techniques
and Dynamic User Profile. On the other hands, company can also
interact to the system for publishing and updating job information.
This system have emerged and supported various platforms such as
web application and android mobile application. In this paper, User
profiles, Implicit User Action, User Feedback, and Clustering
Techniques in WEKA libraries were applied and implemented. In
additions, open source tools like Yii Web Application Framework,
Bootstrap Front End Framework and Android Mobile Technology
were also applied.
Abstract: The Internet of Things (IoT) field has been applied in
industries with different purposes. Sensing Enterprise (SE) is an
attribute of an enterprise or a network that allows it to react to
business stimuli originating on the Internet. These fields have come
into focus recently on the enterprises, and there is some evidence of
the use and implications in supply chain management, while
finding it as an interesting aspect to work on. This paper presents a
revision and proposals of IoT applications in supply chain
management.
Abstract: Social networking sites such as Twitter and Facebook
attracts over 500 million users across the world, for those users, their
social life, even their practical life, has become interrelated. Their
interaction with social networking has affected their life forever.
Accordingly, social networking sites have become among the main
channels that are responsible for vast dissemination of different kinds
of information during real time events. This popularity in Social
networking has led to different problems including the possibility of
exposing incorrect information to their users through fake accounts
which results to the spread of malicious content during life events.
This situation can result to a huge damage in the real world to the
society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting the
fake accounts on Twitter. The study determines the minimized set of
the main factors that influence the detection of the fake accounts on
Twitter, and then the determined factors are applied using different
classification techniques. A comparison of the results of these
techniques has been performed and the most accurate algorithm is
selected according to the accuracy of the results. The study has been
compared with different recent researches in the same area; this
comparison has proved the accuracy of the proposed study. We claim
that this study can be continuously applied on Twitter social network
to automatically detect the fake accounts; moreover, the study can be
applied on different social network sites such as Facebook with minor
changes according to the nature of the social network which are
discussed in this paper.
Abstract: This article discusses the passage of RDB to XML
documents (schema and data) based on metadata and semantic
enrichment, which makes the RDB under flattened shape and is
enriched by the object concept. The integration and exploitation of
the object concept in the XML uses a syntax allowing for the
verification of the conformity of the document XML during the
creation. The information extracted from the RDB is therefore
analyzed and filtered in order to adjust according to the structure of
the XML files and the associated object model. Those implemented
in the XML document through a SQL query are built dynamically. A
prototype was implemented to realize automatic migration, and so
proves the effectiveness of this particular approach.
Abstract: Small-size and low-power sensors with sensing, signal
processing and wireless communication capabilities is suitable for the
wireless sensor networks. Due to the limited resources and battery
constraints, complex routing algorithms used for the ad-hoc networks
cannot be employed in sensor networks. In this paper, we propose
node-disjoint multi-path hexagon-based routing algorithms in wireless
sensor networks. We suggest the details of the algorithm and compare
it with other works. Simulation results show that the proposed scheme
achieves better performance in terms of efficiency and message
delivery ratio.
Abstract: A Reconfigurable Wilkinson power divider is
proposed in this paper. In existing system only a limited number of
bandwidth is used at the output ports, in the proposed Wilkinson
power divider different band of frequencies are obtained by using
PIN diode. By tuning the PIN diode, different frequencies are
achieved. The size of the power divider is reduced for the operating
frequency and increases the fractional bandwidth.
Abstract: Web Usage Mining is the application of data mining
techniques to find usage patterns from web log data, so as to grasp
required patterns and serve the requirements of Web-based
applications. User’s expertise on the internet may be improved by
minimizing user’s web access latency. This may be done by
predicting the future search page earlier and the same may be prefetched
and cached. Therefore, to enhance the standard of web
services, it is needed topic to research the user web navigation
behavior. Analysis of user’s web navigation behavior is achieved
through modeling web navigation history. We propose this technique
which cluster’s the user sessions, based on the K-medoids technique.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.
Abstract: The purpose of this study is analyzing the relationship
between trust and social capital of people with using Social Network
Analysis. In this study, two aspects of social capital will be focused:
Bonding, homophilous social capital (BoSC), and Bridging,
heterophilous social capital (BrSC). These two aspects diverge each
other regarding to the social theories. The other concept of the study
is Trust (Tr), namely interpersonal trust, willing to ascribe good
intentions to and have confidence in the words and actions of other
people. In this study, the sample group, 61 people, was selected from
a private firm from the defense industry. The relation between
BoSC/BrSC and Tr is shown by using Social Network Analysis
(SNA) and statistical analysis with Likert type-questionnaire. The
results of the analysis show the Cronbach’s alpha value is 0.756 and
social capital values (BoSC/BrSC) is not correlated with Tr values of
the people.
Abstract: Currently, there are few user friendly Weigh-in-
Motion (WIM) data analysis softwares available which can produce
traffic input data for the recently developed AASHTOWare pavement
Mechanistic-Empirical (ME) design software. However, these
softwares have only rudimentary Quality Control (QC) processes.
Therefore, they cannot properly deal with erroneous WIM data. As
the pavement performance is highly sensible to the quality of WIM
data, it is highly recommended to use more refined QC process on
raw WIM data to get a good result. This study develops a userfriendly
software, which can produce traffic input for the ME design
software. This software takes the raw data (Class and Weight data)
collected from the WIM station and processes it with a sophisticated
QC procedure. Traffic data such as traffic volume, traffic distribution,
axle load spectra, etc. can be obtained from this software; which can
directly be used in the ME design software.
Abstract: In this paper, an attempt has been made for the design
of a robotic library using an intelligent system. The robot works on
the ARM microprocessor, motor driver circuit with 5 degrees of
freedom with Wi-Fi and GPS based communication protocol. The
authenticity of the library books is controlled by RFID. The proposed
robotic library system is facilitated with embedded system and ARM.
In this library issuance system, the previous potential readers’
authentic review reports have been taken into consideration for
recommending suitable books to the deserving new users and the
issuance of books or periodicals is based on the users’ decision. We
have conjectured that the Wi-Fi based robotic library management
system would allow fast transaction of books issuance and it also
produces quality readers.
Abstract: In wireless sensor network, sensor node transmits the
sensed data to the sink node in multi-hop communication
periodically. This high traffic induces congestion at the node which is
present one-hop distance to the sink node. The packet transmission
and reception rate of these nodes should be very high, when
compared to other sensor nodes in the network. Therefore, the energy
consumption of that node is very high and this effect is known as the
“funneling effect”. The tree based-data aggregation technique
(TBDA) is used to reduce the energy consumption of the node. The
throughput of the overall performance shows a considerable decrease
in the number of packet transmissions to the sink node. The proposed
scheme, TBDA, avoids the funneling effect and extends the lifetime
of the wireless sensor network. The average case time complexity for
inserting the node in the tree is O(n log n) and for the worst case time
complexity is O(n2).
Abstract: Digital images are widely used in computer
applications. To store or transmit the uncompressed images
requires considerable storage capacity and transmission bandwidth.
Image compression is a means to perform transmission or storage of
visual data in the most economical way. This paper explains about
how images can be encoded to be transmitted in a multiplexing
time-frequency domain channel. Multiplexing involves packing
signals together whose representations are compact in the working
domain. In order to optimize transmission resources each 4 × 4
pixel block of the image is transformed by a suitable polynomial
approximation, into a minimal number of coefficients. Less than
4 × 4 coefficients in one block spares a significant amount of
transmitted information, but some information is lost. Different
approximations for image transformation have been evaluated as
polynomial representation (Vandermonde matrix), least squares +
gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev
polynomials or singular value decomposition (SVD). Results have
been compared in terms of nominal compression rate (NCR),
compression ratio (CR) and peak signal-to-noise ratio (PSNR)
in order to minimize the error function defined as the difference
between the original pixel gray levels and the approximated
polynomial output. Polynomial coefficients have been later encoded
and handled for generating chirps in a target rate of about two
chirps per 4 × 4 pixel block and then submitted to a transmission
multiplexing operation in the time-frequency domain.