Abstract: Gradual patterns have been studied for many years as
they contain precious information. They have been integrated in
many expert systems and rule-based systems, for instance to reason
on knowledge such as “the greater the number of turns, the greater
the number of car crashes”. In many cases, this knowledge has been
considered as a rule “the greater the number of turns → the greater
the number of car crashes” Historically, works have thus been
focused on the representation of such rules, studying how implication
could be defined, especially fuzzy implication. These rules were
defined by experts who were in charge to describe the systems they
were working on in order to turn them to operate automatically. More
recently, approaches have been proposed in order to mine databases
for automatically discovering such knowledge. Several approaches
have been studied, the main scientific topics being: how to determine
what is an relevant gradual pattern, and how to discover them as
efficiently as possible (in terms of both memory and CPU usage).
However, in some cases, end-users are not interested in raw level
knowledge, and are rather interested in trends. Moreover, it may be
the case that no relevant pattern can be discovered at a low level of
granularity (e.g. city), whereas some can be discovered at a higher
level (e.g. county). In this paper, we thus extend gradual pattern
approaches in order to consider multiple level gradual patterns. For
this purpose, we consider two aggregation policies, namely
horizontal and vertical.
Abstract: The present study was done primarily to address two major research gaps: firstly, development of an empirical measure of life meaningfulness for substance users and secondly, to determine the psychosocial determinants of life meaningfulness among the substance users. The study is classified into two phases: the first phase which dealt with development of Life Meaningfulness Scale and the second phase which examined the relationship between life meaningfulness and social support, abstinence self efficacy and depression. Both qualitative and quantitative approaches were used for framing items. A Principal Component Analysis yielded three components: Overall Goal Directedness, Striving for healthy lifestyle and Concern for loved ones which collectively accounted for 42.06% of the total variance. The scale and its subscales were also found to be highly reliable. Multiple regression analyses in the second phase of the study revealed that social support and abstinence self efficacy significantly predicted life meaningfulness among 48 recovering inmates of a de-addiction center while level of depression failed to predict life meaningfulness.
Abstract: Social interest and demand on Home-Network has
been increasing greatly. Although various services are being
introduced to respond to such demands, they can cause serious
security problems when linked to the open network such as Internet.
This paper reviews the security requirements to protect the service
users with assumption that the Home-Network environment is
connected to Internet and then proposes the security model based on
the requirement. The proposed security model can satisfy most of the
requirements and further can be dynamically applied to the future
ubiquitous Home-Networks.
Abstract: A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.
Abstract: The recent developments in computing and
communication technology permit to users to access multimedia
documents with variety of devices (PCs, PDAs, mobile phones...)
having heterogeneous capabilities. This diversification of supports
has trained the need to adapt multimedia documents according to
their execution contexts. A semantic framework for multimedia
document adaptation based on the conceptual neighborhood graphs
was proposed. In this framework, adapting consists on finding
another specification that satisfies the target constraints and which is
as close as possible from the initial document. In this paper, we
propose a new way of building the conceptual neighborhood graphs
to best preserve the proximity between the adapted and the original
documents and to deal with more elaborated relations models by
integrating the relations relaxation graphs that permit to handle the
delays and the distances defined within the relations.
Abstract: In this paper, we address the problem of adaptive radio
resource allocation (RRA) and packet scheduling in the downlink of a
cellular OFDMA system, and propose a downlink multi-carrier
proportional fair (MPF) scheduler and its joint with adaptive RRA
algorithm to distribute radio resources among multiple users according
to their individual QoS requirements. The allocation and scheduling
objective is to maximize the total throughput, while at the same time
maintaining the fairness among users. The simulation results
demonstrate that the methods presented provide for user more explicit
fairness relative to RRA algorithm, but the joint scheme achieves the
higher sum-rate capacity with flexible parameters setting compared
with MPF scheduler.
Abstract: With data centers, end-users can realize the pervasiveness of services that will be one day the cornerstone of our lives. However, data centers are often classified as computing systems that consume the most amounts of power. To circumvent such a problem, we propose a self-adaptive weighted sum methodology that jointly optimizes the performance and power consumption of any given data center. Compared to traditional methodologies for multi-objective optimization problems, the proposed self-adaptive weighted sum technique does not rely on a systematical change of weights during the optimization procedure. The proposed technique is compared with the greedy and LR heuristics for large-scale problems, and the optimal solution for small-scale problems implemented in LINDO. the experimental results revealed that the proposed selfadaptive weighted sum technique outperforms both of the heuristics and projects a competitive performance compared to the optimal solution.
Abstract: Reducing the risk of information leaks is one of
the most important functions of identity management systems. To
achieve this purpose, Dey et al. have already proposed an account
management method for a federated login system using a blind
signature scheme. In order to ensure account anonymity for the
authentication provider, referred to as an IDP (identity provider),
a blind signature scheme is utilized to generate an authentication
token on an authentication service and the token is sent to an IDP.
However, there is a problem with the proposed system. Malicious
users can establish multiple accounts on an IDP by requesting such
accounts. As a measure to solve this problem, in this paper, the
authors propose an account checking method that is performed before
account generation.
Abstract: Due to the complex network architecture, the mobile
adhoc network-s multihop feature gives additional problems to the
users. When the traffic load at each node gets increased, the
additional contention due its traffic pattern might cause the nodes
which are close to destination to starve the nodes more away from the
destination and also the capacity of network is unable to satisfy the
total user-s demand which results in an unfairness problem. In this
paper, we propose to create an algorithm to compute the optimal
MAC-layer bandwidth assigned to each flow in the network. The
bottleneck links contention area determines the fair time share which
is necessary to calculate the maximum allowed transmission rate used
by each flow. To completely utilize the network resources, we
compute two optimal rates namely, the maximum fair share and
minimum fair share. We use the maximum fair share achieved in
order to limit the input rate of those flows which crosses the
bottleneck links contention area when the flows that are not allocated
to the optimal transmission rate and calculate the following highest
fair share. Through simulation results, we show that the proposed
protocol achieves improved fair share and throughput with reduced
delay.
Abstract: This paper describes how the correct endian mode of
the TMS320C6713 DSK board can be identified. It also explains how
the TMS320C6713 DSK board can be used in the little endian and in
the big endian modes for assembly language programming in
particular and for signal processing in general. Similarly, it discusses
how crucially important it is for a user of the TMS320C6713 DSK
board to identify the mode of operation and then use it correctly
during the development stages of the assembly language
programming; otherwise, it will cause unnecessary confusion and
erroneous results as far as storing data into the memory and loading
data from the memory is concerned. Furthermore, it highlights and
strongly recommends to the users of the TMS320C6713 DSK board
to be aware of the availability and importance of various display
options in the Code Composer Studio (CCS) for correctly
interpreting and displaying the desired data in the memory. The
information presented in this paper will be of great importance and
interest to those practitioners and developers who wants to use the
TMS320C6713 DSK board for assembly language programming as
well as input-output signal processing manipulations. Finally,
examples that clearly illustrate the concept are presented.
Abstract: In this paper, we investigate the study of techniques
for scheduling users for resource allocation in the case of multiple
input and multiple output (MIMO) packet transmission systems. In
these systems, transmit antennas are assigned to one user or
dynamically to different users using spatial multiplexing. The
allocation of all transmit antennas to one user cannot take full
advantages of multi-user diversity. Therefore, we developed the case
when resources are allocated dynamically. At each time slot users
have to feed back their channel information on an uplink feedback
channel. Channel information considered available in the schedulers
is the zero forcing (ZF) post detection signal to interference plus
noise ratio. Our analysis study concerns the round robin and the
opportunistic schemes.
In this paper, we present an overview and a complete capacity
analysis of these schemes. The main results in our study are to give
an analytical form of system capacity using the ZF receiver at the
user terminal. Simulations have been carried out to validate all
proposed analytical solutions and to compare the performance of
these schemes.
Abstract: 4G Communication Networks provide heterogeneous
wireless technologies to mobile subscribers through IP based
networks and users can avail high speed access while roaming across
multiple wireless channels; possible by an organized way to manage
the Quality of Service (QoS) functionalities in these networks. This
paper proposes the idea of developing a novel QoS optimization
architecture that will judge the user requirements and knowing peak
times of services utilization can save the bandwidth/cost factors. The
proposed architecture can be customized according to the network
usage priorities so as to considerably improve a network-s QoS
performance.
Abstract: Nowadays, HPC, Grid and Cloud systems are evolving
very rapidly. However, the development of infrastructure solutions
related to HPC is lagging behind. While the existing infrastructure is
sufficient for simple cases, many computational problems have more
complex requirements.Such computational experiments use different
resources simultaneously to start a large number of computational
jobs.These resources are heterogeneous. They have different
purposes, architectures, performance and used software.Users need a
convenient tool that allows to describe and to run complex
computational experiments under conditions of HPC environment.
This paper introduces a modularworkflow system called SEGL
which makes it possible to run complex computational experiments
under conditions of a real HPC organization. The system can be used
in a great number of organizations, which provide HPC power.
Significant requirements to this system are high efficiency and
interoperability with the existing HPC infrastructure of the
organization without any changes.
Abstract: Recently, many web services to provide information for public transport are developed and released. They are optimized for mobile devices such a smartphone. We are also developing better path planning system for route buses and trains called “Bus-Net"[1]. However these systems only provide paths and related information before the user start moving. So we propose a context aware navigation to change the way to support public transport users. If we go to somewhere using many kinds of public transport, we have to know how to use them. In addition, public transport is dynamic system, and these have different characteristic by type. So we need information at real-time. Therefore we suggest the system that can support on user-s state. It has a variety of ways to help public transport users by each state, like turn-by-turn navigation. Context aware navigation will be able to reduce anxiety for using public transport.
Abstract: The rapid growth of e-Commerce services is
significantly observed in the past decade. However, the method to
verify the authenticated users still widely depends on numeric
approaches. A new search on other verification methods suitable for
online e-Commerce is an interesting issue. In this paper, a new online
signature-verification method using angular transformation is
presented. Delay shifts existing in online signatures are estimated by
the estimation method relying on angle representation. In the
proposed signature-verification algorithm, all components of input
signature are extracted by considering the discontinuous break points
on the stream of angular values. Then the estimated delay shift is
captured by comparing with the selected reference signature and the
error matching can be computed as a main feature used for verifying
process. The threshold offsets are calculated by two types of error
characteristics of the signature verification problem, False Rejection
Rate (FRR) and False Acceptance Rate (FAR). The level of these two
error rates depends on the decision threshold chosen whose value is
such as to realize the Equal Error Rate (EER; FAR = FRR). The
experimental results show that through the simple programming,
employed on Internet for demonstrating e-Commerce services, the
proposed method can provide 95.39% correct verifications and 7%
better than DP matching based signature-verification method. In
addition, the signature verification with extracting components
provides more reliable results than using a whole decision making.
Abstract: Social networking is one of the most successful and popular tools to emerge from the Web 2.0 era. However, the increased interconnectivity and access to peoples- personal lives and information has created a plethora of opportunities for the nefarious side of human nature to manifest. This paper categorizes and describes the major types of anti-social behavior and criminal activity that can arise through undisciplined use and/or misuse of social media. We specifically address identity theft, misrepresentation of information posted, cyber bullying, children and social networking, and social networking in the work place. Recommendations are provided for how to reduce the risk of being the victim of a crime or engaging in embarrassing behavior that could irrevocably harm one-s reputation either professionally or personally. We also discuss what responsibilities social networking companies have to protect their users and also what law enforcement and policy makers can do to help alleviate the problems.
Abstract: Cloud computing is the innovative and leading
information technology model for enabling convenient, on-demand
network access to a shared pool of configurable computing resources
that can be rapidly provisioned and released with minimal
management effort. This paper presents our development on enabling
an individual user's desktop in a virtualized environment, which is
stored on a remote virtual machine rather than locally. We present the
initial work on the integration of virtual desktop and application
sharing with virtualization technology. Given the development of
remote desktop virtualization, this proposed effort has the potential to
positively provide an efficient, resilience and elastic environment for
online cloud service. Users no longer need to burden the cost of
software licenses and platform maintenances. Moreover, this
development also helps boost user productivity by promoting a
flexible model that lets users access their desktop environments from
virtually anywhere.
Abstract: The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.
Abstract: This paper aims to address the new trend of social
commerce as electronic commerce leverages Web 2.0 technologies
and online social media. The infusions of new technologies on the
World Wide Web connect users in their homes and workplaces,
thus transforming social formations and business transactions. An
in-depth study of the growth and success of a social commerce site,
Facebook was conducted. The investigation is finalized with a triad
relational model which reflects socioeconomic life in the Internet
today. The following three concepts work jointly to form a global
community that has already started to take the place of traditional
commerce and socialization: Web 2.0 technology, E-commerce,
and online social media. A discussion of the research findings
indicates that social commerce networks are sustainable because of
the various incentives given to users as they collaborate with others
regardless of their identity and location. The focus of this article is
to increase understanding on quickly developing Web 2.0 based
social media and their subsequent effects on the emerging social
commerce.
Abstract: Cybercrime is now becoming a big challenge in Nigeria apart from the traditional crime. Inability to identify perpetrators is one of the reasons for the growing menace. This paper proposes a design for monitoring internet users’ activities in order to curbing cybercrime. It requires redefining the operations of Internet Service Providers (ISPs) which will now mandate users to be authenticated before accessing the internet. In implementing this work which can be adapted to a larger scale, a virtual router application is developed and configured to mimic a real router device. A sign-up portal is developed to allow users to register with the ISP. The portal asks for identification information which will include bio-data and government issued identification data like National Identity Card number, et cetera. A unique username and password are chosen by the user to enable access to the internet which will be used to reference him to an Internet Protocol Address (IP Address) of any system he uses on the internet and thereby associating him to any criminal act related to that IP address at that particular time. Questions such as “What happen when another user knows the password and uses it to commit crime?” and other pertinent issues are addressed.