Abstract: In this paper, a method to detect multiple ellipses is presented. The technique is efficient and robust against incomplete ellipses due to partial occlusion, noise or missing edges and outliers. It is an iterative technique that finds and removes the best ellipse until no reasonable ellipse is found. At each run, the best ellipse is extracted from randomly selected edge patches, its fitness calculated and compared to a fitness threshold. RANSAC algorithm is applied as a sampling process together with the Direct Least Square fitting of ellipses (DLS) as the fitting algorithm. In our experiment, the method performs very well and is robust against noise and spurious edges on both synthetic and real-world image data.
Abstract: The usual correctness condition for a schedule of
concurrent database transactions is some form of serializability of
the transactions. For general forms, the problem of deciding whether
a schedule is serializable is NP-complete. In those cases other approaches
to proving correctness, using proof rules that allow the steps
of the proof of serializability to be guided manually, are desirable.
Such an approach is possible in the case of conflict serializability
which is proved algebraically by deriving serial schedules using
commutativity of non-conflicting operations. However, conflict serializability
can be an unnecessarily strong form of serializability restricting
concurrency and thereby reducing performance. In practice,
weaker, more general, forms of serializability for extended models of
transactions are used. Currently, there are no known methods using
proof rules for proving those general forms of serializability. In this
paper, we define serializability for an extended model of partitioned
transactions, which we show to be as expressive as serializability
for general partitioned transactions. An algebraic method for proving
general serializability is obtained by giving an initial-algebra specification
of serializable schedules of concurrent transactions in the
model. This demonstrates that it is possible to conduct algebraic
proofs of correctness of concurrent transactions in general cases.
Abstract: The growth of open networks created the interest to commercialise it. The establishment of an electronic business mechanism must be accompanied by a digital-electronic payment system to transfer the value of transactions. Financial organizations are requested to offer a secure e-payment synthesis with equivalent levels of trust and security served in conventional paper-based payment transactions. The paper addresses the challenge of the first trade problem in e-commerce, provides a brief literature review on electronic payment and attempts to explain the underlying concept and method of trust in relevance to electronic payment.
Abstract: Knowledge-based e-mail systems focus on
incorporating knowledge management approach in order to enhance
the traditional e-mail systems. In this paper, we present a knowledgebased
e-mail system called KS-Mail where people do not only send
and receive e-mail conventionally but are also able to create a sense
of knowledge flow. We introduce semantic processing on the e-mail
contents by automatically assigning categories and providing links to
semantically related e-mails. This is done to enrich the knowledge
value of each e-mail as well as to ease the organization of the e-mails
and their contents. At the application level, we have also built
components like the service manager, evaluation engine and search
engine to handle the e-mail processes efficiently by providing the
means to share and reuse knowledge. For this purpose, we present the
KS-Mail architecture, and elaborate on the details of the e-mail
server and the application server. We present the ontology mapping
technique used to achieve the e-mail content-s categorization as well
as the protocols that we have developed to handle the transactions in
the e-mail system. Finally, we discuss further on the implementation
of the modules presented in the KS-Mail architecture.
Abstract: This paper examines the factors, which determine
R&D outsourcing behaviour at Japanese firms, from the viewpoints of
transaction cost and strategic management, since the latter half of the
1990s. This study uses empirical analysis, which involves the
application of large-sample data. The principal findings of this paper
are listed below. Firms that belong to a wider corporate group are more
active in executing R&D outsourcing activities. Diversification
strategies such as the expansion of product and sales markets have a
positive effect on the R&D outsourcing behaviour of firms. Moreover,
while quantitative R&D resources have positive influences on R&D
outsourcing, qualitative indices have no effect. These facts suggest
that R&D outsourcing behaviour of Japanese firms are consistent with
the two perspectives of transaction cost and strategic management.
Specifically, a conventional corporate group network plays an
important role in R&D outsourcing behaviour. Firms that execute
R&D outsourcing leverage 'old' networks to construct 'new' networks
and use both networks properly.
Abstract: Trends in business intelligence, e-commerce and
remote access make it necessary and practical to store data in
different ways on multiple systems with different operating systems.
As business evolve and grow, they require efficient computerized
solution to perform data update and to access data from diverse
enterprise business applications. The objective of this paper is to
demonstrate the capability of DTS [1] as a database solution for
automatic data transfer and update in solving business problem. This
DTS package is developed for the sales of variety of plants and
eventually expanded into commercial supply and landscaping
business. Dimension data modeling is used in DTS package to
extract, transform and load data from heterogeneous database
systems such as MySQL, Microsoft Access and Oracle that
consolidates into a Data Mart residing in SQL Server. Hence, the
data transfer from various databases is scheduled to run automatically
every quarter of the year to review the efficient sales analysis.
Therefore, DTS is absolutely an attractive solution for automatic data
transfer and update which meeting today-s business needs.
Abstract: In an era of knowledge explosion, the growth of data
increases rapidly day by day. Since data storage is a limited resource,
how to reduce the data space in the process becomes a challenge issue.
Data compression provides a good solution which can lower the
required space. Data mining has many useful applications in recent
years because it can help users discover interesting knowledge in large
databases. However, existing compression algorithms are not
appropriate for data mining. In [1, 2], two different approaches were
proposed to compress databases and then perform the data mining
process. However, they all lack the ability to decompress the data to
their original state and improve the data mining performance. In this
research a new approach called Mining Merged Transactions with the
Quantification Table (M2TQT) was proposed to solve these problems.
M2TQT uses the relationship of transactions to merge related
transactions and builds a quantification table to prune the candidate
itemsets which are impossible to become frequent in order to improve
the performance of mining association rules. The experiments show
that M2TQT performs better than existing approaches.
Abstract: In the past, there were more researches of recommendation system in applied electronic commerce. However, because all circles promote information technology integrative instruction actively, the quantity of instruction resources website is more and more increasing on the Internet. But there are less website including recommendation service, especially for teachers. This study established an instruction resource recommendation website that analyzed teaching style of teachers, then provided appropriate instruction resources for teachers immediately. We used the questionnaire survey to realize teacher-s suggestions and satisfactions with the instruction resource contents and recommendation results. The study shows: (1)The website used “Transactional Ability Inventory" that realized teacher-s style and provided appropriate instruction resources for teachers in a short time, it reduced the step of data filter. (2)According to the content satisfaction of questionnaire survey, four styles teachers were almost satisfied with the contents of the instruction resources that the website recommended, thus, the conception of developing instruction resources with different teaching style is accepted. (3) According to the recommendation satisfaction of questionnaire survey, four styles teachers were almost satisfied with the recommendation service of the website, thus, the recommendation strategy that provide different results for teachers in different teaching styles is accepted.
Abstract: The increasing development of wireless networks and
the widespread popularity of handheld devices such as Personal
Digital Assistants (PDAs), mobile phones and wireless tablets
represents an incredible opportunity to enable mobile devices as a
universal payment method, involving daily financial transactions.
Unfortunately, some issues hampering the widespread acceptance of
mobile payment such as accountability properties, privacy protection,
limitation of wireless network and mobile device. Recently, many
public-key cryptography based mobile payment protocol have been
proposed. However, limited capabilities of mobile devices and
wireless networks make these protocols are unsuitable for mobile
network. Moreover, these protocols were designed to preserve
traditional flow of payment data, which is vulnerable to attack and
increase the user-s risk. In this paper, we propose a private mobile
payment protocol which based on client centric model and by
employing symmetric key operations. The proposed mobile payment
protocol not only minimizes the computational operations and
communication passes between the engaging parties, but also
achieves a completely privacy protection for the payer. The future
work will concentrate on improving the verification solution to
support mobile user authentication and authorization for mobile
payment transactions.
Abstract: In recent years, real estate prediction or valuation has
been a topic of discussion in many developed countries. Improper
hype created by investors leads to fluctuating prices of real estate,
affecting many consumers to purchase their own homes. Therefore,
scholars from various countries have conducted research in real estate
valuation and prediction. With the back-propagation neural network
that has been popular in recent years and the orthogonal array in the
Taguchi method, this study aimed to find the optimal parameter
combination at different levels of orthogonal array after the system
presented different parameter combinations, so that the artificial
neural network obtained the most accurate results. The experimental
results also demonstrated that the method presented in the study had a
better result than traditional machine learning. Finally, it also showed
that the model proposed in this study had the optimal predictive effect,
and could significantly reduce the cost of time in simulation operation.
The best predictive results could be found with a fewer number of
experiments more efficiently. Thus users could predict a real estate
transaction price that is not far from the current actual prices.
Abstract: ebXML (Electronic Business using eXtensible
Markup Language) is an e-business standard, sponsored by
UN/CEFACT and OASIS, which enables enterprises to exchange
business messages, conduct trading relationships, communicate
data in common terms and define and register business
processes. While there is tremendous e-business value in the
ebXML, security remains an unsolved problem and one of the
largest barriers to adoption. XML security technologies emerging
recently have extensibility and flexibility suitable for security
implementation such as encryption, digital signature, access
control and authentication.
In this paper, we propose ebXML business transaction models
that allow trading partners to securely exchange XML based
business transactions by employing XML security technologies.
We show how each XML security technology meets the ebXML
standard by constructing the test software and validating messages
between the trading partners.
Abstract: With the demand of mobility by users, wireless
technologies have become the hotspot developing arena. Internet
Engineering Task Force (IETF) working group has developed Mobile
IP to support node mobility. The concept of node mobility indicates
that in spite of the movement of the node, it is still connected to the
internet and all the data transactions are preserved. It provides
location-independent access to Internet. After the incorporation of
host mobility, network mobility has undergone intense research.
There are several intricacies faced in the real world implementation
of network mobility significantly the problem of nested networks and
their consequences. This article is concerned regarding a problem of
nested network called pinball route problem and proposes a solution
to eliminate the above problem. The proposed mechanism is
implemented using NS2 simulation tool and it is found that the
proposed mechanism efficiently reduces the overload caused by the
pinball route problem.
Abstract: Recently, the RFID (Radio Frequency
Identification) technology attracts the world market attention as
essential technology for ubiquitous environment. The RFID
market has focused on transponders and reader development.
But that concern has shifted to RFID software like as
high-valued e-business applications, RFID middleware and
related development tools. However, due to the high sensitivity
of data and service transaction within the RFID network,
security consideration must be addressed. In order to guarantee
trusted e-business based on RFID technology, we propose a
security enhanced RFID middleware system. Our proposal is
compliant with EPCglobal ALE (Application Level Events),
which is standard interface for middleware and its clients. We
show how to provide strengthened security and trust by
protecting transported data between middleware and its client,
and stored data in middleware. Moreover, we achieve the
identification and service access control against illegal service
abuse. Our system enables secure RFID middleware service
and trusted e-business service.
Abstract: The problem of frequent itemset mining is considered in this paper. One new technique proposed to generate frequent patterns in large databases without time-consuming candidate generation. This technique is based on focusing on transaction instead of concentrating on itemset. This algorithm based on take intersection between one transaction and others transaction and the maximum shared items between transactions computed instead of creating itemset and computing their frequency. With applying real life transactions and some consumption is taken from real life data, the significant efficiency acquire from databases in generation association rules mining.
Abstract: In this paper, a novel method using Bees Algorithm is proposed to determine the optimal allocation of FACTS devices for maximizing the Available Transfer Capability (ATC) of power transactions between source and sink areas in the deregulated power system. The algorithm simultaneously searches the FACTS location, FACTS parameters and FACTS types. Two types of FACTS are simulated in this study namely Thyristor Controlled Series Compensator (TCSC) and Static Var Compensator (SVC). A Repeated Power Flow with FACTS devices including ATC is used to evaluate the feasible ATC value within real and reactive power generation limits, line thermal limits, voltage limits and FACTS operation limits. An IEEE30 bus system is used to demonstrate the effectiveness of the algorithm as an optimization tool to enhance ATC. A Genetic Algorithm technique is used for validation purposes. The results clearly indicate that the introduction of FACTS devices in a right combination of location and parameters could enhance ATC and Bees Algorithm can be efficiently used for this kind of nonlinear integer optimization.
Abstract: The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.
Abstract: Grid computing provides a virtual framework for
controlled sharing of resources across institutional boundaries.
Recently, trust has been recognised as an important factor for
selection of optimal resources in a grid. We introduce a new method
that provides a quantitative trust value, based on the past interactions
and present environment characteristics. This quantitative trust value
is used to select a suitable resource for a job and eliminates run time
failures arising from incompatible user-resource pairs. The proposed
work will act as a tool to calculate the trust values of the various
components of the grid and there by improves the success rate of the
jobs submitted to the resource on the grid. The access to a resource
not only depend on the identity and behaviour of the resource but
also upon its context of transaction, time of transaction, connectivity
bandwidth, availability of the resource and load on the resource. The
quality of the recommender is also evaluated based on the accuracy
of the feedback provided about a resource. The jobs are submitted for
execution to the selected resource after finding the overall trust value
of the resource. The overall trust value is computed with respect to
the subjective and objective parameters.
Abstract: In data mining, the association rules are used to find
for the associations between the different items of the transactions
database. As the data collected and stored, rules of value can be found
through association rules, which can be applied to help managers
execute marketing strategies and establish sound market frameworks.
This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth)
to derive from fuzzy association rules. At first, we apply fuzzy
partition methods and decide a membership function of quantitative
value for each transaction item. Next, we implement FFP-growth
to deal with the process of data mining. In addition, in order to
understand the impact of Apriori algorithm and FFP-growth algorithm
on the execution time and the number of generated association
rules, the experiment will be performed by using different sizes of
databases and thresholds. Lastly, the experiment results show FFPgrowth
algorithm is more efficient than other existing methods.
Abstract: This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.