Abstract: This research seeks to investigate the frequency and
profitability of index arbitrage opportunities involving the SET50
futures, SET50 component stocks, and the ThaiDEX SET50 ETF
(ticker symbol: TDEX). In particular, the frequency and profit of
arbitrage are measured in the following three arbitrage tests: (1)
SET50 futures vs. ThaiDEX SET50 ETF, (2) SET50 futures vs.
SET50 component stocks, and (3) ThaiDEX SET50 ETF vs. SET50
component stocks are investigated. For tests (2) and (3), the problems
involve conic optimization and quadratic programming as subproblems.
This research is first to apply conic optimization and
quadratic programming techniques in the context of index arbitrage
and is first to investigate such index arbitrage in the Thai equity and
derivatives markets. Thus, the contribution of this study is twofold.
First, its results would help understand the contribution of the
derivatives securities to the efficiency of the Thai markets. Second,
the methodology employed in this study can be applied to other
geographical markets, with minor adjustments.
Abstract: A big organization may have multiple branches spread across different locations. Processing of data from these branches becomes a huge task when innumerable transactions take place. Also, branches may be reluctant to forward their data for centralized processing but are ready to pass their association rules. Local mining may also generate a large amount of rules. Further, it is not practically possible for all local data sources to be of the same size. A model is proposed for discovering valid rules from different sized data sources where the valid rules are high weighted rules. These rules can be obtained from the high frequency rules generated from each of the data sources. A data source selection procedure is considered in order to efficiently synthesize rules. Support Equalization is another method proposed which focuses on eliminating low frequency rules at the local sites itself thus reducing the rules by a significant amount.
Abstract: A potentially serious problem with current payment systems is that their underlying hard problems from number theory may be solved by either a quantum computer or unanticipated future advances in algorithms and hardware. A new quantum payment system is proposed in this paper. The suggested system makes use of fundamental principles of quantum mechanics to ensure the unconditional security without prior arrangements between customers and vendors. More specifically, the new system uses Greenberger-Home-Zeilinger (GHZ) states and Quantum Key Distribution to authenticate the vendors and guarantee the transaction integrity.
Abstract: A research study was conducted with an objective to propose a collaborative business strategy of a oil and gas trading company, representing PPT Energy Trading Co., Ltd., with its shareholder, especially electricity and power supply companies for LNG Form of Coal Bed Methane in B2B Transaction. Collaborative business strategy is a strategy to collaborate with other organizations due to have future benefits in both parties, or achieve the business objective through the collaboration of business, its strategy and partners. A structured interview was established to collect the required primary data from the company. Not only interview, but also company’s business plan and annual report were collected and analyzed for the company’s current condition. As the result, this research shows a recommendation to propose a new collaborative strategy with limiting its target market, diversifying product, conducting new business model, and considering other stakeholders.
Abstract: Insider abuse has recently been reported as one of
the more frequently occurring security incidents, suggesting that
more security is required for detecting and preventing unauthorised
financial transactions entered by authorised users. To address the
problem, and based on the observation that all authorised interbanking
financial transactions trigger or are triggered by other
transactions in a workflow, we have developed a security solution
based on a redefined understanding of an audit workflow. One audit
workflow where there is a log file containing the complete workflow
activity of financial transactions directly related to one financial
transaction (an electronic deal recorded at an e-trading system). The
new security solution contemplates any two parties interacting on
the basis of financial transactions recorded by their users in related
but distinct automated financial systems. In the new definition interorganizational
and intra-organization interactions can be described
in one unique audit trail. This concept expands the current ideas of
audit trails by adapting them to actual e-trading workflow activity, i.e.
intra-organizational and inter-organizational activity. With the above,
a security auditing service is designed to detect integrity drifts with
and between organizations in order to detect unauthorised financial
transactions entered by authorised users.
Abstract: Sequential pattern mining is a challenging task in data mining area with large applications. One among those applications is mining patterns from weblog. Recent times, weblog is highly dynamic and some of them may become absolute over time. In addition, users may frequently change the threshold value during the data mining process until acquiring required output or mining interesting rules. Some of the recently proposed algorithms for mining weblog, build the tree with two scans and always consume large time and space. In this paper, we build Revised PLWAP with Non-frequent Items (RePLNI-tree) with single scan for all items. While mining sequential patterns, the links related to the nonfrequent items are not considered. Hence, it is not required to delete or maintain the information of nodes while revising the tree for mining updated transactions. The algorithm supports both incremental and interactive mining. It is not required to re-compute the patterns each time, while weblog is updated or minimum support changed. The performance of the proposed tree is better, even the size of incremental database is more than 50% of existing one. For evaluation purpose, we have used the benchmark weblog dataset and found that the performance of proposed tree is encouraging compared to some of the recently proposed approaches.
Abstract: Recent scientific investigations indicate that
multimodal biometrics overcome the technical limitations of
unimodal biometrics, making them ideally suited for everyday life
applications that require a reliable authentication system. However,
for a successful adoption of multimodal biometrics, such systems
would require large heterogeneous datasets with complex multimodal
fusion and privacy schemes spanning various distributed
environments. From experimental investigations of current
multimodal systems, this paper reports the various issues related to
speed, error-recovery and privacy that impede the diffusion of such
systems in real-life. This calls for a robust mechanism that caters to
the desired real-time performance, robust fusion schemes,
interoperability and adaptable privacy policies.
The main objective of this paper is to present a framework that
addresses the abovementioned issues by leveraging on the
heterogeneous resource sharing capacities of Grid services and the
efficient machine learning capabilities of artificial neural networks
(ANN). Hence, this paper proposes a Grid-based neural network
framework for adopting multimodal biometrics with the view of
overcoming the barriers of performance, privacy and risk issues that
are associated with shared heterogeneous multimodal data centres.
The framework combines the concept of Grid services for reliable
brokering and privacy policy management of shared biometric
resources along with a momentum back propagation ANN (MBPANN)
model of machine learning for efficient multimodal fusion and
authentication schemes. Real-life applications would be able to adopt
the proposed framework to cater to the varying business requirements
and user privacies for a successful diffusion of multimodal
biometrics in various day-to-day transactions.
Abstract: Numerical analysis naturally finds applications in all
fields of engineering and the physical sciences, but in the
21st century, the life sciences and even the arts have adopted
elements of scientific computations. The numerical data analysis
became key process in research and development of all the fields [6].
In this paper we have made an attempt to analyze the specified
numerical patterns with reference to the association rule mining
techniques with minimum confidence and minimum support mining
criteria. The extracted rules and analyzed results are graphically
demonstrated. Association rules are a simple but very useful form of
data mining that describe the probabilistic co-occurrence of certain
events within a database [7]. They were originally designed to
analyze market-basket data, in which the likelihood of items being
purchased together within the same transactions are analyzed.
Abstract: The advances in multimedia and networking technologies
have created opportunities for Internet pirates, who can easily
copy multimedia contents and illegally distribute them on the Internet,
thus violating the legal rights of content owners. This paper describes
how a simple and well-known watermarking procedure based on a
spread spectrum method and a watermark recovery by correlation can
be improved to effectively and adaptively protect MPEG-2 videos
distributed on the Internet. In fact, the procedure, in its simplest
form, is vulnerable to a variety of attacks. However, its security
and robustness have been increased, and its behavior has been
made adaptive with respect to the video terminals used to open
the videos and the network transactions carried out to deliver them
to buyers. In fact, such an adaptive behavior enables the proposed
procedure to efficiently embed watermarks, and this characteristic
makes the procedure well suited to be exploited in web contexts,
where watermarks usually generated from fingerprinting codes have
to be inserted into the distributed videos “on the fly", i.e. during the
purchase web transactions.
Abstract: In the last decade digital watermarking procedures have
become increasingly applied to implement the copyright protection
of multimedia digital contents distributed on the Internet. To this
end, it is worth noting that a lot of watermarking procedures
for images and videos proposed in literature are based on spread
spectrum techniques. However, some scepticism about the robustness
and security of such watermarking procedures has arisen because
of some documented attacks which claim to render the inserted
watermarks undetectable. On the other hand, web content providers
wish to exploit watermarking procedures characterized by flexible and
efficient implementations and which can be easily integrated in their
existing web services frameworks or platforms. This paper presents
how a simple spread spectrum watermarking procedure for MPEG-2
videos can be modified to be exploited in web contexts. To this end,
the proposed procedure has been made secure and robust against some
well-known and dangerous attacks. Furthermore, its basic scheme
has been optimized by making the insertion procedure adaptive with
respect to the terminals used to open the videos and the network transactions
carried out to deliver them to buyers. Finally, two different
implementations of the procedure have been developed: the former
is a high performance parallel implementation, whereas the latter is
a portable Java and XML based implementation. Thus, the paper
demonstrates that a simple spread spectrum watermarking procedure,
with limited and appropriate modifications to the embedding scheme,
can still represent a valid alternative to many other well-known and
more recent watermarking procedures proposed in literature.
Abstract: Relational databases are often used as a basis for persistent storage of ontologies to facilitate rapid operations such as search and retrieval, and to utilize the benefits of relational databases management systems such as transaction management, security and integrity control. On the other hand, there appear more and more OWL files that contain ontologies. Therefore, this paper proposes to extract ontologies from OWL files and then store them in relational databases. A prerequisite for this storing is transformation of ontologies to relational databases, which is the purpose of this paper.
Abstract: Sharing consistent and correct master data among
disparate applications in a reverse-logistics chain has long been
recognized as an intricate problem. Although a master data
management (MDM) system can surely assume that responsibility,
applications that need to co-operate with it must comply with
proprietary query interfaces provided by the specific MDM system. In
this paper, we present a RFID-ready MDM system which makes
master data readily available for any participating applications in a
reverse-logistics chain. We propose a RFID-wrapper as a part of our
MDM. It acts as a gateway between any data retrieval request and
query interfaces that process it. With the RFID-wrapper, any
participating applications in a reverse-logistics chain can easily
retrieve master data in a way that is analogous to retrieval of any other
RFID-based logistics transactional data.
Abstract: This paper presents a watermarking protocol able to
solve the well-known “customer-s right problem" and “unbinding
problem". In particular, the protocol has been purposely designed
to be adopted in a web context, where users wanting to buy digital
contents are usually neither provided with digital certificates issued
by certification authorities (CAs) nor able to autonomously perform
specific security actions. Furthermore, the protocol enables users to
keep their identities unexposed during web transactions as well as
allows guilty buyers, i.e. who are responsible distributors of illegal
replicas, to be unambiguously identified. Finally, the protocol has
been designed so that web content providers (CPs) can exploit
copyright protection services supplied by web service providers (SPs)
in a security context. Thus, CPs can take advantage of complex
services without having to directly implement them.
Abstract: Mobile marketing through mobile messaging service
has highly impressive growth as it enables e-business firms to
communicate with their customers effectively. Educational
institutions hence start using this service to enhance communication
with their students. Previous studies, however, have limited
understanding of applying mobile messaging service in education.
This study proposes a theoretical model to understand the drivers of
students- intentions to use the university-s mobile messaging service.
The model indicates that social influence, perceived control and
attitudes affect students- intention to use the university-s mobile
messaging service. It also provides five antecedents of students-
attitudes–perceived utility (information utility, entertainment utility,
and social utility), innovativeness, information seeking, transaction
specificity (content specificity, sender specificity, and time
specificity) and privacy concern. The proposed model enables
universities to understand what students concern about the use of a
mobile messaging service in universities and handle the service more
effectively. The paper discusses the model development and
concludes with limitations and implications of the proposed model.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: This is a cross-cultural study that determines South
African multinational enterprises (MNEs) entry strategies as they
invest in Africa. An integrated theoretical framework comprising the
transaction cost theory, Uppsala model, eclectic paradigm and the
distance framework was adopted. A sample of 40 South African
MNEs with 415 existing FDI entries in Africa was drawn. Using an
ordered logistic regression model, the impact of culture on the choice
of degree of control by South African MNEs in Africa was
determined. Cultural distance was one of significant factors that
influenced South African MNEs- choice of degree of control.
Furthermore, South African MNEs are risk averse in all countries in
Africa but minimize the risks differently across sectors. Service
sectors chooses to own their subsidiaries 100% and avoid dealing
with the locals while manufacturing, resources and construction
choose to have a local partner to share the risk.
Abstract: In mobile environments, unspecified numbers of transactions
arrive in continuous streams. To prove correctness of their
concurrent execution a method of modelling an infinite number of
transactions is needed. Standard database techniques model fixed
finite schedules of transactions. Lately, techniques based on temporal
logic have been proposed as suitable for modelling infinite schedules.
The drawback of these techniques is that proving the basic
serializability correctness condition is impractical, as encoding (the
absence of) conflict cyclicity within large sets of transactions results
in prohibitively large temporal logic formulae. In this paper, we show
that, under certain common assumptions on the graph structure of
data items accessed by the transactions, conflict cyclicity need only
be checked within all possible pairs of transactions. This results in
formulae of considerably reduced size in any temporal-logic-based
approach to proving serializability, and scales to arbitrary numbers
of transactions.
Abstract: The objective of this research is to calculate the
optimal inventory lot-sizing for each supplier and minimize the total
inventory cost which includes joint purchase cost of the products,
transaction cost for the suppliers, and holding cost for remaining
inventory. Genetic algorithms (GAs) are applied to the multi-product
and multi-period inventory lot-sizing problems with supplier
selection under storage space. Also a maximum storage space for the
decision maker in each period is considered. The decision maker
needs to determine what products to order in what quantities with
which suppliers in which periods. It is assumed that demand of
multiple products is known over a planning horizon. The problem is
formulated as a mixed integer programming and is solved with the
GAs. The detailed computation results are presented.
Abstract: The work we have accomplished in implementing a
Mobile Payment mechanism that enables customers to pay bills for
groceries and other purchased items in a store through the means of a
mobile phone, specifically a Smartphone. The mode of transaction, as
far as communication between the customer-s handset and the
merchant-s POS is concerned, we have decided upon NFC (Near
Field Communication). This is due to the fact that for the most part,
Pakistani Smartphone users have handsets that have Android mobile
OS, which supports the aforementioned platform, IOS, on the other
hand does not.
Abstract: Based on assumptions of neo-classical economics and
rational choice / public choice theory, this paper investigates the
regulation of industrial land use in Taiwan by homeowners
associations (HOAs) as opposed to traditional government
administration. The comparison, which applies the transaction cost
theory and a polynomial regression analysis, manifested that HOAs
are superior to conventional government administration in terms of
transaction costs and overall efficiency. A case study that compares
Taiwan-s commonhold industrial park, NangKang Software Park, to
traditional government counterparts using limited data on the costs
and returns was analyzed. This empirical study on the relative
efficiency of governmental and private institutions justified the
important theoretical proposition. Numerical results prove the
efficiency of the established model.