Abstract: Com Poisson distribution is capable of modeling the count responses irrespective of their mean variance relation and the parameters of this distribution when fitted to a simple cross sectional data can be efficiently estimated using maximum likelihood (ML) method. In the regression setup, however, ML estimation of the parameters of the Com Poisson based generalized linear model is computationally intensive. In this paper, we propose to use quasilikelihood (QL) approach to estimate the effect of the covariates on the Com Poisson counts and investigate the performance of this method with respect to the ML method. QL estimates are consistent and almost as efficient as ML estimates. The simulation studies show that the efficiency loss in the estimation of all the parameters using QL approach as compared to ML approach is quite negligible, whereas QL approach is lesser involving than ML approach.
Abstract: In medical therapy, laser has been widely used to conduct cosmetic, tumor and other treatments. During the process of laser irradiation, there may be thermal damage caused by excessive laser exposure. Thus, the establishment of a complete thermal analysis model is clinically helpful to physicians in reference data. In this study, porcine liver in place of tissue was subjected to laser irradiation to set up the experimental data considering the explored impact on surface thermal field and thermal damage region under different conditions of power, laser irradiation time, and distance between laser and porcine liver. In the experimental process, the surface temperature distribution of the porcine lever was measured by the infrared thermal imager. In the part of simulation, the bio heat transfer Pennes-s equation was solved by software SYSWELD applying in welding process. The double ellipsoid function as a laser source term is firstly considered in the prediction for surface thermal field and internal tissue damage. The simulation results are compared with the experimental data to validate the mathematical model established here in.
Abstract: This article deals with the numerical simulation of the
floor heating convector in 3D. Presented convector can operate in
two modes – cooling mode and heating mode. This initial numerical
simulation is focused on cooling mode of the convector. Models with
different temperature of the fins are compared and three various
shapes of the fins are examined as well. The objective of the work is
to predict air flow and heat transfer inside convector for further
optimalization of these devices. For the numerical simulation was
used commercial software Ansys Fluent.
Abstract: Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.
Abstract: In this paper, a solution is presented for a robotic
manipulation problem in industrial settings. The problem is sensing
objects on a conveyor belt, identifying the target, planning and
tracking an interception trajectory between end effector and the
target. Such a problem could be formulated as combining object
recognition, tracking and interception. For this purpose, we integrated
a vision system to the manipulation system and employed tracking
algorithms. The control approach is implemented on a real industrial
manipulation setting, which consists of a conveyor belt, objects
moving on it, a robotic manipulator, and a visual sensor above the
conveyor. The trjectory for robotic interception at a rendezvous point
on the conveyor belt is analytically calculated. Test results show that
tracking the raget along this trajectory results in interception and
grabbing of the target object.
Abstract: Selecting the word translation from a set of target
language words, one that conveys the correct sense of source word
and makes more fluent target language output, is one of core
problems in machine translation. In this paper we compare the 3
methods of estimating word translation probabilities for selecting the
translation word in Thai – English Machine Translation. The 3
methods are (1) Method based on frequency of word translation, (2)
Method based on collocation of word translation, and (3) Method
based on Expectation Maximization (EM) algorithm. For evaluation
we used Thai – English parallel sentences generated by NECTEC.
The method based on EM algorithm is the best method in comparison
to the other methods and gives the satisfying results.
Abstract: Clustering techniques have received attention in many areas including engineering, medicine, biology and data mining. The purpose of clustering is to group together data points, which are close to one another. The K-means algorithm is one of the most widely used techniques for clustering. However, K-means has two shortcomings: dependency on the initial state and convergence to local optima and global solutions of large problems cannot found with reasonable amount of computation effort. In order to overcome local optima problem lots of studies done in clustering. This paper is presented an efficient hybrid evolutionary optimization algorithm based on combining Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO), called PSO-ACO, for optimally clustering N object into K clusters. The new PSO-ACO algorithm is tested on several data sets, and its performance is compared with those of ACO, PSO and K-means clustering. The simulation results show that the proposed evolutionary optimization algorithm is robust and suitable for handing data clustering.
Abstract: An iterative algorithm is proposed and tested in Cournot Game models, which is based on the convergence of sequential best responses and the utilization of a genetic algorithm for determining each player-s best response to a given strategy profile of its opponents. An extra outer loop is used, to address the problem of finite accuracy, which is inherent in genetic algorithms, since the set of feasible values in such an algorithm is finite. The algorithm is tested in five Cournot models, three of which have convergent best replies sequence, one with divergent sequential best replies and one with “local NE traps"[14], where classical local search algorithms fail to identify the Nash Equilibrium. After a series of simulations, we conclude that the algorithm proposed converges to the Nash Equilibrium, with any level of accuracy needed, in all but the case where the sequential best replies process diverges.
Abstract: This paper presents the stabilization potential of Class
F pond ash (PA) from a coal fired thermal power station on tropical
peat soil. Peat or highly organic soils are well known for their high
compressibility, natural moisture content, low shear strength and
long-term settlement. This study investigates the effect of different
amount (i.e., 5, 10, 15 and 20%) of PA on peat soil, collected from
Sarawak, Malaysia, mainly compaction and unconfined compressive
strength (UCS) properties. The amounts of PA added to the peat soil
sample as percentage of the dry peat soil mass. With the increase in
PA content, the maximum dry density (MDD) of peat soil increases,
while the optimum moisture content (OMC) decreases. The UCS
value of the peat soils increases significantly with the increase of PA
content and also with curing periods. This improvement on
compressive strength of tropical peat soils indicates that PA has the
potential to be used as a stabilizer for tropical peat soil. Also, the use
of PA in soil stabilization helps in reducing the pond volume and
achieving environment friendly as well as a sustainable development
of natural resources.
Abstract: The purpose of this study was to investigate the effectiveness of a recreational workout program for adults with disabilities over two semesters. This investigation was an action study conducted in a naturalistic setting. Participants included equal numbers of adults with severe cognitive impairments (n = 35) and adults without disabilities (n = 35). Adults with disabilities severe cognitive impairments were trained 6 self-initiated workout activities over two semesters by adults without disabilities. The numbers of task-analyzed steps of each activity performed correctly by each participant at the first and last weeks of each semester were used for data analysis. Results of the paired t-tests indicate that across two semesters, significant differences between the first and last weeks were found on 4 out of the 6 task-analyzed workout activities at a statistical level of significance p < .05. The recreational workout program developed in this study was effective.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called MorfeoSMC, enabling the development of mobility applications and services according to a channel model based on Services Oriented Architecture (SOA) principles. It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation of mobile Web contents. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering, as well as to exploit these semantic annotations in a novel user profile-aware content adaptation process. Semantic Web content adaptation is a way of adding value to and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: Lacking an inherent “natural" dissimilarity measure
between objects in categorical dataset presents special difficulties in
clustering analysis. However, each categorical attributes from a given
dataset provides natural probability and information in the sense of
Shannon. In this paper, we proposed a novel method which
heuristically converts categorical attributes to numerical values by
exploiting such associated information. We conduct an experimental
study with real-life categorical dataset. The experiment demonstrates
the effectiveness of our approach.
Abstract: This research seeks to investigate the frequency and
profitability of index arbitrage opportunities involving the SET50
futures, SET50 component stocks, and the ThaiDEX SET50 ETF
(ticker symbol: TDEX). In particular, the frequency and profit of
arbitrage are measured in the following three arbitrage tests: (1)
SET50 futures vs. ThaiDEX SET50 ETF, (2) SET50 futures vs.
SET50 component stocks, and (3) ThaiDEX SET50 ETF vs. SET50
component stocks are investigated. For tests (2) and (3), the problems
involve conic optimization and quadratic programming as subproblems.
This research is first to apply conic optimization and
quadratic programming techniques in the context of index arbitrage
and is first to investigate such index arbitrage in the Thai equity and
derivatives markets. Thus, the contribution of this study is twofold.
First, its results would help understand the contribution of the
derivatives securities to the efficiency of the Thai markets. Second,
the methodology employed in this study can be applied to other
geographical markets, with minor adjustments.
Abstract: This paper discusses site selection process for
biological soil conservation planning. It was supported by a valuefocused
approach and spatial multi-criteria evaluation techniques. A
first set of spatial criteria was used to design a number of potential
sites. Next, a new set of spatial and non-spatial criteria was
employed, including the natural factors and the financial costs,
together with the degree of suitability for the Bonkuh watershed to
biological soil conservation planning and to recommend the most
acceptable program. The whole process was facilitated by a new
software tool that supports spatial multiple criteria evaluation, or
SMCE in GIS software (ILWIS). The application of this tool,
combined with a continual feedback by the public attentions, has
provided an effective methodology to solve complex decisional
problem in biological soil conservation planning.
Abstract: Reversible logic is becoming more and more prominent
as the technology sets higher demands on heat, power, scaling
and stability. Reversible gates are able at any time to "undo" the
current step or function. Multiple-valued logic has the advantage of
transporting and evaluating higher bits each clock cycle than binary.
Moreover, we demonstrate in this paper, combining these disciplines
we can construct powerful multiple-valued reversible logic structures.
In this paper a reversible block implemented by pseudo floatinggate
can perform AD-function and a DA-function as its reverse
application.
Abstract: An alternative approach to the use of Discrete Fourier
Transform (DFT) for Magnetic Resonance Imaging (MRI) reconstruction
is the use of parametric modeling technique. This method
is suitable for problems in which the image can be modeled by
explicit known source functions with a few adjustable parameters.
Despite the success reported in the use of modeling technique as an
alternative MRI reconstruction technique, two important problems
constitutes challenges to the applicability of this method, these are
estimation of Model order and model coefficient determination. In
this paper, five of the suggested method of evaluating the model
order have been evaluated, these are: The Final Prediction Error
(FPE), Akaike Information Criterion (AIC), Residual Variance (RV),
Minimum Description Length (MDL) and Hannan and Quinn (HNQ)
criterion. These criteria were evaluated on MRI data sets based on the
method of Transient Error Reconstruction Algorithm (TERA). The
result for each criterion is compared to result obtained by the use of a
fixed order technique and three measures of similarity were evaluated.
Result obtained shows that the use of MDL gives the highest measure
of similarity to that use by a fixed order technique.
Abstract: Inspired by the recent experiments [1]-[3] indicating
unusual doubly magic nucleus 24O which lies just at the neutron
drip-line and encouraged by the success of our relativistic mean-field
(RMF) plus state dependent BCS approach for the description of
the ground state properties of the drip-line nuclei [23]-[27], we have
further employed this approach, across the entire periodic table, to
explore the unusual shell closures in exotic nuclei. In our RMF+BCS
approach the single particle continuum corresponding to the RMF is
replaced by a set of discrete positive energy states for the calculations
of pairing energy. Detailed analysis of the single particle spectrum,
pairing energies and densities of the nuclei predict the unusual proton
shell closures at Z = 6, 14, 16, 34, and unusual neutron shell closures
at N = 6, 14, 16, 34, 40, 70, 112.
Abstract: We depend upon explanation in order to “make sense"
out of our world. And, making sense is all the more important when
dealing with change. But, what happens if our explanations are
wrong? This question is examined with respect to two types of
explanatory model. Models based on labels and categories we shall
refer to as “representations." More complex models involving
stories, multiple algorithms, rules of thumb, questions, ambiguity we
shall refer to as “compressions." Both compressions and
representations are reductions. But representations are far more
reductive than compressions. Representations can be treated as a set
of defined meanings – coherence with regard to a representation is
the degree of fidelity between the item in question and the definition
of the representation, of the label. By contrast, compressions contain
enough degrees of freedom and ambiguity to allow us to make
internal predictions so that we may determine our potential actions in
the possibility space. Compressions are explanatory via mechanism.
Representations are explanatory via category. Managers are often
confusing their evocation of a representation (category inclusion) as
the creation of a context of compression (description of mechanism).
When this type of explanatory error occurs, more errors follow. In
the drive for efficiency such substitutions are all too often proclaimed
– at the manager-s peril..
Abstract: A new automatic system for the recognition and re¬construction of resealed and/or rotated partially occluded objects is presented. The objects to be recognized are described by 2D views and each view is occluded by several half-planes. The whole object views and their visible parts (linear cuts) are then stored in a database. To establish if a region R of an input image represents an object possibly occluded, the system generates a set of linear cuts of R and compare them with the elements in the database. Each linear cut of R is associated to the most similar database linear cut. R is recognized as an instance of the object 0 if the majority of the linear cuts of R are associated to a linear cut of views of 0. In the case of recognition, the system reconstructs the occluded part of R and determines the scale factor and the orientation in the image plane of the recognized object view. The system has been tested on two different datasets of objects, showing good performance both in terms of recognition and reconstruction accuracy.
Abstract: Resource discovery is one of the chief services of a grid. A new approach to discover the provenances in grid through learning automata has been propounded in this article. The objective of the aforementioned resource-discovery service is to select the resource based upon the user-s applications and the mercantile yardsticks that is to say opting for an originator which can accomplish the user-s tasks in the most economic manner. This novel service is submitted in two phases. We proffered an applicationbased categorization by means of an intelligent nerve-prone plexus. The user in question sets his or her application as the input vector of the nerve-prone nexus. The output vector of the aforesaid network limns the appropriateness of any one of the resource for the presented executive procedure. The most scrimping option out of those put forward in the previous stage which can be coped with to fulfill the task in question is picked out. Te resource choice is carried out by means of the presented algorithm based upon the learning automata.