Abstract: In this paper, algorithms for the automatic localisation
of two anatomical soft tissue landmarks of the head the medial
canthus (inner corner of the eye) and the tragus (a small, pointed,
cartilaginous flap of the ear), in CT images are describet. These
landmarks are to be used as a basis for an automated image-to-patient
registration system we are developing. The landmarks are localised
on a surface model extracted from CT images, based on surface
curvature and a rule based system that incorporates prior knowledge
of the landmark characteristics. The approach was tested on a dataset
of near isotropic CT images of 95 patients. The position of the
automatically localised landmarks was compared to the position of
the manually localised landmarks. The average difference was 1.5
mm and 0.8 mm for the medial canthus and tragus, with a maximum
difference of 4.5 mm and 2.6 mm respectively.The medial canthus
and tragus can be automatically localised in CT images, with
performance comparable to manual localisation
Abstract: In many applications, it is a priori known that the
target function should satisfy certain constraints imposed by, for
example, economic theory or a human-decision maker. Here we
consider partially monotone problems, where the target variable
depends monotonically on some of the predictor variables but not all.
We propose an approach to build partially monotone models based
on the convolution of monotone neural networks and kernel
functions. The results from simulations and a real case study on
house pricing show that our approach has significantly better
performance than partially monotone linear models. Furthermore, the
incorporation of partial monotonicity constraints not only leads to
models that are in accordance with the decision maker's expertise,
but also reduces considerably the model variance in comparison to
standard neural networks with weight decay.
Abstract: In this paper, we propose an adaptation of the Patricia-Tree for sparse datasets to generate non redundant rule associations. Using this adaptation, we can generate frequent closed itemsets that are more compact than frequent itemsets used in Apriori approach. This adaptation has been experimented on a set of datasets benchmarks.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: Today, advantage of biotechnology especially in environmental issues compared to other technologies is irrefragable. Kimia Gharb Gostar Industries Company, as a largest producer of citric acid in Middle East, applies biotechnology for this goal. Citrogypsum is a by–product of citric acid production and it considered as a valid residuum of this company. At this paper summary of acid citric production and condition of Citrogypsum production in company were introduced in addition to defmition of Citrogypsum production and its applications in world. According to these information and evaluation of present conditions about Iran needing to Citrogypsum, the best priority was introduced and emphasized on strategy selection and proper programming for self-sufficiency. The Delphi technique was used to elicit expert opinions about criteria for evaluating the usages. The criteria identified by the experts were profitability, capacity of production, the degree of investment, marketable, production ease and time production. The Analytical Hierarchy Process (ARP) and Expert Choice software were used to compare the alternatives on the criteria derived from the Delphi process.
Abstract: Various formal and informal brand alliances are being formed in professional service firms. Professional service corporate brand is heavily dependent on brands of professional employees who comprise them, and professional employee brands are in turn dependent on the corporate brand. Prior work provides limited scientific evidence of brand alliance effects in professional service area – i.e., how professional service corporate-employee brand allies are affected by an alliance, what are brand attitude effects after alliance formation and how these effects vary with different strengths of an ally. Scientific literature analysis and theoretical modeling are the main methods of the current study. As a result, a theoretical model is constructed for estimating spillover effects of professional service corporate-employee brand alliances and for comparison among different professional service firm expertise practice models – from “brains" to “procedure" model. The resulting theoretical model lays basis for future experimental studies.
Abstract: To provide a better understanding of fair share policies supported by current production schedulers and their impact on scheduling performance, A relative fair share policy supported in four well-known production job schedulers is evaluated in this study. The experimental results show that fair share indeed reduces heavy-demand users from dominating the system resources. However, the detailed per-user performance analysis show that some types of users may suffer unfairness under fair share, possibly due to priority mechanisms used by the current production schedulers. These users typically are not heavy-demands users but they have mixture of jobs that do not spread out.
Abstract: Among various HLM techniques, the Multivariate Hierarchical Linear Model (MHLM) is desirable to use, particularly when multivariate criterion variables are collected and the covariance structure has information valuable for data analysis. In order to reflect prior information or to obtain stable results when the sample size and the number of groups are not sufficiently large, the Bayes method has often been employed in hierarchical data analysis. In these cases, although the Markov Chain Monte Carlo (MCMC) method is a rather powerful tool for parameter estimation, Procedures regarding MCMC have not been formulated for MHLM. For this reason, this research presents concrete procedures for parameter estimation through the use of the Gibbs samplers. Lastly, several future topics for the use of MCMC approach for HLM is discussed.
Abstract: Safety Health and Environment Code of Practice (SHE
COP) was developed to help road transportation operators to manage
its operation in a systematic and safe manner. A study was conducted
to determine the effectiveness of SHE COP implementation during
non-OPS period. The objective of the study is to evaluate the
implementations of SHE COP among bus operators during wee hour
operations. The data was collected by completing a set of checklist
after observing the activities during pre departure, during the trip, and
upon arrival. The results show that there are seven widely practiced
SHE COP elements. 22% of the buses have average speed exceeding
the maximum permissible speed on the highways (90 km/h), with
13% of the buses were travelling at the speed of more than 100 km/h.
The statistical analysis shows that there is only one significant
association which relates speeding with prior presence of
enforcement officers.
Abstract: The main aim of this study was to examine whether
people understand indicative conditionals on the basis of syntactic
factors or on the basis of subjective conditional probability. The
second aim was to investigate whether the conditional probability of
q given p depends on the antecedent and consequent sizes or derives
from inductive processes leading to establish a link of plausible cooccurrence
between events semantically or experientially associated.
These competing hypotheses have been tested through a 3 x 2 x 2 x 2
mixed design involving the manipulation of four variables: type of
instructions (“Consider the following statement to be true", “Read the
following statement" and condition with no conditional statement);
antecedent size (high/low); consequent size (high/low); statement
probability (high/low). The first variable was between-subjects, the
others were within-subjects. The inferences investigated were Modus
Ponens and Modus Tollens. Ninety undergraduates of the Second
University of Naples, without any prior knowledge of logic or
conditional reasoning, participated in this study.
Results suggest that people understand conditionals in a syntactic
way rather than in a probabilistic way, even though the perception of
the conditional probability of q given p is at least partially involved in
the conditionals- comprehension. They also showed that, in presence
of a conditional syllogism, inferences are not affected by the
antecedent or consequent sizes. From a theoretical point of view these
findings suggest that it would be inappropriate to abandon the idea
that conditionals are naturally understood in a syntactic way for the
idea that they are understood in a probabilistic way.
Abstract: In this paper, we study FPGA implementation of a
novel supra-optimal receiver diversity combining technique,
generalized maximal ratio combining (GMRC), for wireless
transmission over fading channels in SIMO systems. Prior
published results using ML-detected GMRC diversity signal
driven by BPSK showed superior bit error rate performance to
the widely used MRC combining scheme in an imperfect
channel estimation (ICE) environment. Under perfect channel
estimation conditions, the performance of GMRC and MRC
were identical. The main drawback of the GMRC study was
that it was theoretical, thus successful FPGA implementation
of it using pipeline techniques is needed as a wireless
communication test-bed for practical real-life situations.
Simulation results showed that the hardware implementation
was efficient both in terms of speed and area. Since diversity
combining is especially effective in small femto- and picocells,
internet-associated wireless peripheral systems are to
benefit most from GMRC. As a result, many spinoff
applications can be made to the hardware of IP-based 4th
generation networks.
Abstract: This paper presents a new method which applies an
artificial bee colony algorithm (ABC) for capacitor placement in
distribution systems with an objective of improving the voltage profile
and reduction of power loss. The ABC algorithm is a new population
based meta heuristic approach inspired by intelligent foraging behavior
of honeybee swarm. The advantage of ABC algorithm is that
it does not require external parameters such as cross over rate and
mutation rate as in case of genetic algorithm and differential evolution
and it is hard to determine these parameters in prior. The other
advantage is that the global search ability in the algorithm is implemented
by introducing neighborhood source production mechanism
which is a similar to mutation process. To demonstrate the validity
of the proposed algorithm, computer simulations are carried out on
69-bus system and compared the results with the other approach
available in the literature. The proposed method has outperformed the
other methods in terms of the quality of solution and computational
efficiency.
Abstract: In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.
Abstract: Reliability Centered Maintenance(RCM) is one of
most widely used methods in the modern power system to schedule a
maintenance cycle and determine the priority of inspection. In order
to apply the RCM method to the Smart Grid, a precedence study for
the new structure of rearranged system should be performed due to
introduction of additional installation such as renewable and
sustainable energy resources, energy storage devices and advanced
metering infrastructure. This paper proposes a new method to
evaluate the priority of maintenance and inspection of the power
system facilities in the Smart Grid using the Risk Priority Number. In
order to calculate that risk index, it is required that the reliability
block diagram should be analyzed for the Smart Grid system. Finally,
the feasible technical method is discussed to estimate the risk
potential as part of the RCM procedure.
Abstract: This paper is to clarify the relationship of individual investor types, risk tolerance and herding bias. The questionnaire survey investigation is conducted to collect 389 valid and voluntary individual investors and to examine how the risk tolerance plays as a mediator between four types of personality and herding bias. Based on featuring BB&K model and reviewing the prior literature of psychology, a linear structural model are constructed and further used to evaluate the path of herding formation through the analysis of Structural Equation Modeling (SEM). The results showed that more impetuous investors would be prone to herding bias directly, but rather exhibit higher risk tolerance. However, risk tolerance would fully mediate between the level of confidence (i.e., confident or anxious) and herding bias, but not mediate between the method of action (careful or impetuous) for individual investors.
Abstract: This research studied recycled waste by the Recyclable Material Bank Project of 4 universities in the central region of Thailand for the evaluation of reducing greenhouse gas emissions compared with landfilling activity during July 2012 to June 2013. The results showed that the projects collected total amount of recyclable wastes of about 911,984.80 kilograms. Office paper had the largest amount among these recycled wastes (50.68% of total recycled waste). Groups of recycled waste can be prioritized from high to low according to their amount as paper, plastic, glass, mixed recyclables, and metal, respectively. The project reduced greenhouse gas emissions equivalent to about 2814.969 metric tons of carbon dioxide. The most significant recycled waste that affects the reduction of greenhouse gas emissions is office paper which is 70.16% of total reduced greenhouse gasses emission. According to amount of reduced greenhouse gasses emission, groups of recycled waste can be prioritized from high to low significances as paper, plastic, metals, mixed recyclables, and glass, respectively.
Abstract: Segmentation of Magnetic Resonance Imaging (MRI) images is the most challenging problems in medical imaging. This paper compares the performances of Seed-Based Region Growing (SBRG), Adaptive Network-Based Fuzzy Inference System (ANFIS) and Fuzzy c-Means (FCM) in brain abnormalities segmentation. Controlled experimental data is used, which designed in such a way that prior knowledge of the size of the abnormalities are known. This is done by cutting various sizes of abnormalities and pasting it onto normal brain tissues. The normal tissues or the background are divided into three different categories. The segmentation is done with fifty seven data of each category. The knowledge of the size of the abnormalities by the number of pixels are then compared with segmentation results of three techniques proposed. It was proven that the ANFIS returns the best segmentation performances in light abnormalities, whereas the SBRG on the other hand performed well in dark abnormalities segmentation.
Abstract: As the use of registration packages spreads, the number of the aligned image pairs in image databases (either by manual or automatic methods) increases dramatically. These image pairs can serve as a set of training data. Correspondingly, the images that are to be registered serve as testing data. In this paper, a novel medical image registration method is proposed which is based on the a priori knowledge of the expected joint intensity distribution estimated from pre-aligned training images. The goal of the registration is to find the optimal transformation such that the distance between the observed joint intensity distribution obtained from the testing image pair and the expected joint intensity distribution obtained from the corresponding training image pair is minimized. The distance is measured using the divergence measure based on Tsallis entropy. Experimental results show that, compared with the widely-used Shannon mutual information as well as Tsallis mutual information, the proposed method is computationally more efficient without sacrificing registration accuracy.
Abstract: In data mining, the association rules are used to find
for the associations between the different items of the transactions
database. As the data collected and stored, rules of value can be found
through association rules, which can be applied to help managers
execute marketing strategies and establish sound market frameworks.
This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth)
to derive from fuzzy association rules. At first, we apply fuzzy
partition methods and decide a membership function of quantitative
value for each transaction item. Next, we implement FFP-growth
to deal with the process of data mining. In addition, in order to
understand the impact of Apriori algorithm and FFP-growth algorithm
on the execution time and the number of generated association
rules, the experiment will be performed by using different sizes of
databases and thresholds. Lastly, the experiment results show FFPgrowth
algorithm is more efficient than other existing methods.
Abstract: The main issue of interest here is whether individuals
who differ in arithmetical reasoning ability and levels of imagery ability display different brain activity during the conduct of mental
arithmetical reasoning tasks. This was a case study of four
participants who represented four extreme combinations of Maths –Imagery abilities: ie., low-low, high-high, high-low, low-high respectively. As the Ps performed a series of 60 arithmetical reasoning tasks, 128-channel EEG recordings were taken and the
pre-response interval subsequently analysed using EGI GeosourceTM
software. The P who was high in both imagery and maths ability
showed peak activity prior to response in BA7 (superior parietal cortex) but other Ps did not show peak activity in this region. The
results are considered in terms of the diverse routes that may be employed by individuals during the conduct of arithmetical reasoning
tasks and the possible implications of this for mathematics education.