Abstract: The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.
Abstract: The evolutionary design of electronic circuits, or
evolvable hardware, is a discipline that allows the user to
automatically obtain the desired circuit design. The circuit
configuration is under the control of evolutionary algorithms. Several
researchers have used evolvable hardware to design electrical
circuits. Every time that one particular algorithm is selected to carry
out the evolution, it is necessary that all its parameters, such as
mutation rate, population size, selection mechanisms etc. are tuned in
order to achieve the best results during the evolution process. This
paper investigates the abilities of evolution strategy to evolve digital
logic circuits based on programmable logic array structures when
different mutation rates are used. Several mutation rates (fixed and
variable) are analyzed and compared with each other to outline the
most appropriate choice to be used during the evolution of
combinational logic circuits. The experimental results outlined in this
paper are important as they could be used by every researcher who
might need to use the evolutionary algorithm to design digital logic
circuits.
Abstract: The Ant Colony Optimization (ACO) is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It has recently attracted a lot of attention and has been successfully applied to a number of different optimization problems. Due to the importance of the feature selection problem and the potential of ACO, this paper presents a novel method that utilizes the ACO algorithm to implement a feature subset search procedure. Initial results obtained using the classification of speech segments are very promising.
Abstract: Shippers are concentrating on the core competency to
stay competitive and outsourcing the logistic activities to the third
party who is expert in this field. This third party logistics (3PL) is
drawing the due attention at government, industrial, academicians
and practitioner-s levels. If the logistics cost in India can be brought
down from the current level of 13% of GDP to 9% (level in the U.S.),
the savings would be around Rs 3 lakh crore approximately per
annum. But the problem with the shippers is to select the suitable
3PL provider. Various criteria for selection of 3PL have been listed
in the literature which are discussed in the present literature review.
Every shipper will select the criteria suitable to its own requirement
which have to be dynamically reviewed time to time so as to fit in the
ever changing environment.
Abstract: Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.
Abstract: Grid scheduling is the process of mapping grid jobs to resources over multiple administrative domains. Traditionally, application-level schedulers have been tightly integrated with the application itself and were not easily applied to other applications. This design is generic that decouples the scheduler core (the search procedure) from the application-specific (e.g. application performance models) and platform-specific (e.g. collection of resource information) components used by the search procedure. In this decoupled approach the application details are not revealed completely to broker, but customer will give the application to resource provider for execution. In a decoupled approach, apart from scheduling, the resource selection can be performed independently in order to achieve scalability.
Abstract: Location selection is one of the most important
decision making process which requires to consider several criteria
based on the mission and the strategy. This study-s object is to
provide a decision support model in order to help the bank selecting
the most appropriate location for a bank-s branch considering a case
study in Turkey. The object of the bank is to select the most
appropriate city for opening a branch among six alternatives in the
South-Eastern of Turkey. The model in this study was consisted of
five main criteria which are Demographic, Socio-Economic, Sectoral
Employment, Banking and Trade Potential and twenty one subcriteria
which represent the bank-s mission and strategy. Because of
the multi-criteria structure of the problem and the fuzziness in the
comparisons of the criteria, fuzzy AHP is used and for the ranking of
the alternatives, TOPSIS method is used.
Abstract: An evolutionary method whose selection and recombination
operations are based on generalization error-bounds of
support vector machine (SVM) can select a subset of potentially
informative genes for SVM classifier very efficiently [7]. In this
paper, we will use the derivative of error-bound (first-order criteria)
to select and recombine gene features in the evolutionary process,
and compare the performance of the derivative of error-bound with
the error-bound itself (zero-order) in the evolutionary process. We
also investigate several error-bounds and their derivatives to compare
the performance, and find the best criteria for gene selection
and classification. We use 7 cancer-related human gene expression
datasets to evaluate the performance of the zero-order and first-order
criteria of error-bounds. Though both criteria have the same strategy
in theoretically, experimental results demonstrate the best criterion
for microarray gene expression data.
Abstract: Texture information plays increasingly an important
role in remotely sensed imagery classification and many pattern
recognition applications. However, the selection of relevant textural
features to improve this classification accuracy is not a straightforward
task. This work investigates the effectiveness of two Mutual
Information Feature Selector (MIFS) algorithms to select salient
textural features that contain highly discriminatory information for
multispectral imagery classification. The input candidate features are
extracted from a SPOT High Resolution Visible(HRV) image using
Wavelet Transform (WT) at levels (l = 1,2).
The experimental results show that the selected textural features
according to MIFS algorithms make the largest contribution to
improve the classification accuracy than classical approaches such
as Principal Components Analysis (PCA) and Linear Discriminant
Analysis (LDA).
Abstract: The Emergency Department of a medical center in
Taiwan cooperated to conduct the research. A predictive model of
triage system is contracted from the contract procedure, selection of
parameters to sample screening. 2,000 pieces of data needed for the
patients is chosen randomly by the computer. After three
categorizations of data mining (Multi-group Discriminant Analysis,
Multinomial Logistic Regression, Back-propagation Neural
Networks), it is found that Back-propagation Neural Networks can
best distinguish the patients- extent of emergency, and the accuracy
rate can reach to as high as 95.1%. The Back-propagation Neural
Networks that has the highest accuracy rate is simulated into the triage
acuity expert system in this research. Data mining applied to the
predictive model of the triage acuity expert system can be updated
regularly for both the improvement of the system and for education
training, and will not be affected by subjective factors.
Abstract: In this paper the Analytic Network Process (ANP) is
applied to the selection of photovoltaic (PV) solar power projects.
These projects follow a long management and execution process
from plant site selection to plant start-up. As a consequence, there are
many risks of time delays and even of project stoppage.
In the case study presented in this paper a top manager of an
important Spanish company that operates in the power market has to
decide on the best PV project (from four alternative projects) to
invest based on risk minimization. The manager identified 50 project
execution delay and/or stoppage risks.
The influences among elements of the network (groups of risks
and alternatives) were identified and analyzed using the ANP
multicriteria decision analysis method. After analyzing the results the
main conclusion is that the network model can manage all the
information of the real-world problem and thus it is a decision
analysis model recommended by the authors. The strengths and
weaknesses ANP as a multicriteria decision analysis tool are also
described in the paper.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Abstract: Network coding has recently attracted attention as an efficient technique in multicast/broadcast services. The problem of finding the optimal network coding mechanism maximizing the bandwidth efficiency is hard to solve and hard to approximate. Lots of network coding-based schemes have been suggested in the literature to improve the bandwidth efficiency, especially network coding-based automatic repeat request (NCARQ) schemes. However, existing schemes have several limitations which cause the performance degradation in resource limited systems. To improve the performance in resource limited systems, we propose NCARQ with overlapping selection (OS-NCARQ) scheme. The advantages of OS-NCARQ scheme over the traditional ARQ scheme and existing NCARQ schemes are shown through the analysis and simulations.
Abstract: Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.
Abstract: This paper presents design trade-off and performance impacts of
the amount of pipeline phase of control path signals in a wormhole-switched
network-on-chip (NoC). The numbers of the pipeline phase of the control
path vary between two- and one-cycle pipeline phase. The control paths
consist of the routing request paths for output selection and the arbitration
paths for input selection. Data communications between on-chip routers are
implemented synchronously and for quality of service, the inter-router data
transports are controlled by using a link-level congestion control to avoid
lose of data because of an overflow. The trade-off between the area (logic
cell area) and the performance (bandwidth gain) of two proposed NoC router
microarchitectures are presented in this paper. The performance evaluation is
made by using a traffic scenario with different number of workloads under
2D mesh NoC topology using a static routing algorithm. By using a 130-nm
CMOS standard-cell technology, our NoC routers can be clocked at 1 GHz,
resulting in a high speed network link and high router bandwidth capacity
of about 320 Gbit/s. Based on our experiments, the amount of control path
pipeline stages gives more significant impact on the NoC performance than
the impact on the logic area of the NoC router.
Abstract: Throughout this paper, a relatively new technique, the Tabu search variable selection model, is elaborated showing how it can be efficiently applied within the financial world whenever researchers come across the selection of a subset of variables from a whole set of descriptive variables under analysis. In the field of financial prediction, researchers often have to select a subset of variables from a larger set to solve different type of problems such as corporate bankruptcy prediction, personal bankruptcy prediction, mortgage, credit scoring and the Arbitrage Pricing Model (APM). Consequently, to demonstrate how the method operates and to illustrate its usefulness as well as its superiority compared to other commonly used methods, the Tabu search algorithm for variable selection is compared to two main alternative search procedures namely, the stepwise regression and the maximum R 2 improvement method. The Tabu search is then implemented in finance; where it attempts to predict corporate bankruptcy by selecting the most appropriate financial ratios and thus creating its own prediction score equation. In comparison to other methods, mostly the Altman Z-Score model, the Tabu search model produces a higher success rate in predicting correctly the failure of firms or the continuous running of existing entities.
Abstract: In this paper, a pipelined version of genetic algorithm,
called PLGA, and a corresponding hardware platform are described.
The basic operations of conventional GA (CGA) are made pipelined
using an appropriate selection scheme. The selection operator, used
here, is stochastic in nature and is called SA-selection. This helps
maintaining the basic generational nature of the proposed pipelined
GA (PLGA). A number of benchmark problems are used to compare
the performances of conventional roulette-wheel selection and the
SA-selection. These include unimodal and multimodal functions with
dimensionality varying from very small to very large. It is seen that
the SA-selection scheme is giving comparable performances with
respect to the classical roulette-wheel selection scheme, for all the
instances, when quality of solutions and rate of convergence are considered.
The speedups obtained by PLGA for different benchmarks
are found to be significant. It is shown that a complete hardware
pipeline can be developed using the proposed scheme, if parallel
evaluation of the fitness expression is possible. In this connection
a low-cost but very fast hardware evaluation unit is described.
Results of simulation experiments show that in a pipelined hardware
environment, PLGA will be much faster than CGA. In terms of
efficiency, PLGA is found to outperform parallel GA (PGA) also.
Abstract: Liver segmentation is the first significant process for
liver diagnosis of the Computed Tomography. It segments the liver
structure from other abdominal organs. Sophisticated filtering techniques
are indispensable for a proper segmentation. In this paper, we
employ a 3D anisotropic diffusion as a preprocessing step. While
removing image noise, this technique preserve the significant parts
of the image, typically edges, lines or other details that are important
for the interpretation of the image. The segmentation task is done
by using thresholding with automatic threshold values selection and
finally the false liver region is eliminated using 3D connected component.
The result shows that by employing the 3D anisotropic filtering,
better liver segmentation results could be achieved eventhough simple
segmentation technique is used.
Abstract: In this paper, we propose a morphing method by which face color images can be freely transformed. The main focus of this work is the transformation of one face image to another. This method is fully automatic in that it can morph two face images by automatically detecting all the control points necessary to perform the morph. A face detection neural network, edge detection and medium filters are employed to detect the face position and features. Five control points, for both the source and target images, are then extracted based on the facial features. Triangulation method is then used to match and warp the source image to the target image using the control points. Finally color interpolation is done using a color Gaussian model that calculates the color for each particular frame depending on the number of frames used. A real coded Genetic algorithm is used in both the image warping and color blending steps to assist in step size decisions and speed up the morphing. This method results in ''very smooth'' morphs and is fast to process.
Abstract: A robust AUSM+ upwind discretisation scheme has been developed to simulate multiphase flow using consistent spatial discretisation schemes and a modified low-Mach number diffusion term. The impact of the selection of an interfacial pressure model has also been investigated. Three representative test cases have been simulated to evaluate the accuracy of the commonly-used stiffenedgas equation of state with respect to the IAPWS-IF97 equation of state for water. The algorithm demonstrates a combination of robustness and accuracy over a range of flow conditions, with the stiffened-gas equation tending to overestimate liquid temperature and density profiles.