Abstract: The purpose of this study is comparing and analysing
of the financial characteristics for development methods of the urban development project in the established area, focusing on the
multi-level replotting.
Analysis showed that the type of the lowest expenditure was
'combination type of group-land and multi-level replotting' and the type of the highest profitability was 'multi-level replotting type'. But
'multi-level replotting type' has still risk of amount of cost for the additional architecture. In addition, we subdivided standard amount
for liquidation of replotting and analysed income-expenditure flow.
Analysis showed that both of 'multi-level replotting type' and 'combination type of group-land and multi-level replotting' improved
profitability of project and property change ratio. However, when the
standard was under a certain amount, amount of original property for the replotting was increased exponentially, and profitability of project.
Abstract: Project selection problems on management
information system (MIS) are often considered a multi-criteria
decision-making (MCDM) for a solving method. These problems
contain two aspects, such as interdependencies among criteria and
candidate projects and qualitative and quantitative factors of projects.
However, most existing methods reported in literature consider these
aspects separately even though these two aspects are simultaneously
incorporated. For this reason, we proposed a hybrid method using
analytic network process (ANP) and fuzzy logic in order to represent
both aspects. We then propose a goal programming model to conduct
an optimization for the project selection problems interpreted by a
hybrid concept. Finally, a numerical example is conducted as
verification purposes.
Abstract: The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: This paper proposes new enhancement models to the
methods of nonlinear anisotropic diffusion to greatly reduce speckle
and preserve image features in medical ultrasound images. By
incorporating local physical characteristics of the image, in this case
scatterer density, in addition to the gradient, into existing tensorbased
image diffusion methods, we were able to greatly improve the
performance of the existing filtering methods, namely edge
enhancing (EE) and coherence enhancing (CE) diffusion. The new
enhancement methods were tested using various ultrasound images,
including phantom and some clinical images, to determine the
amount of speckle reduction, edge, and coherence enhancements.
Scatterer density weighted nonlinear anisotropic diffusion
(SDWNAD) for ultrasound images consistently outperformed its
traditional tensor-based counterparts that use gradient only to weight
the diffusivity function. SDWNAD is shown to greatly reduce
speckle noise while preserving image features as edges, orientation
coherence, and scatterer density. SDWNAD superior performances
over nonlinear coherent diffusion (NCD), speckle reducing
anisotropic diffusion (SRAD), adaptive weighted median filter
(AWMF), wavelet shrinkage (WS), and wavelet shrinkage with
contrast enhancement (WSCE), make these methods ideal
preprocessing steps for automatic segmentation in ultrasound
imaging.
Abstract: Automatic segmentation of skin lesions is the first step
towards the automated analysis of malignant melanoma. Although
numerous segmentation methods have been developed, few studies
have focused on determining the most effective color space for
melanoma application. This paper proposes an automatic segmentation
algorithm based on color space analysis and clustering-based histogram
thresholding, a process which is able to determine the optimal
color channel for detecting the borders in dermoscopy images. The
algorithm is tested on a set of 30 high resolution dermoscopy images.
A comprehensive evaluation of the results is provided, where borders
manually drawn by four dermatologists, are compared to automated
borders detected by the proposed algorithm, applying three previously
used metrics of accuracy, sensitivity, and specificity and a new metric
of similarity. By performing ROC analysis and ranking the metrics,
it is demonstrated that the best results are obtained with the X and
XoYoR color channels, resulting in an accuracy of approximately
97%. The proposed method is also compared with two state-of-theart
skin lesion segmentation methods.
Abstract: In this paper a novel approach for generalized image
retrieval based on semantic contents is presented. A combination of
three feature extraction methods namely color, texture, and edge
histogram descriptor. There is a provision to add new features in
future for better retrieval efficiency. Any combination of these
methods, which is more appropriate for the application, can be used
for retrieval. This is provided through User Interface (UI) in the
form of relevance feedback. The image properties analyzed in this
work are by using computer vision and image processing algorithms.
For color the histogram of images are computed, for texture cooccurrence
matrix based entropy, energy, etc, are calculated and for
edge density it is Edge Histogram Descriptor (EHD) that is found.
For retrieval of images, a novel idea is developed based on greedy
strategy to reduce the computational complexity. The entire system
was developed using AForge.Imaging (an open source product),
MATLAB .NET Builder, C#, and Oracle 10g. The system was tested
with Coral Image database containing 1000 natural images and
achieved better results.
Abstract: A low bit rate still image compression scheme by
compressing the indices of Vector Quantization (VQ) and generating
residual codebook is proposed. The indices of VQ are compressed by
exploiting correlation among image blocks, which reduces the bit per
index. A residual codebook similar to VQ codebook is generated that
represents the distortion produced in VQ. Using this residual
codebook the distortion in the reconstructed image is removed,
thereby increasing the image quality. Our scheme combines these two
methods. Experimental results on standard image Lena show that our
scheme can give a reconstructed image with a PSNR value of 31.6 db
at 0.396 bits per pixel. Our scheme is also faster than the existing VQ
variants.
Abstract: Implicit equations play a crucial role in Engineering.
Based on this importance, several techniques have been applied to
solve this particular class of equations. When it comes to practical
applications, in general, iterative procedures are taken into account.
On the other hand, with the improvement of computers, other
numerical methods have been developed to provide a more
straightforward methodology of solution. Analytical exact approaches
seem to have been continuously neglected due to the difficulty
inherent in their application; notwithstanding, they are indispensable
to validate numerical routines. Lagrange-s Inversion Theorem is a
simple mathematical tool which has proved to be widely applicable to
engineering problems. In short, it provides the solution to implicit
equations by means of an infinite series. To show the validity of this
method, the tree-parameter infiltration equation is, for the first time,
analytically and exactly solved. After manipulating these series,
closed-form solutions are presented as H-functions.
Abstract: The aim of this paper is to rank the impact of Object
Oriented(OO) metrics in fault prediction modeling using Artificial
Neural Networks(ANNs). Past studies on empirical validation of
object oriented metrics as fault predictors using ANNs have focused
on the predictive quality of neural networks versus standard
statistical techniques. In this empirical study we turn our attention to
the capability of ANNs in ranking the impact of these explanatory
metrics on fault proneness. In ANNs data analysis approach, there is
no clear method of ranking the impact of individual metrics. Five
ANN based techniques are studied which rank object oriented
metrics in predicting fault proneness of classes. These techniques are
i) overall connection weights method ii) Garson-s method iii) The
partial derivatives methods iv) The Input Perturb method v) the
classical stepwise methods. We develop and evaluate different
prediction models based on the ranking of the metrics by the
individual techniques. The models based on overall connection
weights and partial derivatives methods have been found to be most
accurate.
Abstract: The impact of OO design on software quality
characteristics such as defect density and rework by mean of
experimental validation. Encapsulation, inheritance, polymorphism,
reusability, Data hiding and message-passing are the major attribute
of an Object Oriented system. In order to evaluate the quality of an
Object oriented system the above said attributes can act as indicators.
The metrics are the well known quantifiable approach to express any
attribute. Hence, in this paper we tried to formulate a framework of
metrics representing the attributes of object oriented system.
Empirical Data is collected from three different projects based on
object oriented paradigms to calculate the metrics.
Abstract: Computer aided design accounts with the support of
parametric software in the design of machine components as well as
of any other pieces of interest. The complexities of the element under
study sometimes offer certain difficulties to computer design, or ever
might generate mistakes in the final body conception. Reverse
engineering techniques are based on the transformation of already
conceived body images into a matrix of points which can be
visualized by the design software. The literature exhibits several
techniques to obtain machine components dimensional fields, as
contact instrument (MMC), calipers and optical methods as laser
scanner, holograms as well as moiré methods. The objective of this
research work was to analyze the moiré technique as instrument of
reverse engineering, applied to bodies of nom complex geometry as
simple solid figures, creating matrices of points. These matrices were
forwarded to a parametric software named SolidWorks to generate
the virtual object. Volume data obtained by mechanical means, i.e.,
by caliper, the volume obtained through the moiré method and the
volume generated by the SolidWorks software were compared and
found to be in close agreement. This research work suggests the
application of phase shifting moiré methods as instrument of reverse
engineering, serving also to support farm machinery element designs.
Abstract: Flood management is one of the important fields in
urban storm water management. Floods are influenced by the
increase of huge storm event, or improper planning of the area. This study mainly provides the flood protection in four stages; planning,
flood event, responses and evaluation. However it is most effective then flood protection is considered in planning/design and
evaluation stages since both stages represent the land development of the area. Structural adjustments are often more reliable than nonstructural
adjustments in providing flood protection, however
structural adjustments are constrained by numerous factors such as
political constraints and cost. Therefore it is important to balance
both adjustments with the situation. The technical decisions provided
will have to be approved by the higher-ups who have the power to
decide on the final solution. Costs however, are the biggest factor in
determining the final decision. Therefore this study recommends
flood protection system should have been integrated and enforces
more in the early stages (planning and design) as part of the storm
water management plan. Factors influencing the technical decisions
provided should be reduced as low as possible to avoid a reduction in
the expected performance of the proposed adjustments.
Abstract: S-boxes (Substitution boxes) are keystones of modern
symmetric cryptosystems (block ciphers, as well as stream ciphers).
S-boxes bring nonlinearity to cryptosystems and strengthen their
cryptographic security. They are used for confusion in data security
An S-box satisfies the strict avalanche criterion (SAC), if and only if
for any single input bit of the S-box, the inversion of it changes each
output bit with probability one half. If a function (cryptographic
transformation) is complete, then each output bit depends on all of
the input bits. Thus, if it were possible to find the simplest Boolean
expression for each output bit in terms of the input bits, each of these
expressions would have to contain all of the input bits if the function
is complete. From some important properties of S-box, the most
interesting property SAC (Strict Avalanche Criterion) is presented
and to analyze this property three analysis methods are proposed.
Abstract: In the recent years, high dynamic range imaging has
gain popularity with the advancement in digital photography. In this
contribution we present a subjective evaluation of various tone
production and tone mapping techniques by a number of participants.
Firstly, standard HDR images were used and the participants were
asked to rate them based on a given rating scheme. After that, the
participant was asked to rate HDR image generated using linear and
nonlinear combination approach of multiple exposure images. The
experimental results showed that linearly generated HDR images
have better visualization than the nonlinear combined ones. In
addition, Reinhard et al. and the exponential tone mapping operators
have shown better results compared to logarithmic and the Garrett et
al. tone mapping operators.
Abstract: In automatic manufacturing and assembling of mechanical, electrical and electronic parts one needs to reliably identify the position of components and to extract the information of these components. Data Matrix Codes (DMC) are established by these days in many areas of industrial manufacturing thanks to their concentration of information on small spaces. In today’s usually order-related industry, where increased tracing requirements prevail, they offer further advantages over other identification systems. This underlines in an impressive way the necessity of a robust code reading system for detecting DMC on the components in factories. This paper compares two methods for estimating the angle of orientation of Data Matrix Codes: one method based on the Hough Transform and the other based on the Mean Shift Algorithm. We concentrate on Data Matrix Codes in industrial environment, punched, milled, lasered or etched on different materials in arbitrary orientation.
Abstract: A direct search approach to determine optimal reservoir operating is proposed with ant colony optimization for continuous domains (ACOR). The model is applied to a system of single reservoir to determine the optimum releases during 42 years of monthly steps. A disadvantage of ant colony based methods and the ACOR in particular, refers to great amount of computer run time consumption. In this study a highly effective procedure for decreasing run time has been developed. The results are compared to those of a GA based model.
Abstract: This evaluation of land supply system performance in
China shall examine the combination of government functions and
national goals in order to perform a cost-benefit analysis of system
results. From the author's point of view, it is most productive to
evaluate land supply system performance at moments of system
transformation for the following reasons. The behavior and
input-output change of beneficial results at different times can be
observed when the system or policy changes and system performance
can be evaluated through a cost-benefit analysis during the process of
system transformation. Moreover, this evaluation method can avoid
the influence of land resource endowment. Different land resource
endowment methods and different economy development periods
result in different systems. This essay studies the contents, principles
and methods of land supply system performance evaluation. Taking
Beijing as an example, this essay optimizes and classifies the land
supply index, makes a quantitative evaluation of land supply system
performance through principal component analysis (PCA), and finally
analyzes the factors that influence land supply system performance at
times of system transformation.
Abstract: A number of automated shot-change detection
methods for indexing a video sequence to facilitate browsing and
retrieval have been proposed in recent years. This paper emphasizes
on the simulation of video shot boundary detection using one of the
methods of the color histogram wherein scaling of the histogram
metrics is an added feature. The difference between the histograms of
two consecutive frames is evaluated resulting in the metrics. Further
scaling of the metrics is performed to avoid ambiguity and to enable
the choice of apt threshold for any type of videos which involves
minor error due to flashlight, camera motion, etc. Two sample videos
are used here with resolution of 352 X 240 pixels using color
histogram approach in the uncompressed media. An attempt is made
for the retrieval of color video. The simulation is performed for the
abrupt change in video which yields 90% recall and precision value.