Abstract: The paper discusses the results obtained to predict
reinforcement in singly reinforced beam using Neural Net (NN),
Support Vector Machines (SVM-s) and Tree Based Models. Major
advantage of SVM-s over NN is of minimizing a bound on the
generalization error of model rather than minimizing a bound on
mean square error over the data set as done in NN. Tree Based
approach divides the problem into a small number of sub problems to
reach at a conclusion. Number of data was created for different
parameters of beam to calculate the reinforcement using limit state
method for creation of models and validation. The results from this
study suggest a remarkably good performance of tree based and
SVM-s models. Further, this study found that these two techniques
work well and even better than Neural Network methods. A
comparison of predicted values with actual values suggests a very
good correlation coefficient with all four techniques.
Abstract: This paper addresses a stock-cutting problem with rotation of items and without the guillotine cutting constraint. In order to solve the large-scale problem effectively and efficiently, we propose a simple but fast heuristic algorithm. It is shown that this heuristic outperforms the latest published algorithms for large-scale problem instances.
Abstract: In this paper a numerical algorithm is described for solving the boundary value problem associated with axisymmetric, inviscid, incompressible, rotational (and irrotational) flow in order to obtain duct wall shapes from prescribed wall velocity distributions. The governing equations are formulated in terms of the stream function ψ (x,y)and the function φ (x,y)as independent variables where for irrotational flow φ (x,y)can be recognized as the velocity potential function, for rotational flow φ (x,y)ceases being the velocity potential function but does remain orthogonal to the stream lines. A numerical method based on the finite difference scheme on a uniform mesh is employed. The technique described is capable of tackling the so-called inverse problem where the velocity wall distributions are prescribed from which the duct wall shape is calculated, as well as the direct problem where the velocity distribution on the duct walls are calculated from prescribed duct geometries. The two different cases as outlined in this paper are in fact boundary value problems with Neumann and Dirichlet boundary conditions respectively. Even though both approaches are discussed, only numerical results for the case of the Dirichlet boundary conditions are given. A downstream condition is prescribed such that cylindrical flow, that is flow which is independent of the axial coordinate, exists.
Abstract: In this paper, a new approach is introduced to solve
Blasius equation using parameter identification of a nonlinear
function which is used as approximation function. Bees Algorithm
(BA) is applied in order to find the adjustable parameters of
approximation function regarding minimizing a fitness function
including these parameters (i.e. adjustable parameters). These
parameters are determined how the approximation function has to
satisfy the boundary conditions. In order to demonstrate the
presented method, the obtained results are compared with another
numerical method. Present method can be easily extended to solve a
wide range of problems.
Abstract: Rule Discovery is an important technique for mining
knowledge from large databases. Use of objective measures for
discovering interesting rules leads to another data mining problem,
although of reduced complexity. Data mining researchers have
studied subjective measures of interestingness to reduce the volume
of discovered rules to ultimately improve the overall efficiency of
KDD process.
In this paper we study novelty of the discovered rules as a
subjective measure of interestingness. We propose a hybrid approach
based on both objective and subjective measures to quantify novelty
of the discovered rules in terms of their deviations from the known
rules (knowledge). We analyze the types of deviation that can arise
between two rules and categorize the discovered rules according to
the user specified threshold. We implement the proposed framework
and experiment with some public datasets. The experimental results
are promising.
Abstract: A retrospective study was undertaken to record the
occurrence and pattern of fractures in small animals (dogs and cats)
from year 2005 to 2010. A total of 650 cases were presented in small
animal surgery unit out of which of 116 (dogs and cats) were
presented with history of fractures of different bones. A total of
17.8% (116/650) cases were of fractures which constituted dogs 67%
while cats were 23%. The majority of animals were intact. Trauma in
the form of road side accident was the principal cause of fractures in
dogs whereas as in cats it was fall from height. The ages of the
fractured dog ranged from 4 months to 12 years whereas in cat it was
from 4 weeks to 10 years. The femoral fractures represented 37.5%
and 25% respectively in dogs and cats. Diaphysis, distal metaphyseal
and supracondylar fractures were the most affected sites in dog and
cats. Tibial fracture in dogs and cats represented 21.5% and 10%
while humoral fractures were 7.9% and 14% in dogs and cats
respectively. Humoral condyler fractures were most commonly seen
in puppies aged 4 to 6 months. Fractured radius-ulna incidence was
19% and 14% in dogs and cats respectively. Other fractures recorded
were of lumbar vertebrae, mandible and metacarpals etc. The
management comprised of external and internal fixation in both the
species. The most common internal fixation technique employed was
Intramedullary fixation in long followed by other methods like stack
or cross pinning, wiring etc as per findings in the cases. The cast
bandage was used majorly as mean for external coaptation. The
paper discusses the outcome of the case as per the technique
employed.
Abstract: The aim of this study is to analyze influence of
differences of heat insulation methods on indoor thermal environment and comfort of apartment buildings.
This study analyzes indoor thermal environment and comfort on units of apartment buildings using calculation software "THERB" and
compares three different kinds of heat insulation methods. Those are
outside insulation on outside walls, inside insulation on outside walls and interior insulation. In terms of indoor thermal environment, outside insulation is the best to stabilize room temperature. In winter, room temperature on
outside insulation after heating is higher than other and it is kept 3-5 degrees higher through all night. But the surface temperature with
outside insulation did not dramatically increase when heating was used, which was 3 to 5oC lower than the temperature with other
insulation. The PMV of interior insulation fall nearly range of comfort when the heating and cooling was use.
Abstract: A large number of semantic web service composition
approaches are developed by the research community and one is
more efficient than the other one depending on the particular
situation of use. So a close look at the requirements of ones particular
situation is necessary to find a suitable approach to use. In this paper,
we present a Technique Recommendation System (TRS) which using
a classification of state-of-art semantic web service composition
approaches, can provide the user of the system with the
recommendations regarding the use of service composition approach
based on some parameters regarding situation of use. TRS has
modular architecture and uses the production-rules for knowledge
representation.
Abstract: Effective employee selection is a critical component
of a successful organization. Many important criteria for personnel
selection such as decision-making ability, adaptability, ambition, and
self-organization are naturally vague and imprecise to evaluate. The
rough sets theory (RST) as a new mathematical approach to
vagueness and uncertainty is a very well suited tool to deal with
qualitative data and various decision problems. This paper provides
conceptual, descriptive, and simulation results, concentrating chiefly
on human resources and personnel selection factors. The current
research derives certain decision rules which are able to facilitate
personnel selection and identifies several significant features based
on an empirical study conducted in an IT company in Iran.
Abstract: Switched-mode converters play now a significant role in
modern society. Their operation are often crucial in various electrical
applications affecting the every day life. Therefore, the quality of
the converters needs to be reliably verified. Recent studies have
shown that the converters can be fully characterized by a set of
frequency responses which can be efficiently used to validate the
proper operation of the converters. Consequently, several methods
have been proposed to measure the frequency responses fast and
accurately. Most often correlation-based techniques have been applied.
The presented measurement methods are highly sensitive to
external errors and system nonlinearities. This fact has been often
forgotten and the necessary uncertainty analysis of the measured
responses has been neglected. This paper presents a simple approach
to analyze the noise and nonlinearities in the frequency-response
measurements of switched-mode converters. Coherence analysis is
applied to form a confidence interval characterizing the noise and
nonlinearities involved in the measurements. The presented method is
verified by practical measurements from a high-frequency switchedmode
converter.
Abstract: The values of managers and employees in organizations are phenomena that have captured the interest of researchers at large. Despite this attention, there continues to be a lack of agreement on what values are and how they influence individuals, or how they are constituted in individuals- mind. In this article content-based approach is presented as alternative reference frame for exploring values. In content-based approach human thinking in different contexts is set at the focal point. Differences in valuations can be explained through the information contents of mental representations. In addition to the information contents, attention is devoted to those cognitive processes through which mental representations of values are constructed. Such informational contents are in decisive role for understanding human behavior. By applying content-based analysis to an examination of values as mental representations, it is possible to reach a deeper to the motivational foundation of behaviors, such as decision making in organizational procedures, through understanding the structure and meanings of specific values at play.
Abstract: A clustering is process to identify a homogeneous
groups of object called as cluster. Clustering is one interesting topic
on data mining. A group or class behaves similarly characteristics.
This paper discusses a robust clustering process for data images with
two reduction dimension approaches; i.e. the two dimensional
principal component analysis (2DPCA) and principal component
analysis (PCA). A standard approach to overcome this problem is
dimension reduction, which transforms a high-dimensional data into
a lower-dimensional space with limited loss of information. One of
the most common forms of dimensionality reduction is the principal
components analysis (PCA). The 2DPCA is often called a variant of
principal component (PCA), the image matrices were directly treated
as 2D matrices; they do not need to be transformed into a vector so
that the covariance matrix of image can be constructed directly using
the original image matrices. The decomposed classical covariance
matrix is very sensitive to outlying observations. The objective of
paper is to compare the performance of robust minimizing vector
variance (MVV) in the two dimensional projection PCA (2DPCA)
and the PCA for clustering on an arbitrary data image when outliers
are hiden in the data set. The simulation aspects of robustness and
the illustration of clustering images are discussed in the end of
paper
Abstract: Writer identification is one of the areas in pattern
recognition that attract many researchers to work in, particularly in
forensic and biometric application, where the writing style can be
used as biometric features for authenticating an identity. The
challenging task in writer identification is the extraction of unique
features, in which the individualistic of such handwriting styles
can be adopted into bio-inspired generalized global shape for
writer identification. In this paper, the feasibility of generalized
global shape concept of complimentary binding in Artificial
Immune System (AIS) for writer identification is explored. An
experiment based on the proposed framework has been conducted
to proof the validity and feasibility of the proposed approach for
off-line writer identification.
Abstract: Need for an appropriate system of evaluating students-
educational developments is a key problem to achieve the predefined
educational goals. Intensity of the related papers in the last years; that
tries to proof or disproof the necessity and adequacy of the students
assessment; is the corroborator of this matter. Some of these studies
tried to increase the precision of determining question weights in
scientific examinations. But in all of them there has been an attempt
to adjust the initial question weights while the accuracy and precision
of those initial question weights are still under question. Thus In
order to increase the precision of the assessment process of students-
educational development, the present study tries to propose a new
method for determining the initial question weights by considering
the factors of questions like: difficulty, importance and complexity;
and implementing a combined method of PROMETHEE and fuzzy
analytic network process using a data mining approach to improve
the model-s inputs. The result of the implemented case study proves
the development of performance and precision of the proposed
model.
Abstract: This paper represents an investigation on how exploiting multiple transmit antennas by OFDM based wireless LAN subscribers can mitigate physical layer error rate. Then by comparing the Wireless LANs that utilize spatial diversity techniques with the conventional ones it will reveal how PHY and TCP throughputs behaviors are ameliorated. In the next step it will assess the same issues based on a cellular context operation which is mainly introduced as an innovated solution that beside a multi cell operation scenario benefits spatio-temporal signaling schemes as well. Presented simulations will shed light on the improved performance of the wide range and high quality wireless LAN services provided by the proposed approach.
Abstract: Building life cycle will never be excused from the existence of defects and deterioration. They are common problems in building, existed in newly build or in aged building. Buildings constructed from wood are indeed affected by its agent and serious defects and damages can reduce values to a building. In repair works, it is important to identify the causes and repair techniques that best suites with the condition. This paper reviews the conservation of traditional timber mosque in Malaysia comprises the concept, principles and approaches of mosque conservation in general. As in conservation practice, wood in historic building can be conserved by using various restoration and conservation techniques which this can be grouped as Fully and Partial Replacement, Mechanical Reinforcement, Consolidation by Impregnation and Reinforcement, Removing Paint and also Preservation of Wood and Control Insect Invasion, as to prolong and extended the function of a timber in a building. It resulted that the common techniques adopted in timber mosque conservation are from the conventional ways and the understanding of the repair technique requires the use of only preserve wood to prevent the future immature defects.
Abstract: With the aim of improving nutritional profile and antioxidant capacity of gluten-free cookies, blueberry pomace, by-product of juice production, was processed into a new food ingredient by drying and grinding and used for a gluten-free cookie formulation. Since the quality of a baked product is highly influenced by the baking conditions, the objective of this work was to optimize the baking time and thickness of dough pieces, by applying Response Surface Methodology (RSM) in order to obtain the best technological quality of the cookies. The experiments were carried out according to a Central Composite Design (CCD) by selecting the dough thickness and baking time as independent variables, while hardness, color parameters (L*, a* and b* values), water activity, diameter and short/long ratio were response variables. According to the results of RSM analysis, the baking time of 13.74min and dough thickness of 4.08mm was found to be the optimal for the baking temperature of 170°C. As similar optimal parameters were obtained by previously conducted experiment based on sensory analysis, response surface methodology (RSM) can be considered as a suitable approach to optimize the baking process.
Abstract: The simulation of extrusion process is studied widely
in order to both increase products and improve quality, with broad
application in wire coating. The annular tube-tooling extrusion was
set up by a model that is termed as Navier-Stokes equation in
addition to a rheological model of differential form based on singlemode
exponential Phan-Thien/Tanner constitutive equation in a twodimensional
cylindrical coordinate system for predicting the
contraction point of the polymer melt beyond the die. Numerical
solutions are sought through semi-implicit Taylor-Galerkin pressurecorrection
finite element scheme. The investigation was focused on
incompressible creeping flow with long relaxation time in terms of
Weissenberg numbers up to 200. The isothermal case was considered
with surface tension effect on free surface in extrudate flow and no
slip at die wall. The Stream Line Upwind Petrov-Galerkin has been
proposed to stabilize solution. The structure of mesh after die exit
was adjusted following prediction of both top and bottom free
surfaces so as to keep the location of contraction point around one
unit length which is close to experimental results. The simulation of
extrusion process is studied widely in order to both increase products
and improve quality, with broad application in wire coating. The
annular tube-tooling extrusion was set up by a model that is termed
as Navier-Stokes equation in addition to a rheological model of
differential form based on single-mode exponential Phan-
Thien/Tanner constitutive equation in a two-dimensional cylindrical
coordinate system for predicting the contraction point of the polymer
melt beyond the die. Numerical solutions are sought through semiimplicit
Taylor-Galerkin pressure-correction finite element scheme.
The investigation was focused on incompressible creeping flow with
long relaxation time in terms of Weissenberg numbers up to 200. The
isothermal case was considered with surface tension effect on free
surface in extrudate flow and no slip at die wall. The Stream Line
Upwind Petrov-Galerkin has been proposed to stabilize solution. The
structure of mesh after die exit was adjusted following prediction of
both top and bottom free surfaces so as to keep the location of
contraction point around one unit length which is close to
experimental results.
Abstract: This paper focuses on operational risk measurement
techniques and on economic capital estimation methods. A data
sample of operational losses provided by an anonymous Central
European bank is analyzed using several approaches. Loss
Distribution Approach and scenario analysis method are considered.
Custom plausible loss events defined in a particular scenario are
merged with the original data sample and their impact on capital
estimates and on the financial institution is evaluated. Two main
questions are assessed – What is the most appropriate statistical
method to measure and model operational loss data distribution? and
What is the impact of hypothetical plausible events on the financial
institution? The g&h distribution was evaluated to be the most
suitable one for operational risk modeling. The method based on the
combination of historical loss events modeling and scenario analysis
provides reasonable capital estimates and allows for the measurement
of the impact of extreme events on banking operations.
Abstract: Speckled images arise when coherent microwave,
optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar
systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted
by speckle noise is complicated by the nature of the noise and is not
as straightforward as detection and estimation in additive noise. In
this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The
motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this
context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series
of Laguerre weighted exponential functions, resulting in a doubly
stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form.
It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an
exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.