Abstract: This paper compares Hilditch, Rosenfeld, Zhang-
Suen, dan Nagendraprasad Wang Gupta (NWG) thinning algorithms
for Javanese character image recognition. Thinning is an effective
process when the focus in not on the size of the pattern, but rather on
the relative position of the strokes in the pattern. The research
analyzes the thinning of 60 Javanese characters.
Time-wise, Zhang-Suen algorithm gives the best results with the
average process time being 0.00455188 seconds. But if we look at
the percentage of pixels that meet one-pixel thickness, Rosenfelt
algorithm gives the best results, with a 99.98% success rate. From the
number of pixels that are erased, NWG algorithm gives the best
results with the average number of pixels erased being 84.12%. It can
be concluded that the Hilditch algorithm performs least successfully
compared to the other three algorithms.
Abstract: Protection and proper management of archaeological heritage are an essential process of studying and interpreting the generations present and future. Protecting the archaeological heritage is based upon multidiscipline professional collaboration. This study aims to gather data by different sources (Photogrammetry and Geographic Information System (GIS)) integrated for the purpose of documenting one the of significant archeological sites (Ahl-Alkahf, Jordan). 3D modeling deals with the actual image of the features, shapes and texture to represent reality as realistically as possible by using texture. The 3D coordinates that result of the photogrammetric adjustment procedures are used to create 3D-models of the study area. Adding Textures to the 3D-models surfaces gives a 'real world' appearance to the displayed models. GIS system combined all data, including boundary maps, indicating the location of archeological sites, transportation layer, digital elevation model and orthoimages. For realistic representation of the study area, 3D - GIS model prepared, where efficient generation, management and visualization of such special data can be achieved.
Abstract: We consider linear regression models where both input data (the values of independent variables) and output data (the observations of the dependent variable) are interval-censored. We introduce a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. This set captures the impact of the loss of information on the OLS estimator caused by interval censoring and provides a tool for quantification of this effect. We study complexity-theoretic properties of the OLS-set. We also deal with restricted versions of the general interval linear regression model, in particular the crisp input – interval output model. We give an argument that natural descriptions of the OLS-set in the crisp input – interval output cannot be computed in polynomial time. Then we derive easily computable approximations for the OLS-set which can be used instead of the exact description. We illustrate the approach by an example.
Abstract: This paper proposes a methodology for analysis of
the dynamic behavior of a robotic manipulator in continuous
time. Initially this system (nonlinear system) will be decomposed
into linear submodels and analyzed in the context of the Linear
and Parameter Varying (LPV) Systems. The obtained linear
submodels, which represent the local dynamic behavior of the
robotic manipulator in some operating points were grouped in
a Takagi-Sugeno fuzzy structure. The obtained fuzzy model was
analyzed and validated through analog simulation, as universal
approximator of the robotic manipulator.
Abstract: The main purpose of this research is the calculation of implicit prices of the environmental level of air quality in the city of Moscow on the basis of housing property prices. The database used contains records of approximately 20 thousand apartments and has been provided by a leading real estate agency operating in Russia. The explanatory variables include physical characteristics of the houses, environmental (industry emissions), neighbourhood sociodemographic and geographic data: GPS coordinates of each house. The hedonic regression results for ecological variables show «negative» prices while increasing the level of air contamination from such substances as carbon monoxide, nitrogen dioxide, sulphur dioxide, and particles (CO, NO2, SO2, TSP). The marginal willingness to pay for higher environmental quality is presented for linear and log-log models.
Abstract: Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.
Abstract: Innovations in technology have created new ethical
challenges. Essential use of electronic communication in the
workplace has escalated at an astronomical rate over the past decade.
As such, legal and ethical dilemmas confronted by both the employer
and the employee concerning managerial control and ownership of einformation
have increased dramatically in the USA. From the
employer-s perspective, ownership and control of all information
created for the workplace is an undeniable source of economic
advantage and must be monitored zealously. From the perspective of
the employee, individual rights, such as privacy, freedom of speech,
and freedom from unreasonable search and seizure, continue to be
stalwart legal guarantees that employers are not legally or ethically
entitled to abridge in the workplace. These issues have been the
source of great debate and the catalyst for legal reform. The fine line
between ethical and legal has been complicated by emerging
technologies. This manuscript will identify and discuss a number of
specific legal and ethical issues raised by the dynamic electronic
workplace and conclude with suggestions that employers should
follow to respect the delicate balance between employees- legal
rights to privacy and the employer's right to protect its knowledge
systems and infrastructure.
Abstract: This study has investigated the antidiabetic and
antioxidant potential of Pseudovaria macrophylla bark extract on
streptozotocin–nicotinamide induced type 2 diabetic rats. LCMSQTOF
and NMR experiments were done to determine the chemical
composition in the methanolic bark extract. For in vivo experiments,
the STZ (60 mg/kg/b.w, 15 min after 120 mg/kg/1 nicotinamide, i.p.)
induced diabetic rats were treated with methanolic extract of
Pseuduvaria macrophylla (200 and 400 mg/kg·bw) and
glibenclamide (2.5 mg/kg) as positive control respectively.
Biochemical parameters were assayed in the blood samples of all
groups of rats. The pro-inflammatory cytokines, antioxidant status
and plasma transforming growth factor βeta-1 (TGF-β1) were
evaluated. The histological study of the pancreas was examined and
its expression level of insulin was observed by
immunohistochemistry. In addition, the expression of glucose
transporters (GLUT 1, 2 and 4) were assessed in pancreas tissue by
western blot analysis. The outcomes of the study displayed that the
bark methanol extract of Pseuduvaria macrophylla has potentially
normalized the elevated blood glucose levels and improved serum
insulin and C-peptide levels with significant increase in the
antioxidant enzyme, reduced glutathione (GSH) and decrease in the
level of lipid peroxidation (LPO). Additionally, the extract has
markedly decreased the levels of serum pro-inflammatory cytokines
and transforming growth factor beta-1 (TGF-β1). Histopathology
analysis demonstrated that Pseuduvaria macrophylla has the
potential to protect the pancreas of diabetic rats against peroxidation
damage by downregulating oxidative stress and elevated
hyperglycaemia. Furthermore, the expression of insulin protein,
GLUT-1, GLUT-2 and GLUT-4 in pancreatic cells was enhanced.
The findings of this study support the anti-diabetic claims of
Pseudovaria macrophylla bark.
Abstract: In Image processing the Image compression can improve
the performance of the digital systems by reducing the cost and
time in image storage and transmission without significant reduction
of the Image quality. This paper describes hardware architecture of
low complexity Discrete Cosine Transform (DCT) architecture for
image compression[6]. In this DCT architecture, common computations
are identified and shared to remove redundant computations
in DCT matrix operation. Vector processing is a method used for
implementation of DCT. This reduction in computational complexity
of 2D DCT reduces power consumption. The 2D DCT is performed
on 8x8 matrix using two 1-Dimensional Discrete cosine transform
blocks and a transposition memory [7]. Inverse discrete cosine
transform (IDCT) is performed to obtain the image matrix and
reconstruct the original image. The proposed image compression
algorithm is comprehended using MATLAB code. The VLSI design
of the architecture is implemented Using Verilog HDL. The proposed
hardware architecture for image compression employing DCT was
synthesized using RTL complier and it was mapped using 180nm
standard cells. . The Simulation is done using Modelsim. The
simulation results from MATLAB and Verilog HDL are compared.
Detailed analysis for power and area was done using RTL compiler
from CADENCE. Power consumption of DCT core is reduced to
1.027mW with minimum area[1].
Abstract: The design of technological procedures for
manufacturing certain products demands the definition and
optimization of technological process parameters. Their
determination depends on the model of the process itself and its
complexity. Certain processes do not have an adequate mathematical
model, thus they are modeled using heuristic methods. First part of
this paper presents a state of the art of using soft computing
techniques in manufacturing processes from the perspective of
applicability in modern CAx systems. Methods of artificial
intelligence which can be used for this purpose are analyzed. The
second part of this paper shows some of the developed models of
certain processes, as well as their applicability in the actual
calculation of parameters of some technological processes within the
design system from the viewpoint of productivity.
Abstract: Raman spectroscopy are used to characterize the
chemical changes in normoxic polyhydroxyethylacrylate gel
dosimeter (PHEA) induced by radiation. Irradiations in the low dose
region are performed and the polymerizations of PHEA gels are
monitored by the observing the changes of Raman shift intensity of
the carbon covalent bond of PHEA originated from both monomer
and the cross-linker. The variation in peak intensities with absorbed
dose was observed. As the dose increase, the peak intensities of
covalent bond of carbon in the polymer gels decrease. This point out
that the amount of absorbed dose affect the polymerization of
polymer gels. As the absorbed dose increase, the polymerizations
also increase. Results verify that PHEA gel dosimeters are sensitive
even in lower dose region.
Abstract: The rapidly increasing costs of power line extensions
and fossil fuel, combined with the desire to reduce carbon dioxide
emissions pushed the development of hybrid power system suited for
remote locations, the purpose in mind being that of autonomous local
power systems. The paper presents the suggested solution for a “high
penetration" hybrid power system, it being determined by the
location of the settlement and its “zero policy" on carbon dioxide
emissions. The paper focuses on the technical solution and the power
flow management algorithm of the system, taking into consideration
local conditions of development.
Abstract: Using efficient classification methods is necessary for automatic fingerprint recognition system. This paper introduces a new structural approach to fingerprint classification by using the directional image of fingerprints to increase the number of subclasses. In this method, the directional image of fingerprints is segmented into regions consisting of pixels with the same direction. Afterwards the relational graph to the segmented image is constructed and according to it, the super graph including prominent information of this graph is formed. Ultimately we apply a matching technique to compare obtained graph with the model graphs in order to classify fingerprints by using cost function. Increasing the number of subclasses with acceptable accuracy in classification and faster processing in fingerprints recognition, makes this system superior.
Abstract: Bagging and boosting are among the most popular re-sampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noise-free data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using an averaging methodology of bagging and boosting ensembles with 10 sub-learners in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-learners on standard benchmark datasets and the proposed ensemble gave better accuracy.
Abstract: A cognitive collaborative reinforcement learning
algorithm (CCRL) that incorporates an advisor into the learning
process is developed to improve supervised learning. An autonomous
learner is enabled with a self awareness cognitive skill to decide
when to solicit instructions from the advisor. The learner can also
assess the value of advice, and accept or reject it. The method is
evaluated for robotic motion planning using simulation. Tests are
conducted for advisors with skill levels from expert to novice. The
CCRL algorithm and a combined method integrating its logic with
Clouse-s Introspection Approach, outperformed a base-line fully
autonomous learner, and demonstrated robust performance when
dealing with various advisor skill levels, learning to accept advice
received from an expert, while rejecting that of less skilled
collaborators. Although the CCRL algorithm is based on RL, it fits
other machine learning methods, since advisor-s actions are only
added to the outer layer.
Abstract: This paper presents a model of case based corporate
memory named ReCaRo (REsource, CAse, ROle). The approach
suggested in ReCaRo decomposes the domain to model through a set
of components. These components represent the objects developed by
the company during its activity. They are reused, and sometimes,
while bringing adaptations. These components are enriched by
knowledge after each reuse. ReCaRo builds the corporate memory on
the basis of these components. It models two types of knowledge: 1)
Business Knowledge, which constitutes the main knowledge capital
of the company, refers to its basic skill, thus, directly to the
components and 2) the Experience Knowledge which is a specialised
knowledge and represents the experience gained during the handling
of business knowledge. ReCaRo builds corporate memories which
are made up of five communicating ones.
Abstract: Optimizing equipment selection in heavy earthwork
operations is a critical key in the success of any construction project.
The objective of this research incentive was geared towards
developing a computer model to assist contractors and construction
managers in estimating the cost of heavy earthwork operations.
Economical operation analysis was conducted for an equipment fleet
taking into consideration the owning and operating costs involved in
earthwork operations. The model is being developed in a Microsoft
environment and is capable of being integrated with other estimating
and optimization models. In this study, Caterpillar® Performance
Handbook [5] was the main resource used to obtain specifications of
selected equipment. The implementation of the model shall give
optimum selection of equipment fleet not only based on cost
effectiveness but also in terms of versatility. To validate the model, a
case study of an actual dam construction project was selected to
quantify its degree of accuracy.
Abstract: In this paper a way of hiding text message (Steganography) in the gray image has been presented. In this method tried to find binary value of each character of text message and then in the next stage, tried to find dark places of gray image (black) by converting the original image to binary image for labeling each object of image by considering on 8 connectivity. Then these images have been converted to RGB image in order to find dark places. Because in this way each sequence of gray color turns into RGB color and dark level of grey image is found by this way if the Gary image is very light the histogram must be changed manually to find just dark places. In the final stage each 8 pixels of dark places has been considered as a byte and binary value of each character has been put in low bit of each byte that was created manually by dark places pixels for increasing security of the main way of steganography (LSB).
Abstract: The effects of dynamic subgrid scale (SGS) models are
investigated in variational multiscale (VMS) LES simulations of bluff
body flows. The spatial discretization is based on a mixed finite
element/finite volume formulation on unstructured grids. In the VMS
approach used in this work, the separation between the largest and the
smallest resolved scales is obtained through a variational projection
operator and a finite volume cell agglomeration. The dynamic version
of Smagorinsky and WALE SGS models are used to account for
the effects of the unresolved scales. In the VMS approach, these
effects are only modeled in the smallest resolved scales. The dynamic
VMS-LES approach is applied to the simulation of the flow around a
circular cylinder at Reynolds numbers 3900 and 20000 and to the flow
around a square cylinder at Reynolds numbers 22000 and 175000. It
is observed as in previous studies that the dynamic SGS procedure
has a smaller impact on the results within the VMS approach than in
LES. But improvements are demonstrated for important feature like
recirculating part of the flow. The global prediction is improved for
a small computational extra cost.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.