Abstract: In the recent years, high dynamic range imaging has
gain popularity with the advancement in digital photography. In this
contribution we present a subjective evaluation of various tone
production and tone mapping techniques by a number of participants.
Firstly, standard HDR images were used and the participants were
asked to rate them based on a given rating scheme. After that, the
participant was asked to rate HDR image generated using linear and
nonlinear combination approach of multiple exposure images. The
experimental results showed that linearly generated HDR images
have better visualization than the nonlinear combined ones. In
addition, Reinhard et al. and the exponential tone mapping operators
have shown better results compared to logarithmic and the Garrett et
al. tone mapping operators.
Abstract: Functional imaging procedures for the non-invasive assessment of tissue microcirculation are highly requested, but require a mathematical approach describing the trans- and intercapillary passage of tracer particles. Up to now, two theoretical, for the moment different concepts have been established for tracer kinetic modeling of contrast agent transport in tissues: pharmacokinetic compartment models, which are usually written as coupled differential equations, and the indicator dilution theory, which can be generalized in accordance with the theory of lineartime- invariant (LTI) systems by using a convolution approach. Based on mathematical considerations, it can be shown that also in the case of an open two-compartment model well-known from functional imaging, the concentration-time course in tissue is given by a convolution, which allows a separation of the arterial input function from a system function being the impulse response function, summarizing the available information on tissue microcirculation. Due to this reason, it is possible to integrate the open two-compartment model into the system-theoretic concept of indicator dilution theory (IDT) and thus results known from IDT remain valid for the compartment approach. According to the long number of applications of compartmental analysis, even for a more general context similar solutions of the so-called forward problem can already be found in the extensively available appropriate literature of the seventies and early eighties. Nevertheless, to this day, within the field of biomedical imaging – not from the mathematical point of view – there seems to be a trench between both approaches, which the author would like to get over by exemplary analysis of the well-known model.
Abstract: Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.
Abstract: Hearing impairment is the number one chronic
disability affecting many people in the world. Background noise is
particularly damaging to speech intelligibility for people with
hearing loss especially for sensorineural loss patients. Several
investigations on speech intelligibility have demonstrated
sensorineural loss patients need 5-15 dB higher SNR than the normal
hearing subjects. This paper describes Discrete Hartley Transform
Power Normalized Least Mean Square algorithm (DHT-LMS) to
improve the SNR and to reduce the convergence rate of the Least
Means Square (LMS) for sensorineural loss patients. The DHT
transforms n real numbers to n real numbers, and has the convenient
property of being its own inverse. It can be effectively used for noise
cancellation with less convergence time. The simulated result shows
the superior characteristics by improving the SNR at least 9 dB for
input SNR with zero dB and faster convergence rate (eigenvalue ratio
12) compare to time domain method and DFT-LMS.
Abstract: Knowledge of an organization does not merely reside
in structured form of information and data; it is also embedded in
unstructured form. The discovery of such knowledge is particularly
difficult as the characteristic is dynamic, scattered, massive and
multiplying at high speed. Conventional methods of managing
unstructured information are considered too resource demanding and
time consuming to cope with the rapid information growth.
In this paper, a Multi-faceted and Automatic Knowledge
Elicitation System (MAKES) is introduced for the purpose of
discovery and capture of organizational knowledge. A trial
implementation has been conducted in a public organization to
achieve the objective of decision capture and navigation from a
number of meeting minutes which are autonomously organized,
classified and presented in a multi-faceted taxonomy map in both
document and content level. Key concepts such as critical decision
made, key knowledge workers, knowledge flow and the relationship
among them are elicited and displayed in predefined knowledge
model and maps. Hence, the structured knowledge can be retained,
shared and reused.
Conducting Knowledge Management with MAKES reduces work
in searching and retrieving the target decision, saves a great deal of
time and manpower, and also enables an organization to keep pace
with the knowledge life cycle. This is particularly important when
the amount of unstructured information and data grows extremely
quickly. This system approach of knowledge management can
accelerate value extraction and creation cycles of organizations.
Abstract: Water quality is a subject of ongoing concern.
Deterioration of water quality has initiated serious management
efforts in many countries. This study endeavors to automatically
classify water quality. The water quality classes are evaluated using 6
factor indices. These factors are pH value (pH), Dissolved Oxygen
(DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen
(NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (TColiform).
The methodology involves applying data mining
techniques using multilayer perceptron (MLP) neural network
models. The data consisted of 11 sites of canals in Dusit district in
Bangkok, Thailand. The data is obtained from the Department of
Drainage and Sewerage Bangkok Metropolitan Administration
during 2007-2011. The results of multilayer perceptron neural
network exhibit a high accuracy multilayer perception rate at 96.52%
in classifying the water quality of Dusit district canal in Bangkok
Subsequently, this encouraging result could be applied with plan and
management source of water quality.
Abstract: Artificial Bee Colony (ABC) algorithm is a relatively new swarm intelligence technique for clustering. It produces higher
quality clusters compared to other population-based algorithms but with poor energy efficiency, cluster quality consistency and typically slower in convergence speed. Inspired by energy saving foraging behavior of natural honey bees this paper presents a Quality and Quantity Aware Artificial Bee Colony (Q2ABC) algorithm to improve quality of cluster identification, energy efficiency and convergence speed of the original ABC. To evaluate the performance of Q2ABC algorithm, experiments were conducted on a suite of ten benchmark UCI datasets. The results demonstrate Q2ABC outperformed ABC and K-means algorithm in the quality of clusters delivered.
Abstract: To support user mobility for a wireless network new mechanisms are needed and are fundamental, such as paging, location updating, routing, and handover. Also an important key feature is mobile QoS offered by the WATM. Several ATM network protocols should be updated to implement mobility management and to maintain the already ATM QoS over wireless ATM networks. A survey of the various schemes and types of handover is provided. Handover procedure allows guarantee the terminal connection reestablishment when it moves between areas covered by different base stations. It is useful to satisfy user radio link transfer without interrupting a connection. However, failure to offer efficient solutions will result in handover important packet loss, severe delays and degradation of QoS offered to the applications. This paper reviews the requirements, characteristics and open issues of wireless ATM, particularly with regard to handover. It introduces key aspects of WATM and mobility extensions, which are added in the fixed ATM network. We propose a flexible approach for handover management that will minimize the QoS deterioration. Functional entities of this flexible approach are discussed in order to achieve minimum impact on the connection quality when a MT crosses the BS.
Abstract: Measurements of capacitance C and dissipation
factor tand of the stator insulation system provide useful information
about internal defects within the insulation. The index k is defined as
the proportionality constant between the changes at high voltage of
capacitance DC and of the dissipation factor Dtand . DC and
Dtand values were highly correlated when small flat defects were
within the insulation and that correlation was lost in the presence of
large narrow defects like electrical treeing. The discrimination
between small and large defects is made resorting to partial discharge
PD phase angle analysis. For the validation of the results, C and tand
measurements were carried out in a 15MVA 4160V steam turbine
turbogenerator placed in a sugar mill. In addition, laboratory test
results obtained by other authors were analyzed jointly. In such
laboratory tests, model coil bars subjected to thermal cycling resulted
highly degraded and DC and Dtand values were not correlated. Thus,
the index k could not be calculated.
Abstract: The article touches upon questions of information security in Russian Economy. It covers theoretical bases of information security and causes of its development. The theory is proved by the analysis of business activities and the main tendencies of information security development. Perm region has been chosen as the bases for the analysis, being the fastestdeveloping region that uses methods of information security in managing it economy. As a result of the study the authors of the given article have formulated their own vision of the problem of information security in various branches of economy and stated prospects of information security development and its growing role in Russian economy
Abstract: In this paper, we proposed a new framework to incorporate an intelligent agent software robot into a crisis communication portal (CCNet) in order to send alert news to subscribed users via email and other mobile services such as Short Message Service (SMS), Multimedia Messaging Service (MMS) and General Packet Radio Services (GPRS). The content on the mobile services can be delivered either through mobile phone or Personal Digital Assistance (PDA). This research has shown that with our proposed framework, the embodied conversation agents system can handle questions intelligently with our multilayer architecture. At the same time, the extended framework can take care of delivery content through a more humanoid interface on mobile devices.
Abstract: The scattering effect of light in fog improves the
difficulty in visibility thus introducing disturbances in transport
facilities in urban or industrial areas causing fatal accidents or public
harassments, therefore, developing an enhanced fog vision system
with radio wave to improvise the way outs of these severe problems
is really a big challenge for researchers. Series of experimental
studies already been done and more are in progress to know the
weather effect on radio frequencies for different ranges. According to
Rayleigh scattering Law, the propagating wavelength should be
greater than the diameter of the particle present in the penetrating
medium. Direct wave RF signal thus have high chance of failure to
work in such weather for detection of any object. Therefore an
extensive study was required to find suitable region in the RF band
that can help us in detecting objects with proper shape. This paper
produces some results on object detection using 912 MHz band with
successful detection of the persistence of any object coming under the
trajectory of a vehicle navigating in indoor and outdoor environment.
The developed images are finally transformed to video signal to
enable continuous monitoring.
Abstract: This paper proposes an efficient method to classify
inverse synthetic aperture (ISAR) images. Because ISAR images can
be translated and rotated in the 2-dimensional image place, invariance
to the two factors is indispensable for successful classification. The
proposed method achieves invariance to translation and rotation of
ISAR images using a combination of two-dimensional Fourier
transform, polar mapping and correlation-based alignment of the
image. Classification is conducted using a simple matching score
classifier. In simulations using the real ISAR images of five scaled
models measured in a compact range, the proposed method yields
classification ratios higher than 97 %.
Abstract: Quantitative methods of economic decision-making as
the methodological base of the so called operational research
represent an important set of tools for managing complex economic
systems,both at the microeconomic level and on the macroeconomic
scale. Mathematical models of controlled and controlling processes
allow, by means of artificial experiments, obtaining information
foroptimalor optimum approaching managerial decision-making.The
quantitative methods of economic decision-making usually include a
methodology known as structural analysis -an analysisof
interdisciplinary production-consumption relations.
Abstract: This work concerns the evolution and the maintenance
of an ontological resource in relation with the evolution of the corpus
of texts from which it had been built.
The knowledge forming a text corpus, especially in dynamic domains,
is in continuous evolution. When a change in the corpus occurs, the
domain ontology must evolve accordingly. Most methods manage
ontology evolution independently from the corpus from which it is
built; in addition, they treat evolution just as a process of knowledge
addition, not considering other knowledge changes. We propose a
methodology for managing an evolving ontology from a text corpus
that evolves over time, while preserving the consistency and the
persistence of this ontology.
Our methodology is based on the changes made on the corpus to
reflect the evolution of the considered domain - augmented surgery
in our case. In this context, the results of text mining techniques,
as well as the ARCHONTE method slightly modified, are used to
support the evolution process.
Abstract: Bacterial cellulose, a biopolysaccharide, is produced by the bacterium, Gluconacetobacter xylinus. Static batch fermentation for bacterial cellulose production was studied in sucrose and date syrup solutions (Bx. 10%) at 28 °C using G. xylinus (PTCC, 1734). Results showed that the maximum yields of bacterial cellulose (BC) were 4.35 and 1.69 g/l00 ml for date syrup and sucrose medium after 336 hours fermentation period, respectively. Comparison of FTIR spectrum of cellulose with BC indicated appropriate coincidence which proved that the component produced by G. xylinus was cellulose. Determination of the area under X-ray diffractometry patterns demonstrated that the crystallinity amount of cellulose (83.61%) was more than that for the BC (60.73%). The scanning electron microscopy imaging of BC and cellulose were carried out in two magnifications of 1 and 6K. Results showed that the diameter ratio of BC to cellulose was approximately 1/30 which indicated more delicacy of BC fibers relative to cellulose.
Abstract: The use and management of projects has risen to
a new prominence, with projects seen as critical to economic in
both the private and public sectors due challenging and dynamic
business environment. However, failure in managing project is
encountered regularly, which cause the waste of company
resources. The impacts of projects that failed to meet
stakeholders expectations have left behind long lasting negative
consequences in organization. Therefore, this research aims to
investigate on key success factors of project management in an
organization. It is believed that recognizing important factors
that contribute to successful project will help companies to
increase the overall profitability. 150 questionnaires were
distributed to respondents and 110 questionnaires were collected
and used in performing the data analysis. The result has strongly
supported the relationship between independent variables and
project performance.
Abstract: The importance of good requirements engineering is well documented. Agile practices, promoting collaboration and communications, facilitate the elicitation and management of volatile requirements. However, current Agile practices work in a well-defined environment. It is necessary to have a co-located customer. With distributed development it is not always possible to realize this co-location. In this environment a suitable process, possibly supported by tools, is required to support changing requirements. This paper introduces the issues of concern when managing requirements in a distributed environment and describes work done at the Software Technology Research Centre as part of the NOMAD project.
Abstract: Using Dynamic Bayesian Networks (DBN) to model genetic regulatory networks from gene expression data is one of the major paradigms for inferring the interactions among genes. Averaging a collection of models for predicting network is desired, rather than relying on a single high scoring model. In this paper, two kinds of model searching approaches are compared, which are Greedy hill-climbing Search with Restarts (GSR) and Markov Chain Monte Carlo (MCMC) methods. The GSR is preferred in many papers, but there is no such comparison study about which one is better for DBN models. Different types of experiments have been carried out to try to give a benchmark test to these approaches. Our experimental results demonstrated that on average the MCMC methods outperform the GSR in accuracy of predicted network, and having the comparable performance in time efficiency. By proposing the different variations of MCMC and employing simulated annealing strategy, the MCMC methods become more efficient and stable. Apart from comparisons between these approaches, another objective of this study is to investigate the feasibility of using DBN modeling approaches for inferring gene networks from few snapshots of high dimensional gene profiles. Through synthetic data experiments as well as systematic data experiments, the experimental results revealed how the performances of these approaches can be influenced as the target gene network varies in the network size, data size, as well as system complexity.
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.