Abstract: In current common research reports, salient regions
are usually defined as those regions that could present the main
meaningful or semantic contents. However, there are no uniform
saliency metrics that could describe the saliency of implicit image
regions. Most common metrics take those regions as salient regions,
which have many abrupt changes or some unpredictable
characteristics. But, this metric will fail to detect those salient useful
regions with flat textures. In fact, according to human semantic
perceptions, color and texture distinctions are the main characteristics
that could distinct different regions. Thus, we present a novel saliency
metric coupled with color and texture features, and its corresponding
salient region extraction methods. In order to evaluate the
corresponding saliency values of implicit regions in one image, three
main colors and multi-resolution Gabor features are respectively used
for color and texture features. For each region, its saliency value is
actually to evaluate the total sum of its Euclidean distances for other
regions in the color and texture spaces. A special synthesized image
and several practical images with main salient regions are used to
evaluate the performance of the proposed saliency metric and other
several common metrics, i.e., scale saliency, wavelet transform
modulus maxima point density, and important index based metrics.
Experiment results verified that the proposed saliency metric could
achieve more robust performance than those common saliency
metrics.
Abstract: The growing influence of service industries has
prompted greater attention being paid to service operations
management. However, service managers often have difficulty
articulating the veritable effects of their service innovation. Especially,
the performance evaluation process of service innovation problems
generally involves uncertain and imprecise data. This paper presents a
2-tuple fuzzy linguistic computing approach to dealing with
heterogeneous information and information loss problems while the
processes of subjective evaluation integration. The proposed method
based on group decision-making scenario to assist business managers
in measuring performance of service innovation manipulates the
heterogeneity integration processes and avoids the information loss
effectively.
Abstract: Today, the Internet based communication has widen
the opportunity of event monitoring system in the medical field.
There is always a need of analyzing and designing secure and reliable
mobile communication between the hospital and biomedical
engineers mobile units. This study has been carried out to find
possible solution using SIP-based event notification for alerting the
technical staff about the Biomedical Device (BMD) status and
Patients treatment session. The Session Initiation Protocol (SIP) can
be used to create a medical event notification system. SIP can work
on a variety of devices. Its adoption as the protocol of choice for third
generation wireless networks allows for a robust and scalable
environment. One of the advantages of SIP is that it supports personal
mobility through the separation of user addressing and device
addressing. The solution for Telemed alert notification system is
based on SIP - Specific Event Notification. The aim of this project is
to extend mobility service to the hospital technicians who are using
Telemedicine system.
Abstract: Work is focused to the study of unburned carbon in
ash from coal (and wastes) combustion in 8 combustion tests at 3
fluidised-bed power station, at co-combustion of coal and wastes
(also at fluidized bed) and at bench-scale unit simulating coal
combustion in small domestic furnaces. The attention is paid to
unburned carbon contents in bottom ashes and fly ashes at these 8
combustion tests and to morphology of unburned carbons. Specific
surface area of coals, unburned carbons and ashes and the relation of
specific surface area of unburned carbon and the content of volatile
combustibles in coal were studied as well.
Abstract: Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.
Abstract: In this paper a new approach to face recognition is
presented that achieves double dimension reduction, making the
system computationally efficient with better recognition results and
out perform common DCT technique of face recognition. In pattern
recognition techniques, discriminative information of image
increases with increase in resolution to a certain extent, consequently
face recognition results change with change in face image resolution
and provide optimal results when arriving at a certain resolution
level. In the proposed model of face recognition, initially image
decimation algorithm is applied on face image for dimension
reduction to a certain resolution level which provides best
recognition results. Due to increased computational speed and feature
extraction potential of Discrete Cosine Transform (DCT), it is
applied on face image. A subset of coefficients of DCT from low to
mid frequencies that represent the face adequately and provides best
recognition results is retained. A tradeoff between decimation factor,
number of DCT coefficients retained and recognition rate with
minimum computation is obtained. Preprocessing of the image is
carried out to increase its robustness against variations in poses and
illumination level. This new model has been tested on different
databases which include ORL , Yale and EME color database.
Abstract: Supply chain networks are frequently hit by
unplanned events which lead to disruptions and cause operational and
financial consequences. It is neither possible to avoid disruption risk
entirely, nor are network members able to prepare for every possible
disruptive event. Therefore a continuity planning should be set up
which supports effective operational responses in supply chain
networks in times of emergencies. In this research network related
degrees of freedom which determine the options for responsive
actions are derived from interview data. The findings are further
embedded into a common risk management process. The paper
provides support for researchers and practitioners to identify the
network related options for responsive actions and to determine the
need for improving the reaction capabilities.
Abstract: The notion of Next Generation Network (NGN) is
based on the Network Convergence concept which refers to
integration of services (such as IT and communication services) over
IP layer. As the most popular implementation of Service Oriented
Architecture (SOA), Web Services technology is known to be the
base for service integration. In this paper, we present a platform to
deliver communication services as web services. We also implement
a sample service to show the simplicity of making composite web
and communication services using this platform. A Service Logic
Execution Environment (SLEE) is used to implement the
communication services. The proposed architecture is in agreement
with Service Oriented Architecture (SOA) and also can be integrated
to an Enterprise Service Bus to make a base for NGN Service
Delivery Platform (SDP).
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: Results are presented from a combined experimental
and modeling study undertaken to understand the effect of fuel spray
angle on soot production in turbulent liquid spray flames. The
experimental work was conducted in a cylindrical laboratory furnace
at fuel spray cone angle of 30º, 45º and 60º. Soot concentrations
inside the combustor are measured by filter paper technique. The soot
concentration is modeled by using the soot particle number density
and the mass density based acetylene concentrations. Soot oxidation
occurred by both hydroxide radicals and oxygen molecules. The
comparison of calculated results against experimental measurements
shows good agreement. Both the numerical and experimental results
show that the peak value of soot and its location in the furnace
depend on fuel spray cone angle. An increase in spray angle enhances
the evaporating rate and peak temperature near the nozzle. Although
peak soot concentration increase with enhance of fuel spray angle but
soot emission from the furnace decreases.
Abstract: This paper proposes a “soft systems" approach to
domain-driven design of computer-based information systems. We
propose a systemic framework combining techniques from Soft
Systems Methodology (SSM), the Unified Modelling Language
(UML), and an implementation pattern known as “Naked Objects".
We have used this framework in action research projects that have
involved the investigation and modelling of business processes using
object-oriented domain models and the implementation of software
systems based on those domain models. Within the proposed
framework, Soft Systems Methodology (SSM) is used as a guiding
methodology to explore the problem situation and to generate a
ubiquitous language (soft language) which can be used as the basis
for developing an object-oriented domain model. The domain model
is further developed using techniques based on the UML and is
implemented in software following the “Naked Objects"
implementation pattern. We argue that there are advantages from
combining and using techniques from different methodologies in this
way.
The proposed systemic framework is overviewed and justified as
multimethodologyusing Mingers multimethodology ideas.
This multimethodology approach is being evaluated through a
series of action research projects based on real-world case studies. A
Peer-Tutoring case study is presented here as a sample of the
framework evaluation process
Abstract: With the development of the Internet, E-commerce is
growing at an exponential rate, and lots of online stores are built up to
sell their goods online. A major factor influencing the successful
adoption of E-commerce is consumer-s trust. For new or unknown
Internet business, consumers- lack of trust has been cited as a major
barrier to its proliferation. As web sites provide key interface for
consumer use of E-Commerce, we investigate the design of web site to
build trust in E-Commerce from a design science approach. A
conceptual model is proposed in this paper to describe the ontology of
online transaction and human-computer interaction. Based on this
conceptual model, we provide a personalized webpage design
approach using Bayesian networks learning method. Experimental
evaluation are designed to show the effectiveness of web
personalization in improving consumer-s trust in new or unknown
online store.
Abstract: Since Cloud environment has appeared as the most powerful
keyword in the computing industry, the growth in VDI (Virtual Desktop
Infrastructure) became remarkable in domestic market. In recent years, with the trend
that mobile devices such as smartphones and pads spread so rapidly, the strengths of
VDI that allows people to access and perform business on the move along with
companies' office needs expedite more rapid spread of VDI.
In this paper, mobile OTP (One-Time Password) authentication method is proposed
to secure mobile device portability through rapid and secure authentication using
mobile devices such as mobile phones or pads, which does not require additional
purchase or possession of OTP tokens of users. To facilitate diverse and wide use of
Services in the future, service should be continuous and stable, and above all, security
should be considered the most important to meet advanced portability and user
accessibility, the strengths of VDI.
Abstract: With Power system movement toward restructuring along with factors such as life environment pollution, problems of transmission expansion and with advancement in construction technology of small generation units, it is expected that small units like wind turbines, fuel cells, photovoltaic, ... that most of the time connect to the distribution networks play a very essential role in electric power industry. With increase in developing usage of small generation units, management of distribution networks should be reviewed. The target of this paper is to present a new method for optimal management of active and reactive power in distribution networks with regard to costs pertaining to various types of dispersed generations, capacitors and cost of electric energy achieved from network. In other words, in this method it-s endeavored to select optimal sources of active and reactive power generation and controlling equipments such as dispersed generations, capacitors, under load tapchanger transformers and substations in a way that firstly costs in relation to them are minimized and secondly technical and physical constraints are regarded. Because the optimal management of distribution networks is an optimization problem with continuous and discrete variables, the new evolutionary method based on Ant Colony Algorithm has been applied. The simulation results of the method tested on two cases containing 23 and 34 buses exist and will be shown at later sections.
Abstract: In this work we study the reflection of circularly
polarised light from a nano-structured biological material found in
the exocuticle of scarabus beetles. This material is made of a stack
of ultra-thin (~5 nm) uniaxial layers arranged in a left-handed
helicoidal stack, which resonantly reflects circularly polarized light.
A chirp in the layer thickness combined with a finite absorption
coefficient produce a broad smooth reflectance spectrum. By
comparing model calculations and electron microscopy with
measured spectra we can explain our observations and quantify most
relevant structural parameters.
Abstract: With today's fast lifestyles and busy schedule, nuclear
families are becoming popular. Thus, the elderly members of these
families are often neglected. This has lead to the popularity of the
concept of Community living for the aged. The elders reside at a
centre, which is controlled by the MANAGER. The manager takes
responsibility of the functioning of the centre which includes taking
care of 'residents' at the centre along with managing the daily chores
of the centre, which he accomplishes with the help of a number of
staff members and volunteers Often the Manager is not an employee
but a volunteer. In such cases especially, time is an important
constraint. A system, which provides an easy and efficient manner of
managing the working of an old age home in detail, will prove to be
of great benefit. We have developed a P.C. based organizer used to
monitor the various activities of an old age home. It is an effective
and easy-to-use system which will enable the manager to keep an
account of all the residents, their accounts, staff members, volunteers,
the centre-s logistic requirements etc. It is thus, a comprehensive
'Organizer' for Old Age Homes.
Abstract: The objective of the presented work is to implement the Kalman Filter into an application that reduces the influence of the environmental changes over the robot expected to navigate over a terrain of varying friction properties. The Discrete Kalman Filter is used to estimate the robot position, project the estimated current state ahead at time through time update and adjust the projected estimated state by an actual measurement at that time via the measurement update using the data coming from the infrared sensors, ultrasonic sensors and the visual sensor respectively. The navigation test has been performed in a real world environment and has been found to be robust.
Abstract: Least Significant Bit (LSB) technique is the earliest
developed technique in watermarking and it is also the most simple,
direct and common technique. It essentially involves embedding the
watermark by replacing the least significant bit of the image data with
a bit of the watermark data. The disadvantage of LSB is that it is not
robust against attacks. In this study intermediate significant bit (ISB)
has been used in order to improve the robustness of the watermarking
system. The aim of this model is to replace the watermarked image
pixels by new pixels that can protect the watermark data against
attacks and at the same time keeping the new pixels very close to the
original pixels in order to protect the quality of watermarked image.
The technique is based on testing the value of the watermark pixel
according to the range of each bit-plane.
Abstract: Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Abstract: The knowledge base of welding defect recognition is
essentially incomplete. This characteristic determines that the recognition results do not reflect the actual situation. It also has a further influence on the classification of welding quality. This paper is
concerned with the study of a rough set based method to reduce the influence and improve the classification accuracy. At first, a rough set
model of welding quality intelligent classification has been built. Both condition and decision attributes have been specified. Later on, groups
of the representative multiple compound defects have been chosen
from the defect library and then classified correctly to form the
decision table. Finally, the redundant information of the decision table has been reducted and the optimal decision rules have been reached. By this method, we are able to reclassify the misclassified defects to
the right quality level. Compared with the ordinary ones, this method
has higher accuracy and better robustness.