Abstract: Now a days video data embedding approach is a very challenging and interesting task towards keeping real time video data secure. We can implement and use this technique with high-level applications. As the rate-distortion of any image is not confirmed, because the gain provided by accurate image frame segmentation are balanced by the inefficiency of coding objects of arbitrary shape, with a lot factors like losses that depend on both the coding scheme and the object structure. By using rate controller in association with the encoder one can dynamically adjust the target bitrate. This paper discusses about to keep secure videos by mixing signature data with negligible distortion in the original video, and to keep steganographic video as closely as possible to the quality of the original video. In this discussion we propose the method for embedding the signature data into separate video frames by the use of block Discrete Cosine Transform. These frames are then encoded by real time encoding H.264 scheme concepts. After processing, at receiver end recovery of original video and the signature data is proposed.
Abstract: Building a service-centric business model requires
new knowledge and capabilities in companies. This paper enlightens
the challenges small and medium sized firms (SMEs) face when
developing their service-centric business models. This paper
examines the premise for knowledge transfer and capability
development required. The objective of this paper is to increase
knowledge about SME-s transformation to service-centric business
models.This paper reports an action research based case study. The
paper provides empirical evidence from three case companies. The
empirical data was collected through multiple methods. The findings
of the paper are: First, the developed model to analyze the current
state in companies. Second, the process of building the service –
centric business models. Third, the selection of suitable service
development methods. The lack of a holistic understanding on
service logic suggests that SMEs need practical and easy to use
methods to improve their business
Abstract: A reconfigurable manufacturing system (RMS) is an
advanced system designed at the outset for rapid changes in its hardware
and software components in order to quickly adjust its production
capacity and functionally. Among various operational decisions, this
study considers the scheduling problem that determines the input
sequence and schedule at the same time for a given set of parts. In
particular, we consider the practical constraints that the numbers of
pallets/fixtures are limited and hence a part can be released into the
system only when the fixture required for the part is available. To
solve the integrated input sequencing and scheduling problems, we
suggest a priority rule based approach in which the two sub-problems
are solved using a combination of priority rules. To show the effectiveness
of various rule combinations, a simulation experiment was
done on the data for a real RMS, and the test results are reported.
Abstract: In order to integrate knowledge in heterogeneous
case-based reasoning (CBR) systems, ontology-based CBR system
has become a hot topic. To solve the facing problems of
ontology-based CBR system, for example, its architecture is
nonstandard, reusing knowledge in legacy CBR is deficient, ontology
construction is difficult, etc, we propose a novel approach for
semi-automatically construct ontology-based CBR system whose
architecture is based on two-layer ontology. Domain knowledge
implied in legacy case bases can be mapped from relational database
schema and knowledge items to relevant OWL local ontology
automatically by a mapping algorithm with low time-complexity. By
concept clustering based on formal concept analysis, computing
concept equation measure and concept inclusion measure, some
suggestions about enriching or amending concept hierarchy of OWL
local ontologies are made automatically that can aid designers to
achieve semi-automatic construction of OWL domain ontology.
Validation of the approach is done by an application example.
Abstract: This paper presents a customized deformable model
for the segmentation of abdominal and thoracic aortic aneurysms in
CTA datasets. An important challenge in reliably detecting aortic
aneurysm is the need to overcome problems associated with intensity
inhomogeneities and image noise. Level sets are part of an important
class of methods that utilize partial differential equations (PDEs) and
have been extensively applied in image segmentation. A Gaussian
kernel function in the level set formulation, which extracts the local
intensity information, aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in
segmentation time compared with previous implementations of level
sets. The results indicate the method is more effective than other
approaches in coping with intensity inhomogeneities.
Abstract: the research was accomplished on fresh in Latvia wild
growing cranberries and cranberry cultivars. The aim of the study
was to evaluate effect of pretreatment method and drying conditions
on the volatile compounds composition in cranberries. Berries
pre-treatment methods were: perforation, halving and
steam-blanching. The berries before drying in a cabinet drier were
pre-treated using all three methods, in microwave vacuum
drier – using a steam-blanching and halving. Volatile compounds in
cranberries were analysed using GC-MS of extracts obtained by
SPME. During present research 21 various volatile compounds were
detected in fresh cranberries: the cultivar 'Steven' - 15, 'Bergman'
and 'Early black' – 13, 'Ben Lear' and 'Pilgrim' – 11 and wild
cranberries – 14 volatile compounds. In dried cranberries 20 volatile
compounds were detected. Mathematical data processing allows
drawing a conclusion that there exists the significant influence of
cranberry cultivar, pre-treatment method and drying condition on
volatile compounds in berries and new volatile compound formation.
Abstract: Pore water pressure is normally because of
consolidation, compaction and water level fluctuation on reservoir.
Measuring, controlling and analyzing of pore water pressure have
significant importance in both of construction and operation period.
Since end of 2002, (dam start up) nature of KARKHEH dam has
been analyzed by using the gathered information from
instrumentation system of dam. In this lecture dam condition after
start up have been analyzed by using the gathered data from located
piezometers in core of dam. According to TERZAGHI equation and
records of piezometers, consolidation lasted around five years during
early years of construction stage, and current pore water pressure in
core of dam is caused by water level fluctuation in reservoir.
Although there is time lag between water level fluctuation and results
of piezometers. These time lags have been checked and the results
clearly show that one of the most important causes of it is distance
between piezometer and reservoir.
Abstract: This paper outlines the application of Knowledge Management (KM) principles in the context of Educational institutions. The paper caters to the needs of the engineering institutions for imparting quality education by delineating the instruction delivery process in a highly structured, controlled and quantified manner. This is done using a software tool EDULOGIC+. The central idea has been based on the engineering education pattern in Indian Universities/ Institutions. The data, contents and results produced over contiguous years build the necessary ground for managing the related accumulated knowledge. Application of KM has been explained using certain examples of data analysis and knowledge extraction.
Abstract: Digital libraries become more and more necessary in
order to support users with powerful and easy-to-use tools for
searching, browsing and retrieving media information. The starting
point for these tasks is the segmentation of video content into shots.
To segment MPEG video streams into shots, a fully automatic
procedure to detect both abrupt and gradual transitions (dissolve and
fade-groups) with minimal decoding in real time is developed in this
study. Each was explored through two phases: macro-block type's
analysis in B-frames, and on-demand intensity information analysis.
The experimental results show remarkable performance in
detecting gradual transitions of some kinds of input data and
comparable results of the rest of the examined video streams. Almost
all abrupt transitions could be detected with very few false positive
alarms.
Abstract: Thousands of masters athletes participate
quadrennially in the World Masters Games (WMG), yet this cohort
of athletes remains proportionately under-investigated. Due to a
growing global obesity pandemic in context of benefits of physical
activity across the lifespan, the prevalence of obesity in this unique
population was of particular interest. Data gathered on a sub-sample
of 535 football code athletes, aged 31-72 yrs ( =47.4, s =±7.1),
competing at the Sydney World Masters Games (2009) demonstrated
a significantly (p
Abstract: In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Abstract: Short-Term Load Forecasting (STLF) plays an important role for the economic and secure operation of power systems. In this paper, Continuous Genetic Algorithm (CGA) is employed to evolve the optimum large neural networks structure and connecting weights for one-day ahead electric load forecasting problem. This study describes the process of developing three layer feed-forward large neural networks for load forecasting and then presents a heuristic search algorithm for performing an important task of this process, i.e. optimal networks structure design. The proposed method is applied to STLF of the local utility. Data are clustered due to the differences in their characteristics. Special days are extracted from the normal training sets and handled separately. In this way, a solution is provided for all load types, including working days and weekends and special days. We find good performance for the large neural networks. The proposed methodology gives lower percent errors all the time. Thus, it can be applied to automatically design an optimal load forecaster based on historical data.
Abstract: The benefits of eco-roofs is quite well known, however there remains very little research conducted for the implementation of eco-roofs in subtropical climates such as Australia. There are many challenges facing Australia as it moves into the future, climate change is proving to be one of the leading challenges. In order to move forward with the mitigation of climate change, the impacts of rapid urbanization need to be offset. Eco-roofs are one way to achieve this; this study presents the energy savings and environmental benefits of the implementation of eco-roofs in subtropical climates. An experimental set-up was installed at Rockhampton campus of Central Queensland University, where two shipping containers were converted into small offices, one with an eco-roof and one without. These were used for temperature, humidity and energy consumption data collection. In addition, a computational model was developed using Design Builder software (state-of-the-art building energy simulation software) for simulating energy consumption of shipping containers and environmental parameters, this was done to allow comparison between simulated and real world data. This study found that eco-roofs are very effective in subtropical climates and provide energy saving of about 13% which agrees well with simulated results.
Abstract: This paper presents a new compact analytical model of
the gate leakage current in high-k based nano scale MOSFET by
assuming a two-step inelastic trap-assisted tunneling (ITAT) process
as the conduction mechanism. This model is based on an inelastic
trap-assisted tunneling (ITAT) mechanism combined with a semiempirical
gate leakage current formulation in the BSIM 4 model. The
gate tunneling currents have been calculated as a function of gate
voltage for different gate dielectrics structures such as HfO2, Al2O3
and Si3N4 with EOT (equivalent oxide thickness) of 1.0 nm. The
proposed model is compared and contrasted with santaurus
simulation results to verify the accuracy of the model and excellent
agreement is found between the analytical and simulated data. It is
observed that proposed analytical model is suitable for different highk
gate dielectrics simply by adjusting two fitting parameters. It was
also shown that gate leakages reduced with the introduction of high-k
gate dielectric in place of SiO2.
Abstract: Mostly, pedestrian-car accidents occurred at a
signalized interaction is because pedestrians cannot across the
intersection safely within the green light. From the viewpoint of
pedestrian, there might have two reasons. The first one is pedestrians
cannot speed up to across the intersection, such as the elders. The other
reason is pedestrians do not sense that the signal phase is going to
change and their right-of-way is going to lose. Developing signal logic
to protect pedestrian, who is crossing an intersection is the first
purpose of this study. Another purpose of this study is improving the
reliability and reduce delay of public transportation service. Therefore,
bus preemption is also considered in the designed signal logic. In this
study, the traffic data of the intersection of Chong-Qing North Road
and Min-Zu West Road, Taipei, Taiwan, is employed to calibrate and
validate the signal logic by simulation. VISSIM 5.20, which is a
microscopic traffic simulation software, is employed to simulate the
signal logic. From the simulated results, the signal logic presented in
this study can protect pedestrians crossing the intersection
successfully. The design of bus preemption can reduce the average
delay. However, the pedestrian safety and bus preemptive signal will
influence the average delay of cars largely. Thus, whether applying the
pedestrian safety and bus preemption signal logic to an isolated
intersection or not should be evaluated carefully.
Abstract: An accurate prediction of the minimum fluidization
velocity is a crucial hydrodynamic aspect of the design of fluidized
bed reactors. Common approaches for the prediction of the minimum
fluidization velocities of binary-solid fluidized beds are first
discussed here. The data of our own careful experimental
investigation involving a binary-solid pair fluidized with water is
presented. The effect of the relative composition of the two solid
species comprising the fluidized bed on the bed void fraction at the
incipient fluidization condition is reported and its influence on the
minimum fluidization velocity is discussed. In this connection, the
capability of packing models to predict the bed void fraction is also
examined.
Abstract: In this paper we propose a new classification method for automatic sleep scoring using an artificial neural network based decision tree. It attempts to treat sleep scoring progress as a series of two-class problems and solves them with a decision tree made up of a group of neural network classifiers, each of which uses a special feature set and is aimed at only one specific sleep stage in order to maximize the classification effect. A single electroencephalogram (EEG) signal is used for our analysis rather than depending on multiple biological signals, which makes greatly simplifies the data acquisition process. Experimental results demonstrate that the average epoch by epoch agreement between the visual and the proposed method in separating 30s wakefulness+S1, REM, S2 and SWS epochs was 88.83%. This study shows that the proposed method performed well in all the four stages, and can effectively limit error propagation at the same time. It could, therefore, be an efficient method for automatic sleep scoring. Additionally, since it requires only a small volume of data it could be suited to pervasive applications.
Abstract: Nowadays, more engineering systems are using some
kind of Artificial Intelligence (AI) for the development of their
processes. Some well-known AI techniques include artificial neural
nets, fuzzy inference systems, and neuro-fuzzy inference systems
among others. Furthermore, many decision-making applications base
their intelligent processes on Fuzzy Logic; due to the Fuzzy
Inference Systems (FIS) capability to deal with problems that are
based on user knowledge and experience. Also, knowing that users
have a wide variety of distinctiveness, and generally, provide
uncertain data, this information can be used and properly processed
by a FIS. To properly consider uncertainty and inexact system input
values, FIS normally use Membership Functions (MF) that represent
a degree of user satisfaction on certain conditions and/or constraints.
In order to define the parameters of the MFs, the knowledge from
experts in the field is very important. This knowledge defines the MF
shape to process the user inputs and through fuzzy reasoning and
inference mechanisms, the FIS can provide an “appropriate" output.
However an important issue immediately arises: How can it be
assured that the obtained output is the optimum solution? How can it
be guaranteed that each MF has an optimum shape? A viable solution
to these questions is through the MFs parameter optimization. In this
Paper a novel parameter optimization process is presented. The
process for FIS parameter optimization consists of the five simple
steps that can be easily realized off-line. Here the proposed process
of FIS parameter optimization it is demonstrated by its
implementation on an Intelligent Interface section dealing with the
on-line customization / personalization of internet portals applied to
E-commerce.
Abstract: The national economy development affects the vehicle
ownership which ultimately increases fuel consumption. The rise of
the vehicle ownership is dominated by the increasing number of
motorcycles. This research aims to analyze and identify the
characteristics of fuel consumption, the city transportation system,
and to analyze the relationship and the effect of the city
transportation system on the fuel consumption. A multivariable
analysis is used in this study. The data analysis techniques include: a
Multivariate Multivariable Analysis by using the R software. More
than 84% of fuel on Java is consumed in metropolitan and large
cities. The city transportation system variables that strongly effect the
fuel consumption are population, public vehicles, private vehicles and
private bus. This method can be developed to control the fuel
consumption by considering the urban transport system and city
tipology. The effect can reducing subsidy on the fuel consumption,
increasing state economic.