Abstract: The tracing methods determine the contribution the
power system sources have in their supplying. The methods can be used
to assess the transmission prices, but also to recover the transmission
fixed cost. In this paper is presented the influence of the modification of
commons structure has on the specific price of transfer. The operator
must make use of a few basic principles about allocation. Most
tracing methods are based on the proportional sharing principle. In this
paper Kirschen method is used. In order to illustrate this method, the 25-
bus test system is used, elaborated within the Electrical Power
Engineering Department, from Timisoara, Romania.
Abstract: The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model.
Abstract: Human perceives color in categories, which may be
identified using color name such as red, blue, etc. The categorization
is unique for each human being. However despite the individual
differences, the categorization is shared among members in society.
This allows communication among them, especially when using
color name. Sociable robot, to live coexist with human and become
part of human society, must also have the shared color
categorization, which can be achieved through learning. Many
works have been done to enable computer, as brain of robot, to learn
color categorization. Most of them rely on modeling of human color
perception and mathematical complexities. Differently, in this work,
the computer learns color categorization through interaction with
humans. This work aims at developing the innate ability of the
computer to learn the human-like color categorization. It focuses on
the representation of color categorization and how it is built and
developed without much mathematical complexity.
Abstract: The paper presents an analytical solution for dispersion
of a solute in the peristaltic motion of a couple stress fluid
through a porous medium with slip condition in the presence of both
homogeneous and heterogeneous chemical reactions. The average
effective dispersion coefficient has been found using Taylor-s limiting
condition and long wavelength approximation. The effects of various
relevant parameters on the average coefficient of dispersion have been
studied. The average effective dispersion coefficient tends to increase
with permeability parameter but tends to decrease with homogeneous
chemical reaction rate parameter, couple stress parameter, slip parameter
and heterogeneous reaction rate parameter.
Abstract: This paper describes a segmentation algorithm based
on the cooperation of an optical flow estimation method with edge
detection and region growing procedures.
The proposed method has been developed as a pre-processing
stage to be used in methodologies and tools for video/image indexing
and retrieval by content. The addressed problem consists in
extracting whole objects from background for producing images of
single complete objects from videos or photos. The extracted images
are used for calculating the object visual features necessary for both
indexing and retrieval processes.
The first task of the algorithm exploits the cues from motion
analysis for moving area detection. Objects and background are then
refined using respectively edge detection and region growing
procedures. These tasks are iteratively performed until objects and
background are completely resolved.
The developed method has been applied to a variety of indoor and
outdoor scenes where objects of different type and shape are
represented on variously textured background.
Abstract: In this era of competitiveness, there is a growing need for supply chains also to become competitive enough to handle pressures like varying customer’s expectations, low cost high quality products to be delivered at the minimum time and the most important is throat cutting competition at world wide scale. In the recent years, supply chain competitiveness has been, therefore, accepted as one of the most important philosophies in the supply chain literature. Various researchers and practitioners have tried to identify and implement strategies in supply chains which can bring competitiveness in the supply chains i.e. supply chain competitiveness. The purpose of this paper is to suggest select strategies for supply chain competitiveness in the Indian manufacturing sector using an integrated approach of literature review and exploratory interviews with eminent professionals from the supply chain area in various industries, academia and research. The aim of the paper is to highlight the important area of competitiveness in the supply chain and to suggest recommendations to the industry and managers of manufacturing sector.
Abstract: Wireless sensor networks (WSNs) have gained
tremendous attention in recent years due to their numerous
applications. Due to the limited energy resource, energy efficient
operation of sensor nodes is a key issue in wireless sensor networks.
Cooperative caching which ensures sharing of data among various
nodes reduces the number of communications over the wireless
channels and thus enhances the overall lifetime of a wireless sensor
network. In this paper, we propose a cooperative caching scheme
called ZCS (Zone Cooperation at Sensors) for wireless sensor
networks. In ZCS scheme, one-hop neighbors of a sensor node form a
cooperative cache zone and share the cached data with each other.
Simulation experiments show that the ZCS caching scheme achieves
significant improvements in byte hit ratio and average query latency
in comparison with other caching strategies.
Abstract: A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Abstract: In this paper, we have proposed a novel FinFET with
extended body under the poly gate, which is called EB-FinFET, and
its characteristic is demonstrated by using three-dimensional (3-D)
numerical simulation. We have analyzed and compared it with
conventional FinFET. The extended body height dependence on the
drain induced barrier lowering (DIBL) and subthreshold swing (S.S)
have been also investigated. According to the 3-D numerical
simulation, the proposed structure has a firm structure, an acceptable
short channel effect (SCE), a reduced series resistance, an increased
on state drain current (I
on) and a large normalized I
DS. Furthermore,
the structure can also improve corner effect and reduce self-heating
effect due to the extended body. Our results show that the EBFinFET
is excellent for nanoscale device.
Abstract: The ability to recognize humans and their activities by computer vision is a very important task, with many potential application. Study of human motion analysis is related to several research areas of computer vision such as the motion capture, detection, tracking and segmentation of people. In this paper, we describe a segmentation method for extracting human body contour in modified HLS color space. To estimate a background, the modified HLS color space is proposed, and the background features are estimated by using the HLS color components. Here, the large amount of human dataset, which was collected from DV cameras, is pre-processed. The human body and its contour is successfully extracted from the image sequences.
Abstract: In this paper we propose a new traffic simulation
package, TDMSim, which supports both macroscopic and
microscopic simulation on free-flowing and regulated traffic systems.
Both simulators are based on travel demands, which specify the
numbers of vehicles departing from origins to arrive at different
destinations. The microscopic simulator implements the carfollowing
model given the pre-defined routes of the vehicles but also
supports the rerouting of vehicles. We also propose a macroscopic
simulator which is built in integration with the microscopic simulator
to allow the simulation to be scaled for larger networks without
sacrificing the precision achievable through the microscopic
simulator. The macroscopic simulator also enables the reuse of
previous simulation results when simulating traffic on the same
networks at later time. Validations have been conducted to show the
correctness of both simulators.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: Two commercial proteases from Bacillus
licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were
screened for the production of Z-Ala-Phe-NH2 in batch reaction.
Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal
amidation of Z-Ala-Phe-OMe using ammonium carbamate as
ammonium source. Immobilization of protease has been achieved by
the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and
tetramethoxysilane (TMOS) as precursors (unpublished results). In
batch production, about 95% of Z-Ala-Phe-NH2 was obtained at
30°C after 24 hours of incubation. Reproducibility of different
batches of commercial Alcalase 2.4 L FG preparations was also
investigated by evaluating the amidation activity and the entrapment
yields in the case of immobilization. A packed-bed reactor (0.68 cm
ID, 15.0 cm long) was operated successfully for the continuous
synthesis of peptide amides. The immobilized enzyme retained the
initial activity over 10 cycles of repeated use in continuous reactor at
ambient temperature. At 0.75 mL/min flow rate of the substrate
mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5
hours of substrate recycling. The product contained about 90%
peptide amide and 10% hydrolysis byproduct.
Abstract: The current study begins with an awareness that
today-s media environment is characterized by technological
development and a new way of reading caused by the introduction of
the Internet. The researcher conducted a meta analysis framed within
Technological Determinism to investigate the process of hypertext
reading, its differences from linear reading and the effects such
differences can have on people-s ways of mentally structuring their
world. The relationship between literacy and the comprehension
achieved by reading hypertexts is also investigated. The results show
hypertexts are not always user friendly. People experience hyperlinks
as interruptions that distract their attention generating comprehension
and disorientation. On one hand hypertextual jumping reading
generates interruptions that finally make people lose their
concentration. On the other hand hypertexts fascinate people who
would rather read a document in such a format even though the
outcome is often frustrating and affects their ability to elaborate and
retain information.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: In a competitive production environment, critical
decision making are based on data resulted by random sampling of
product units. Efficiency of these decisions depends on data quality
and also their reliability scale. This point leads to the necessity of a
reliable measurement system. Therefore, the conjecture process and
analysing the errors contributes to a measurement system known as
Measurement System Analysis (MSA). The aim of this research is on
determining the necessity and assurance of extensive development in
analysing measurement systems, particularly with the use of
Repeatability and Reproducibility Gages (GR&R) to improve
physical measurements. Nowadays in productive industries,
repeatability and reproducibility gages released so well but they are
not applicable as well as other measurement system analysis
methods. To get familiar with this method and gain a feedback in
improving measurement systems, this survey would be on
“ANOVA" method as the most widespread way of calculating
Repeatability and Reproducibility (R&R).
Abstract: The acidity of different raw Jordanian clays
containing zeolite, bentonite, red and white kaolinite and diatomite
was characterized by means of temperature programmed desorption
(TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH),
FTIR and BET-measurements. FTIR spectra proved presence of
silanol and bridged hydroxyls on the clay surface. The number of
acidic sites was calculated from experimental TPD-profiles. We
observed the decrease of surface acidity correlates with the decrease
of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two
maxima were registered due to different strength of surface acidic
sites. Values of MBOH conversion, product yields and selectivity
were calculated for the catalysis on Jordanian clays. We obtained that
all clay samples are able to convert MBOH into a major product
which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid
surface sites with the selectivity close to 70%. There was found a
correlation between MBOH conversion and acidity of clays
determined by TPD-NH3, i.e. the higher the acidity the higher the
conversion of MBOH. However, diatomite provided the lowest
conversion of MBOH as result of poor polarization of silanol groups.
Comparison of surface areas and conversions revealed the highest
density of active sites for red kaolinite and the lowest for zeolite and
diatomite.
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.