Abstract: The tracing methods determine the contribution the
power system sources have in their supplying. The methods can be used
to assess the transmission prices, but also to recover the transmission
fixed cost. In this paper is presented the influence of the modification of
commons structure has on the specific price of transfer. The operator
must make use of a few basic principles about allocation. Most
tracing methods are based on the proportional sharing principle. In this
paper Kirschen method is used. In order to illustrate this method, the 25-
bus test system is used, elaborated within the Electrical Power
Engineering Department, from Timisoara, Romania.
Abstract: The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model.
Abstract: Human perceives color in categories, which may be
identified using color name such as red, blue, etc. The categorization
is unique for each human being. However despite the individual
differences, the categorization is shared among members in society.
This allows communication among them, especially when using
color name. Sociable robot, to live coexist with human and become
part of human society, must also have the shared color
categorization, which can be achieved through learning. Many
works have been done to enable computer, as brain of robot, to learn
color categorization. Most of them rely on modeling of human color
perception and mathematical complexities. Differently, in this work,
the computer learns color categorization through interaction with
humans. This work aims at developing the innate ability of the
computer to learn the human-like color categorization. It focuses on
the representation of color categorization and how it is built and
developed without much mathematical complexity.
Abstract: Even though most researchers would agree that in
symbiotic relationships, like the one between parent and child,
influences become reciprocal over time, empirical evidence
supporting this claim is limited. The aim of the current study was to
develop and test a model describing the reciprocal influence between
characteristics of the parent-child relationship, such as closeness and
conflict, and the child-s bullying and victimization experiences at
school. The study used data from the longitudinal Study of Early
Child-Care, conducted by the National Institute of Child Health and
Human Development. The participants were dyads of early
adolescents (5th and 6th graders during the two data collection waves)
and their mothers (N=1364). Supporting our hypothesis, the findings
suggested a reciprocal association between bullying and positive
parenting, although this association was only significant for boys.
Victimization and positive parenting were not significantly
interrelated.
Abstract: In this era of competitiveness, there is a growing need for supply chains also to become competitive enough to handle pressures like varying customer’s expectations, low cost high quality products to be delivered at the minimum time and the most important is throat cutting competition at world wide scale. In the recent years, supply chain competitiveness has been, therefore, accepted as one of the most important philosophies in the supply chain literature. Various researchers and practitioners have tried to identify and implement strategies in supply chains which can bring competitiveness in the supply chains i.e. supply chain competitiveness. The purpose of this paper is to suggest select strategies for supply chain competitiveness in the Indian manufacturing sector using an integrated approach of literature review and exploratory interviews with eminent professionals from the supply chain area in various industries, academia and research. The aim of the paper is to highlight the important area of competitiveness in the supply chain and to suggest recommendations to the industry and managers of manufacturing sector.
Abstract: Wireless sensor networks (WSNs) have gained
tremendous attention in recent years due to their numerous
applications. Due to the limited energy resource, energy efficient
operation of sensor nodes is a key issue in wireless sensor networks.
Cooperative caching which ensures sharing of data among various
nodes reduces the number of communications over the wireless
channels and thus enhances the overall lifetime of a wireless sensor
network. In this paper, we propose a cooperative caching scheme
called ZCS (Zone Cooperation at Sensors) for wireless sensor
networks. In ZCS scheme, one-hop neighbors of a sensor node form a
cooperative cache zone and share the cached data with each other.
Simulation experiments show that the ZCS caching scheme achieves
significant improvements in byte hit ratio and average query latency
in comparison with other caching strategies.
Abstract: A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Abstract: In this paper, we have proposed a novel FinFET with
extended body under the poly gate, which is called EB-FinFET, and
its characteristic is demonstrated by using three-dimensional (3-D)
numerical simulation. We have analyzed and compared it with
conventional FinFET. The extended body height dependence on the
drain induced barrier lowering (DIBL) and subthreshold swing (S.S)
have been also investigated. According to the 3-D numerical
simulation, the proposed structure has a firm structure, an acceptable
short channel effect (SCE), a reduced series resistance, an increased
on state drain current (I
on) and a large normalized I
DS. Furthermore,
the structure can also improve corner effect and reduce self-heating
effect due to the extended body. Our results show that the EBFinFET
is excellent for nanoscale device.
Abstract: Currently, there has been a 3G mobile networks data
traffic explosion due to the large increase in the number of smartphone
users. Unlike a traditional wired infrastructure, 3G mobile networks
have limited wireless resources and signaling procedures for complex
wireless resource management. And mobile network security for
various abnormal and malicious traffic technologies was not ready. So
Malicious or potentially malicious traffic originating from mobile
malware infected smart devices can cause serious problems to the 3G
mobile networks, such as DoS and scanning attack in wired networks.
This paper describes the DoS security threat in the 3G mobile network
and proposes a detection technology.
Abstract: Linear stochastic estimation and quadratic stochastic
estimation techniques were applied to estimate the entire velocity
flow-field of an open cavity with a length to depth ratio of 2. The
estimations were done through the use of instantaneous velocity
magnitude as estimators. These measurements were obtained by
Particle Image Velocimetry. The predicted flow was compared
against the original flow-field in terms of the Reynolds stresses and
turbulent kinetic energy. Quadratic stochastic estimation proved to be
more superior than linear stochastic estimation in resolving the shear
layer flow. When the velocity fluctuations were scaled up in the
quadratic estimate, both the time-averaged quantities and the
instantaneous cavity flow can be predicted to a rather accurate extent.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: The current study begins with an awareness that
today-s media environment is characterized by technological
development and a new way of reading caused by the introduction of
the Internet. The researcher conducted a meta analysis framed within
Technological Determinism to investigate the process of hypertext
reading, its differences from linear reading and the effects such
differences can have on people-s ways of mentally structuring their
world. The relationship between literacy and the comprehension
achieved by reading hypertexts is also investigated. The results show
hypertexts are not always user friendly. People experience hyperlinks
as interruptions that distract their attention generating comprehension
and disorientation. On one hand hypertextual jumping reading
generates interruptions that finally make people lose their
concentration. On the other hand hypertexts fascinate people who
would rather read a document in such a format even though the
outcome is often frustrating and affects their ability to elaborate and
retain information.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: The background estimation approach using a small
window median filter is presented on the bases of analyzing IR point
target, noise and clutter model. After simplifying the two-dimensional
filter, a simple method of adopting one-dimensional median filter is
illustrated to make estimations of background according to the
characteristics of IR scanning system. The adaptive threshold is used
to segment canceled image in the background. Experimental results
show that the algorithm achieved good performance and satisfy the
requirement of big size image-s real-time processing.
Abstract: The principal objective of this study is to be able to
extract niobium oxide from columbite-tantalite concentrate of Thayet
Kon Area in Nay Phi Taw. It is recovered from columbite-tantalite
concentrate which contains 19.29 % Nb2O5.The recovery of niobium
oxide from columbite-tantalite concentrate can be divided into three
main sections, namely, digestion of the concentrate, recovery from
the leached solution and precipitation and calcinations. The
concentrate was digested with hydrofluoric acid and sulfuric acid. Of
the various parameters that effect acidity and time were studied. In
the recovery section solvent extraction process using methyl isobutyl
ketone was investigated. Ammonium hydroxide was used as a
precipitating agent and the precipitate was later calcined. The
percentage of niobium oxide is 74%.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.
Abstract: Chaiyaphum Starch Co. Ltd. is one of many starch
manufacturers that has introduced machinery to aid in manufacturing.
Even though machinery has replaced many elements and is now a
significant part in manufacturing processes, problems that must be
solved with respect to current process flow to increase efficiency still
exist. The paper-s aim is to increase productivity while maintaining
desired quality of starch, by redesigning the flipping machine-s
mechanical control system which has grossly low functional lifetime.
Such problems stem from the mechanical control system-s bearings,
as fluids and humidity can access into said bearing directly, in
tandem with vibrations from the machine-s function itself. The wheel
which is used to sense starch thickness occasionally falls from its
shaft, due to high speed rotation during operation, while the shaft
may bend from impact when processing dried bread. Redesigning its
mechanical control system has increased its efficiency, allowing
quality thickness measurement while increasing functional lifetime
an additional 62 days.
Abstract: The availability of water in adequate quantity and
quality is imperative for sustainable development. Worldwide,
significant imbalance exists with regards to sustainable development
particularly from a water and sanitation perspective. Water is a
critical component of public health, and failure to supply safe water
will place a heavy burden on the entire population. Although the 21st
century has witnessed wealth and advanced development, it has not
been realized everywhere. Billions of people are still striving to
access the most basic human needs which are food, shelter, safe
drinking water and adequate sanitation. The global picture conceals
various inequalities particularly with regards to sanitation coverage in
rural and urban areas. Currently, water scarcity and in particular
water governance is the main challenge which will cause a threat to
sustainable development goals. Within the context of water,
sanitation and health, sustainable development is a confusing concept
primarily when examined from the viewpoint of policy options for
developing countries. This perspective paper aims to summarize and
critically evaluate evidence of published studies in relation to water,
sanitation and health and to identify relevant solutions to reduce
public health impacts. Evidently, improving water and sanitation
services will result in significant and lasting gains in health and
economic development.
Abstract: In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.