Abstract: Ferroresonance is an electrical phenomenon in
nonlinear character, which frequently occurs in power system due to
transmission line faults and single or more-phase switching on the
lines as well as usage of the saturable transformers. In this study, the
ferroresonance phenomena are investigated under the modeling of the
West Anatolian Electric Power Network of 380 kV in Turkey. The
ferroresonance event is observed as a result of removing the loads at
the end of the lines. In this sense, two different cases are considered.
At first, the switching is applied at 2nd second and the ferroresonance
affects are observed between 2nd and 4th seconds in the voltage
variations of the phase-R. Hence the ferroresonance and nonferroresonance
parts of the overall data are compared with each
others using the Fourier transform techniques to show the
ferroresonance affects.
Abstract: Most of the losses in a power system relate to
the distribution sector which always has been considered.
From the important factors which contribute to increase losses
in the distribution system is the existence of radioactive flows.
The most common way to compensate the radioactive power
in the system is the power to use parallel capacitors. In
addition to reducing the losses, the advantages of capacitor
placement are the reduction of the losses in the release peak of
network capacity and improving the voltage profile. The point
which should be considered in capacitor placement is the
optimal placement and specification of the amount of the
capacitor in order to maximize the advantages of capacitor
placement.
In this paper, a new technique has been offered for the
placement and the specification of the amount of the constant
capacitors in the radius distribution network on the basis of
Genetic Algorithm (GA). The existing optimal methods for
capacitor placement are mostly including those which reduce
the losses and voltage profile simultaneously. But the
retaliation cost and load changes have not been considered as
influential UN the target function .In this article, a holistic
approach has been considered for the optimal response to this
problem which includes all the parameters in the distribution
network: The price of the phase voltage and load changes. So,
a vast inquiry is required for all the possible responses. So, in
this article, we use Genetic Algorithm (GA) as the most
powerful method for optimal inquiry.
Abstract: Wireless sensor networks (WSNs) have gained
tremendous attention in recent years due to their numerous
applications. Due to the limited energy resource, energy efficient
operation of sensor nodes is a key issue in wireless sensor networks.
Cooperative caching which ensures sharing of data among various
nodes reduces the number of communications over the wireless
channels and thus enhances the overall lifetime of a wireless sensor
network. In this paper, we propose a cooperative caching scheme
called ZCS (Zone Cooperation at Sensors) for wireless sensor
networks. In ZCS scheme, one-hop neighbors of a sensor node form a
cooperative cache zone and share the cached data with each other.
Simulation experiments show that the ZCS caching scheme achieves
significant improvements in byte hit ratio and average query latency
in comparison with other caching strategies.
Abstract: Currently, there has been a 3G mobile networks data
traffic explosion due to the large increase in the number of smartphone
users. Unlike a traditional wired infrastructure, 3G mobile networks
have limited wireless resources and signaling procedures for complex
wireless resource management. And mobile network security for
various abnormal and malicious traffic technologies was not ready. So
Malicious or potentially malicious traffic originating from mobile
malware infected smart devices can cause serious problems to the 3G
mobile networks, such as DoS and scanning attack in wired networks.
This paper describes the DoS security threat in the 3G mobile network
and proposes a detection technology.
Abstract: In this paper we propose a new traffic simulation
package, TDMSim, which supports both macroscopic and
microscopic simulation on free-flowing and regulated traffic systems.
Both simulators are based on travel demands, which specify the
numbers of vehicles departing from origins to arrive at different
destinations. The microscopic simulator implements the carfollowing
model given the pre-defined routes of the vehicles but also
supports the rerouting of vehicles. We also propose a macroscopic
simulator which is built in integration with the microscopic simulator
to allow the simulation to be scaled for larger networks without
sacrificing the precision achievable through the microscopic
simulator. The macroscopic simulator also enables the reuse of
previous simulation results when simulating traffic on the same
networks at later time. Validations have been conducted to show the
correctness of both simulators.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: I/O workload is a critical and important factor to
analyze I/O pattern and to maximize file system performance.
However to measure I/O workload on running distributed parallel file
system is non-trivial due to collection overhead and large volume of
data. In this paper, we measured and analyzed file system activities on
two large-scale cluster systems which had TFlops level high
performance computation resources. By comparing file system
activities of 2009 with those of 2006, we analyzed the change of I/O
workloads by the development of system performance and high-speed
network technology.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.
Abstract: Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.
Abstract: This paper presents the design, analysis and
development of permanent magnet (PM) torque couplers. These
couplers employ rare-earth magnets. Based on finite element analysis
and earlier analytical works both concentric and face-type
synchronous type couplers have been designed and fabricated. The
experimental performance has good correlation with finite element
calculations.
Abstract: The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.
Abstract: The stilling basins are commonly used to dissipate the
energy and protect the downstream floor from erosion. The aim of
the present experimental work is to improve the roughened stilling
basin using T-shape roughness instead of the regular cubic one and
design this new shape. As a result of the present work the best
intensity and the best roughness length are identified. Also, it is
found that the T-shape roughness save materials and reduce the jump
length compared to the cubic one. Sensitivity analysis was performed
and it was noticed that the change in the length of jump is more
sensitive to the change in roughness length than the change in
intensity.
Abstract: Extraction of laccase produced by L. polychrous in an
aqueous two-phase system, composed of polyethylene glycol and
phosphate salt at pH 7.0 and 250C was investigated. The effect of
PEG molecular weight, PEG concentration and phosphate
concentration was determined. Laccase preferentially partitioned to
the top phase. Good extraction of laccase to the top phase was
observed with PEG 4000. The optimum system was found in the
system containing 12% w/w PEG 4000 and 16% w/w phosphate salt
with KE of 88.3, purification factor of 3.0-fold and 99.1% yield.
Some properties of the enzyme such as thermal stability, effect of
heavy metal ions and kinetic constants were also presented in this
work. The thermal stability decreased sharply with high temperature
above 60 0C. The enzyme was inhibited by Cd2+, Pb2+, Zn2+ and
Cu2+. The Vmax and Km values of the enzyme were 74.70
μmol/min/ml and 9.066 mM respectively.
Abstract: The convergence of heterogeneous wireless access technologies characterizes the 4G wireless networks. In such converged systems, the seamless and efficient handoff between
different access technologies (vertical handoff) is essential and remains a challenging problem. The heterogeneous co-existence of access technologies with largely different characteristics creates a decision problem of determining the “best" available network at
“best" time to reduce the unnecessary handoffs. This paper proposes a dynamic decision model to decide the “best" network at “best"
time moment to handoffs. The proposed dynamic decision model make the right vertical handoff decisions by determining the “best"
network at “best" time among available networks based on, dynamic
factors such as “Received Signal Strength(RSS)" of network and
“velocity" of mobile station simultaneously with static factors like Usage Expense, Link capacity(offered bandwidth) and power
consumption. This model not only meets the individual user needs but also improve the whole system performance by reducing the unnecessary handoffs.
Abstract: The vast amount of information on the World Wide
Web is created and published by many different types of providers.
Unlike books and journals, most of this information is not subject to
editing or peer review by experts. This lack of quality control and the
explosion of web sites make the task of finding quality information
on the web especially critical. Meanwhile new facilities for
producing web pages such as Blogs make this issue more significant
because Blogs have simple content management tools enabling nonexperts
to build easily updatable web diaries or online journals. On
the other hand despite a decade of active research in information
quality (IQ) there is no framework for measuring information quality
on the Blogs yet. This paper presents a novel experimental
framework for ranking quality of information on the Weblog. The
results of data analysis revealed seven IQ dimensions for the Weblog.
For each dimension, variables and related coefficients were
calculated so that presented framework is able to assess IQ of
Weblogs automatically.