Abstract: In this paper, based on steady-state models of Flexible
AC Transmission System (FACTS) devices, the sizing of static
synchronous series compensator (SSSC) controllers in transmission
network is formed as an optimization problem. The objective of this
problem is to reduce the transmission losses in the network. The
optimization problem is solved using particle swarm optimization
(PSO) technique. The Newton-Raphson load flow algorithm is
modified to consider the insertion of the SSSC devices in the
network. A numerical example, illustrating the effectiveness of the
proposed algorithm, is introduced. In addition, a novel model of a 3-
phase voltage source converter (VSC) that is suitable for series
connected FACTS a controller is introduced. The model is verified
by simulation using Power System Blockset (PSB) and Simulink
software.
Abstract: This survey of recent literature examines the link between growth and poverty. It is widely accepted that economic growth is a necessary condition for sustainable poverty reduction. But it is the fact that the economic growth of some countries has been pro-poor while others not. Some factors such as labor market, policies and demographic factors may lead to a weak relationship between economic performance and poverty rate. In this sense pro-growth policies should be pro-poor to increase the poverty alleviation effects of the growth. The purpose of this study is to review the recent studies on the effects of macroeconomic policies on poverty and inequality and to review the poverty analyses which examine the relationship between growth, poverty and inequality. Also this study provides some facts about the relationship between economic growth, inequality and poverty from Turkey. Keywordseconomic growth, inequality, macroeconomic policy, poverty
Abstract: The advent of modern technology shadows its impetus repercussions on successful Legacy systems making them obsolete with time. These systems have evolved the large organizations in major problems in terms of new business requirements, response time, financial depreciation and maintenance. Major difficulty is due to constant system evolution and incomplete, inconsistent and obsolete documents which a legacy system tends to have. The myriad dimensions of these systems can only be explored by incorporating reverse engineering, in this context, is the best method to extract useful artifacts and by exploring these artifacts for reengineering existing legacy systems to meet new requirements of organizations. A case study is conducted on six different type of software systems having source code in different programming languages using the architectural recovery framework.
Abstract: Electricity market activities and a growing demand for electricity have led to heavily stressed power systems. This requires operation of the networks closer to their stability limits. Power system operation is affected by stability related problems, leading to unpredictable system behavior. Voltage stability refers to the ability of a power system to sustain appropriate voltage levels through large and small disturbances. Steady-state voltage stability is concerned with limits on the existence of steady-state operating points for the network. FACTS devices can be utilized to increase the transmission capacity, the stability margin and dynamic behavior or serve to ensure improved power quality. Their main capabilities are reactive power compensation, voltage control and power flow control. Among the FACTS controllers, Static Var Compensator (SVC) provides fast acting dynamic reactive compensation for voltage support during contingency events. In this paper, voltage stability assessment with appropriate representations of tap-changer transformers and SVC is investigated. Integrating both of these devices is the main topic of this paper. Effect of the presence of tap-changing transformers on static VAR compensator controller parameters and ratings necessary to stabilize load voltages at certain values are highlighted. The interrelation between transformer off nominal tap ratios and the SVC controller gains and droop slopes and the SVC rating are found. P-V curves are constructed to calculate loadability margins.
Abstract: In this paper a novel method for finding the fault zone
on a Thyristor Controlled Series Capacitor (TCSC) incorporated
transmission line is presented. The method makes use of the Support
Vector Machine (SVM), used in the classification mode to
distinguish between the zones, before or after the TCSC. The use of
Discrete Wavelet Transform is made to prepare the features which
would be given as the input to the SVM. This method was tested on a
400 kV, 50 Hz, 300 Km transmission line and the results were highly
accurate.
Abstract: A considerable progress has been achieved in transient
stability analysis (TSA) with various FACTS controllers. But, all
these controllers are associated with single transmission line. This
paper is intended to discuss a new approach i.e. a multi-line FACTS
controller which is interline power flow controller (IPFC) for TSA of
a multi-machine power system network. A mathematical model of
IPFC, termed as power injection model (PIM) presented and this
model is incorporated in Newton-Raphson (NR) power flow
algorithm. Then, the reduced admittance matrix of a multi-machine
power system network for a three phase fault without and with IPFC
is obtained which is required to draw the machine swing curves. A
general approach based on L-index has also been discussed to find
the best location of IPFC to reduce the proximity to instability of a
power system. Numerical results are carried out on two test systems
namely, 6-bus and 11-bus systems. A program in MATLAB has
been written to plot the variation of generator rotor angle and speed
difference curves without and with IPFC for TSA and also a simple
approach has been presented to evaluate critical clearing time for test
systems. The results obtained without and with IPFC are compared
and discussed.
Abstract: Electroencephalogram (EEG) recordings are often
contaminated with ocular and muscle artifacts. In this paper, the
canonical correlation analysis (CCA) is used as blind source
separation (BSS) technique (BSS-CCA) to decompose the artifact
contaminated EEG into component signals. We combine the BSSCCA
technique with wavelet filtering approach for minimizing both
ocular and muscle artifacts simultaneously, and refer the proposed
method as wavelet enhanced BSS-CCA. In this approach, after
careful visual inspection, the muscle artifact components are
discarded and ocular artifact components are subjected to wavelet
filtering to retain high frequency cerebral information, and then clean
EEG is reconstructed. The performance of the proposed wavelet
enhanced BSS-CCA method is tested on real EEG recordings
contaminated with ocular and muscle artifacts, for which power
spectral density is used as a quantitative measure. Our results suggest
that the proposed hybrid approach minimizes ocular and muscle
artifacts effectively, minimally affecting underlying cerebral activity
in EEG recordings.
Abstract: The work presented in this paper focus on Knowledge Management services enabling CSCW (Computer Supported Cooperative Work) applications to provide an appropriate adaptation to the user and the situation in which the user is working. In this paper, we explain how a knowledge management system can be designed to support users in different situations exploiting contextual data, users' preferences, and profiles of involved artifacts (e.g., documents, multimedia files, mockups...). The presented work roots in the experience we had in the MILK project and early steps made in the MAIS project.
Abstract: This paper presents a genetic algorithm based
approach for solving security constrained optimal power flow
problem (SCOPF) including FACTS devices. The optimal location of
FACTS devices are identified using an index called overload index
and the optimal values are obtained using an enhanced genetic
algorithm. The optimal allocation by the proposed method optimizes
the investment, taking into account its effects on security in terms of
the alleviation of line overloads. The proposed approach has been
tested on IEEE-30 bus system to show the effectiveness of the
proposed algorithm for solving the SCOPF problem.
Abstract: In this work, we present for the first time in our
perception an efficient digital watermarking scheme for mpeg audio
layer 3 files that operates directly in the compressed data domain,
while manipulating the time and subband/channel domain. In
addition, it does not need the original signal to detect the watermark.
Our scheme was implemented taking special care for the efficient
usage of the two limited resources of computer systems: time and
space. It offers to the industrial user the capability of watermark
embedding and detection in time immediately comparable to the real
music time of the original audio file that depends on the mpeg
compression, while the end user/audience does not face any artifacts
or delays hearing the watermarked audio file. Furthermore, it
overcomes the disadvantage of algorithms operating in the PCMData
domain to be vulnerable to compression/recompression attacks,
as it places the watermark in the scale factors domain and not in the
digitized sound audio data. The strength of our scheme, that allows it
to be used with success in both authentication and copyright
protection, relies on the fact that it gives to the users the enhanced
capability their ownership of the audio file not to be accomplished
simply by detecting the bit pattern that comprises the watermark
itself, but by showing that the legal owner knows a hard to compute
property of the watermark.
Abstract: Much has been written about the difficulties students
have with producing traditional dissertations. This includes both
native English speakers (L1) and students with English as a second
language (L2). The main emphasis of these papers has been on the
structure of the dissertation, but in all cases, even when electronic
versions are discussed, the dissertation is still in what most would
regard as a traditional written form.
Master of Science Degrees in computing disciplines require
students to gain technical proficiency and apply their knowledge to a
range of scenarios. The basis of this paper is that if a dissertation is a
means of showing that such a student has met the criteria for a pass,
which should be based on the learning outcomes of the dissertation
module, does meeting those outcomes require a student to
demonstrate their skills in a solely text based form, particularly in a
highly technical research project? Could it be possible for a student
to produce a series of related artifacts which form a cohesive package
that meets the learning out comes of the dissertation?
Abstract: This paper aims to select the optimal location and
setting parameters of TCSC (Thyristor Controlled Series
Compensator) controller using Particle Swarm Optimization (PSO)
and Genetic Algorithm (GA) to mitigate small signal oscillations in a
multimachine power system. Though Power System Stabilizers
(PSSs) are prime choice in this issue, installation of FACTS device
has been suggested here in order to achieve appreciable damping of
system oscillations. However, performance of any FACTS devices
highly depends upon its parameters and suitable location in the
power network. In this paper PSO as well as GA based techniques are
used separately and compared their performances to investigate this
problem. The results of small signal stability analysis have been
represented employing eigenvalue as well as time domain response in
face of two common power system disturbances e.g., varying load
and transmission line outage. It has been revealed that the PSO based
TCSC controller is more effective than GA based controller even
during critical loading condition.
Abstract: To distinguish small retinal hemorrhages in early
diabetic retinopathy from dust artifacts, we analyzed hue, lightness,
and saturation (HLS) color spaces. The fundus of 5 patients with
diabetic retinopathy was photographed. For the initial experiment, we
placed 4 different colored papers on the ceiling of a darkroom. Using
each color, 10 fragments of house dust particles on a magnifier were
photographed. The colored papers were removed, and 3 different
colored light bulbs were suspended from the ceiling. Ten fragments of
house dust particles on the camera-s object lens were photographed.
We then constructed an experimental device that can photograph
artificial eyes. Five fragments of house dust particles under the ocher
fundus of the artificial eye were photographed. On analyzing HLS
color space of the dust artifact, lightness and saturation were found to
be highly sensitive. However, hue was not highly sensitive.
Abstract: Measurement and the following evaluation of
performance represent important part of management. The paper
focuses on indicators as the basic elements of performance
measurement system. It emphasizes a necessity of searching
requirements for quality indicators so that they can become part of
the useful system. It introduces standpoints for a systematic dividing
of indicators so that they have as high as possible informative value
of background sources for searching, analysis, designing and using of
indicators. It draws attention to requirements for indicators' quality
and at the same it deals with some dangers decreasing indicator's
informative value. It submits a draft of questions that should be
answered at the construction of indicator. It is obvious that particular
indicators need to be defined exactly to stimulate the desired
behavior in order to attain expected results. In the enclosure a
concrete example of the defined indicator in the concrete conditions
of a small firm is given. The authors of the paper pay attention to the
fact that a quality indicator makes it possible to get to the basic
causes of the problem and include the established facts into the
company information system. At the same time they emphasize that
developing of a quality indicator is a prerequisite for the utilization
of the system of measurement in management.
Abstract: As computer network technology becomes
increasingly complex, it becomes necessary to place greater
requirements on the validity of developing standards and the
resulting technology. Communication networks are based on large
amounts of protocols. The validity of these protocols have to be
proved either individually or in an integral fashion. One strategy for
achieving this is to apply the growing field of formal methods.
Formal methods research defines systems in high order logic so that
automated reasoning can be applied for verification. In this research
we represent and implement a formerly announced multicast protocol
in Prolog language so that certain properties of the protocol can be
verified. It is shown that by using this approach some minor faults in
the protocol were found and repaired. Describing the protocol as
facts and rules also have other benefits i.e. leads to a process-able
knowledge. This knowledge can be transferred as ontology between
systems in KQML format. Since the Prolog language can increase its
knowledge base every time, this method can also be used to learn an
intelligent network.
Abstract: This preliminary study attempts to see if a learning
environment influences instructor’s teaching strategies and learners’
in-class activities in a foreign language class at a university in Japan.
The class under study was conducted in a computer room, while the
majority of classes of the same course were offered in traditional
classrooms without computers. The study also sees if the unplanned
blended learning environment, enhanced, or worked against, in
achieving course goals, by paying close attention to in-class artefacts,
such as computers. In the macro-level analysis, the course syllabus
and weekly itinerary of the course were looked at; and in the microlevel
analysis, nonhuman actors in their environments were named
and analyzed to see how they influenced the learners’ task processes.
The result indicated that students were heavily influenced by the
presence of computers, which lead them to disregard some aspects of
intended learning objectives.
Abstract: Selecting the data modeling technique for an
information system is determined by the objective of the resultant
data model. Dimensional modeling is the preferred modeling
technique for data destined for data warehouses and data mining,
presenting data models that ease analysis and queries which are in
contrast with entity relationship modeling. The establishment of data
warehouses as components of information system landscapes in
many organizations has subsequently led to the development of
dimensional modeling. This has been significantly more developed
and reported for the commercial database management systems as
compared to the open sources thereby making it less affordable for
those in resource constrained settings. This paper presents
dimensional modeling of HIV patient information using open source
modeling tools. It aims to take advantage of the fact that the most
affected regions by the HIV virus are also heavily resource
constrained (sub-Saharan Africa) whereas having large quantities of
HIV data. Two HIV data source systems were studied to identify
appropriate dimensions and facts these were then modeled using two
open source dimensional modeling tools. Use of open source would
reduce the software costs for dimensional modeling and in turn make
data warehousing and data mining more feasible even for those in
resource constrained settings but with data available.
Abstract: Most buildings have been using anchor bolts
commonly for installing outdoor advertising structures. Anchor bolts
of common carbon steel are widely used and often installed
indiscriminately by inadequate installation standards. In the area
where strong winds frequently blow, falling accidents of outdoor
advertising structures can occur and cause a serious disaster, which is
very dangerous and to be prevented. In this regard, the development of
high-performance anchor bolts is urgently required. In the present
study, 25Cr-8Ni-1.5Si-1Mn-0.4C alloy was produced by traditional
vacuum induction melting (VIM) for the application of anchor bolt.
The alloy composition is revealed as a duplex microstructure from
thermodynamic phase analysis by FactSage® and confirmed by
metallographic experiment. Addition of Nitrogen to the alloy was
found to reduce the ferritic phase domain and significantly increase the
hardness and the tensile strength. Microstructure observation revealed
mixed structure of austenite and ferrite with fine carbide distributed
along the grain and phase boundaries.
Abstract: This paper deals principally with the socio-economic impact on the local Iban community in Mukah Division, Sarawak; with the commencement of the open-cut coal mining industry since 2003. To-date there are no actual studies being carried out by either the public or private sector to truly analyze how the Iban community is coping with the advent of a large influx of cash into their society. The Iban community has traditionally been practicing shifting cultivation and farming of domesticated animals; with a portion of the younger generation working as laborers and professional. This paper represents the views and observations of the author supported by some statistical facts extracted from published articles and non-published reports. The paper deals primarily in the following areas: • Background of the coal mining industry in Mukah Division, Sarawak; • Benefits of the coal mining industry towards the Iban community; • Issues / Problems arise in the Iban community because of the presence of the coal mining industry; and • Possible actions that need to be taken to overcome these issues/ problems.
Abstract: In this paper DJess is presented, a novel distributed production system that provides an infrastructure for factual and procedural knowledge sharing. DJess is a Java package that provides programmers with a lightweight middleware by which inference systems implemented in Jess and running on different nodes of a network can communicate. Communication and coordination among inference systems (agents) is achieved through the ability of each agent to transparently and asynchronously reason on inferred knowledge (facts) that might be collected and asserted by other agents on the basis of inference code (rules) that might be either local or transmitted by any node to any other node.