Abstract: The selection of appropriate requirements for product
releases can make a big difference in a product success. The selection
of requirements is done by different requirements prioritization
techniques. These techniques are based on pre-defined and
systematic steps to calculate the requirements relative weight.
Prioritization is complicated by new development settings, shifting
from traditional co-located development to geographically distributed
development. Stakeholders, connected to a project, are distributed all
over the world. These geographically distributions of stakeholders
make it hard to prioritize requirements as each stakeholder have their
own perception and expectations of the requirements in a software
project. This paper discusses limitations of the Analytical Hierarchy
Process with respect to geographically distributed stakeholders-
(GDS) prioritization of requirements. This paper also provides a
solution, in the form of a modified AHP, in order to prioritize
requirements for GDS. We will conduct two experiments in this
paper and will analyze the results in order to discuss AHP limitations
with respect to GDS. The modified AHP variant is also validated in
this paper.
Abstract: Motion detection is very important in image
processing. One way of detecting motion is using optical flow.
Optical flow cannot be computed locally, since only one independent
measurement is available from the image sequence at a point, while
the flow velocity has two components. A second constraint is needed.
The method used for finding the optical flow in this project is
assuming that the apparent velocity of the brightness pattern varies
smoothly almost everywhere in the image. This technique is later
used in developing software for motion detection which has the
capability to carry out four types of motion detection. The motion
detection software presented in this project also can highlight motion
region, count motion level as well as counting object numbers. Many
objects such as vehicles and human from video streams can be
recognized by applying optical flow technique.
Abstract: Vector quantization is a powerful tool for speech
coding applications. This paper deals with LPC Coding of speech
signals which uses a new technique called Multi Switched Split
Vector Quantization, This is a hybrid of two product code vector
quantization techniques namely the Multi stage vector quantization
technique, and Switched split vector quantization technique,. Multi
Switched Split Vector Quantization technique quantizes the linear
predictive coefficients in terms of line spectral frequencies. From
results it is proved that Multi Switched Split Vector Quantization
provides better trade off between bitrate and spectral distortion
performance, computational complexity and memory requirements
when compared to Switched Split Vector Quantization, Multi stage
vector quantization, and Split Vector Quantization techniques. By
employing the switching technique at each stage of the vector
quantizer the spectral distortion, computational complexity and
memory requirements were greatly reduced. Spectral distortion was
measured in dB, Computational complexity was measured in
floating point operations (flops), and memory requirements was
measured in (floats).
Abstract: A passive system "Qanat" is collection of some
underground wells. A mother-well was dug in a place far from the
city where they could reach to the water table maybe 100 meters
underground, they dug other wells to direct water toward the city,
with minimum possible gradient. Using the slope of the earth they
could bring water close to the surface in the city. The source of water
or the appearance of Qanat, land slope and the ownership lines are
the important and effective factors in the formation of routes and the
segment division of lands to the extent that making use of Qanat as
the techniques of extracting underground waters creates a channel of
routes with an organic order and hierarchy coinciding the slope of
land and it also guides the Qanat waters in the tradition texture of salt
desert and border provinces of it. Qanats are excavated in a specified
distinction from each other. The quantity of water provided by
Qanats depends on the kind of land, distance from mountain,
geographical situation of them and the rate of water supply from the
underground land. The rate of underground waters, possibility of
Qanat excavation, number of Qanats and rate of their water supply
from one hand and the quantity of cultivable fertile lands from the
other hand are the important natural factors making the size of cities.
In the same manner the cities with several Qanats have multi central
textures. The location of cities is in direct relation with land quality,
soil fertility and possibility of using underground water by excavating
Qanats. Observing the allowable distance for Qanat watering is a
determining factor for distance between villages and cities.
Topography, land slope, soil quality, watering system, ownership,
kind of cultivation, etc. are the effective factors in directing Qanats
for excavation and guiding water toward the cultivable lands and it
also causes the formation of different textures in land division of
farming provinces. Several divisions such as orderly and wide, inorderly,
thin and long, comb like, etc. are the introduction to organic
order. And at the same time they are complete coincidence with
environmental conditions in the typical development of ecological
architecture and planning in the traditional cities and settlements
order.
Abstract: Recent developments in Soft computing techniques,
power electronic switches and low-cost computational hardware have
made it possible to design and implement sophisticated control
strategies for sensorless speed control of AC motor drives. Such an
attempt has been made in this work, for Sensorless Speed Control of
Induction Motor (IM) by means of Direct Torque Fuzzy Control
(DTFC), PI-type fuzzy speed regulator and MRAS speed estimator
strategy, which is absolutely nonlinear in its nature. Direct torque
control is known to produce quick and robust response in AC drive
system. However, during steady state, torque, flux and current ripple
occurs. So, the performance of conventional DTC with PI speed
regulator can be improved by implementing fuzzy logic techniques.
Certain important issues in design including the space vector
modulated (SVM) 3-Ф voltage source inverter, DTFC design,
generation of reference torque using PI-type fuzzy speed regulator
and sensor less speed estimator have been resolved. The proposed
scheme is validated through extensive numerical simulations on
MATLAB. The simulated results indicate the sensor less speed
control of IM with DTFC and PI-type fuzzy speed regulator provides
satisfactory high dynamic and static performance compare to
conventional DTC with PI speed regulator.
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.
Abstract: This paper investigates the effect of International
Financial Reporting Standards (IFRS) adoption on the frequency of
earnings managements towards small positive profits. We focus on
two emerging markets IFRS adopters: South Africa and Turkey.
We tested our logistic regression using appropriate panelestimation
techniques over a sample of 330 South African and 210
Turkish firm-year observations over the period 2002-2008. Our
results document that mandatory adoption of IFRS is not associated
with a reduction in earnings management towards small positive
profits in emerging markets. These results contradict most of the
previous findings of the studies conducted in developed countries.
Based on the legal system factor, we compare the intensity of
earnings management between a code law country (Turkey) and a
common law country (South Africa) over the pre and post-adoption
periods. Our findings show that the frequency of such earnings
management practice increases significantly for the code law
country.
Abstract: A large amount of valuable information is available in
plain text clinical reports. New techniques and technologies are
applied to extract information from these reports. In this study, we
developed a domain based software system to transform 600
Otorhinolaryngology discharge notes to a structured form for
extracting clinical data from the discharge notes. In order to decrease
the system process time discharge notes were transformed into a data
table after preprocessing. Several word lists were constituted to
identify common section in the discharge notes, including patient
history, age, problems, and diagnosis etc. N-gram method was used
for discovering terms co-Occurrences within each section. Using this
method a dataset of concept candidates has been generated for the
validation step, and then Predictive Apriori algorithm for Association
Rule Mining (ARM) was applied to validate candidate concepts.
Abstract: In this paper, we address the problem of reducing the
switching activity (SA) in on-chip buses through the use of a bus
binding technique in high-level synthesis. While many binding
techniques to reduce the SA exist, we present yet another technique for
further reducing the switching activity. Our proposed method
combines bus binding and data sequence reordering to explore a wider
solution space. The problem is formulated as a multiple traveling
salesman problem and solved using simulated annealing technique.
The experimental results revealed that a binding solution obtained
with the proposed method reduces 5.6-27.2% (18.0% on average) and
2.6-12.7% (6.8% on average) of the switching activity when compared
with conventional binding-only and hybrid binding-encoding
methods, respectively.
Abstract: Recently, grid computing has been widely focused on
the science, industry, and business fields, which are required a vast
amount of computing. Grid computing is to provide the environment
that many nodes (i.e., many computers) are connected with each
other through a local/global network and it is available for many
users. In the environment, to achieve data processing among nodes
for any applications, each node executes mutual authentication by
using certificates which published from the Certificate Authority
(for short, CA). However, if a failure or fault has occurred in the
CA, any new certificates cannot be published from the CA. As
a result, a new node cannot participate in the gird environment.
In this paper, an off-the-shelf scheme for dependable grid systems
using virtualization techniques is proposed and its implementation is
verified. The proposed approach using the virtualization techniques
is to restart an application, e.g., the CA, if it has failed. The system
can tolerate a failure or fault if it has occurred in the CA. Since
the proposed scheme is implemented at the application level easily,
the cost of its implementation by the system builder hardly takes
compared it with other methods. Simulation results show that the
CA in the system can recover from its failure or fault.
Abstract: The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Abstract: The protection of the contents of digital products is
referred to as content authentication. In some applications, to be able
to authenticate a digital product could be extremely essential. For
example, if a digital product is used as a piece of evidence in the
court, its integrity could mean life or death of the accused. Generally,
the problem of content authentication can be solved using semifragile
digital watermarking techniques. Recently many authors have
proposed Computer Generated Hologram Watermarking (CGHWatermarking)
techniques. Starting from these studies, in this paper
a semi-fragile Computer Generated Hologram coding technique is
proposed, which is able to detect malicious tampering while
tolerating some incidental distortions. The proposed technique uses
as watermark an encrypted image, and it is well suitable for digital
image authentication.
Abstract: Chemical and physical functionalization of multiwalled
carbon nanotubes (MWCNT) has been commonly practiced to
achieve better dispersion of carbon nanotubes (CNTs) in polymer
matrix. This work describes various functionalization methods (acidtreatment,
non-ionic surfactant treatment with TritonX-100),
fabrication of MWCNT/PP nanocomposites via melt blending and
characterization of mechanical properties. Microscopy analysis
(FESEM, TEM, XPS) showed effective purification of MWCNTs
under acid treatment, and better dispersion under both chemical and
physical functionalization techniques combined, in their respective
order. Tensile tests showed increase in tensile strength for the
nanocomposites that contain MWCNTs up to 2 wt%. A decrease in
tensile strength was seen in samples that contain 4 wt% of MWCNTs
for both raw and Triton X-100 functionalized, signifying MWCNT
degradation/rebundling at composition with higher content of
MWCNTs. For the acid-treated MWCNTs, however, the tensile
results showed slight improvement even at 4wt%, indicating effective
dispersion of MWCNTs.
Abstract: The manufacture of large-scale precision aerospace
components using CNC requires a highly effective maintenance
strategy to ensure that the required accuracy can be achieved over
many hours of production. This paper reviews a strategy for a
maintenance management system based on Failure Mode Avoidance,
which uses advanced techniques and technologies to underpin a
predictive maintenance strategy. It is shown how condition
monitoring (CM) is important to predict potential failures in high
precision machining facilities and achieve intelligent and integrated
maintenance management. There are two distinct ways in which CM
can be applied. One is to monitor key process parameters and
observe trends which may indicate a gradual deterioration of
accuracy in the product. The other is the use of CM techniques to
monitor high status machine parameters enables trends to be
observed which can be corrected before machine failure and
downtime occurs.
It is concluded that the key to developing a flexible and intelligent
maintenance framework in any precision manufacturing operation is
the ability to evaluate reliably and routinely machine tool condition
using condition monitoring techniques within a framework of Failure
Mode Avoidance.
Abstract: To minimize power losses, it is important to
determine the location and size of local generators to be placed in
unbalanced power distribution systems. On account of some inherent
features of unbalanced distribution systems, such as radial structure,
large number of nodes, a wide range of X/R ratios, the conventional
techniques developed for the transmission systems generally fail on
the determination of optimum size and location of distributed
generators (DGs). This paper presents a simple method for
investigating the problem of contemporaneously choosing best
location and size of DG in three-phase unbalanced radial distribution
system (URDS) for power loss minimization and to improve the
voltage profile of the system. Best location of the DG is determined
by using voltage index analysis and size of DG is computed by
variational technique algorithm according to available standard size
of DGs. This paper presents the results of simulations for 25-bus and
IEEE 37- bus Unbalanced Radial Distribution system.
Abstract: In this paper we propose an NLP-based method for
Ontology Population from texts and apply it to semi automatic
instantiate a Generic Knowledge Base (Generic Domain Ontology) in
the risk management domain. The approach is semi-automatic and
uses a domain expert intervention for validation. The proposed
approach relies on a set of Instances Recognition Rules based on
syntactic structures, and on the predicative power of verbs in the
instantiation process. It is not domain dependent since it heavily
relies on linguistic knowledge.
A description of an experiment performed on a part of the
ontology of the PRIMA1 project (supported by the European
community) is given. A first validation of the method is done by
populating this ontology with Chemical Fact Sheets from
Environmental Protection Agency2. The results of this experiment
complete the paper and support the hypothesis that relying on the
predicative power of verbs in the instantiation process improves the
performance.
Abstract: Testing is an activity that is required both in the
development and maintenance of the software development life cycle
in which Integration Testing is an important activity. Integration
testing is based on the specification and functionality of the software
and thus could be called black-box testing technique. The purpose of
integration testing is testing integration between software
components. In function or system testing, the concern is with overall
behavior and whether the software meets its functional specifications
or performance characteristics or how well the software and
hardware work together. This explains the importance and necessity
of IT for which the emphasis is on interactions between modules and
their interfaces. Software errors should be discovered early during
IT to reduce the costs of correction. This paper introduces a new type
of integration error, presenting an overview of Integration Testing
techniques with comparison of each technique and also identifying
which technique detects what type of error.
Abstract: This paper will discuss about an active power generator scheduling method in order to increase the limit level of steady state systems. Some power generator optimization methods such as Langrange, PLN (Indonesian electricity company) Operation, and the proposed Z-Thevenin-based method will be studied and compared in respect of their steady state aspects. A method proposed in this paper is built upon the thevenin equivalent impedance values between each load respected to each generator. The steady state stability index obtained with the REI DIMO method. This research will review the 500kV-Jawa-Bali interconnection system. The simulation results show that the proposed method has the highest limit level of steady state stability compared to other optimization techniques such as Lagrange, and PLN operation. Thus, the proposed method can be used to create the steady state stability limit of the system especially in the peak load condition.
Abstract: As the majority of faults are found in a few of its
modules so there is a need to investigate the modules that are
affected severely as compared to other modules and proper
maintenance need to be done in time especially for the critical
applications. As, Neural networks, which have been already applied
in software engineering applications to build reliability growth
models predict the gross change or reusability metrics. Neural
networks are non-linear sophisticated modeling techniques that are
able to model complex functions. Neural network techniques are
used when exact nature of input and outputs is not known. A key
feature is that they learn the relationship between input and output
through training. In this present work, various Neural Network Based
techniques are explored and comparative analysis is performed for
the prediction of level of need of maintenance by predicting level
severity of faults present in NASA-s public domain defect dataset.
The comparison of different algorithms is made on the basis of Mean
Absolute Error, Root Mean Square Error and Accuracy Values. It is
concluded that Generalized Regression Networks is the best
algorithm for classification of the software components into different
level of severity of impact of the faults. The algorithm can be used to
develop model that can be used for identifying modules that are
heavily affected by the faults.
Abstract: Inter-organizational Workflow (IOW) is commonly
used to support the collaboration between heterogeneous and
distributed business processes of different autonomous organizations
in order to achieve a common goal. E-government is considered as an
application field of IOW. The coordination of the different
organizations is the fundamental problem in IOW and remains the
major cause of failure in e-government projects. In this paper, we
introduce a new coordination model for IOW that improves the
collaboration between government administrations and that respects
IOW requirements applied to e-government. For this purpose, we
adopt a Multi-Agent approach, which deals more easily with interorganizational
digital government characteristics: distribution,
heterogeneity and autonomy. Our model integrates also different
technologies to deal with the semantic and technologic
interoperability. Moreover, it conserves the existing systems of
government administrations by offering a distributed coordination
based on interfaces communication. This is especially applied in
developing countries, where administrations are not necessary
equipped with workflow systems. The use of our coordination
techniques allows an easier migration for an e-government solution
and with a lower cost. To illustrate the applicability of the proposed
model, we present a case study of an identity card creation in Tunisia.