Abstract: Processing the data by computers and performing
reasoning tasks is an important aim in Computer Science. Semantic
Web is one step towards it. The use of ontologies to enhance the
information by semantically is the current trend. Huge amount of
domain specific, unstructured on-line data needs to be expressed in
machine understandable and semantically searchable format.
Currently users are often forced to search manually in the results
returned by the keyword-based search services. They also want to use
their native languages to express what they search. In this paper, an
ontology-based automated question answering system on software
test documents domain is presented. The system allows users to enter
a question about the domain by means of natural language and
returns exact answer of the questions. Conversion of the natural
language question into the ontology based query is the challenging
part of the system. To be able to achieve this, a new algorithm
regarding free text to ontology based search engine query conversion
is proposed. The algorithm is based on investigation of suitable
question type and parsing the words of the question sentence.
Abstract: Reverse Engineering is a very important process in
Software Engineering. It can be performed backwards from system
development life cycle (SDLC) in order to get back the source data
or representations of a system through analysis of its structure,
function and operation. We use reverse engineering to introduce an
automatic tool to generate system requirements from its program
source codes. The tool is able to accept the Cµ programming source
codes, scan the source codes line by line and parse the codes to
parser. Then, the engine of the tool will be able to generate system
requirements for that specific program to facilitate reuse and
enhancement of the program. The purpose of producing the tool is to
help recovering the system requirements of any system when the
system requirements document (SRD) does not exist due to
undocumented support of the system.
Abstract: The world wide web coupled with the ever-increasing
sophistication of online technologies and software applications puts
greater emphasis on the need of even more sophisticated and
consistent quality requirements modeling than traditional software
applications. Web sites and Web applications (WebApps) are
becoming more information driven and content-oriented raising the
concern about their information quality (InQ). The consistent and
consolidated modeling of InQ requirements for WebApps at different
stages of the life cycle still poses a challenge. This paper proposes an
approach to specify InQ requirements for WebApps by reusing and
extending the ISO 25012:2008(E) data quality model. We also
discuss learnability aspect of information quality for the WebApps.
The proposed ISO 25012 based InQ framework is a step towards a
standardized approach to evaluate WebApps InQ.
Abstract: One of the common problems encountered in software
engineering is addressing and responding to the changing nature of
requirements. While several approaches have been devised to address
this issue, ranging from instilling resistance to changing requirements
in order to mitigate impact to project schedules, to developing an
agile mindset towards requirements, the approach discussed in this
paper is one of conceptualizing the delta in requirement and
modeling it, in order to plan a response to it. To provide some
context here, change is first formally identified and categorized as
either formal change or informal change. While agile methodology
facilitates informal change, the approach discussed in this paper
seeks to develop the idea of facilitating formal change. To collect,
document meta-requirements that represent the phenomena of change
would be a pro-active measure towards building a realistic cognition
of the requirements entity that can further be harnessed in the
software engineering process.
Abstract: A new and cost effective robotic device was designed
for remote tele surgery using dual tone multi frequency technology
(DTMF). Tele system with Dual Tone Multiple Frequency has a large
capability in sending and receiving of data in hardware and software.
The robot consists of DC motors for arm movements and it is
controlled manually through a mobile phone through DTMF
Technology. The system enables the surgeon from base station to
send commands through mobile phone to the patient’s robotic system
which includes two robotic arms that translate the input into actual
instrument manipulation. A mobile phone attached to the
microcontroller 8051 which can activate robot through relays. The
Remote robot-assisted tele surgery eliminates geographic constraints
for getting surgical expertise where it is needed and allows an expert
surgeon to teach or proctor the performance of surgical technique by
real-time intervention.
Abstract: Dredging activities inevitably cause sediment
dispersion. In certain locations, where there are important ecological
areas such as mangroves or coral reefs, carefully planning the
dredging can significantly reduce negative impacts. This article
utilizes the dredging at Phuket port, Thailand, as a case study to
demonstrate how computer simulations can be helpful to protect
existing coral reefs. A software package named MIKE21 was
applied. Necessary information required by the simulations was
gathered. After calibrating and verifying the model, various dredging
scenario were simulated to predict spoil movement. The simulation
results were used as guidance to setting up an environmental
measure. Finally, the recommendation to dredge during flood tide
with silt curtains installed was made.
Abstract: This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.
Abstract: Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Abstract: A new approach for protection of power transformer is
presented using a time-frequency transform known as Wavelet transform.
Different operating conditions such as inrush, Normal, load,
External fault and internal fault current are sampled and processed
to obtain wavelet coefficients. Different Operating conditions provide
variation in wavelet coefficients. Features like energy and Standard
deviation are calculated using Parsevals theorem. These features
are used as inputs to PNN (Probabilistic neural network) for fault
classification. The proposed algorithm provides more accurate results
even in the presence of noise inputs and accurately identifies inrush
and fault currents. Overall classification accuracy of the proposed
method is found to be 96.45%. Simulation of the fault (with and
without noise) was done using MATLAB AND SIMULINK software
taking 2 cycles of data window (40 m sec) containing 800 samples.
The algorithm was evaluated by using 10 % Gaussian white noise.
Abstract: Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.
Abstract: The impact of OO design on software quality
characteristics such as defect density and rework by mean of
experimental validation. Encapsulation, inheritance, polymorphism,
reusability, Data hiding and message-passing are the major attribute
of an Object Oriented system. In order to evaluate the quality of an
Object oriented system the above said attributes can act as indicators.
The metrics are the well known quantifiable approach to express any
attribute. Hence, in this paper we tried to formulate a framework of
metrics representing the attributes of object oriented system.
Empirical Data is collected from three different projects based on
object oriented paradigms to calculate the metrics.
Abstract: Computer aided design accounts with the support of
parametric software in the design of machine components as well as
of any other pieces of interest. The complexities of the element under
study sometimes offer certain difficulties to computer design, or ever
might generate mistakes in the final body conception. Reverse
engineering techniques are based on the transformation of already
conceived body images into a matrix of points which can be
visualized by the design software. The literature exhibits several
techniques to obtain machine components dimensional fields, as
contact instrument (MMC), calipers and optical methods as laser
scanner, holograms as well as moiré methods. The objective of this
research work was to analyze the moiré technique as instrument of
reverse engineering, applied to bodies of nom complex geometry as
simple solid figures, creating matrices of points. These matrices were
forwarded to a parametric software named SolidWorks to generate
the virtual object. Volume data obtained by mechanical means, i.e.,
by caliper, the volume obtained through the moiré method and the
volume generated by the SolidWorks software were compared and
found to be in close agreement. This research work suggests the
application of phase shifting moiré methods as instrument of reverse
engineering, serving also to support farm machinery element designs.
Abstract: Flat double-layer grid is from category of space structures that are formed from two flat layers connected together with diagonal members. Increased stiffness and better seismic resistance in relation to other space structures are advantages of flat double layer space structures. The objective of this study is assessment and calculation of Behavior factor of flat double layer space structures. With regarding that these structures are used widely but Behavior factor used to design these structures against seismic force is not determined and exact, the necessity of study is obvious. This study is theoretical. In this study we used structures with span length of 16m and 20 m. All connections are pivotal. ANSYS software is used to non-linear analysis of structures.
Abstract: In this paper we report a study aimed at determining
the effects of animation on usability and appeal of educational
software user interfaces. Specifically, the study compares 3
interfaces developed for the Mathsigner™ program: a static
interface, an interface with highlighting/sound feedback, and an
interface that incorporates five Disney animation principles. The
main objectives of the comparative study were to: (1) determine
which interface is the most effective for the target users of
Mathsigner™ (e.g., children ages 5-11), and (2) identify any Gender
and Age differences in using the three interfaces. To accomplish
these goals we have designed an experiment consisting of a
cognitive walkthrough and a survey with rating questions. Sixteen
children ages 7-11 participated in the study, ten males and six
females. Results showed no significant interface effect on user task
performance (e.g., task completion time and number of errors);
however, interface differences were seen in rating of appeal, with
the animated interface rated more 'likeable' than the other two.
Task performance and rating of appeal were not affected
significantly by Gender or Age of the subjects.
Abstract: In this project electrical and optical properties of
BaZrO3 have been accomplished through the full-potential
linear augmented plane wave (FP-LAPW) by applying Wein2k
software. In this study band structure, density of state, gap energy,
refractive index and optical conduction have been studied. The results
of calculations show that BaZrO3 is an insulator with an indirect gap
in which 3.2 ev and studied refractive index equal 2.07. These results
are in accordance with the ones obtained in experimental researches.
Abstract: Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.
Abstract: This study focuses on bureau management
technologies and information systems in developing countries.
Developing countries use such systems which facilitate executive and
organizational functions through the utilization of bureau
management technologies and provide the executive staff with
necessary information.
The concepts of data and information differ from each other in
developing countries, and thus the concepts of data processing and
information processing are different. Symbols represent ideas,
objects, figures, letters and numbers. Data processing system is an
integrated system which deals with the processing of the data related
to the internal and external environment of the organization in order
to make decisions, create plans and develop strategies; it goes
without saying that this system is composed of both human beings
and machines. Information is obtained through the acquisition and
the processing of data. On the other hand, data are raw
communicative messages. Within this framework, data processing
equals to producing plausible information out of raw data.
Organizations in developing countries need to obtain information
relevant to them because rapid changes in the organizational arena
require rapid access to accurate information. The most significant
role of the directors and managers who work in the organizational
arena is to make decisions. Making a correct decision is possible only
when the directors and managers are equipped with sound ideas and
appropriate information. Therefore, acquisition, organization and
distribution of information gain significance. Today-s organizations
make use of computer-assisted “Management Information Systems"
in order to obtain and distribute information.
Decision Support System which is closely related to practice is an
information system that facilitates the director-s task of making
decisions. Decision Support System integrates human intelligence,
information technology and software in order to solve the complex
problems. With the support of the computer technology and software
systems, Decision Support System produces information relevant to
the decision to be made by the director and provides the executive
staff with supportive ideas about the decision.
Artificial Intelligence programs which transfer the studies and
experiences of the people to the computer are called expert systems.
An expert system stores expert information in a limited area and can
solve problems by deriving rational consequences.
Bureau management technologies and information systems in
developing countries create a kind of information society and
information economy which make those countries have their places
in the global socio-economic structure and which enable them to play
a reasonable and fruitful role; therefore it is of crucial importance to
make use of information and management technologies in order to
work together with innovative and enterprising individuals and it is
also significant to create “scientific policies" based on information
and technology in the fields of economy, politics, law and culture.
Abstract: Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.
Abstract: An adaptive software reliability prediction model
using evolutionary connectionist approach based on Recurrent Radial
Basis Function architecture is proposed. Based on the currently
available software failure time data, Fuzzy Min-Max algorithm is
used to globally optimize the number of the k Gaussian nodes. The
corresponding optimized neural network architecture is iteratively
and dynamically reconfigured in real-time as new actual failure time
data arrives. The performance of our proposed approach has been
tested using sixteen real-time software failure data. Numerical results
show that our proposed approach is robust across different software
projects, and has a better performance with respect to next-steppredictability
compared to existing neural network model for failure
time prediction.
Abstract: Software maintenance and mainly software
comprehension pose the largest costs in the software lifecycle. In
order to assess the cost of software comprehension, various
complexity measures have been proposed in the literature. This paper
proposes new cognitive-spatial complexity measures, which combine
the impact of spatial as well as architectural aspect of the software to
compute the software complexity. The spatial aspect of the software
complexity is taken into account using the lexical distances (in
number of lines of code) between different program elements and the
architectural aspect of the software complexity is taken into
consideration using the cognitive weights of control structures
present in control flow of the program. The proposed measures are
evaluated using standard axiomatic frameworks and then, the
proposed measures are compared with the corresponding existing
cognitive complexity measures as well as the spatial complexity
measures for object-oriented software. This study establishes that the
proposed measures are better indicators of the cognitive effort
required for software comprehension than the other existing
complexity measures for object-oriented software.