Abstract: Cloud Computing is a new technology that helps us to
use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An
important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the
Cloud; So it helps us to improve the efficiency. Because of it is new
technology, it has both advantages and disadvantages that are
scrutinized in this article. Then some vanguards of this technology
are studied. Afterwards we find out that Cloud Computing will have
important roles in our tomorrow life!
Abstract: As the majority of faults are found in a few of its
modules so there is a need to investigate the modules that are
affected severely as compared to other modules and proper
maintenance need to be done in time especially for the critical
applications. As, Neural networks, which have been already applied
in software engineering applications to build reliability growth
models predict the gross change or reusability metrics. Neural
networks are non-linear sophisticated modeling techniques that are
able to model complex functions. Neural network techniques are
used when exact nature of input and outputs is not known. A key
feature is that they learn the relationship between input and output
through training. In this present work, various Neural Network Based
techniques are explored and comparative analysis is performed for
the prediction of level of need of maintenance by predicting level
severity of faults present in NASA-s public domain defect dataset.
The comparison of different algorithms is made on the basis of Mean
Absolute Error, Root Mean Square Error and Accuracy Values. It is
concluded that Generalized Regression Networks is the best
algorithm for classification of the software components into different
level of severity of impact of the faults. The algorithm can be used to
develop model that can be used for identifying modules that are
heavily affected by the faults.
Abstract: The purpose of this work is measurement of the
system presampling MTF of a variable resolution x-ray (VRX) CT
scanner. In this paper, we used the parameters of an actual VRX CT
scanner for simulation and study of effect of different focal spot sizes
on system presampling MTF by Monte Carlo method (GATE
simulation software). Focal spot size of 0.6 mm limited the spatial
resolution of the system to 5.5 cy/mm at incident angles of below 17º
for cell#1. By focal spot size of 0.3 mm the spatial resolution
increased up to 11 cy/mm and the limiting effect of focal spot size
appeared at incident angles of below 9º. The focal spot size of 0.3
mm could improve the spatial resolution to some extent but because
of magnification non-uniformity, there is a 10 cy/mm difference
between spatial resolution of cell#1 and cell#256. The focal spot size
of 0.1 mm acted as an ideal point source for this system. The spatial
resolution increased to more than 35 cy/mm and at all incident angles
the spatial resolution was a function of incident angle. By the way
focal spot size of 0.1 mm minimized the effect of magnification nonuniformity.
Abstract: As seen in literature, about 70% of the improvement initiatives fail, and a significant number do not even get started. This paper analyses the problem of failing initiatives on Software Process Improvement (SPI), and proposes good practices supported by motivational tools that can help minimizing failures. It elaborates on the hypothesis that human factors are poorly addressed by deployers, especially because implementation guides usually emphasize only technical factors. This research was conducted with SPI deployers and analyses 32 SPI initiatives. The results indicate that although human factors are not commonly highlighted in guidelines, the successful initiatives usually address human factors implicitly. This research shows that practices based on human factors indeed perform a crucial role on successful implantations of SPI, proposes change management as a theoretical framework to introduce those practices in the SPI context and suggests some motivational tools based on SPI deployers experience to support it.
Abstract: Virtual Assembly (VA) is one of the key technologies
in advanced manufacturing field. It is a promising application of
virtual reality in design and manufacturing field. It has drawn much
interest from industries and research institutes in the last two decades.
This paper describes a process for integrating an interactive Virtual
Reality-based assembly simulation of a digital mockup with the
CAD/CAM infrastructure. The necessary hardware and software
preconditions for the process are explained so that it can easily be
adopted by non VR experts. The article outlines how assembly
simulation can improve the CAD/CAM procedures and structures;
how CAD model preparations have to be carried out and which
virtual environment requirements have to be fulfilled. The issue of
data transfer is also explained in the paper. The other challenges and
requirements like anti-aliasing and collision detection have also been
explained. Finally, a VA simulation has been carried out for a ball
valve assembly and a car door assembly with the help of Vizard
virtual reality toolkit in a semi-immersive environment and their
performance analysis has been done on different workstations to
evaluate the importance of graphical processing unit (GPU) in the
field of VA.
Abstract: The policies governing the business of any
organization are well reflected in her business rules. The business
rules are implemented by data validation techniques, coded during
the software development process. Any change in business
policies results in change in the code written for data validation
used to enforce the business policies. Implementing the change in
business rules without changing the code is the objective of this
paper. The proposed approach enables users to create rule sets at
run time once the software has been developed. The newly defined
rule sets by end users are associated with the data variables for
which the validation is required. The proposed approach facilitates
the users to define business rules using all the comparison
operators and Boolean operators. Multithreading is used to
validate the data entered by end user against the business rules
applied. The evaluation of the data is performed by a newly
created thread using an enhanced form of the RPN (Reverse Polish
Notation) algorithm.
Abstract: This paper presents the result of three senior capstone
projects at the Department of Computer Engineering, Prince of
Songkla University, Thailand. These projects focus on developing an
examination management system for the Faculty of Engineering in
order to manage the examination both the examination room
assignments and the examination proctor assignments in each room.
The current version of the software is a web-based application. The
developed software allows the examination proctors to select their
scheduled time online while each subject is assigned to each available
examination room according to its type and the room capacity. The
developed system is evaluated using real data by prospective users of
the system. Several suggestions for further improvements are given
by the testers. Even though the features of the developed software are
not superior, the developing process can be a case study for a projectbased
teaching style. Furthermore, the process of developing this
software can show several issues in developing an educational
support application.
Abstract: Inadequate curriculum for software engineering is considered to be one of the most common software risks. A number of solutions, on improving Software Engineering Education (SEE) have been reported in literature but there is a need to collectively present these solutions at one place. We have performed a mapping study to present a broad view of literature; published on improving the current state of SEE. Our aim is to give academicians, practitioners and researchers an international view of the current state of SEE. Our study has identified 70 primary studies that met our selection criteria, which we further classified and categorized in a well-defined Software Engineering educational framework. We found that the most researched category within the SE educational framework is Innovative Teaching Methods whereas the least amount of research was found in Student Learning and Assessment category. Our future work is to conduct a Systematic Literature Review on SEE.
Abstract: We have developed a computer program consisting of
6 subtests assessing the children hand dexterity applicable in the
rehabilitation medicine. We have carried out a normative study on a
representative sample of 285 children aged from 7 to 15 (mean age
11.3) and we have proposed clinical standards for three age groups
(7-9, 9-11, 12-15 years). We have shown statistical significance of
differences among the corresponding mean values of the task time
completion. We have also found a strong correlation between the task
time completion and the age of the subjects, as well as we have
performed the test-retest reliability checks in the sample of 84
children, giving the high values of the Pearson coefficients for the
dominant and non-dominant hand in the range 0.740.97 and
0.620.93, respectively.
A new MATLAB-based programming tool aiming at analysis of
cardiologic RR intervals and blood pressure descriptors, is worked
out, too. For each set of data, ten different parameters are extracted: 2
in time domain, 4 in frequency domain and 4 in Poincaré plot
analysis. In addition twelve different parameters of baroreflex
sensitivity are calculated. All these data sets can be visualized in time
domain together with their power spectra and Poincaré plots. If
available, the respiratory oscillation curves can be also plotted for
comparison. Another application processes biological data obtained
from BLAST analysis.
Abstract: This paper provides a framework in order to
incorporate reliability issue as a sign of disruption in distribution
systems and partial covering theory as a response to limitation in
coverage radios and economical preferences, simultaneously into the
traditional literatures of capacitated facility location problems. As a
result we develop a bi-objective model based on the discrete
scenarios for expected cost minimization and demands coverage
maximization through a three echelon supply chain network by
facilitating multi-capacity levels for provider side layers and
imposing gradual coverage function for distribution centers (DCs).
Additionally, in spite of objectives aggregation for solving the model
through LINGO software, a branch of LP-Metric method called Min-
Max approach is proposed and different aspects of corresponds
model will be explored.
Abstract: Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
Abstract: This paper proposes a new of cloud computing for individual computer users to share applications in distributed communities, called community-based personal cloud computing (CPCC). The paper also presents a prototype design and implementation of CPCC. The users of CPCC are able to share their computing applications with other users of the community. Any member of the community is able to execute remote applications shared by other members. The remote applications behave in the same way as their local counterparts, allowing the user to enter input, receive output as well as providing the access to the local data of the user. CPCC provides a peer-to-peer (P2P) environment where each peer provides applications which can be used by the other peers that are connected CPCC.
Abstract: Unsatisfactory effectiveness of software systems
development and enhancement projects is one of the main reasons
why in software engineering there are attempts being made to use
experiences coming from other engineering disciplines. In spite of
specificity of software product and process a belief had come out that
the execution of software could be more effective if these objects
were subject to measurement – as it is true in other engineering
disciplines for which measurement is an immanent feature. Thus
objective and reliable approaches to the measurement of software
processes and products have been sought in software engineering for
several dozens of years already. This may be proved, among others,
by the current version of CMMI for Development model. This paper
is aimed at analyzing the approach to the software processes and
products measurement proposed in the latest version of this very
model, indicating growing acceptance for this issue in software
engineering.
Abstract: In today-s hip hop world where everyone is running
short of time and works hap hazardly,the similar scene is common on
the roads while in traffic.To do away with the fatal consequences of
such speedy traffics on rushy lanes, a software to analyse and keep
account of the traffic and subsequent conjestion is being used in the
developed countries. This software has being implemented and used
with the help of a suppprt tool called Critical Analysis Reporting
Environment.There has been two existing versions of this tool.The
current research paper involves examining the issues and probles
while using these two practically. Further a hybrid architecture is
proposed for the same that retains the quality and performance of
both and is better in terms of coupling of components , maintainence
and many other features.
Abstract: This paper examines the modeling and analysis of a
cruise control system using a Petri net based approach, task graphs,
invariant analysis and behavioral properties. It shows how the
structures used can be verified and optimized.
Abstract: Decision Support System (DSS) are interactive
software systems that are built to assist the management of an
organization in the decision making process when faced with nonroutine
problems in a specific application domain. Non-functional
requirements (NFRs) for a DSS deal with the desirable qualities and
restrictions that the DSS functionalities must satisfy. Unlike the
functional requirements, which are tangible functionalities provided
by the DSS, NFRs are often hidden and transparent to DSS users but
affect the quality of the provided functionalities. NFRs are often
overlooked or added later to the system in an ad hoc manner, leading
to a poor overall quality of the system. In this paper, we discuss the
development of NFRs as part of the requirements engineering phase
of the system development life cycle of DSSs. To help eliciting
NFRs, we provide a comprehensive taxonomy of NFRs for DSSs.
Abstract: Library management systems are commonly used in
all educational related institutes. Many commercial products are
available. However, many institutions may not be able to afford the
cost of using commercial products. Therefore, an alternative solution
in such situations would be open source software. This paper is
focusing on reviewing open source library management system
packages currently available. The review will focus on the abilities to
perform four basic components which are traditional services,
interlibrary load management, managing electronic materials and
basic common management system such as security, alert system and
statistical reports. In addition, environment, basic requirement and
supporting aspects of each open source package are also mentioned.
Abstract: The cost of developing the software from scratch can
be saved by identifying and extracting the reusable components from
already developed and existing software systems or legacy systems
[6]. But the issue of how to identify reusable components from
existing systems has remained relatively unexplored. We have used
metric based approach for characterizing a software module. In this
present work, the metrics McCabe-s Cyclometric Complexity
Measure for Complexity measurement, Regularity Metric, Halstead
Software Science Indicator for Volume indication, Reuse Frequency
metric and Coupling Metric values of the software component are
used as input attributes to the different types of Neural Network
system and reusability of the software component is calculated. The
results are recorded in terms of Accuracy, Mean Absolute Error
(MAE) and Root Mean Squared Error (RMSE).
Abstract: In cryptography, confusion and diffusion are very
important to get confidentiality and privacy of message in block
ciphers and stream ciphers. There are two types of network to provide
confusion and diffusion properties of message in block ciphers. They
are Substitution- Permutation network (S-P network), and Feistel
network. NLFS (Non-Linear feedback stream cipher) is a fast and
secure stream cipher for software application. NLFS have two modes
basic mode that is synchronous mode and self synchronous mode.
Real random numbers are non-deterministic. R-box (random box)
based on the dynamic properties and it performs the stochastic
transformation of data that can be used effectively meet the
challenges of information is protected from international destructive
impacts. In this paper, a new implementation of stochastic
transformation will be proposed.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.