Abstract: Software fault prediction models are created by using
the source code, processed metrics from the same or previous version
of code and related fault data. Some company do not store and keep
track of all artifacts which are required for software fault prediction.
To construct fault prediction model for such company, the training
data from the other projects can be one potential solution. Earlier we
predicted the fault the less cost it requires to correct. The training
data consists of metrics data and related fault data at function/module
level. This paper investigates fault predictions at early stage using the
cross-project data focusing on the design metrics. In this study,
empirical analysis is carried out to validate design metrics for cross
project fault prediction. The machine learning techniques used for
evaluation is Naïve Bayes. The design phase metrics of other projects
can be used as initial guideline for the projects where no previous
fault data is available. We analyze seven datasets from NASA
Metrics Data Program which offer design as well as code metrics.
Overall, the results of cross project is comparable to the within
company data learning.
Abstract: In educational technology, the idea of innovation is
usually tethered to contemporary technological inventions and
emerging technologies. Yet, using long-known technologies in ways
that are pedagogically or experimentially new can reposition them as
emerging educational technologies. In this study we explore how a
subtle pivot in pedagogical thinking led to an innovative education
technology. We describe the design and implementation of an online
writing tool that scaffolds students in the evaluation of their own
informational texts. We think about how pathways to innovation can
emerge from pivots, namely a leveraging of longstanding practices in
novel ways has the potential to cultivate new opportunities for
learning. We first unpack Infowriter in terms of its design, then we
describe some results of a study in which we implemented an
intervention which included our designed application.
Abstract: Transportation of long turbine blades from one place
to another is a difficult process. Hence a feasibility study of
modularization of wind turbine blade was taken from structural
standpoint through finite element analysis. Initially, a non-segmented
blade is modeled and its structural behavior is evaluated to serve as
reference. The resonant, static bending and fatigue tests are simulated
in accordance with IEC61400-23 standard for comparison purpose.
The non-segmented test blade is separated at suitable location based
on trade off studies and the segments are joined with an innovative
double strap bonded joint configuration. The adhesive joint is
modeled by adopting cohesive zone modeling approach in ANSYS.
The developed blade model is analyzed for its structural response
through simulation. Performances of both the blades are found to be
similar, which indicates that, efficient segmentation of the long blade
is possible which facilitates easy transportation of the blades and on
site reassembling. The location selected for segmentation and
adopted joint configuration has resulted in an efficient segmented
blade model which proves the methodology adopted for segmentation
was quite effective. The developed segmented blade appears to be the
viable alternative considering its structural response specifically in
fatigue within considered assumptions.
Abstract: The paper deals with the classical fiber bundle model
of equal load sharing, sometimes referred to as the Daniels’ bundle
or the democratic bundle. Daniels formulated a multidimensional
integral and also a recursive formula for evaluation of the
strength cumulative distribution function. This paper describes
three algorithms for evaluation of the recursive formula and also
their implementations with source codes in the Python high-level
programming language. A comparison of the algorithms are provided
with respect to execution time. Analysis of orders of magnitudes of
addends in the recursion is also provided.
Abstract: In this contribution two approaches for calculating
optimal trajectories for highly automated vehicles are presented and
compared. The first one is based on a non-linear vehicle model, used
for evaluation. The second one is based on a simplified model and
can be implemented on a current ECU. In usual driving situations
both approaches show very similar results.
Abstract: All current experimental methods for determination of
stress intensity factors are based on the assumption that the state of
stress near the crack tip is plane stress. Therefore, these methods rely
on strain and displacement measurements made outside the near
crack tip region affected by the three-dimensional effects or by
process zone. In this paper, we develop and validate an experimental
procedure for the evaluation of stress intensity factors from the
measurements of the out-of-plane displacements in the surface area
controlled by 3D effects. The evaluation of stress intensity factors is
possible when the process zone is sufficiently small, and the
displacement field generated by the 3D effects is fully encapsulated
by K-dominance region.
Abstract: Microscopic simulation tool kits allow for
consideration of the two processes of railway operations and the
previous timetable production. Block occupation conflicts on both
process levels are often solved by using defined train priorities. These
conflict resolutions (dispatching decisions) generate reactionary
delays to the involved trains. The sum of reactionary delays is
commonly used to evaluate the quality of railway operations, which
describes the timetable robustness. It is either compared to an
acceptable train performance or the delays are appraised
economically by linear monetary functions. It is impossible to
adequately evaluate dispatching decisions without a well-founded
objective function. This paper presents a new approach for the
evaluation of dispatching decisions. The approach uses mode choice
models and considers the behaviour of the end-customers. These
models evaluate the reactionary delays in more detail and consider
other competing modes of transport. The new approach pursues the
coupling of a microscopic model of railway operations with the
macroscopic choice mode model. At first, it will be implemented for
railway operations process but it can also be used for timetable
production. The evaluation considers the possibility for the customer
to interchange to other transport modes. The new approach starts to
look at rail and road, but it can also be extended to air travel. The
result of mode choice models is the modal split. The reactions by the
end-customers have an impact on the revenue of the train operating
companies. Different purposes of travel have different payment
reserves and tolerances towards late running. Aside from changes to
revenues, longer journey times can also generate additional costs.
The costs are either time- or track-specific and arise from required
changes to rolling stock or train crew cycles. Only the variable values
are summarised in the contribution margin, which is the base for the
monetary evaluation of delays. The contribution margin is calculated
for different possible solutions to the same conflict. The conflict
resolution is optimised until the monetary loss becomes minimal. The
iterative process therefore determines an optimum conflict resolution
by monitoring the change to the contribution margin. Furthermore, a
monetary value of each dispatching decision can also be derived.
Abstract: This paper presents the development of a robot car
that can track the motion of an object by detecting its color through
an Android device. The employed computer vision algorithm uses the
OpenCV library, which is embedded into an Android application of a
smartphone, for manipulating the captured image of the object. The
captured image of the object is subjected to color conversion and is
transformed to a binary image for further processing after color
filtering. The desired object is clearly determined after removing
pixel noise by applying image morphology operations and contour
definition. Finally, the area and the center of the object are
determined so that object’s motion to be tracked. The smartphone
application has been placed on a robot car and transmits by Bluetooth
to an Arduino assembly the motion directives so that to follow
objects of a specified color. The experimental evaluation of the
proposed algorithm shows reliable color detection and smooth
tracking characteristics.
Abstract: The seismic risk mitigation from the perspective of
the old buildings stock is truly essential in Algerian urban areas,
particularly those located in seismic prone regions, such as Annaba
city, and which the old buildings present high levels of degradation
associated with no seismic strengthening and/or rehabilitation
concerns. In this sense, the present paper approaches the issue of the
seismic vulnerability assessment of old masonry building stocks
through the adaptation of a simplified methodology developed for a
European context area similar to that of Annaba city, Algeria.
Therefore, this method is used for the first level of seismic
vulnerability assessment of the masonry buildings stock of the old
city center of Annaba. This methodology is based on a vulnerability
index that is suitable for the evaluation of damage and for the
creation of large-scale loss scenarios. Over 380 buildings were
evaluated in accordance with the referred methodology and the
results obtained were then integrated into a Geographical Information
System (GIS) tool. Such results can be used by the Annaba city
council for supporting management decisions, based on a global view
of the site under analysis, which led to more accurate and faster
decisions for the risk mitigation strategies and rehabilitation plans.
Abstract: Design concepts of real-time embedded system can be
realized initially by introducing novel design approaches. In this
literature, model based design approach and in-the-loop testing were
employed early in the conceptual and preliminary phase to formulate
design requirements and perform quick real-time verification. The
design and analysis methodology includes simulation analysis, model
based testing, and in-the-loop testing. The design of conceptual driveby-
wire, or DBW, algorithm for electronic control unit, or ECU, was
presented to demonstrate the conceptual design process, analysis, and
functionality evaluation. The concepts of DBW ECU function can be
implemented in the vehicle system to improve electric vehicle, or EV,
conversion drivability. However, within a new development process,
conceptual ECU functions and parameters are needed to be evaluated.
As a result, the testing system was employed to support conceptual
DBW ECU functions evaluation. For the current setup, the system
components were consisted of actual DBW ECU hardware, electric
vehicle models, and control area network or CAN protocol. The
vehicle models and CAN bus interface were both implemented as
real-time applications where ECU and CAN protocol functionality
were verified according to the design requirements. The proposed
system could potentially benefit in performing rapid real-time
analysis of design parameters for conceptual system or software
algorithm development.
Abstract: We evaluate the performance of a numerical method
for global optimization of expensive functions. The method is using a
response surface to guide the search for the global optimum. This
metamodel could be based on radial basis functions, kriging, or a
combination of different models. We discuss how to set the cyclic
parameters of the optimization method to get a balance between local
and global search. We also discuss the eventual problem with Runge
oscillations in the response surface.
Abstract: The Scheduling and mapping of tasks on a set of
processors is considered as a critical problem in parallel and
distributed computing system. This paper deals with the problem of
dynamic scheduling on a special type of multiprocessor architecture
known as Linear Crossed Cube (LCQ) network. This proposed
multiprocessor is a hybrid network which combines the features of
both linear types of architectures as well as cube based architectures.
Two standard dynamic scheduling schemes namely Minimum
Distance Scheduling (MDS) and Two Round Scheduling (TRS)
schemes are implemented on the LCQ network. Parallel tasks are
mapped and the imbalance of load is evaluated on different set of
processors in LCQ network. The simulations results are evaluated
and effort is made by means of through analysis of the results to
obtain the best solution for the given network in term of load
imbalance left and execution time. The other performance matrices
like speedup and efficiency are also evaluated with the given
dynamic algorithms.
Abstract: This study presented to reduce earthquake damage and
emergency rehabilitation of critical structures such as schools, hightech
factories, and hospitals due to strong ground motions associated
with climate changes. Regarding recent trend, a strong earthquake
causes serious damage to critical structures and then the critical
structure might be influenced by sequence aftershocks (or tsunami)
due to fault plane adjustments. Therefore, in order to improve seismic
performance of critical structures, retrofitted or strengthening study
of the structures under aftershocks sequence after emergency
rehabilitation of the structures subjected to strong earthquakes is
widely carried out. Consequently, this study used composite material
for emergency rehabilitation of the structure rather than concrete and
steel materials because of high strength and stiffness, lightweight,
rapid manufacturing, and dynamic performance. Also, this study was
to develop or improve the seismic performance or seismic retrofit of
critical structures subjected to strong ground motions and earthquake
aftershocks, by utilizing GFRP-Corrugated Infill Panels (GCIP).
Abstract: A central element of higher education today is the
“core” or “general education” curriculum: that configuration of
courses that often encompasses the essence of liberal arts education.
Ensuring that such offerings reflect the mission and values of the
institution is a challenge faced by most college and universities, often
more than once. This paper presents an action model of program
planning designed to structure the processes of developing,
implementing and revising core curricula in a manner consistent with
key institutional goals and objectives. Through presentation of a case
study from a university in the United States, the elements of needs
assessment, stakeholder investment and collaborative compromise
are shown as key components of a planning strategy that can produce
a general education program that is comprehensive, academically
rigorous, assessable, and mission consistent. The paper concludes
with recommendations for both the implementation and evaluation of
such programs in practice.
Abstract: Every year, a considerable amount of money is being
invested on research, mainly in the form of funding allocated to
universities and research institutes. To better distribute the available
funds and to set the most proper R&D investment strategies for the
future, evaluation of the productivity of the funded researchers and
the impact of such funding is crucial. In this paper, using the data on
15 years of journal publications of the NSERC (Natural Sciences and
Engineering research Council of Canada) funded researchers and by
means of bibliometric analysis, the scientific development of the
funded researchers and their scientific collaboration patterns will be
investigated in the period of 1996-2010. According to the results it
seems that there is a positive relation between the average level of
funding and quantity and quality of the scientific output. In addition,
whenever funding allocated to the researchers has increased, the
number of co-authors per paper has also augmented. Hence, the
increase in the level of funding may enable researchers to get
involved in larger projects and/or scientific teams and increase their
scientific output respectively.
Abstract: Non contact evaluation of the thickness of paint
coatings can be attempted by different destructive and nondestructive
methods such as cross-section microscopy, gravimetric mass
measurement, magnetic gauges, Eddy current, ultrasound or
terahertz. Infrared thermography is a nondestructive and non-invasive
method that can be envisaged as a useful tool to measure the surface
thickness variations by analyzing the temperature response. In this
paper, the thermal quadrupole method for two layered samples heated
up with a pulsed excitation is firstly used. By analyzing the thermal
responses as a function of thermal properties and thicknesses of both
layers, optimal parameters for the excitation source can be identified.
Simulations show that a pulsed excitation with duration of ten
milliseconds allows obtaining a substrate-independent thermal
response. Based on this result, an experimental setup consisting of a
near-infrared laser diode and an Infrared camera was next used to
evaluate the variation of paint coating thickness between 60 μm and
130 μm on two samples. Results show that the parameters extracted
for thermal images are correlated with the estimated thicknesses by
the Eddy current methods. The laser pulsed thermography is thus an
interesting alternative nondestructive method that can be moreover
used for nonconductive substrates.
Abstract: Grid is an environment with millions of resources
which are dynamic and heterogeneous in nature. A computational
grid is one in which the resources are computing nodes and is meant
for applications that involves larger computations. A scheduling
algorithm is said to be efficient if and only if it performs better
resource allocation even in case of resource failure. Resource
allocation is a tedious issue since it has to consider several
requirements such as system load, processing cost and time, user’s
deadline and resource failure. This work attempts in designing a
resource allocation algorithm which is cost-effective and also targets
at load balancing, fault tolerance and user satisfaction by considering
the above requirements. The proposed Budget Constrained Load
Balancing Fault Tolerant algorithm with user satisfaction (BLBFT)
reduces the schedule makespan, schedule cost and task failure rate
and improves resource utilization. Evaluation of the proposed
BLBFT algorithm is done using Gridsim toolkit and the results are
compared with the algorithms which separately concentrates on all
these factors. The comparison results ensure that the proposed
algorithm works better than its counterparts.
Abstract: The principle of the seismic performance evaluation methods is to provide a measure of capability for a building or set of buildings to be damaged by an earthquake. The common objective of many of these methods is to supply classification criteria. The purpose of this study is to present a method for assessing the seismic performance of structures, based on Pushover method; we are particularly interested in reinforced concrete frame structures, which represent a significant percentage of damaged structures after a seismic event. The work is based on the characterization of seismic movement of the various earthquake zones in terms of PGA and PGD that is obtained by means of SIMQK_GR and PRISM software and the correlation between the points of performance and the scalar characterizing the earthquakes will developed.
Abstract: This paper is focused on the CFD simulation of the radiaxial pump (i.e. mixed flow pump) with the aim to detect the reasons of Y-Q characteristic instability. The main reasons of pressure pulsations were detected by means of the analysis of velocity and pressure fields within the pump combined with the theoretical approach. Consequently, the modifications of spiral case and pump suction area were made based on the knowledge of flow conditions and the shape of dissipation function. The primary design of pump geometry was created as the base model serving for the comparison of individual modification influences. The basic experimental data are available for this geometry. This approach replaced the more complicated and with respect to convergence of all computational tasks more difficult calculation for the compressible liquid flow. The modification of primary pump consisted in inserting the three fins types. Subsequently, the evaluation of pressure pulsations, specific energy curves and visualization of velocity fields were chosen as the criterion for successful design.
Abstract: The use of energy dissipation systems for seismic applications has increased worldwide, thus it is necessary to develop practical and modern criteria for their optimal design. Here, a direct displacement-based seismic design approach for frame buildings with hysteretic energy dissipation systems (HEDS) is applied. The building is constituted by two individual structural systems consisting of: 1) a main elastic structural frame designed for service loads; and 2) a secondary system, corresponding to the HEDS, that controls the effects of lateral loads. The procedure implies to control two design parameters: a) the stiffness ratio (α=Kframe/Ktotal system), and b) the strength ratio (γ=Vdamper/Vtotal system). The proposed damage-controlled approach contributes to the design of a more sustainable and resilient building because the structural damage is concentrated on the HEDS. The reduction of the design displacement spectrum is done by means of a damping factor (recently published) for elastic structural systems with HEDS, located in Mexico City. Two limit states are verified: serviceability and near collapse. Instead of the traditional trial-error approach, a procedure that allows the designer to establish the preliminary sizes of the structural elements of both systems is proposed. The design methodology is applied to an 8-story steel building with buckling restrained braces, located in soft soil of Mexico City. With the aim of choosing the optimal design parameters, a parametric study is developed considering different values of હ and . The simplified methodology is for preliminary sizing, design, and evaluation of the effectiveness of HEDS, and it constitutes a modern and practical tool that enables the structural designer to select the best design parameters.