Abstract: In this paper, a new concept of closed-loop design for a
product is presented. The closed-loop design model is developed by
integrating forward design and reverse design. Based on this new
concept, a closed-loop design model for sustainable manufacturing by
integrated evaluation of forward design, reverse design, and green
manufacturing using a fuzzy analytic network process is developed. In
the design stage of a product, with a given product requirement and
objective, there can be different ways to design the detailed
components and specifications. Therefore, there can be different
design cases to achieve the same product requirement and objective.
Subsequently, in the design evaluation stage, it is required to analyze
and evaluate the different design cases. The purpose of this research is
to develop a model for evaluating the design cases by integrated
evaluating the criteria in forward design, reverse design, and green
manufacturing. A fuzzy analytic network process method is presented
for integrated evaluation of the criteria in the three models. The
comparison matrices for evaluating the criteria in the three groups are
established. The total relational values among the three groups
represent the total relational effects. In applications, a super matrix
model is created and the total relational values can be used to evaluate
the design cases for decision-making to select the final design case. An
example product is demonstrated in this presentation. It shows that the
model is useful for integrated evaluation of forward design, reverse
design, and green manufacturing to achieve a closed-loop design for
sustainable manufacturing objective.
Abstract: This paper presents a novel algorithm for secure,
reliable and flexible transmission of big data in two hop wireless
networks using cooperative jamming scheme. Two hop wireless
networks consist of source, relay and destination nodes. Big data has
to transmit from source to relay and from relay to destination by
deploying security in physical layer. Cooperative jamming scheme
determines transmission of big data in more secure manner by
protecting it from eavesdroppers and malicious nodes of unknown
location. The novel algorithm that ensures secure and energy balance
transmission of big data, includes selection of data transmitting
region, segmenting the selected region, determining probability ratio
for each node (capture node, non-capture and eavesdropper node) in
every segment, evaluating the probability using binary based
evaluation. If it is secure transmission resume with the two- hop
transmission of big data, otherwise prevent the attackers by
cooperative jamming scheme and transmit the data in two-hop
transmission.
Abstract: Cooperative spectrum sensing is a crucial challenge in
cognitive radio networks. Cooperative sensing can increase the
reliability of spectrum hole detection, optimize sensing time and
reduce delay in cooperative networks. In this paper, an efficient
central capacity optimization algorithm is proposed to minimize
cooperative sensing time in a homogenous sensor network using OR
decision rule subject to the detection and false alarm probabilities
constraints. The evaluation results reveal significant improvement in
the sensing time and normalized capacity of the cognitive sensors.
Abstract: Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nanocutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.
Abstract: Froth flotation remains to date as one of the most used
metallurgical processes for concentrating metal-bearing minerals in
ores. Oxide ores are relatively less amenable to froth flotation and
require a judicious choice of reagents for the recovery of metals to be
optimised. Laboratory batch flotation tests were conducted to
determine the effect of two types of gasoil-rinkalore mixtures on the
flotation response of a copper cobalt oxide ore sample. The head
assay conducted on the initial ore sample showed that it contained
about 2.90% of Cu, 0.12% of Co.
Upon the flotation test work, the results obtained indicated that the
concentrate obtained with use of the mixture gazoil-rinkalore RX
yielded 8.24% Cu and 0.22% Co concentrate grades with recoveries
of 76.0% Cu and 78.0% Co respectively. But, the concentrate
obtained by use of the mixture gazoil-rinkalore RX3 yielded
relatively bad results with 5.92% Cu and 0.18% Cu concentrate
grades with recoveries of 70.3% Cu and 65.3% Co respectively.
Abstract: Software fault prediction models are created by using
the source code, processed metrics from the same or previous version
of code and related fault data. Some company do not store and keep
track of all artifacts which are required for software fault prediction.
To construct fault prediction model for such company, the training
data from the other projects can be one potential solution. Earlier we
predicted the fault the less cost it requires to correct. The training
data consists of metrics data and related fault data at function/module
level. This paper investigates fault predictions at early stage using the
cross-project data focusing on the design metrics. In this study,
empirical analysis is carried out to validate design metrics for cross
project fault prediction. The machine learning techniques used for
evaluation is Naïve Bayes. The design phase metrics of other projects
can be used as initial guideline for the projects where no previous
fault data is available. We analyze seven datasets from NASA
Metrics Data Program which offer design as well as code metrics.
Overall, the results of cross project is comparable to the within
company data learning.
Abstract: In educational technology, the idea of innovation is
usually tethered to contemporary technological inventions and
emerging technologies. Yet, using long-known technologies in ways
that are pedagogically or experimentially new can reposition them as
emerging educational technologies. In this study we explore how a
subtle pivot in pedagogical thinking led to an innovative education
technology. We describe the design and implementation of an online
writing tool that scaffolds students in the evaluation of their own
informational texts. We think about how pathways to innovation can
emerge from pivots, namely a leveraging of longstanding practices in
novel ways has the potential to cultivate new opportunities for
learning. We first unpack Infowriter in terms of its design, then we
describe some results of a study in which we implemented an
intervention which included our designed application.
Abstract: Transportation of long turbine blades from one place
to another is a difficult process. Hence a feasibility study of
modularization of wind turbine blade was taken from structural
standpoint through finite element analysis. Initially, a non-segmented
blade is modeled and its structural behavior is evaluated to serve as
reference. The resonant, static bending and fatigue tests are simulated
in accordance with IEC61400-23 standard for comparison purpose.
The non-segmented test blade is separated at suitable location based
on trade off studies and the segments are joined with an innovative
double strap bonded joint configuration. The adhesive joint is
modeled by adopting cohesive zone modeling approach in ANSYS.
The developed blade model is analyzed for its structural response
through simulation. Performances of both the blades are found to be
similar, which indicates that, efficient segmentation of the long blade
is possible which facilitates easy transportation of the blades and on
site reassembling. The location selected for segmentation and
adopted joint configuration has resulted in an efficient segmented
blade model which proves the methodology adopted for segmentation
was quite effective. The developed segmented blade appears to be the
viable alternative considering its structural response specifically in
fatigue within considered assumptions.
Abstract: The paper deals with the classical fiber bundle model
of equal load sharing, sometimes referred to as the Daniels’ bundle
or the democratic bundle. Daniels formulated a multidimensional
integral and also a recursive formula for evaluation of the
strength cumulative distribution function. This paper describes
three algorithms for evaluation of the recursive formula and also
their implementations with source codes in the Python high-level
programming language. A comparison of the algorithms are provided
with respect to execution time. Analysis of orders of magnitudes of
addends in the recursion is also provided.
Abstract: In this contribution two approaches for calculating
optimal trajectories for highly automated vehicles are presented and
compared. The first one is based on a non-linear vehicle model, used
for evaluation. The second one is based on a simplified model and
can be implemented on a current ECU. In usual driving situations
both approaches show very similar results.
Abstract: All current experimental methods for determination of
stress intensity factors are based on the assumption that the state of
stress near the crack tip is plane stress. Therefore, these methods rely
on strain and displacement measurements made outside the near
crack tip region affected by the three-dimensional effects or by
process zone. In this paper, we develop and validate an experimental
procedure for the evaluation of stress intensity factors from the
measurements of the out-of-plane displacements in the surface area
controlled by 3D effects. The evaluation of stress intensity factors is
possible when the process zone is sufficiently small, and the
displacement field generated by the 3D effects is fully encapsulated
by K-dominance region.
Abstract: Microscopic simulation tool kits allow for
consideration of the two processes of railway operations and the
previous timetable production. Block occupation conflicts on both
process levels are often solved by using defined train priorities. These
conflict resolutions (dispatching decisions) generate reactionary
delays to the involved trains. The sum of reactionary delays is
commonly used to evaluate the quality of railway operations, which
describes the timetable robustness. It is either compared to an
acceptable train performance or the delays are appraised
economically by linear monetary functions. It is impossible to
adequately evaluate dispatching decisions without a well-founded
objective function. This paper presents a new approach for the
evaluation of dispatching decisions. The approach uses mode choice
models and considers the behaviour of the end-customers. These
models evaluate the reactionary delays in more detail and consider
other competing modes of transport. The new approach pursues the
coupling of a microscopic model of railway operations with the
macroscopic choice mode model. At first, it will be implemented for
railway operations process but it can also be used for timetable
production. The evaluation considers the possibility for the customer
to interchange to other transport modes. The new approach starts to
look at rail and road, but it can also be extended to air travel. The
result of mode choice models is the modal split. The reactions by the
end-customers have an impact on the revenue of the train operating
companies. Different purposes of travel have different payment
reserves and tolerances towards late running. Aside from changes to
revenues, longer journey times can also generate additional costs.
The costs are either time- or track-specific and arise from required
changes to rolling stock or train crew cycles. Only the variable values
are summarised in the contribution margin, which is the base for the
monetary evaluation of delays. The contribution margin is calculated
for different possible solutions to the same conflict. The conflict
resolution is optimised until the monetary loss becomes minimal. The
iterative process therefore determines an optimum conflict resolution
by monitoring the change to the contribution margin. Furthermore, a
monetary value of each dispatching decision can also be derived.
Abstract: This paper presents the development of a robot car
that can track the motion of an object by detecting its color through
an Android device. The employed computer vision algorithm uses the
OpenCV library, which is embedded into an Android application of a
smartphone, for manipulating the captured image of the object. The
captured image of the object is subjected to color conversion and is
transformed to a binary image for further processing after color
filtering. The desired object is clearly determined after removing
pixel noise by applying image morphology operations and contour
definition. Finally, the area and the center of the object are
determined so that object’s motion to be tracked. The smartphone
application has been placed on a robot car and transmits by Bluetooth
to an Arduino assembly the motion directives so that to follow
objects of a specified color. The experimental evaluation of the
proposed algorithm shows reliable color detection and smooth
tracking characteristics.
Abstract: The seismic risk mitigation from the perspective of
the old buildings stock is truly essential in Algerian urban areas,
particularly those located in seismic prone regions, such as Annaba
city, and which the old buildings present high levels of degradation
associated with no seismic strengthening and/or rehabilitation
concerns. In this sense, the present paper approaches the issue of the
seismic vulnerability assessment of old masonry building stocks
through the adaptation of a simplified methodology developed for a
European context area similar to that of Annaba city, Algeria.
Therefore, this method is used for the first level of seismic
vulnerability assessment of the masonry buildings stock of the old
city center of Annaba. This methodology is based on a vulnerability
index that is suitable for the evaluation of damage and for the
creation of large-scale loss scenarios. Over 380 buildings were
evaluated in accordance with the referred methodology and the
results obtained were then integrated into a Geographical Information
System (GIS) tool. Such results can be used by the Annaba city
council for supporting management decisions, based on a global view
of the site under analysis, which led to more accurate and faster
decisions for the risk mitigation strategies and rehabilitation plans.
Abstract: Design concepts of real-time embedded system can be
realized initially by introducing novel design approaches. In this
literature, model based design approach and in-the-loop testing were
employed early in the conceptual and preliminary phase to formulate
design requirements and perform quick real-time verification. The
design and analysis methodology includes simulation analysis, model
based testing, and in-the-loop testing. The design of conceptual driveby-
wire, or DBW, algorithm for electronic control unit, or ECU, was
presented to demonstrate the conceptual design process, analysis, and
functionality evaluation. The concepts of DBW ECU function can be
implemented in the vehicle system to improve electric vehicle, or EV,
conversion drivability. However, within a new development process,
conceptual ECU functions and parameters are needed to be evaluated.
As a result, the testing system was employed to support conceptual
DBW ECU functions evaluation. For the current setup, the system
components were consisted of actual DBW ECU hardware, electric
vehicle models, and control area network or CAN protocol. The
vehicle models and CAN bus interface were both implemented as
real-time applications where ECU and CAN protocol functionality
were verified according to the design requirements. The proposed
system could potentially benefit in performing rapid real-time
analysis of design parameters for conceptual system or software
algorithm development.
Abstract: We evaluate the performance of a numerical method
for global optimization of expensive functions. The method is using a
response surface to guide the search for the global optimum. This
metamodel could be based on radial basis functions, kriging, or a
combination of different models. We discuss how to set the cyclic
parameters of the optimization method to get a balance between local
and global search. We also discuss the eventual problem with Runge
oscillations in the response surface.
Abstract: The Scheduling and mapping of tasks on a set of
processors is considered as a critical problem in parallel and
distributed computing system. This paper deals with the problem of
dynamic scheduling on a special type of multiprocessor architecture
known as Linear Crossed Cube (LCQ) network. This proposed
multiprocessor is a hybrid network which combines the features of
both linear types of architectures as well as cube based architectures.
Two standard dynamic scheduling schemes namely Minimum
Distance Scheduling (MDS) and Two Round Scheduling (TRS)
schemes are implemented on the LCQ network. Parallel tasks are
mapped and the imbalance of load is evaluated on different set of
processors in LCQ network. The simulations results are evaluated
and effort is made by means of through analysis of the results to
obtain the best solution for the given network in term of load
imbalance left and execution time. The other performance matrices
like speedup and efficiency are also evaluated with the given
dynamic algorithms.
Abstract: This study presented to reduce earthquake damage and
emergency rehabilitation of critical structures such as schools, hightech
factories, and hospitals due to strong ground motions associated
with climate changes. Regarding recent trend, a strong earthquake
causes serious damage to critical structures and then the critical
structure might be influenced by sequence aftershocks (or tsunami)
due to fault plane adjustments. Therefore, in order to improve seismic
performance of critical structures, retrofitted or strengthening study
of the structures under aftershocks sequence after emergency
rehabilitation of the structures subjected to strong earthquakes is
widely carried out. Consequently, this study used composite material
for emergency rehabilitation of the structure rather than concrete and
steel materials because of high strength and stiffness, lightweight,
rapid manufacturing, and dynamic performance. Also, this study was
to develop or improve the seismic performance or seismic retrofit of
critical structures subjected to strong ground motions and earthquake
aftershocks, by utilizing GFRP-Corrugated Infill Panels (GCIP).
Abstract: A central element of higher education today is the
“core” or “general education” curriculum: that configuration of
courses that often encompasses the essence of liberal arts education.
Ensuring that such offerings reflect the mission and values of the
institution is a challenge faced by most college and universities, often
more than once. This paper presents an action model of program
planning designed to structure the processes of developing,
implementing and revising core curricula in a manner consistent with
key institutional goals and objectives. Through presentation of a case
study from a university in the United States, the elements of needs
assessment, stakeholder investment and collaborative compromise
are shown as key components of a planning strategy that can produce
a general education program that is comprehensive, academically
rigorous, assessable, and mission consistent. The paper concludes
with recommendations for both the implementation and evaluation of
such programs in practice.
Abstract: Every year, a considerable amount of money is being
invested on research, mainly in the form of funding allocated to
universities and research institutes. To better distribute the available
funds and to set the most proper R&D investment strategies for the
future, evaluation of the productivity of the funded researchers and
the impact of such funding is crucial. In this paper, using the data on
15 years of journal publications of the NSERC (Natural Sciences and
Engineering research Council of Canada) funded researchers and by
means of bibliometric analysis, the scientific development of the
funded researchers and their scientific collaboration patterns will be
investigated in the period of 1996-2010. According to the results it
seems that there is a positive relation between the average level of
funding and quantity and quality of the scientific output. In addition,
whenever funding allocated to the researchers has increased, the
number of co-authors per paper has also augmented. Hence, the
increase in the level of funding may enable researchers to get
involved in larger projects and/or scientific teams and increase their
scientific output respectively.