Abstract: Carbon Fiber Reinforced Plastics (CFRPs) are widely
used for advanced applications, in particular in aerospace, automotive
and wind energy industries. Once cured to near net shape, CFRP
parts need several finishing operations such as trimming, milling or
drilling in order to accommodate fastening hardware and meeting the
final dimensions. The present research aims to study the effect of the
cutting temperature in trimming on the mechanical strength of high
performance CFRP laminates used for aeronautics applications. The
cutting temperature is of great importance when dealing with
trimming of CFRP. Temperatures higher than the glass-transition
temperature (Tg) of the resin matrix are highly undesirable: they
cause degradation of the matrix in the trimmed edges area, which can
severely affect the mechanical performance of the entire component.
In this study, a 9.50mm diameter CVD diamond coated carbide tool
with six flutes was used to trim 24-plies CFRP laminates. A
300m/min cutting speed and 1140mm/min feed rate were used in the
experiments. The tool was heated prior to trimming using a
blowtorch, for temperatures ranging from 20°C to 300°C. The
temperature at the cutting edge was measured using embedded KType
thermocouples. Samples trimmed for different cutting
temperatures, below and above Tg, were mechanically tested using
three-points bending short-beam loading configurations. New cutting
tools as well as worn cutting tools were utilized for the experiments.
The experiments with the new tools could not prove any correlation
between the length of cut, the cutting temperature and the mechanical
performance. Thus mechanical strength was constant, regardless of
the cutting temperature. However, for worn tools, producing a cutting
temperature rising up to 450°C, thermal damage of the resin was
observed. The mechanical tests showed a reduced mean resistance in
short beam configuration, while the resistance in three point bending
decreases with increase of the cutting temperature.
Abstract: Due to the fast and flawless technological innovation
there is a tremendous amount of data dumping all over the world in
every domain such as Pattern Recognition, Machine Learning, Spatial
Data Mining, Image Analysis, Fraudulent Analysis, World Wide
Web etc., This issue turns to be more essential for developing several
tools for data mining functionalities. The major aim of this paper is to
analyze various tools which are used to build a resourceful analytical
or descriptive model for handling large amount of information more
efficiently and user friendly. In this survey the diverse tools are
illustrated with their extensive technical paradigm, outstanding
graphical interface and inbuilt multipath algorithms in which it is
very useful for handling significant amount of data more indeed.
Abstract: This paper describes I²C Slave implementation using
I²C master obtained from the OpenCores website. This website
provides free Verilog and VHDL Codes to users. The design
implementation for the I²C slave is in Verilog Language and uses
EDA tools for ASIC design known as ModelSim from Mentor
Graphic. This tool is used for simulation and verification purposes.
Common application for this I²C Master-Slave integration is also
included. This paper also addresses the advantages and limitations of
the said design.
Abstract: Structural failure is caused mainly by damage that
often occurs on structures. Many researchers focus on to obtain very
efficient tools to detect the damage in structures in the early state. In
the past decades, a subject that has received considerable attention in
literature is the damage detection as determined by variations in the
dynamic characteristics or response of structures. The study presents
a new damage identification technique. The technique detects the
damage location for the incomplete structure system using output
data only. The method indicates the damage based on the free
vibration test data by using ‘Two Points Condensation (TPC)
technique’. This method creates a set of matrices by reducing the
structural system to two degrees of freedom systems. The current
stiffness matrices obtain from optimization the equation of motion
using the measured test data. The current stiffness matrices compare
with original (undamaged) stiffness matrices. The large percentage
changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply
supported steel beam model structure after inducing thickness change
in one element, where two cases consider. The method detects the
damage and determines its location accurately in both cases. In
addition, the results illustrate these changes in stiffness matrix can be
a useful tool for continuous monitoring of structural safety using
ambient vibration data. Furthermore, its efficiency proves that this
technique can be used also for big structures.
Abstract: To decrease the grating scale thermal expansion error,
a novel method which based on multiple temperature detection is
proposed. Several temperature sensors are installed on the grating
scale and the temperatures of these sensors are recorded. The
temperatures of every point on the grating scale are calculated by
interpolating between adjacent sensors. According to the thermal
expansion principle, the grating scale thermal expansion error model
can be established by doing the integral for the variations of position
and temperature. A novel compensation method is proposed in this
paper. By applying the established error model, the grating scale
thermal expansion error is decreased by 90% compared with no
compensation. The residual positioning error of the grating scale is
less than 15μm/10m and the accuracy of the machine tool is
significant improved.
Abstract: Milk is considered as an essential and complete food.
The present study was conducted at Milk Plant Mohali especially in
reference to the procurement section where the cash inflow was
maximum, with the objective to achieve higher productivity and
reduce wastage of milk. In milk plant it was observed that during the
month of Jan-2014 to March-2014 the average procurement of milk
was Rs. 4, 19, 361 liter per month and cost of procurement of milk is
Rs 35/- per liter. The total cost of procurement thereby equal to Rs.
1crore 46 lakh per month, but there was mismatch in procurementproduction
of milk, which leads to an average loss of Rs. 12, 94, 405
per month. To solve the procurement-production problem Quality
Control Tools like brainstorming, Flow Chart, Cause effect diagram
and Pareto analysis are applied wherever applicable. With the
successful implementation of Quality Control tools an average saving
of Rs. 4, 59, 445 per month is done.
Abstract: This paper deals with the problem of management of
information resources in libraries of the public institution Sultan
Moulay Slimane University (SMSU) in order to analyze the
satisfaction of the readers, and allow university leaders to make better
strategic and instant decisions. For this, the integration of an
integrated management decision library system is a priority program
of higher education, as part of the Digital Morocco, which has a
proactive policy to develop the use of new technologies information
and communication in higher institutions. This operational
information system can provide better services to the students and for
the leaders. Our approach is to integrate the tools of business
intelligence (BI) in the library management by using power BI.
Abstract: The current study aims to highlight the loading
characteristics impact on the time evolution (focusing particularly on
long term effects) of the deformation of realized reinforced concrete
beams. Namely the tension stiffening code provisions (i.e. within
Eurocode 2) are reviewed with a clear intention to reassess their
operational value and predicting capacity. In what follows the
experimental programme adopted along with some preliminary
findings and numerical modeling attempts are presented. For a range of long slender reinforced concrete simply supported
beams (4200 mm) constant static sustained and repeated cyclic
loadings were applied mapping the time evolution of deformation.
All experiments were carried out at the Heavy Structures Lab of the
University of Leeds. During tests the mid-span deflection, creep
coefficient and shrinkage strains were monitored for duration of 90
days. The obtained results are set against the values predicted by
Eurocode 2 and the tools within an FE commercial package (i.e.
Midas FEA) to yield that existing knowledge and practise is at times
over-conservative.
Abstract: This study aims to increase understanding of the
transition of business models in servitization. The significance of
service in all business has increased dramatically during the past
decades. Service-dominant logic (SDL) describes this change in the
economy and questions the goods-dominant logic on which business
has primarily been based in the past. A business model canvas is one
of the most cited and used tools in defining end developing business
models. The starting point of this paper lies in the notion that the
traditional business model canvas is inherently goods-oriented and
best suits for product-based business. However, the basic differences
between goods and services necessitate changes in business model
representations when proceeding in servitization. Therefore, new
knowledge is needed on how the conception of business model and
the business model canvas as its representation should be altered in
servitized firms in order to better serve business developers and interfirm
co-creation. That is to say, compared to products, services are
intangible and they are co-produced between the supplier and the
customer. Value is always co-created in interaction between a
supplier and a customer, and customer experience primarily depends
on how well the interaction succeeds between the actors. The role of
service experience is even stronger in service business compared to
product business, as services are co-produced with the customer. This paper provides business model developers with a service
business model canvas, which takes into account the intangible,
interactive, and relational nature of service. The study employs a
design science approach that contributes to theory development via
design artifacts. This study utilizes qualitative data gathered in
workshops with ten companies from various industries. In particular,
key differences between Goods-dominant logic (GDL) and SDLbased
business models are identified when an industrial firm
proceeds in servitization. As the result of the study, an updated version of the business
model canvas is provided based on service-dominant logic. The
service business model canvas ensures a stronger customer focus and
includes aspects salient for services, such as interaction between
companies, service co-production, and customer experience. It can be
used for the analysis and development of a current service business
model of a company or for designing a new business model. It
facilitates customer-focused new service design and service
development. It aids in the identification of development needs, and
facilitates the creation of a common view of the business model.
Therefore, the service business model canvas can be regarded as a
boundary object, which facilitates the creation of a common
understanding of the business model between several actors involved.
The study contributes to the business model and service business
development disciplines by providing a managerial tool for
practitioners in service development. It also provides research insight
into how servitization challenges companies’ business models.
Abstract: The current tools for real time management of sewer
systems are based on two software tools: the software of weather
forecast and the software of hydraulic simulation. The use of the first
ones is an important cause of imprecision and uncertainty, the use of
the second requires temporal important steps of decision because of
their need in times of calculation. This way of proceeding fact that
the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by
approaching the problem by the "automatic" face rather than by that
"hydrology". The objective is to make possible the realization of a
large number of simulations at very short times (a few seconds)
allowing to take place weather forecasts by using directly the real
time meditative pluviometric data. The aim is to reach a system
where the decision-making is realized from reliable data and where
the correction of the error is permanent. A first model of control laws was realized and tested with different
return-period rainfalls. The gains obtained in rejecting volume vary
from 19 to 100 %. The development of a new algorithm was then
used to optimize calculation time and thus to overcome the
subsequent combinatorial problem in our first approach. Finally, this
new algorithm was tested with 16- year-rainfall series. The obtained
gains are 40 % of total volume rejected to the natural environment
and of 65 % in the number of discharges.
Abstract: The importance of energy efficiency within the production processes increases steadily. For a comprehensive assessment of energy efficiency within the production process, unfortunately no tools exist or have been developed yet. Therefore the Institute for Factory Automation and Production Systems at the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency namely EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state-of-the-art as well as the developed approaches.
Abstract: Ecological systems are exposed and are influenced by
various natural and anthropogenic disturbances. They produce
various effects and states seeking response symmetry to a state of
global phase coherence or stability and balance of their food webs.
This research project addresses the development of a computational
methodology for modeling plankton food webs. The use of
algorithms to establish connections, the generation of representative
fuzzy multigraphs and application of technical analysis of complex
networks provide a set of tools for defining, analyzing and evaluating
community structure of coastal aquatic ecosystems, beyond the
estimate of possible external impacts to the networks. Thus, this
study aims to develop computational systems and data models to
assess how these ecological networks are structurally and
functionally organized, to analyze the types and degree of
compartmentalization and synchronization between oscillatory and
interconnected elements network and the influence of disturbances on
the overall pattern of rhythmicity of the system.
Abstract: The current paper presents the results of a conducted
case study. During the past few years the number of children
diagnosed with Learning Difficulties has drastically augmented and
especially the cases of ADHD (Attention Deficit Hyperactivity
Disorder). One of the core characteristics of ADHD is a deficit in
working memory functions. The review of the literature indicates a
plethora of educational software that aim at training and enhancing
the working memory. Nevertheless, in the current paper, the
possibility of using for the same purpose free, online games will be
explored. Another issue of interest is the potential effect of the
working memory training to the core symptoms of ADHD. In order
to explore the abovementioned research questions, three digital tests
are employed, all of which are developed on the E-slate platform by
the author, in order to check the levels of ADHD’s symptoms and to
be used as diagnostic tools, both in the beginning and in the end of
the case study. The tools used during the main intervention of the
research are free online games for the training of working memory.
The research and the data analysis focus on the following axes: a) the
presence and the possible change in two of the core symptoms of
ADHD, attention and impulsivity and b) a possible change in the
general cognitive abilities of the individual. The case study was
conducted with the participation of a thirteen year-old, female
student, diagnosed with ADHD, during after-school hours. The
results of the study indicate positive changes both in the levels of
attention and impulsivity. Therefore, we conclude that the training of
working memory through the use of free, online games has a positive
impact on the characteristics of ADHD. Finally, concerning the
second research question, the change in general cognitive abilities, no
significant changes were noted.
Abstract: This survey paper shows the recent state of model
comparison as it’s applies to Model Driven engineering. In Model
Driven Engineering to calculate the difference between the models is
a very important and challenging task. There are number of tasks
involved in model differencing that firstly starts with identifying and
matching the elements of the model. In this paper, we discuss how
model matching is accomplished, the strategies, techniques and the
types of the model. We also discuss the future direction. We found
out that many of the latest model comparison strategies are geared
near enabling Meta model and similarity based matching. Therefore
model versioning is the most dominant application of the model
comparison. Recently to work on comparison for versioning has
begun to deteriorate, giving way to different applications. Ultimately
there is wide change among the tools in the measure of client exertion
needed to perform model comparisons, as some require more push to
encourage more sweeping statement and expressive force.
Abstract: The teaching of computer programming for beginners
has been generally considered as a difficult and challenging task.
Several methodologies and research tools have been developed,
however, the difficulty of teaching still remains. Our work integrates
the state of the art in teaching programming with game software and
further provides metrics for the evaluation of student performance in
a collaborative activity of playing games. This paper aims to present a
multi-agent system architecture to be incorporated to the educational
collaborative game software for teaching programming that monitors,
evaluates and encourages collaboration by the participants. A
literature review has been made on the concepts of Collaborative
Learning, Multi-agents systems, collaborative games and techniques
to teach programming using these concepts simultaneously.
Abstract: The aim of this work was to characterize a potential
target group of people interested in participating into a training
program in organic farming in the context of mobile-learning. The
information sought addressed in particular, but not exclusively,
possible contents, formats and forms of evaluation that will
contribute to define the course objectives and curriculum, as well as
to ensure that the course meets the needs of the learners and their
preferences. The sample was selected among different European
countries. The questionnaires were delivered electronically for
answering on-line and in the end 135 consented valid questionnaires
were obtained. The results allowed characterizing the target group
and identifying their training needs and preferences towards m-learning
formats, giving valuable tools to design the training offer.
Abstract: A large amount of software products offer a wide
range and number of features. This is called featuritis or creeping
featurism and tends to rise with each release of the product. Feautiris
often adds unnecessary complexity to software, leading to longer
learning curves and overall confusing the users and degrading their
experience. We take a look to a new design approach tendency that
has been coming up, the so-called “What You Get is What You
Need” concept that argues that products should be very focused,
simple and with minimalistic interfaces in order to help users conduct
their tasks in distraction-free ambiences. This isn’t as simple to
implement as it might sound and the developers need to cut down
features. Our contribution illustrates and evaluates this design method
through a novel distraction-free diagramming tool named Delineato
Pro for Mac OS X in which the user is confronted with an empty
canvas when launching the software and where tools only show up
when really needed.
Abstract: In order to obtain efficient pollutants removal in
small-scale wastewater treatment plants, uniform water flow has to be
achieved. The experimental setup, designed for treating high-load
wastewater (leachate), consists of two aerobic biological reactors and
a lamellar settler. Both biological tanks were aerated by using three
different types of aeration systems - perforated pipes, membrane air
diffusers and tube ceramic diffusers. The possibility of homogenizing
the water mass with each of the air diffusion systems was evaluated
comparatively. The oxygen concentration was determined by optical
sensors with data logging. The experimental data was analyzed
comparatively for all three different air dispersion systems aiming to
identify the oxygen concentration variation during different
operational conditions. The Oxygenation Capacity was calculated for
each of the three systems and used as performance and selection
parameter. The global mass transfer coefficients were also evaluated
as important tools in designing the aeration system. Even though
using the tubular porous diffusers leads to higher oxygen
concentration compared to the perforated pipe system (which
provides medium-sized bubbles in the aqueous solution), it doesn’t
achieve the threshold limit of 80% oxygen saturation in less than 30
minutes. The study has shown that the optimal solution for the
studied configuration was the radial air diffusers which ensure an
oxygen saturation of 80% in 20 minutes. An increment of the values
was identified when the air flow was increased.
Abstract: In order to help the expert to validate association rules
extracted from data, some quality measures are proposed in the
literature. We distinguish two categories: objective and subjective
measures. The first one depends on a fixed threshold and on data
quality from which the rules are extracted. The second one consists
on providing to the expert some tools in the objective to explore and
visualize rules during the evaluation step. However, the number of
extracted rules to validate remains high. Thus, the manually mining
rules task is very hard. To solve this problem, we propose, in this
paper, a semi-automatic method to assist the expert during the
association rule's validation. Our method uses rule-based
classification as follow: (i) We transform association rules into
classification rules (classifiers), (ii) We use the generated classifiers
for data classification. (iii) We visualize association rules with their
quality classification to give an idea to the expert and to assist him
during validation process.
Abstract: The spindle system is one of the most important
components of machine tool. The dynamic properties of the spindle
affect the machining productivity and quality of the work pieces.
Thus, it is important and necessary to determine its dynamic
characteristics of spindles in the design and development in order to
avoid forced resonance. The finite element method (FEM) has been
adopted in order to obtain the dynamic behavior of spindle system.
For this reason, obtaining the Campbell diagrams and determining the
critical speeds are very useful to evaluate the spindle system
dynamics. The unbalance response of the system to the center of
mass unbalance at the cutting tool is also calculated to investigate the
dynamic behavior. In this paper, we used an ANSYS Parametric
Design Language (APDL) program which based on finite element
method has been implemented to make the full dynamic analysis and
evaluation of the results. Results show that the calculated critical
speeds are far from the operating speed range of the spindle, thus, the
spindle would not experience resonance, and the maximum
unbalance response at operating speed is still with acceptable limit.
ANSYS Parametric Design Language (APDL) can be used by spindle
designer as tools in order to increase the product quality, reducing
cost, and time consuming in the design and development stages.