Abstract: Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new
part directly from 3 Dimensional Computer Aided Design (3D CAD)
model, building new part on the existing old component and repairing an existing high valued component parts that would have
been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the
underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing
parameters and studying many parameters at the same time makes it
further complex to understand. In this study, the effect of laser power
and powder flow rate on physical properties (deposition height and
deposition width), metallurgical property (microstructure) and
mechanical (microhardness) properties on laser deposited most
widely used aerospace alloy are studied. Also, because the Ti6Al4V
is very expensive, and LMD is capable of reducing buy-to-fly ratio
of aerospace parts, the material utilization efficiency is also studied.
Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow
rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and
scanning speed constant at 2 l/min and 0.005 m/s respectively. The
deposition height / width are found to increase with increase in laser
power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces
material utilization. The results are presented and fully discussed.
Abstract: Computers are increasingly being used as educational
tools in elementary/primary schools worldwide. A specific
application of such computer use, is that of multimedia games, where
the aim is to combine pedagogy and entertainment. This study
reports on a case-study whereby an educational multimedia game has
been developed for use by elementary school children. The stages of
the application-s design, implementation and evaluation are
presented. Strengths of the game are identified and discussed, and its
weaknesses are identified, allowing for suggestions for future redesigns.
The results show that the use of games can engage children
in the learning process for longer periods of time with the added
benefit of the entertainment factor.
Abstract: In the present research, a finite element model is
presented to study the geometrical and material nonlinear behavior of
reinforced concrete plane frames considering soil-structure
interaction. The nonlinear behaviors of concrete and reinforcing steel
are considered both in compression and tension up to failure. The
model takes account also for the number, diameter, and distribution
of rebar along every cross section. Soil behavior is taken into
consideration using four different models; namely: linear-, nonlinear
Winkler's model, and linear-, nonlinear continuum model. A
computer program (NARC) is specially developed in order to
perform the analysis. The results achieved by the present model show
good agreement with both theoretical and experimental published
literature. The nonlinear behavior of a rectangular frame resting on
soft soil up to failure using the proposed model is introduced for
demonstration.
Abstract: Assessment of IEP (Individual Education Plan) is an
important stage in the area of special education. This paper deals
with this problem by introducing computer software which process
the data gathered from application of IEP. The software is intended
to be used by special education institution in Turkey and allows
assessment of school and family trainings. The software has a user
friendly interface and its design includes graphical developer tools.
Abstract: Trust is essential for further and wider acceptance of
contemporary e-services. It was first addressed almost thirty years
ago in Trusted Computer System Evaluation Criteria standard by
the US DoD. But this and other proposed approaches of that
period were actually solving security. Roughly some ten years ago,
methodologies followed that addressed trust phenomenon at its core,
and they were based on Bayesian statistics and its derivatives, while
some approaches were based on game theory. However, trust is a
manifestation of judgment and reasoning processes. It has to be dealt
with in accordance with this fact and adequately supported in cyber
environment. On the basis of the results in the field of psychology
and our own findings, a methodology called qualitative algebra has
been developed, which deals with so far overlooked elements of trust
phenomenon. It complements existing methodologies and provides a
basis for a practical technical solution that supports management of
trust in contemporary computing environments. Such solution is also
presented at the end of this paper.
Abstract: In the analysis of structures, the nonlinear effects due to large displacement, large rotation and materially-nonlinear are very important and must be considered for the reliable analysis. The non-linear fmite element analysis has potential as usable and reliable means for analyzing of civil structures with the availability of computer technology. In this research the large displacements and materially nonlinear behavior of shear wall is presented with developing of fmite element code using the standard Galerkin weighted residual formulation. Two-dimensional plane stress model was carried out to present the shear wall response. Total Lagangian formulation, which is computationally more effective, is used in the formulation of stiffness matrices and the Newton-Raphson method is applied for the solution of nonlinear transient equations. The details of the program formulation are highlighted and the results of the analyses are presented, along with a comparison of the response of the structure with Ansys software results. The presented model in this paper can be developed for nonlinear analysis of civil engineering structures with different material behavior and complicated geometry.
Abstract: The burst noise is a kind of noises that are destructive
and frequently found in semiconductor devices and ICs, yet detecting
and removing the noise has proved challenging for IC designers or users. According to the properties of burst noise, a methodological
approach is presented (proposed) in the paper, by which the burst noise
can be analysed and detected in time domain. In this paper, principles
and properties of burst noise are expounded first, Afterwards,
feasibility (viable) of burst noise detection by means of wavelet
transform in the time domain is corroborated in the paper, and the multi-resolution characters of Gaussian noise, burst noise and blurred
burst noise are discussed in details by computer emulation. Furthermore, the practical method to decide parameters of wavelet
transform is acquired through a great deal of experiment and data statistics. The methodology may yield an expectation in a wide variety of applications.
Abstract: The system is made with main distributed components:
First Level: Industrial Computers placed in Control Room (monitors thermal and electrical processes based on the data provided by the second level); Second Level: PLCs which collects data from process and transmits information on the first level; also takes commands from this level which are further, passed to execution elements from third
level; Third Level: field elements consisting in 3 categories: data collecting elements; data transfer elements from the third level to the second; execution elements which take commands from the second
level PLCs and executes them after which transmits the confirmation of execution to them. The purpose of the automatic functioning is the optimization of the co-generative electrical energy commissioning in the national
energy system and the commissioning of thermal energy to the consumers.
The integrated system treats the functioning of all the equipments and devices as a whole: Gas Turbine Units (GTU); MT 20kV Medium Voltage Station (MVS); 0,4 kV Low Voltage Station (LVS); Main Hot Water Boilers (MHW); Auxiliary Hot Water Boilers (AHW); Gas Compressor Unit (GCU); Thermal Agent Circulation
Pumping Unit (TPU); Water Treating Station (WTS).
Abstract: Optical burst switching (OBS) has been proposed to
realize the next generation Internet based on the wavelength division
multiplexing (WDM) network technologies. In the OBS, the burst
contention is one of the major problems. The deflection routing has
been designed for resolving the problem. However, the deflection
routing becomes difficult to prevent from the burst contentions as the
network load becomes high. In this paper, we introduce a flow rate
control methods to reduce burst contentions. We propose new flow
rate control methods based on the leaky bucket algorithm and
deflection routing, i.e. separate leaky bucket deflection method, and
dynamic leaky bucket deflection method. In proposed methods, edge
nodes which generate data bursts carry out the flow rate control
protocols. In order to verify the effectiveness of the flow rate control in
OBS networks, we show that the proposed methods improve the
network utilization and reduce the burst loss probability through
computer simulations.
Abstract: The analysis of electromagnetic environment using
deterministic mathematical models is characterized by the
impossibility of analyzing a large number of interacting network
stations with a priori unknown parameters, and this is characteristic,
for example, of mobile wireless communication networks. One of the
tasks of the tools used in designing, planning and optimization of
mobile wireless network is to carry out simulation of electromagnetic
environment based on mathematical modelling methods, including
computer experiment, and to estimate its effect on radio
communication devices. This paper proposes the development of a
statistical model of electromagnetic environment of a mobile
wireless communication network by describing the parameters and
factors affecting it including the propagation channel and their
statistical models.
Abstract: in dissimilar material joints, failure often occurs
along the interface between two materials due to stress singularity.
Stress distribution and its concentration depend on materials and
geometry of the junction. Inhomogenity of stress distribution at the
interface of junction of two materials with different elastic modules
and stress concentration in this zone are the main factors resulting in
rupture of the junction. Effect of joining angle in the interface of
aluminum-polycarbonate will be discussed in this paper. Computer
simulation and finite element analysis by ABAQUS showed that
convex interfacial joint leads to stress reduction at junction corners in
compare with straight joint. This finding is confirmed by photoelastic
experimental results.
Abstract: Human computer interaction has progressed
considerably from the traditional modes of interaction. Vision based
interfaces are a revolutionary technology, allowing interaction
through human actions, gestures. Researchers have developed
numerous accurate techniques, however, with an exception to few
these techniques are not evaluated using standard HCI techniques. In
this paper we present a comprehensive framework to address this
issue. Our evaluation of a computer vision application shows that in
addition to the accuracy, it is vital to address human factors
Abstract: Road Traffic Accidents are a major cause of disability and death throughout the world. The control of intelligent vehicles in order to reduce human error and boost ease congestion is not accomplished solely by the aid of human resources. The present article is an attempt to introduce an intelligent control system based on RFID technology. By the help of RFID technology, vehicles are connected to computerized systems, intelligent light poles and other available hardware along the way. In this project, intelligent control system is capable of tracking all vehicles, crisis management and control, traffic guidance and recording Driving offences along the highway.
Abstract: For the communication between human and computer
in an interactive computing environment, the gesture recognition is
studied vigorously. Therefore, a lot of studies have proposed efficient
methods about the recognition algorithm using 2D camera captured
images. However, there is a limitation to these methods, such as the
extracted features cannot fully represent the object in real world.
Although many studies used 3D features instead of 2D features for
more accurate gesture recognition, the problem, such as the processing
time to generate 3D objects, is still unsolved in related researches.
Therefore we propose a method to extract the 3D features combined
with the 3D object reconstruction. This method uses the modified
GPU-based visual hull generation algorithm which disables unnecessary
processes, such as the texture calculation to generate three kinds
of 3D projection maps as the 3D feature: a nearest boundary, a farthest
boundary, and a thickness of the object projected on the base-plane. In
the section of experimental results, we present results of proposed
method on eight human postures: T shape, both hands up, right hand
up, left hand up, hands front, stand, sit and bend, and compare the
computational time of the proposed method with that of the previous
methods.
Abstract: The design of a steam turbine is a very complex
engineering operation that can be simplified and improved thanks to
computer-aided multi-objective optimization. This process makes use
of existing optimization algorithms and losses correlations to identify
those geometries that deliver the best balance of performance (i.e.
Pareto-optimal points).
This paper deals with a one-dimensional multi-objective and
multi-point optimization of a single-stage steam turbine. Using a
genetic optimization algorithm and an algebraic one-dimensional
ideal gas-path model based on loss and deviation correlations, a code
capable of performing the optimization of a predefined steam turbine
stage was developed. More specifically, during this study the
parameters modified (i.e. decision variables) to identify the best
performing geometries were solidity and angles both for stator and
rotor cascades, while the objective functions to maximize were totalto-
static efficiency and specific work done.
Finally, an accurate analysis of the obtained results was carried
out.
Abstract: The purpose of this study was to investigate the effect
of combining Real Experimentation (RE) With Virtual
Experimentation (VE) on students- conceptual understanding of
photo electric effect. To achieve this, a preāpost comparison study
design was used that involved 46 undergraduate students. Two
groups were set up for this study. Participants in the control group
used RE to learn photo electric effect, whereas, participants in the
experimental group used RE in the first part of the curriculum and
VE in another part. Achievement test was given to the groups
before and after the application as pre-test and post test. The
independent samples t- test, one way Anova and Tukey HSD test
were used for testing the data obtained from the study.
According to the results of analyzes, the experimental group
was found more successful than the control group.
Abstract: Estimating the lifetime distribution of computer networks in which nodes and links exist in time and are bound for failure is very useful in various applications. This problem is known to be NP-hard. In this paper we present efficient combinatorial approaches to Monte Carlo estimation of network lifetime distribution. We also present some simulation results.
Abstract: This paper presented a MATLAB-based system named Smart Access Network Testing, Analyzing and Database (SANTAD), purposely for in-service transmission surveillance and self restoration against fiber fault in fiber-to-the-home (FTTH) access network. The developed program will be installed with optical line terminal (OLT) at central office (CO) to monitor the status and detect any fiber fault that occurs in FTTH downwardly from CO towards residential customer locations. SANTAD is interfaced with optical time domain reflectometer (OTDR) to accumulate every network testing result to be displayed on a single computer screen for further analysis. This program will identify and present the parameters of each optical fiber line such as the line's status either in working or nonworking condition, magnitude of decreasing at each point, failure location, and other details as shown in the OTDR's screen. The failure status will be delivered to field engineers for promptly actions, meanwhile the failure line will be diverted to protection line to ensure the traffic flow continuously. This approach has a bright prospect to improve the survivability and reliability as well as increase the efficiency and monitoring capabilities in FTTH.
Abstract: This paper discusses the use of explorative data
mining tools that allow the educator to explore new relationships
between reported learning experiences and actual activities,
even if there are multiple dimensions with a large number
of measured items. The underlying technology is based on
the so-called Compendium Platform for Reproducible Computing
(http://www.freestatistics.org) which was built on top the computational
R Framework (http://www.wessa.net).
Abstract: In modern human computer interaction systems
(HCI), emotion recognition is becoming an imperative characteristic.
The quest for effective and reliable emotion recognition in HCI has
resulted in a need for better face detection, feature extraction and
classification. In this paper we present results of feature space analysis
after briefly explaining our fully automatic vision based emotion
recognition method. We demonstrate the compactness of the feature
space and show how the 2d/3d based method achieves superior features
for the purpose of emotion classification. Also it is exposed that
through feature normalization a widely person independent feature
space is created. As a consequence, the classifier architecture has
only a minor influence on the classification result. This is particularly
elucidated with the help of confusion matrices. For this purpose
advanced classification algorithms, such as Support Vector Machines
and Artificial Neural Networks are employed, as well as the simple k-
Nearest Neighbor classifier.