Abstract: POS (also been called DGPS/IMU) technique can obtain the Exterior Orientation Elements of aerial photo, so the triangulation and DLG production using POS can save large numbers of ground control points (GCP), and this will improve the produce efficiency of DLG and reduce the cost of collecting GCP. This paper mainly research on POS technique in production of 1:10 000 scale DLG on GCP distribution. We designed 23 kinds of ground control points distribution schemes, using integrated sensor direction method to do the triangulation experiments, based on the results of triangulation, we produce a map with the scale of 1:10 000 and test its accuracy. This paper put forward appropriate GCP distributing schemes by experiments and research above, and made preparations for the application of POS technique on photogrammetry 4D data production.
Abstract: A systematic and exhaustive method based on the group
structure of a unitary Lie algebra is proposed to generate an enormous
number of quantum codes. With respect to the algebraic structure,
the orthogonality condition, which is the central rule of generating
quantum codes, is proved to be fully equivalent to the distinguishability
of the elements in this structure. In addition, four types of
quantum codes are classified according to the relation of the codeword
operators and some initial quantum state. By linking the unitary Lie
algebra with the additive group, the classical correspondences of some
of these quantum codes can be rendered.
Abstract: To date, theoretical studies concerning the Carbon
Fiber Reinforced Polymer (CFRP) strengthening of RC beams with
openings have been rather limited. In addition, various numerical
analyses presented so far have effectively simulated the behaviour of
solid beam strengthened by FRP material. In this paper, a two
dimensional nonlinear finite element analysis is presented to validate
against the laboratory test results of six RC beams. All beams had the
same rectangular cross-section geometry and were loaded under four
point bending. The crack pattern results of the finite element model
show good agreement with the crack pattern of the experimental
beams. The load midspan deflection curves of the finite element
models exhibited a stiffer result compared to the experimental beams.
The possible reason may be due to the perfect bond assumption used
between the concrete and steel reinforcement.
Abstract: Multimedia courseware has been accepted as a tool
that can support teaching and learning process. 'Li2D' courseware
was developed to assist student-s visualization on the topic of Loci in
Two Dimension. This paper describes an evaluation on the
effectiveness and usability of a 'Li2D' courseware. The quasi
experiment was used for the effectiveness evaluation. Usability
evaluation was accomplished based on four constructs of usability,
namely: efficiency, learnability, screen design and satisfaction. An
evaluation on the multimedia elements was also conducted. A total of
63 students of Form Two are involved in the study. The students are
divided into two groups: control and experimental. The experimental
group had to interact with 'Li2D' courseware as part of the learning
activities while the control group used the conventional learning
methods. The results indicate that the experimental group performed
better than the control group in understanding the Loci in Two
Dimensions topic. In terms of usability, the results showed that the
students agreed on the usability in multimedia elements in the 'Li2D'
courseware.
Abstract: With the drastically growth in optical communication
technology, a lossless, low-crosstalk and multifunction optical switch
is most desirable for large-scale photonic network. To realize such a
switch, we have introduced the new architecture of optical switch
that embedded many functions on single device. The asymmetrical
architecture of OXADM consists of 3 parts; selective port, add/drop
operation, and path routing. Selective port permits only the interest
wavelength pass through and acts as a filter. While add and drop
function can be implemented in second part of OXADM architecture.
The signals can then be re-routed to any output port or/and perform
an accumulation function which multiplex all signals onto single path
and then exit to any interest output port. This will be done by path
routing operation. The unique features offered by OXADM has
extended its application to Fiber to-the Home Technology (FTTH),
here the OXADM is used as a wavelength management element in
Optical Line Terminal (OLT). Each port is assigned specifically with
the operating wavelengths and with the dynamic routing management
to ensure no traffic combustion occurs in OLT.
Abstract: The development and extension of large cities induced
a need for shallow tunnel in soft ground of building areas. Estimation
of ground settlement caused by the tunnel excavation is important
engineering point. In this paper, prediction of surface subsidence
caused by tunneling in one section of seventh line of Tehran subway
is considered. On the basis of studied geotechnical conditions of the
region, tunnel with the length of 26.9km has been excavated applying
a mechanized method using an EPB-TBM with a diameter of 9.14m.
In this regard, settlement is estimated utilizing both analytical and
numerical finite element method. The numerical method shows that
the value of settlement in this section is 5cm. Besides, the analytical
consequences (Bobet and Loganathan-Polous) are 5.29 and 12.36cm,
respectively. According to results of this study, due tosaturation of
this section, there are good agreement between Bobet and numerical
methods. Therefore, tunneling processes in this section needs a
special consolidation measurement and support system before the
passage of tunnel boring machine.
Abstract: This paper introduces an approach to construct a set of criteria for evaluating alternative options. Content analysis was used to collet criterion elements. Then the elements were classified and organized yielding to hierarchic structure. The reliability of the constructed criteria was evaluated in an experiment. Finally the criteria were used to evaluate alternative options indecision-making.
Abstract: The present paper presents a finite element model and
analysis for the interaction between a piezoresistive tactile sensor and
biological tissues. The tactile sensor is proposed for use in minimally
invasive surgery to deliver tactile information of biological tissues to
surgeons. The proposed sensor measures the relative hardness of soft
contact objects as well as the contact force. Silicone rubbers were
used as the phantom of biological tissues. Finite element analysis of
the silicone rubbers and the mechanical structure of the sensor were
performed using COMSOL Multiphysics (v3.4) environment. The
simulation results verify the capability of the sensor to be used to
differentiate between different kinds of silicone rubber materials.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: The users are now expecting higher level of
DSP(Digital Signal Processing) software quality than ever before.
Prevention and detection of defect are critical elements of software
quality assurance. In this paper, principles and rules for prevention and
detection of defect are suggested, which are not universal guidelines,
but are useful for both novice and experienced DSP software
developers.
Abstract: Modeling of a heterogeneous industrial fixed bed
reactor for selective dehydrogenation of heavy paraffin with Pt-Sn-
Al2O3 catalyst has been the subject of current study. By applying
mass balance, momentum balance for appropriate element of reactor
and using pressure drop, rate and deactivation equations, a detailed
model of the reactor has been obtained. Mass balance equations have
been written for five different components. In order to estimate
reactor production by the passage of time, the reactor model which is
a set of partial differential equations, ordinary differential equations
and algebraic equations has been solved numerically.
Paraffins, olefins, dienes, aromatics and hydrogen mole percent as
a function of time and reactor radius have been found by numerical
solution of the model. Results of model have been compared with
industrial reactor data at different operation times. The comparison
successfully confirms validity of proposed model.
Abstract: Today with the rapid growth of telecommunications equipment, electronic and developing more and more networks of power, influence of electromagnetic waves on one another has become hot topic discussions. So in this article, this issue and appropriate mechanisms for EMC operations have been presented. First, impact of high voltage lines on the surrounding environment especially on the control room has been investigated, then to reduce electromagnetic radiation, various methods of shielding are provided and shielding effectiveness of them has been compared. It should be expressed that simulations have been done by the finite element method (FEM).
Abstract: Work is focused to the study of unburned carbon in
ash from coal (and wastes) combustion in 8 combustion tests at 3
fluidised-bed power station, at co-combustion of coal and wastes
(also at fluidized bed) and at bench-scale unit simulating coal
combustion in small domestic furnaces. The attention is paid to
unburned carbon contents in bottom ashes and fly ashes at these 8
combustion tests and to morphology of unburned carbons. Specific
surface area of coals, unburned carbons and ashes and the relation of
specific surface area of unburned carbon and the content of volatile
combustibles in coal were studied as well.
Abstract: In-core memory requirement is a bottleneck in solving
large three dimensional Navier-Stokes finite element problem
formulations using sparse direct solvers. Out-of-core solution
strategy is a viable alternative to reduce the in-core memory
requirements while solving large scale problems. This study
evaluates the performance of various out-of-core sequential solvers
based on multifrontal or supernodal techniques in the context of
finite element formulations for three dimensional problems on a
Windows platform. Here three different solvers, HSL_MA78,
MUMPS and PARDISO are compared. The performance of these
solvers is evaluated on a 64-bit machine with 16GB RAM for finite
element formulation of flow through a rectangular channel. It is
observed that using out-of-core PARDISO solver, relatively large
problems can be solved. The implementation of Newton and
modified Newton's iteration is also discussed.
Abstract: Nowadays, engineering ceramics have significant
applications in different industries such as; automotive, aerospace,
electrical, electronics and even martial industries due to their
attractive physical and mechanical properties like very high hardness
and strength at elevated temperatures, chemical stability, low friction
and high wear resistance. However, these interesting properties plus
low heat conductivity make their machining processes too hard,
costly and time consuming. Many attempts have been made in order
to make the grinding process of engineering ceramics easier and
many scientists have tried to find proper techniques to economize
ceramics' machining processes. This paper proposes a new diamond
plunge grinding technique using ultrasonic vibration for grinding
Alumina ceramic (Al2O3). For this purpose, a set of laboratory
equipments have been designed and simulated using Finite Element
Method (FEM) and constructed in order to be used in various
measurements. The results obtained have been compared with the
conventional plunge grinding process without ultrasonic vibration
and indicated that the surface roughness and fracture strength
improved and the grinding forces decreased.
Abstract: In this paper, a tooth shape optimization method for
cogging torque reduction in Permanent Magnet (PM) motors is
developed by using the Reduced Basis Technique (RBT) coupled by
Finite Element Analysis (FEA) and Design of Experiments (DOE)
methods. The primary objective of the method is to reduce the
enormous number of design variables required to define the tooth
shape. RBT is a weighted combination of several basis shapes. The
aim of the method is to find the best combination using the weights
for each tooth shape as the design variables. A multi-level design
process is developed to find suitable basis shapes or trial shapes at
each level that can be used in the reduced basis technique. Each level
is treated as a separated optimization problem until the required
objective – minimum cogging torque – is achieved. The process is
started with geometrically simple basis shapes that are defined by
their shape co-ordinates. The experimental design of Taguchi method
is used to build the approximation model and to perform
optimization. This method is demonstrated on the tooth shape
optimization of a 8-poles/12-slots PM motor.
Abstract: In an assessment of the extractability of metals in
green liquor dregs from the chemical recovery circuit of semichemical
pulp mill, extractable concentrations of heavy metals in
artificial gastric fluid were between 10 (Ni) and 717 (Zn) times
higher than those in artificial sweat fluid. Only Al (6.7 mg/kg; d.w.),
Ni (1.2 mg/kg; d.w.) and Zn (1.8 mg/kg; d.w.) showed extractability
in the artificial sweat fluid, whereas Al (730 mg/kg; d.w.), Ba (770
mg/kg; d.w.) and Zn (1290 mg/kg; d.w.) showed clear extractability
in the artificial gastric fluid. As certain heavy metals were clearly
soluble in the artificial gastric fluid, the careful handling of this
residue is recommended in order to prevent the penetration of green
liquor dregs across the human gastrointestinal tract.
Abstract: In this paper back-propagation artificial neural
network (BPANN) with Levenberg–Marquardt algorithm is
employed to predict the limiting drawing ratio (LDR) of the deep
drawing process. To prepare a training set for BPANN, some finite
element simulations were carried out. die and punch radius, die arc
radius, friction coefficient, thickness, yield strength of sheet and
strain hardening exponent were used as the input data and the LDR
as the specified output used in the training of neural network. As a
result of the specified parameters, the program will be able to
estimate the LDR for any new given condition. Comparing FEM and
BPANN results, an acceptable correlation was found.
Abstract: Maintenance costs incurred on building differs. The
difference can be as results of the types, functions, age, building
health index, size, form height, location and complexity of the
building. These are contributing to the difficulty in maintenance
development of deterministic maintenance cost model. This paper is
concerns with reporting the preliminary findings on the creation of
building maintenance cost distributions for universities in Malaysia.
This study is triggered by the need to provide guides on maintenance
costs distributions for decision making. For this purpose, a survey
questionnaire was conducted to investigate the distribution of
maintenance costs in the universities. Altogether, responses were
received from twenty universities comprising both private and
publicly owned. The research found that engineering services,
roofing and finishes were the elements contributing the larger
segment of the maintenance costs. Furthermore, the study indicates
the significance of maintenance cost distribution as decision making
tool towards maintenance management.
Abstract: Games can be classified as games of skill, games of chance or otherwise be classified as mixed. This paper deals with the topic of scientifically classifying mixed games as more reliant on elements of chance or elements of skill and ways to scientifically measure the amount of skill involved. This is predominantly useful for classification of games as legal or illegal in deferent jurisdictions based on the local gaming laws. We propose a novel measure of skill to chance ratio called the Game Skill Measure (GSM) and utilize it to calculate the skill component of a popular variant of Poker.