Abstract: Vegetation affects the mean and turbulent flow
structure. It may increase flood risks and sediment transport.
Therefore, it is important to develop analytical approaches for the bed
shear stress on vegetated bed, to predict resistance caused by
vegetation. In the recent years, experimental and numerical models
have both been developed to model the effects of submerged
vegetation on open-channel flow. In this paper, different analytic
models are compared and tested using the criteria of deviation, to
explore their capacity for predicting the mean velocity and select the
suitable one that will be applied in real case of rivers. The
comparison between the measured data in vegetated flume and
simulated mean velocities indicated, a good performance, in the case
of rigid vegetation, whereas, Huthoff model shows the best
agreement with a high coefficient of determination (R2=80%) and the
smallest error in the prediction of the average velocities.
Abstract: A Motzkin shift is a mathematical model for constraints
on genetic sequences. In terms of the theory of symbolic dynamics,
the Motzkin shift is nonsofic, and therefore, we cannot use the Perron-
Frobenius theory to calculate its topological entropy. The Motzkin
shift M(M,N) which comes from language theory, is defined to be the
shift system over an alphabet A that consists of N negative symbols,
N positive symbols and M neutral symbols. For an x in the full shift,
x will be in the Motzkin subshift M(M,N) if and only if every finite
block appearing in x has a non-zero reduced form. Therefore, the
constraint for x cannot be bounded in length. K. Inoue has shown that
the entropy of the Motzkin shift M(M,N) is log(M + N + 1). In this
paper, a new direct method of calculating the topological entropy of
the Motzkin shift is given without any measure theoretical discussion.
Abstract: This research study aimed to survey and analyze the
attitudes of pre-service teachers’ the analytical thinking development
based on Miller’s Model. The informants of this study were 22 third
year teacher students majoring in Thai. The course where the
instruction was conducted was English for Academic Purposes in
Thai Language 2. The instrument of this research was an open-ended
questionnaire with two dimensions of questions: academic and
satisfaction dimensions. The investigation revealed the positive
attitudes. In the academic dimension, the majority of 12 (54.54%),
the highest percentage, reflected that the method of teaching
analytical thinking and language simultaneously was their new
knowledge and the similar percentage also belonged to text cohesion
in writing. For the satisfaction, the highest frequency count was from
17 of them (77.27%) and this majority favored the openness or
friendliness of the teacher.
Abstract: Construction cost estimation is one of the most
important aspects of construction project design. For generations, the
process of cost estimating has been manual, time-consuming and
error-prone. This has partly led to most cost estimates to be unclear
and riddled with inaccuracies that at times lead to over- or underestimation
of construction cost. The development of standard set of
measurement rules that are understandable by all those involved in a
construction project, have not totally solved the challenges. Emerging
Building Information Modelling (BIM) technologies can exploit
standard measurement methods to automate cost estimation process
and improve accuracies. This requires standard measurement
methods to be structured in ontological and machine readable format;
so that BIM software packages can easily read them. Most standard
measurement methods are still text-based in textbooks and require
manual editing into tables or Spreadsheet during cost estimation. The
aim of this study is to explore the development of an ontology based
on New Rules of Measurement (NRM) commonly used in the UK for
cost estimation. The methodology adopted is Methontology, one of
the most widely used ontology engineering methodologies. The
challenges in this exploratory study are also reported and
recommendations for future studies proposed.
Abstract: In this paper, a new trend for improvement in semianalytical
method based on scale boundaries in order to solve the 2D
elastodynamic problems is provided. In this regard, only the
boundaries of the problem domain discretization are by specific subparametric
elements. Mapping functions are uses as a class of higherorder
Lagrange polynomials, special shape functions, Gauss-Lobatto-
Legendre numerical integration, and the integral form of the weighted
residual method, the matrix is diagonal coefficients in the equations
of elastodynamic issues. Differences between study conducted and
prior research in this paper is in geometry production procedure of
the interpolation function and integration of the different is selected.
Validity and accuracy of the present method are fully demonstrated
through two benchmark problems which are successfully modeled
using a few numbers of DOFs. The numerical results agree very well
with the analytical solutions and the results from other numerical
methods.
Abstract: This study is used as a definition method to the value
and function in manufacturing sector. In concurrence of discussion
about present condition of modeling method, until now definition of
1D-CAE is ambiguity and not conceptual. Across all the physic fields,
those methods are defined with the formulation of differential
algebraic equation which only applied time derivation and simulation.
At the same time, we propose semi-acausal modeling concept and
differential algebraic equation method as a newly modeling method
which the efficiency has been verified through the comparison of
numerical analysis result between the semi-acausal modeling
calculation and FEM theory calculation.
Abstract: The study of organisations’ information security
cultures has attracted scholars as well as healthcare services industry
to research the topic and find appropriate tools and approaches to
develop a positive culture. The vast majority of studies in Saudi
national health services are on the use of technology to protect and
secure health services information. On the other hand, there is a lack
of research on the role and impact of an organisation’s cultural
dimensions on information security. This research investigated and
analysed the role and impact of cultural dimensions on information
security in Saudi Arabia health service. Hypotheses were tested and
two surveys were carried out in order to collect data and information
from three major hospitals in Saudi Arabia (SA). The first survey
identified the main cultural-dimension problems in SA health
services and developed an initial information security culture
framework model. The second survey evaluated and tested the
developed framework model to test its usefulness, reliability and
applicability. The model is based on human behaviour theory, where
the individual’s attitude is the key element of the individual’s
intention to behave as well as of his or her actual behaviour. The
research identified a set of cultural and sub-cultural dimensions in SA
health information security and services.
Abstract: High Peak to Average Power Ratio (PAPR) of the
transmitted signal is a serious problem in multicarrier systems (MC),
such as Orthogonal Frequency Division Multiplexing (OFDM), or in
Multi-Carrier Code Division Multiple Access (MC-CDMA) systems,
due to large number of subcarriers. This effect is possible reduce with
some PAPR reduction techniques. Spreading sequences at the
presence of Saleh and Rapp models of high power amplifier (HPA)
have big influence on the behavior of system. In this paper we
investigate the bit-error-rate (BER) performance of MC-CDMA
systems. Basically we can see from simulations that the MC-CDMA
system with Iterative algorithm can be providing significantly better
results than the MC-CDMA system. The results of our analyses are
verified via simulation.
Abstract: Recent research in neural networks science and
neuroscience for modeling complex time series data and statistical
learning has focused mostly on learning from high input space and
signals. Local linear models are a strong choice for modeling local
nonlinearity in data series. Locally weighted projection regression is
a flexible and powerful algorithm for nonlinear approximation in
high dimensional signal spaces. In this paper, different learning
scenario of one and two dimensional data series with different
distributions are investigated for simulation and further noise is
inputted to data distribution for making different disordered
distribution in time series data and for evaluation of algorithm in
locality prediction of nonlinearity. Then, the performance of this
algorithm is simulated and also when the distribution of data is high
or when the number of data is less the sensitivity of this approach to
data distribution and influence of important parameter of local
validity in this algorithm with different data distribution is explained.
Abstract: Universal modeling method well proven for industrial
compressors was applied for design of the high flow rate supersonic
stage. Results were checked by ANSYS CFX and NUMECA Fine
Turbo calculations. The impeller appeared to be very effective at
transonic flow velocities. Stator elements efficiency is acceptable at
design Mach numbers too. Their loss coefficient versus inlet flow
angle performances correlates well with Universal modeling
prediction. The impeller demonstrates ability of satisfactory operation
at design flow rate. Supersonic flow behavior in the impeller inducer
at the shroud blade to blade surface Φ des deserves additional study.
Abstract: There is an evident trend to elevate pressure ratio of a
single stage of a turbo compressors - axial compressors in particular.
Whilst there was an opinion recently that a pressure ratio 1,9 was a
reasonable limit, later appeared information on successful modeling
tested of stages with pressure ratio up to 2,8. The authors recon that
lack of information on high pressure stages makes actual a study of
rational choice of design parameters before high supersonic flow
problems solving. The computer program of an engineering type was
developed. Below is presented a sample of its application to study
possible parameters of the impeller of the stage with pressure ratio
3,0. Influence of two main design parameters on expected efficiency,
periphery blade speed and flow structure is demonstrated. The results
had lead to choose a variant for further analysis and improvement by
CFD methods.
Abstract: This paper introduces novel approaches to partitioning
and mapping in terms of model-based embedded multicore system
engineering and further discusses benefits, industrial relevance and
features in common with existing approaches. In order to assess
and evaluate results, both approaches have been applied to a real
industrial application as well as to various prototypical demonstrative
applications, that have been developed and implemented for
different purposes. Evaluations show, that such applications improve
significantly according to performance, energy efficiency, meeting
timing constraints and covering maintaining issues by using
the AMALTHEA platform and the implemented approaches.
Furthermore, the model-based design provides an open, expandable,
platform independent and scalable exchange format between
OEMs, suppliers and developers on different levels. Our proposed
mechanisms provide meaningful multicore system utilization since
load balancing by means of partitioning and mapping is effectively
performed with regard to the modeled systems including hardware,
software, operating system, scheduling, constraints, configuration and
more data.
Abstract: The mechanics of rip currents are complex, involving
interactions between waves, currents, water levels and the bathymetry,
that present particular challenges for numerical models. Here,
the effects of a grid-spacing dependent horizontal mixing on the
wave-current interactions are studied. Near the shore, wave rays
diverge from channels towards bar crests because of refraction by
topography and currents, in a way that depends on the rip current
intensity which is itself modulated by the horizontal mixing. At
low resolution with the grid-spacing dependent horizontal mixing,
the wave motion is the same for both coupling modes because the
wave deviation by the currents is weak. In high resolution case,
however, classical results are found with the stabilizing effect of
the flow by feedback of waves on currents. Lastly, wave-current
interactions and the horizontal mixing strongly affect the intensity
of the three-dimensional rip velocity.
Abstract: The number of Ground Motion Prediction Equations
(GMPEs) used for predicting peak ground acceleration (PGA) and
the number of earthquake recordings that have been used for fitting
these equations has increased in the past decades. The current PF-L
database contains 3550 recordings. Since the GMPEs frequently
model the peak ground acceleration the goal of the present study was
to refit a selection of 44 of the existing equation models for PGA in
light of the latest data. The algorithm Levenberg-Marquardt was used
for fitting the coefficients of the equations and the results are
evaluated both quantitatively by presenting the root mean squared
error (RMSE) and qualitatively by drawing graphs of the five best
fitted equations. The RMSE was found to be as low as 0.08 for the
best equation models. The newly estimated coefficients vary from the
values published in the original works.
Abstract: The spring-driven ball-type check valve is one of the
most important components of hydraulic systems: it controls the
position of the ball and prevents backward flow. To simplify the
structure, the spring must be eliminated, and to accomplish this, the
flow pattern and the behavior of the check ball in L-shaped pipe must
be determined. In this paper, we present a full-scale model of a check
ball made of acrylic resin, and we determine the relationship between
the initial position of the ball, the position and diameter of the inflow
port. The check flow rate increases in a standard center inflow model,
and it is possible to greatly decrease the check-flow rate by shifting the
inflow from the center.
Abstract: Job Scheduling plays an important role for efficient
utilization of grid resources available across different domains and
geographical zones. Scheduling of jobs is challenging and NPcomplete.
Evolutionary / Swarm Intelligence algorithms have been
extensively used to address the NP problem in grid scheduling.
Artificial Bee Colony (ABC) has been proposed for optimization
problems based on foraging behaviour of bees. This work proposes a
modified ABC algorithm, Cluster Heterogeneous Earliest First Min-
Min Artificial Bee Colony (CHMM-ABC), to optimally schedule
jobs for the available resources. The proposed model utilizes a novel
Heterogeneous Earliest Finish Time (HEFT) Heuristic Algorithm
along with Min-Min algorithm to identify the initial food source.
Simulation results show the performance improvement of the
proposed algorithm over other swarm intelligence techniques.
Abstract: The development, operation and maintenance of
Integrated Waste Management Systems (IWMS) affects essentially
the sustainable concern of every region. The features of such systems
have great influence on all of the components of sustainability. In
order to reach the optimal way of processes, a comprehensive
mapping of the variables affecting the future efficiency of the system
is needed such as analysis of the interconnections among the
components and modeling of their interactions. The planning of a
IWMS is based fundamentally on technical and economical
opportunities and the legal framework. Modeling the sustainability
and operation effectiveness of a certain IWMS is not in the scope of
the present research. The complexity of the systems and the large
number of the variables require the utilization of a complex approach
to model the outcomes and future risks. This complex method should
be able to evaluate the logical framework of the factors composing
the system and the interconnections between them. The authors of
this paper studied the usability of the Fuzzy Cognitive Map (FCM)
approach modeling the future operation of IWMS’s. The approach
requires two input data set. One is the connection matrix containing
all the factors affecting the system in focus with all the
interconnections. The other input data set is the time series, a
retrospective reconstruction of the weights and roles of the factors.
This paper introduces a novel method to develop time series by
content analysis.
Abstract: This study examines several critical dimensions of eservice
quality overlooked in the existing literature and proposes a
model and instrument framework for measuring customer perceived
e-service quality in the banking sector. The initial design was derived
from a pool of instrument dimensions and their items from the
existing literature review by content analysis. Based on focused
group discussion, nine dimensions were extracted. An exploratory
factor analysis approach was applied to data from a survey of 323
respondents. The instrument has been designed specifically for the
banking sector. Research data was collected from bank customers
who use electronic banking in a developing economy. A nine-factor
instrument has been proposed to measure the e-service quality. The
instrument has been checked for reliability. The validity and sample
place limited the applicability of the instrument across economies and
service categories. Future research must be conducted to check the
validity. This instrument can help bankers in developing economies
like India to measure the e-service quality and make improvements.
The present study offers a systematic procedure that provides insights
on to the conceptual and empirical comprehension of customer
perceived e-service quality and its constituents.
Abstract: In medical imaging, segmentation of different areas of
human body like bones, organs, tissues, etc. is an important issue.
Image segmentation allows isolating the object of interest for further
processing that can lead for example to 3D model reconstruction of
whole organs. Difficulty of this procedure varies from trivial for
bones to quite difficult for organs like liver. The liver is being
considered as one of the most difficult human body organ to segment.
It is mainly for its complexity, shape versatility and proximity of
other organs and tissues. Due to this facts usually substantial user
effort has to be applied to obtain satisfactory results of the image
segmentation. Process of image segmentation then deteriorates from
automatic or semi-automatic to fairly manual one. In this paper,
overview of selected available software applications that can handle
semi-automatic image segmentation with further 3D volume
reconstruction of human liver is presented. The applications are being
evaluated based on the segmentation results of several consecutive
DICOM images covering the abdominal area of the human body.
Abstract: The absorption power generation cycle based on the
ammonia-water mixture has attracted much attention for efficient
recovery of low-grade energy sources. In this paper a thermodynamic
performance analysis is carried out for a Kalina cycle using
ammonia-water mixture as a working fluid for efficient conversion of
low-temperature heat source in the form of sensible energy. The
effects of the source temperature on the system performance are
extensively investigated by using the thermodynamic models. The
results show that the source temperature as well as the ammonia mass
fraction affects greatly on the thermodynamic performance of the
cycle.