Abstract: In this paper we describe one critical research
program within a complex, ongoing multi-year project (2010 to 2014
inclusive) with the overall goal to improve the learning outcomes for
first year undergraduate commerce/business students within an
Information Systems (IS) subject with very large enrolment. The
single research program described in this paper is the analysis of
student attitudes and decision making in relation to the availability of
formative assessment feedback via Web-based real time conferencing
and document exchange software (Adobe Connect). The formative
assessment feedback between teaching staff and students is in respect
of an authentic problem-based, team-completed assignment. The
analysis of student attitudes and decision making is investigated via
both qualitative (firstly) and quantitative (secondly) application of the
Theory of Planned Behavior (TPB) with a two statistically-significant
and separate trial samples of the enrolled students. The initial
qualitative TPB investigation revealed that perceived self-efficacy,
improved time-management, and lecturer-student relationship
building were the major factors in shaping an overall favorable
student attitude to online feedback, whilst some students expressed
valid concerns with perceived control limitations identified within the
online feedback protocols. The subsequent quantitative TPB
investigation then confirmed that attitude towards usage, subjective
norms surrounding usage, and perceived behavioral control of usage
were all significant in shaping student intention to use the online
feedback protocol, with these three variables explaining 63 percent of
the variance in the behavioral intention to use the online feedback
protocol. The identification in this research of perceived behavioral
control as a significant determinant in student usage of a specific
technology component within a virtual learning environment (VLE)
suggests that VLEs could now be viewed not as a single, atomic
entity, but as a spectrum of technology offerings ranging from the
mature and simple (e.g., email, Web downloads) to the cutting-edge
and challenging (e.g., Web conferencing and real-time document
exchange). That is, that all VLEs should not be considered the same.
The results of this research suggest that tertiary students have the
technological sophistication to assess a VLE in this more selective
manner.
Abstract: In this paper, we considered and applied parametric
modeling for some experimental data of dynamical system. In this
study, we investigated the different distribution of output
measurement from some dynamical systems. Also, with variance
processing in experimental data we obtained the region of
nonlinearity in experimental data and then identification of output
section is applied in different situation and data distribution. Finally,
the effect of the spanning the measurement such as variance to
identification and limitation of this approach is explained.
Abstract: Performance of different filtering approaches depends
on modeling of dynamical system and algorithm structure. For
modeling and smoothing the data the evaluation of posterior
distribution in different filtering approach should be chosen carefully.
In this paper different filtering approaches like filter KALMAN,
EKF, UKF, EKS and smoother RTS is simulated in some trajectory
tracking of path and accuracy and limitation of these approaches are
explained. Then probability of model with different filters is
compered and finally the effect of the noise variance to estimation is
described with simulations results.
Abstract: This paper presents a novel statistical description of
the counterpoise effective length due to lightning surges, where the
(impulse) effective length had been obtained by means of regressive
formulas applied to the transient simulation results. The effective
length is described in terms of a statistical distribution function, from
which median, mean, variance, and other parameters of interest could
be readily obtained. The influence of lightning current amplitude,
lightning front duration, and soil resistivity on the effective length has
been accounted for, assuming statistical nature of these parameters. A
method for determining the optimal counterpoise length, in terms of
the statistical impulse effective length, is also presented. It is based on
estimating the number of dangerous events associated with lightning
strikes. Proposed statistical description and the associated method
provide valuable information which could aid the design engineer in
optimising physical lengths of counterpoises in different grounding
arrangements and soil resistivity situations.
Abstract: In-memory database systems are becoming popular
due to the availability and affordability of sufficiently large RAM and
processors in modern high-end servers with the capacity to manage
large in-memory database transactions. While fast and reliable inmemory
systems are still being developed to overcome cache misses,
CPU/IO bottlenecks and distributed transaction costs, disk-based data
stores still serve as the primary persistence. In addition, with the
recent growth in multi-tenancy cloud applications and associated
security concerns, many organisations consider the trade-offs and
continue to require fast and reliable transaction processing of diskbased
database systems as an available choice. For these
organizations, the only way of increasing throughput is by improving
the performance of disk-based concurrency control. This warrants a
hybrid database system with the ability to selectively apply an
enhanced disk-based data management within the context of inmemory
systems that would help improve overall throughput.
The general view is that in-memory systems substantially
outperform disk-based systems. We question this assumption and
examine how a modified variation of access invariance that we call
enhanced memory access, (EMA) can be used to allow very high
levels of concurrency in the pre-fetching of data in disk-based
systems. We demonstrate how this prefetching in disk-based systems
can yield close to in-memory performance, which paves the way for
improved hybrid database systems. This paper proposes a novel EMA
technique and presents a comparative study between disk-based EMA
systems and in-memory systems running on hardware configurations
of equivalent power in terms of the number of processors and their
speeds. The results of the experiments conducted clearly substantiate
that when used in conjunction with all concurrency control
mechanisms, EMA can increase the throughput of disk-based systems
to levels quite close to those achieved by in-memory system. The
promising results of this work show that enhanced disk-based
systems facilitate in improving hybrid data management within the
broader context of in-memory systems.
Abstract: In this study which has been conducted in Akçasu
Forest Range District of Devrek Forest Directorate; 3 methods (weed
control with labourer power, cover removal with Hitachi F20
Excavator, and weed control with agricultural equipment mounted on
a Ferguson 240S agriculture tractor) were utilized in weed control
efforts in regeneration of degraded oriental beech forests have been
compared. In this respect, 3 methods have been compared by
determining certain work hours and standard durations of unit areas
(1 hectare). For this purpose, evaluating the tasks made with human
and machine force from the aspects of duration, productivity and
costs, it has been aimed to determine the most productive method in
accordance with the actual ecological conditions of research field.
Within the scope of the study, the time studies have been conducted
for 3 methods used in weed control efforts. While carrying out those
studies, the performed implementations have been evaluated by
dividing them into business stages. Also, the actual data have been
used while calculating the cost accounts. In those calculations, the
latest formulas and equations which are also used in developed
countries have been utilized. The variance of analysis (ANOVA) was
used in order to determine whether there is any statistically
significant difference among obtained results, and the Duncan test
was used for grouping if there is significant difference. According to
the measurements and findings carried out within the scope of this
study, it has been found during living cover removal efforts in
regeneration efforts in demolished oriental beech forests that the
removal of weed layer in 1 hectare of field has taken 920 hours with
labourer force, 15.1 hours with excavator and 60 hours with an
equipment mounted on a tractor. On the other hand, it has been
determined that the cost of removal of living cover in unit area (1
hectare) was 3220.00 TL for labourer power, 1250 TL for excavator
and 1825 TL for equipment mounted on a tractor.
According to the obtained results, it has been found that the
utilization of excavator in weed control effort in regeneration of
degraded oriental beech regions under actual ecological conditions of
research field has been found to be more productive from both of
aspects of duration and costs. These determinations carried out
should be repeated in weed control efforts in degraded forest fields
with different ecological conditions, it is compulsory for finding the
most efficient weed control method. These findings will light the way
of technical staff of forestry directorate in determination of the most
effective and economic weed control method. Thus, the more actual
data will be used while preparing the weed control budgets, and there
will be significant contributions to national economy. Also the results of this and similar studies are very important for developing the policies for our forestry in short and long term.
Abstract: The tomato is a very important crop, whose
cultivation in the Mediterranean basin is severely affected by the
phytoparasitic weed Phelipanche ramosa. The semiarid regions of
the world are considered the main areas where this parasitic weed is
established causing heavy infestation as it is able to produce high
numbers of seeds (up to 500,000 per plant), which remain viable for
extended period (more than 20 years). In this paper the results
obtained from eleven treatments in order to control this parasitic
weed including chemical, agronomic, biological and biotechnological
methods compared with the untreated test under two plowing depths
(30 and 50 cm) are reported. The split-plot design with 3 replicates
was adopted. In 2014 a trial was performed in Foggia province
(southern Italy) on processing tomato (cv Docet) grown in the field
infested by Phelipanche ramosa. Tomato seedlings were transplant
on May 5, on a clay-loam soil. During the growing cycle of the
tomato crop, at 56-78 and 92 days after transplantation, the number
of parasitic shoots emerged in each plot was detected. At tomato
harvesting, on August 18, the major quantity-quality yield parameters
were determined (marketable yield, mean weight, dry matter, pH,
soluble solids and color of fruits). All data were subjected to analysis
of variance (ANOVA) and the means were compared by Tukey's test.
Each treatment studied did not provide complete control against
Phelipanche ramosa. However, among the different methods tested,
some of them which Fusarium, gliphosate, radicon biostimulant and
Red Setter tomato cv (improved genotypes obtained by Tilling
technology) under deeper plowing (50 cm depth) proved to mitigate
the virulence of the Phelipanche ramose attacks. It is assumed that
these effects can be improved combining some of these treatments
each other, especially for a gradual and continuing reduction of the
“seed bank” of the parasite in the soil.
Abstract: The objective of meta-analysis is to combine results
from several independent studies in order to create generalization
and provide evidence base for decision making. But recent studies
show that the magnitude of effect size estimates reported in many
areas of research significantly changed over time and this can
impair the results and conclusions of meta-analysis. A number of
sequential methods have been proposed for monitoring the effect
size estimates in meta-analysis. However they are based on statistical
theory applicable only to fixed effect model (FEM) of meta-analysis.
For random-effects model (REM), the analysis incorporates the
heterogeneity variance, τ 2 and its estimation create complications.
In this paper we study the use of a truncated CUSUM-type test with
asymptotically valid critical values for sequential monitoring in REM.
Simulation results show that the test does not control the Type I error
well, and is not recommended. Further work required to derive an
appropriate test in this important area of applications.
Abstract: This study is purposed to develop an efficient fault
detection method for Global Navigation Satellite Systems (GNSS)
applications based on adaptive noise covariance estimation. Due to the
dependence on radio frequency signals, GNSS measurements are
dominated by systematic errors in receiver’s operating environment.
In the proposed method, the pseudorange and carrier-phase
measurement noise covariances are obtained at time propagations and
measurement updates in process of Carrier-Smoothed Code (CSC)
filtering, respectively. The test statistics for fault detection are
generated by the estimated measurement noise covariances. To
evaluate the fault detection capability, intentional faults were added to
the filed-collected measurements. The experiment result shows that
the proposed method is efficient in detecting unhealthy measurements
and improves GNSS positioning accuracy against fault occurrences.
Abstract: One of the most important tasks in the risk
management is the correct determination of probability of default
(PD) of particular financial subjects. In this paper a possibility of
determination of financial institution’s PD according to the creditscoring
models is discussed. The paper is divided into the two parts.
The first part is devoted to the estimation of the three different
models (based on the linear discriminant analysis, logit regression
and probit regression) from the sample of almost three hundred US
commercial banks. Afterwards these models are compared and
verified on the control sample with the view to choose the best one.
The second part of the paper is aimed at the application of the chosen
model on the portfolio of three key Czech banks to estimate their
present financial stability. However, it is not less important to be able
to estimate the evolution of PD in the future. For this reason, the
second task in this paper is to estimate the probability distribution of
the future PD for the Czech banks. So, there are sampled randomly
the values of particular indicators and estimated the PDs’ distribution,
while it’s assumed that the indicators are distributed according to the
multidimensional subordinated Lévy model (Variance Gamma model
and Normal Inverse Gaussian model, particularly). Although the
obtained results show that all banks are relatively healthy, there is
still high chance that “a financial crisis” will occur, at least in terms
of probability. This is indicated by estimation of the various quantiles
in the estimated distributions. Finally, it should be noted that the
applicability of the estimated model (with respect to the used data) is
limited to the recessionary phase of the financial market.
Abstract: In this paper two approaches to joint signal detection,
time of arrival (ToA) and angle of arrival (AoA) estimation in
multi-element antenna array are investigated. Two scenarios were
considered: first one, when the waveform of the useful signal
is known a priori and, second one, when the waveform of the
desired signal is unknown. For first scenario, the antenna array
signal processing based on multi-element matched filtering (MF)
with the following non-coherent detection scheme and maximum
likelihood (ML) parameter estimation blocks is exploited. For second
scenario, the signal processing based on the antenna array elements
covariance matrix estimation with the following eigenvector analysis
and ML parameter estimation blocks is applied. The performance
characteristics of both signal processing schemes are thoroughly
investigated and compared for different useful signals and noise
parameters.
Abstract: Mech-Degla, Degla-Beida and Frezza are the date
(Phoenix dactylifera L.) common varieties with a more or less good
availability and feeble trade value. Some morphologic and
physicochemical factors were determined. Results show that the
whole date weight is significantly different (P= 95%) concerning
Mech-Degla and Degla-Beida which are more commercialized than
Frezza whereas the pulp mass proportion in relation to whole fruits is
highest for Frezza (88.28%). Moreover, there is a large variability
concerning the weights and densities of constitutive tissues in each
variety. The white tissue is dominant in Mech-Degla in opposite to
the two other varieties. The variance analyze showed that the
difference in weights between brown and white tissues is significant
(P = 95%) for all studied varieties. Some other morphologic and
chemical proprieties of the whole pulps and their two constitutive
parts (brown or pigmented and white) are also investigated. The
predominance of phenolics in Mech-Degla (4.01g/100g, w.b) and
Frezza (4.96 g/100g, w.b) pulps brown part is the main result
revealed in this study.
Abstract: This paper presents the findings of an experimental investigation of important machining parameters for the horizontal boring tool modified to mouth with a horizontal lathe machine to bore an overlength workpiece. In order to verify a usability of a modified tool, design of experiment based on Taguchi method is performed. The parameters investigated are spindle speed, feed rate, depth of cut and length of workpiece. Taguchi L9 orthogonal array is selected for four factors three level parameters in order to minimize surface roughness (Ra and Rz) of S45C steel tubes. Signal to noise ratio analysis and analysis of variance (ANOVA) is performed to study an effect of said parameters and to optimize the machine setting for best surface finish. The controlled factors with most effect are depth of cut, spindle speed, length of workpiece, and feed rate in order. The confirmation test is performed to test the optimal setting obtained from Taguchi method and the result is satisfactory.
Abstract: The study investigated the implementation of the
Neural Network (NN) techniques for prediction of the loading of Cu
ions onto clinoptilolite. The experimental design using analysis of
variance (ANOVA) was chosen for testing the adequacy of the
Neural Network and for optimizing of the effective input parameters
(pH, temperature and initial concentration). Feed forward, multi-layer
perceptron (MLP) NN successfully tracked the non-linear behavior of
the adsorption process versus the input parameters with mean squared
error (MSE), correlation coefficient (R) and minimum squared error
(MSRE) of 0.102, 0.998 and 0.004 respectively. The results showed
that NN modeling techniques could effectively predict and simulate
the highly complex system and non-linear process such as ionexchange.
Abstract: Today, the means of following the developments in
the area of science and technology is to keep up with the pace of the
advancements in this area. As is in every profession, apart from their
personal efforts, the training of teachers in the period after they start
their careers is only possible through in-service training. The aim of
the present study is to determine the views of Information
Technologies (IT) teachers regarding the in-service training courses
organized by the Ministry of National Education. In this study, in
which quantitative research methods and techniques were employed,
the views of 196 IT teachers were collected by using the “Views on
In-service Training” questionnaire developed by the authors of the
paper. Independent groups t-test was used to determine whether the
views of IT teachers regarding in-service training differed depending
on gender, age and professional seniority. One-way analysis of
variance (ANOVA) was used to investigate whether the views of IT
teachers regarding in-service training differed depending on the
number of in-service training courses they joined and the type of inservice
training course they wanted to take. According to the findings
obtained in the study, the views of IT teachers on in-service training
did not show a significant difference depending on gender and age,
whereas those views differed depending on professional seniority, the
number of in-service training courses they joined and the type of inservice
training course they wanted to take.
Abstract: In this study, Electrical Discharge Machining (EDM) is used to modify the surface of high carbon steel En31 with the help of tool electrode (Copper-Chromium-Nickel) manufactured by powder metallurgy (PM) process. The effect of EDM on surface roughness during surface alloying is studied. Taguchi’s Design of experiment (DOE) and L18 orthogonal array is used to find the best level of input parameters in order to achieve high surface finish. Six input parameters are considered and their percentage contribution towards surface roughness is investigated by analysis of variances (ANOVA). Experimental results show that an hard alloyed surface (1.21% carbon, 2.14% chromium and 1.38% nickel) with surface roughness of 3.19µm can be generated using EDM with PM tool. Additionally, techniques like Scanning Electron Microscope (SEM) and Energy Dispersive Spectroscopy (EDS) are used to analyze the machined surface and EDMed layer composition, respectively. The increase in machined surface micro-hardness (101%) may be related to the formation of carbides containing chromium.
Abstract: Lately, with the increasing number of location-based applications, demand for highly accurate and reliable indoor localization became urgent. This is a challenging problem, due to the measurement variance which is the consequence of various factors like obstacles, equipment properties and environmental changes in complex nature of indoor environments. In this paper we propose low-cost custom-setup infrastructure solution and localization algorithm based on the Weighted Centroid Localization (WCL) method. Localization accuracy is increased by several enhancements: calibration of RSSI values gained from wireless nodes, repetitive measurements of RSSI to exclude deviating values from the position estimation, and by considering orientation of the device according to the wireless nodes. We conducted several experiments to evaluate the proposed algorithm. High accuracy of ~1m was achieved.
Abstract: This paper proposes a GLMM with spatial and
temporal effects for malaria data in Thailand. A Bayesian method is
used for parameter estimation via Gibbs sampling MCMC. A
conditional autoregressive (CAR) model is assumed to present the
spatial effects. The temporal correlation is presented through the
covariance matrix of the random effects. The malaria quarterly data
have been extracted from the Bureau of Epidemiology, Ministry of
Public Health of Thailand. The factors considered are rainfall and
temperature. The result shows that rainfall and temperature are
positively related to the malaria morbidity rate. The posterior means
of the estimated morbidity rates are used to construct the malaria
maps. The top 5 highest morbidity rates (per 100,000 population) are
in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4,
97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82).
According to the DIC criterion, the proposed model has a better
performance than the GLMM with spatial effects but without
temporal terms.
Abstract: Using the technology acceptance model (TAM), this
study examined the external variables of technological complexity
(TC) to acquire a better understanding of the factors that influence the
acceptance of computer application courses by learners at Active
Aging Universities. After the learners in this study had completed a
27-hour Facebook course, 44 learners responded to a modified TAM
survey. Data were collected to examine the path relationships among
the variables that influence the acceptance of Facebook-mediated
community learning. The partial least squares (PLS) method was used
to test the measurement and the structural model. The study results
demonstrated that attitudes toward Facebook use directly influence
behavioral intentions (BI) with respect to Facebook use, evincing a
high prediction rate of 58.3%. In addition to the perceived usefulness
(PU) and perceived ease of use (PEOU) measures that are proposed in
the TAM, other external variables, such as TC, also indirectly
influence BI. These four variables can explain 88% of the variance in
BI and demonstrate a high level of predictive ability. Finally,
limitations of this investigation and implications for further research
are discussed.
Abstract: An analysis of word semantics focusing on the invariance of advanced imagery in several pressing problems. Interest in the language of imagery is caused by the introduction, in the linguistics sphere, of a new paradigm, the center of which is the personality of the speaker (the subject of the language). Particularly noteworthy is the question of the place of the image when discussing the lexical, phraseological values and the relationship of imagery and metaphors. In part, the formation of a metaphor, as an interaction between two intellective entities, occurs at a cognitive level, and it is the category of the image, having cognitive roots, which aides in the correct interpretation of the results of this process on the lexical-semantic level.