Abstract: To ensure student success in a non-majors biology course, a flipped classroom pedagogical approach was developed and implemented. All students were assigned online lectures to listen to before they come to class. A three hour lecture was split into one hour of online component, one hour of in class lecture and one hour of worksheets done by students in the classroom. This deviation from a traditional 3 hour in class lecture has resulted in increased student interest in science as well as better understanding of difficult scientific concepts. A pre and post survey was given to measure the interest in the subject and grades were used to measure the success rates. While the overall grade average did not change dramatically, students reported a much better appreciation of biology. Also, students overwhelmingly like the use of worksheets in class to help them understand the concepts. They liked the fact that they could listen to lectures at their own pace on line and even repeat if needed. The flipped classroom approach turned out to work really well our non-science majors and the author is ready to implement this in other classrooms.
Abstract: Thousands of organisations store important and
confidential information related to them, their customers, and their
business partners in databases all across the world. The stored data
ranges from less sensitive (e.g. first name, last name, date of birth) to
more sensitive data (e.g. password, pin code, and credit card
information). Losing data, disclosing confidential information or
even changing the value of data are the severe damages that
Structured Query Language injection (SQLi) attack can cause on a
given database. It is a code injection technique where malicious SQL
statements are inserted into a given SQL database by simply using a
web browser. In this paper, we propose an effective pattern
recognition neural network model for detection and classification of
SQLi attacks. The proposed model is built from three main elements
of: a Uniform Resource Locator (URL) generator in order to generate
thousands of malicious and benign URLs, a URL classifier in order
to: 1) classify each generated URL to either a benign URL or a
malicious URL and 2) classify the malicious URLs into different
SQLi attack categories, and a NN model in order to: 1) detect either a
given URL is a malicious URL or a benign URL and 2) identify the
type of SQLi attack for each malicious URL. The model is first
trained and then evaluated by employing thousands of benign and
malicious URLs. The results of the experiments are presented in
order to demonstrate the effectiveness of the proposed approach.
Abstract: The paper deals with the classical fiber bundle model
of equal load sharing, sometimes referred to as the Daniels’ bundle
or the democratic bundle. Daniels formulated a multidimensional
integral and also a recursive formula for evaluation of the
strength cumulative distribution function. This paper describes
three algorithms for evaluation of the recursive formula and also
their implementations with source codes in the Python high-level
programming language. A comparison of the algorithms are provided
with respect to execution time. Analysis of orders of magnitudes of
addends in the recursion is also provided.
Abstract: This article presents two methods for the
compensation of harmonics generated by a nonlinear load. The first is
the classic method P-Q. The second is the controller by modern
method of artificial intelligence specifically fuzzy logic. Both
methods are applied to a shunt Active Power Filter (sAPF) based on a
three-phase voltage converter at five levels NPC topology. In
calculating the harmonic currents of reference, we use the algorithm
P-Q and pulse generation, we use the intersective PWM. For
flexibility and dynamics, we use fuzzy logic. The results give us clear
that the rate of Harmonic Distortion issued by fuzzy logic is better
than P-Q.
Abstract: The paper presents new results concerning selection of
optimal information fusion formula for ensembles of C-OTDR
channels. The goal of information fusion is to create an integral
classificator designed for effective classification of seismoacoustic
target events. The LPBoost (LP-β and LP-B variants), the Multiple
Kernel Learning, and Weighing of Inversely as Lipschitz Constants
(WILC) approaches were compared. The WILC is a brand new
approach to optimal fusion of Lipschitz Classifiers Ensembles.
Results of practical usage are presented.
Abstract: Experimental studies to investigate the depth of the
scour conducted at a side-weir intersection located at the 1800 curved
flume which located Hydraulic Laboratory of Yıldız Technical
University, Istanbul, Turkey. Side weirs were located at the middle of
the straight part of the main channel. Three different lengths (25, 40
and 50 cm) and three different weir crest height (7, 10 and 12 cm) of
the side weir placed on the side weir station. There is no scour when
the material is only kaolin. Therefore, the cohesive bed was prepared
by properly mixing clay material (kaolin) with 31% sand in all
experiments. Following 24h consolidation time, in order to observe
the effect of flow intensity on the scour depth, experiments were
carried out for five different upstream Froude numbers in the range of
0.33-0.81.
As a result of this study the relation between scour depth and
upstream flow intensity as a function of time have been established.
The longitudinal velocities decreased along the side weir; towards the
downstream due to overflow over the side-weirs. At the beginning,
the scour depth increases rapidly with time and then asymptotically
approached constant values in all experiments for all side weir
dimensions as in non-cohesive sediment. Thus, the scour depth
reached equilibrium conditions. Time to equilibrium depends on the
approach flow intensity and the dimensions of side weirs. For
different heights of the weir crest, dimensionless scour depths
increased with increasing upstream Froude number. Equilibrium
scour depths which formed 7 cm side-weir crest height were obtained
higher than that of the 12 cm side-weir crest height. This means when
side-weir crest height increased equilibrium scour depths decreased.
Although the upstream side of the scour hole is almost vertical, the
downstream side of the hole is inclined.
Abstract: Fabric textures are very common in our daily life.
However, the representation of fabric textures has never been explored
from neuroscience view. Theoretical studies suggest that primary
visual cortex (V1) uses a sparse code to efficiently represent natural
images. However, how the simple cells in V1 encode the artificial
textures is still a mystery. So, here we will take fabric texture as
stimulus to study the response of independent component analysis that
is established to model the receptive field of simple cells in V1. We
choose 140 types of fabrics to get the classical fabric textures as
materials. Experiment results indicate that the receptive fields of
simple cells have obvious selectivity in orientation, frequency and
phase when drifting gratings are used to determine their tuning
properties. Additionally, the distribution of optimal orientation and
frequency shows that the patch size selected from each original fabric
image has a significant effect on the frequency selectivity.
Abstract: Purpose: The study aimed to assess the depressant or
antidepressant effects of several Nonsteroidal Anti-Inflammatory
Drugs (NSAIDs) in mice: the selective cyclooxygenase-2 (COX-2)
inhibitor meloxicam, and the non-selective COX-1 and COX-2
inhibitors lornoxicam, sodium metamizole, and ketorolac. The
current literature data regarding such effects of these agents are
scarce.
Materials and methods: The study was carried out on NMRI mice
weighing 20-35 g, kept in a standard laboratory environment. The
study was approved by the Ethics Committee of the University of
Medicine and Pharmacy „Carol Davila”, Bucharest. The study agents
were injected intraperitoneally, 10 mL/kg body weight (bw) 1 hour
before the assessment of the locomotor activity by cage testing (n=10
mice/ group) and 2 hours before the forced swimming tests (n=15).
The study agents were dissolved in normal saline (meloxicam,
sodium metamizole), ethanol 11.8% v/v in normal saline (ketorolac),
or water (lornoxicam), respectively. Negative and positive control
agents were also given (amitryptilline in the forced swimming test).
The cage floor used in the locomotor activity assessment was divided
into 20 equal 10 cm squares. The forced swimming test involved
partial immersion of the mice in cylinders (15/9cm height/diameter)
filled with water (10 cm depth at 28C), where they were left for 6
minutes. The cage endpoint used in the locomotor activity assessment
was the number of treaded squares. Four endpoints were used in the
forced swimming test (immobility latency for the entire 6 minutes,
and immobility, swimming, and climbing scores for the final 4
minutes of the swimming session), recorded by an observer that was
„blinded” to the experimental design. The statistical analysis used the
Levene test for variance homogeneity, ANOVA and post-hoc
analysis as appropriate, Tukey or Tamhane tests.
Results: No statistically significant increase or decrease in the
number of treaded squares was seen in the locomotor activity
assessment of any mice group. In the forced swimming test,
amitryptilline showed an antidepressant effect in each experiment, at
the 10 mg/kg bw dosage. Sodium metamizole was depressant at 100
mg/kg bw (increased the immobility score, p=0.049, Tamhane test),
but not in lower dosages as well (25 and 50 mg/kg bw). Ketorolac
showed an antidepressant effect at the intermediate dosage of 5
mg/kg bw, but not so in the dosages of 2.5 and 10 mg/kg bw,
respectively (increased the swimming score, p=0.012, Tamhane test).
Meloxicam and lornoxicam did not alter the forced swimming
endpoints at any dosage level.
Discussion: 1) Certain NSAIDs caused changes in the forced
swimming patterns without interfering with locomotion. 2) Sodium
metamizole showed a depressant effect, whereas ketorolac proved
antidepressant. Conclusion: NSAID-induced mood changes are not
class effects of these agents and apparently are independent of the
type of inhibited cyclooxygenase (COX-1 or COX-2).
Disclosure: This paper was co-financed from the European Social
Fund, through the Sectorial Operational Programme Human Resources Development 2007-2013, project number POSDRU /159
/1.5 /S /138907 "Excellence in scientific interdisciplinary research,
doctoral and postdoctoral, in the economic, social and medical fields
-EXCELIS", coordinator The Bucharest University of Economic
Studies.
Abstract: Brass terminal, one of the several crude oil and
petroleum products storage/handling facilities in the Niger Delta was
built in the 1980s. Activities at this site, over the years, released
crude oil into this 3 m-deep, 1500 m-long canal lying adjacent to the
terminal with oil floating on it and its sediment heavily polluted. To
ensure effective clean-up, three major activities were planned: site
characterization, bioremediation pilot plant construction and testing
and full-scale bioremediation of contaminated sediment / bank soil by
land farming. The canal was delineated into 12 lots and each
characterized, with reference to the floating oily phase, contaminated
sediment and canal bank soil. As a result of site characterization, a
pilot plant for on-site bioremediation was designed and a treatment
basin constructed for carrying out pilot bioremediation test.
Following a designed sampling protocol, samples from this pilot
plant were collected for analysis at two laboratories as a quality
assurance / quality control check. Results showed that Brass Canal
upstream is contaminated with dark, thick and viscous oily film with
characteristic hydrocarbon smell while downstream, thin oily film
interspersed with water was observed. Sediments were observed to be
dark with mixture of brownish sandy soil with TPH ranging from
17,800 mg/kg in Lot 1 to 88,500 mg/kg in Lot 12 samples. Brass
Canal bank soil was observed to be sandy from ground surface to 3m,
below ground surface (bgs) it was silty-sandy and brownish while
subsurface soil (4-10m bgs) was sandy-clayey and whitish/grayish
with typical hydrocarbon smell. Preliminary results obtained so far
have been very promising but were proprietary. This project is
considered, to the best of technical literature knowledge, the first
large-scale on-site bioremediation project in the Niger Delta region,
Nigeria.
Abstract: Waste load allocation (WLA) policies may use multiobjective
optimization methods to find the most appropriate and
sustainable solutions. These usually intend to simultaneously
minimize two criteria, total abatement costs (TC) and environmental
violations (EV). If other criteria, such as inequity, need for
minimization as well, it requires introducing more binary
optimizations through different scenarios. In order to reduce the
calculation steps, this study presents value index as an innovative
decision making approach. Since the value index contains both the
environmental violation and treatment costs, it can be maximized
simultaneously with the equity index. It implies that the definition of
different scenarios for environmental violations is no longer required.
Furthermore, the solution is not necessarily the point with minimized
total costs or environmental violations. This idea is testified for Haraz
River, in north of Iran. Here, the dissolved oxygen (DO) level of river
is simulated by Streeter-Phelps equation in MATLAB software. The
WLA is determined for fish farms using multi-objective particle
swarm optimization (MOPSO) in two scenarios. At first, the trade-off
curves of TC-EV and TC-Inequity are plotted separately as the
conventional approach. In the second, the Value-Equity curve is
derived. The comparative results show that the solutions are in a
similar range of inequity with lower total costs. This is due to the
freedom of environmental violation attained in value index. As a
result, the conventional approach can well be replaced by the value
index particularly for problems optimizing these objectives. This
reduces the process to achieve the best solutions and may find better
classification for scenario definition. It is also concluded that decision
makers are better to focus on value index and weighting its contents
to find the most sustainable alternatives based on their requirements.
Abstract: The posterior reference for the ala tragal line is a
cause of confusion, with different authors suggesting different
locations as to the superior, middle or inferior part of the tragus. This
study was conducted on 200 subjects to evaluate if any correlation
exists between the variation of angulation of palatal throat form and
the relative parallelism of occlusal plane to ala-tragal line at different
tragal levels. A custom made Occlusal Plane Analyzer was used to
check the parallelism between the ala-tragal line and occlusal plane.
A lateral cephalogram was shot for each subject to measure the
angulation of the palatal throat form. Fisher’s exact test was used to
evaluate the correlation between the angulation of the palatal throat
form and the relative parallelism of occlusal plane to the ala tragal
line. Also, a classification was formulated for the palatal throat form,
based on confidence interval. From the results of the study, the
inferior part, middle part and superior part of the tragus were seen as
the reference points in 49.5%, 32% and 18.5% of the subjects
respectively. Class I palatal throat form (41degree-50 degree), Class
II palatal throat form (below 41 degree) and Class III palatal throat
form (above 50 degree) were seen in 42%, 43% and 15% of the
subjects respectively. It was also concluded that there is no
significant correlation between the variation in the angulations of the
palatal throat form and the relative parallelism of occlusal plane to
the ala-tragal line.
Abstract: This paper contains the description of argumentation
approach for the problem of inductive concept formation. It is
proposed to use argumentation, based on defeasible reasoning with
justification degrees, to improve the quality of classification models,
obtained by generalization algorithms. The experiment’s results on
both clear and noisy data are also presented.
Abstract: Margin-Based Principle has been proposed for a long
time, it has been proved that this principle could reduce the
structural risk and improve the performance in both theoretical
and practical aspects. Meanwhile, feed-forward neural network is
a traditional classifier, which is very hot at present with a deeper
architecture. However, the training algorithm of feed-forward neural
network is developed and generated from Widrow-Hoff Principle that
means to minimize the squared error. In this paper, we propose
a new training algorithm for feed-forward neural networks based
on Margin-Based Principle, which could effectively promote the
accuracy and generalization ability of neural network classifiers
with less labelled samples and flexible network. We have conducted
experiments on four UCI open datasets and achieved good results
as expected. In conclusion, our model could handle more sparse
labelled and more high-dimension dataset in a high accuracy while
modification from old ANN method to our method is easy and almost
free of work.
Abstract: In Hungary, the society has changed a lot for the past
25 years, and these changes could be detected in educational
situations as well. The number and the intensity of conflicts have
been increased in most fields of life, as well as at schools. Teachers
have difficulties to be able to handle school conflicts. What is more,
the new net generation, generation Z has values and behavioural
patterns different from those of the previous one, which might
generate more serious conflicts at school, especially with teachers
who were mainly socialising in a traditional teacher – student
relationship.
In Hungary, the bill CCIV of 2011 declared the foundation of
Institutes of Teacher Training in higher education institutes. One of
the tasks of the Institutes is to survey the competences and needs of
teachers working in public education and to provide further trainings
and services for them according to their needs and requirements. This
job is supported by the Social Renewal Operative Programs 4.1.2.B.
The professors of a college carried out a questionnaire and surveyed
the needs and the requirements of teachers working in the region.
Based on the results, the professors of the Institute of Teacher
Training decided to meet the requirements of teachers and to launch
short teacher further training courses in spring 2015. One of the
courses is going to focus on school conflict management through
mediation.
The aim of the pilot course is to provide conflict management
techniques for teachers and to present different mediation techniques
to them. The theoretical part of the course (5 hours) will enable
participants to understand the main points and the advantages of
mediation, while the practical part (10 hours) will involve teachers in
role plays to learn how to cope with conflict situations applying
mediation. We hope if conflicts could be reduced, it would influence
school atmosphere in a positive way and the teaching – learning
process could be more successful and effective.
Abstract: The problems arising from unbalanced data sets
generally appear in real world applications. Due to unequal class
distribution, many researchers have found that the performance of
existing classifiers tends to be biased towards the majority class. The
k-nearest neighbors’ nonparametric discriminant analysis is a method
that was proposed for classifying unbalanced classes with good
performance. In this study, the methods of discriminant analysis are
of interest in investigating misclassification error rates for classimbalanced
data of three diabetes risk groups. The purpose of this
study was to compare the classification performance between
parametric discriminant analysis and nonparametric discriminant
analysis in a three-class classification of class-imbalanced data of
diabetes risk groups. Data from a project maintaining healthy
conditions for 599 employees of a government hospital in Bangkok
were obtained for the classification problem. The employees were
divided into three diabetes risk groups: non-risk (90%), risk (5%),
and diabetic (5%). The original data including the variables of
diabetes risk group, age, gender, blood glucose, and BMI were
analyzed and bootstrapped for 50 and 100 samples, 599 observations
per sample, for additional estimation of the misclassification error
rate. Each data set was explored for the departure of multivariate
normality and the equality of covariance matrices of the three risk
groups. Both the original data and the bootstrap samples showed nonnormality
and unequal covariance matrices. The parametric linear
discriminant function, quadratic discriminant function, and the
nonparametric k-nearest neighbors’ discriminant function were
performed over 50 and 100 bootstrap samples and applied to the
original data. Searching the optimal classification rule, the choices of
prior probabilities were set up for both equal proportions (0.33: 0.33:
0.33) and unequal proportions of (0.90:0.05:0.05), (0.80: 0.10: 0.10)
and (0.70, 0.15, 0.15). The results from 50 and 100 bootstrap samples
indicated that the k-nearest neighbors approach when k=3 or k=4 and
the defined prior probabilities of non-risk: risk: diabetic as 0.90:
0.05:0.05 or 0.80:0.10:0.10 gave the smallest error rate of
misclassification. The k-nearest neighbors approach would be
suggested for classifying a three-class-imbalanced data of diabetes
risk groups.
Abstract: In this work, we explore the capability of the mean
shift algorithm as a powerful preprocessing tool for improving the
quality of spatial data, acquired from airborne scanners, from densely
built urban areas. On one hand, high resolution image data corrupted
by noise caused by lossy compression techniques are appropriately
smoothed while at the same time preserving the optical edges and, on
the other, low resolution LiDAR data in the form of normalized
Digital Surface Map (nDSM) is upsampled through the joint mean
shift algorithm. Experiments on both the edge-preserving smoothing
and upsampling capabilities using synthetic RGB-z data show that the
mean shift algorithm is superior to bilateral filtering as well as to
other classical smoothing and upsampling algorithms. Application of
the proposed methodology for 3D reconstruction of buildings of a
pilot region of Athens, Greece results in a significant visual
improvement of the 3D building block model.
Abstract: The growth in the volume of text data such as books
and articles in libraries for centuries has imposed to establish
effective mechanisms to locate them. Early techniques such as
abstraction, indexing and the use of classification categories have
marked the birth of a new field of research called "Information
Retrieval". Information Retrieval (IR) can be defined as the task of
defining models and systems whose purpose is to facilitate access to
a set of documents in electronic form (corpus) to allow a user to find
the relevant ones for him, that is to say, the contents which matches
with the information needs of the user. This paper presents a new
semantic indexing approach of a documentary corpus. The indexing
process starts first by a term weighting phase to determine the
importance of these terms in the documents. Then the use of a
thesaurus like Wordnet allows moving to the conceptual level.
Each candidate concept is evaluated by determining its level of
representation of the document, that is to say, the importance of the
concept in relation to other concepts of the document. Finally, the
semantic index is constructed by attaching to each concept of the
ontology, the documents of the corpus in which these concepts are
found.
Abstract: This study investigated the behavior of improved soft soils through the vibro replacement technique by considering their settlements and consolidation rates and the applicability of this technique in various types of soils and settlement and bearing capacity calculations.
Abstract: Micro-electromechanical system (MEMS)
accelerometers and gyroscopes are suitable for the inertial navigation
system (INS) of many applications due to low price, small
dimensions and light weight. The main disadvantage in a comparison
with classic sensors is a worse long term stability. The estimation
accuracy is mostly affected by the time-dependent growth of inertial
sensor errors, especially the stochastic errors. In order to eliminate
negative effects of these random errors, they must be accurately
modeled. In this paper, the Allan variance technique will be used in
modeling the stochastic errors of the inertial sensors. By performing
a simple operation on the entire length of data, a characteristic curve
is obtained whose inspection provides a systematic characterization
of various random errors contained in the inertial-sensor output data.
Abstract: OPEN_EmoRec_II is an open multimodal corpus with
experimentally induced emotions. In the first half of the experiment,
emotions were induced with standardized picture material and in the
second half during a human-computer interaction (HCI), realized
with a wizard-of-oz design. The induced emotions are based on the
dimensional theory of emotions (valence, arousal and dominance).
These emotional sequences - recorded with multimodal data (facial
reactions, speech, audio and physiological reactions) during a
naturalistic-like HCI-environment one can improve classification
methods on a multimodal level.
This database is the result of an HCI-experiment, for which 30
subjects in total agreed to a publication of their data including the
video material for research purposes*. The now available open
corpus contains sensory signal of: video, audio, physiology (SCL,
respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus
Major) and facial reactions annotations.