Abstract: In the deep south of Thailand, checkpoints for people
verification are necessary for the security management of risk zones,
such as official buildings in the conflict area. In this paper, we
propose an automatic checkpoint system that verifies persons using
information from ID cards and facial features. The methods for a
person’s information abstraction and verification are introduced
based on useful information such as ID number and name, extracted
from official cards, and facial images from videos. The proposed
system shows promising results and has a real impact on the local
society.
Abstract: This paper outlines the development of an
experimental technique in quantifying supersonic jet flows, in an
attempt to avoid seeding particle problems frequently associated with
particle-image velocimetry (PIV) techniques at high Mach numbers.
Based on optical flow algorithms, the idea behind the technique
involves using high speed cameras to capture Schlieren images of the
supersonic jet shear layers, before they are subjected to an adapted
optical flow algorithm based on the Horn-Schnuck method to
determine the associated flow fields. The proposed method is capable
of offering full-field unsteady flow information with potentially
higher accuracy and resolution than existing point-measurements or
PIV techniques. Preliminary study via numerical simulations of a
circular de Laval jet nozzle successfully reveals flow and shock
structures typically associated with supersonic jet flows, which serve
as useful data for subsequent validation of the optical flow based
experimental results. For experimental technique, a Z-type Schlieren
setup is proposed with supersonic jet operated in cold mode,
stagnation pressure of 4 bar and exit Mach of 1.5. High-speed singleframe
or double-frame cameras are used to capture successive
Schlieren images. As implementation of optical flow technique to
supersonic flows remains rare, the current focus revolves around
methodology validation through synthetic images. The results of
validation test offers valuable insight into how the optical flow
algorithm can be further improved to improve robustness and
accuracy. Despite these challenges however, this supersonic flow
measurement technique may potentially offer a simpler way to
identify and quantify the fine spatial structures within the shock shear
layer.
Abstract: With the increasing number of people reviewing
products online in recent years, opinion sharing websites has become
the most important source of customers’ opinions. Unfortunately,
spammers generate and post fake reviews in order to promote or
demote brands and mislead potential customers. These are notably
destructive not only for potential customers, but also for business
holders and manufacturers. However, research in this area is not
adequate, and many critical problems related to spam detection have
not been solved to date. To provide green researchers in the domain
with a great aid, in this paper, we have attempted to create a highquality
framework to make a clear vision on review spam-detection
methods. In addition, this report contains a comprehensive collection
of detection metrics used in proposed spam-detection approaches.
These metrics are extremely applicable for developing novel
detection methods.
Abstract: In language learning, second language learners as well
as Native speakers commit errors in their attempt to achieve
competence in the target language. The realm of collocation has to do
with meaning relation between lexical items. In all human language,
there is a kind of ‘natural order’ in which words are arranged or relate
to one another in sentences so much so that when a word occurs in a
given context, the related or naturally co-occurring word will
automatically come to the mind. It becomes an error, therefore, if
students inappropriately pair or arrange such ‘naturally’ co–occurring
lexical items in a text. It has been observed that most of the second
language learners in this research group commit collocation errors. A
study of this kind is very significant as it gives insight into the kinds
of errors committed by learners. This will help the language teacher
to be able to identify the sources and causes of such errors as well as
correct them thereby guiding, helping and leading the learners
towards achieving some level of competence in the language. The
aim of the study is to understand the nature of these errors as
stumbling blocks to effective essay writing. The objective of the
study is to identify the errors, analyze their structural compositions so
as to determine whether there are similarities between students in this
regard and to find out whether there are patterns to these kinds of
errors which will enable the researcher to understand their sources
and causes. As a descriptive research, the researcher samples some
nine hundred essays collected from three hundred undergraduate
learners of English as a second language in the Federal College of
Education, Kano, North- West Nigeria, i.e. three essays per each
student. The essays which were given on three different lecture times
were of similar thematic preoccupations (i.e. same topics) and length
(i.e. same number of words). The essays were written during the
lecture hour at three different lecture occasions. The errors were
identified in a systematic manner whereby errors so identified were
recorded only once even if they occur severally in students’ essays.
The data was collated using percentages in which the identified
numbers of occurrences were converted accordingly in percentages.
The findings from the study indicate that there are similarities as well
as regular and repeated errors which provided a pattern. Based on the
pattern identified, the conclusion is that students’ collocation errors
are attributable to poor teaching and learning which resulted in wrong
generalization of rules.
Abstract: The simulation in wind tunnel is used thoroughly to model real situations of drainages of air. Besides the automotive industry, a great number of applications can be numbered: dispersion of pollutant, studies of pedestrians’ comfort, and dispersion of particles. This work had the objective of visualizing the characteristics aerodynamics of two automobiles in different ways. To accomplish that drainage of air a fan that generated a speed exists (measured with anemometer of hot thread) of 4,1m/s and 4,95m/s. To visualize the path of the air through the cars, in the wind tunnel, smoke was used, obtained with it burns of vegetable oil. For “to do smoke” vegetable oil was used, that was burned for a tension of 20V generated by a thread of 2,5mm. The cars were placed inside of the wind tunnel with the drainage of “air-smoke” and photographed, registering like this the path lines around them, in the 3 different speeds.
Abstract: The number and adequacy of Performance-Indicators
(PIs) for organisational purposes are core to the success of
organisations and a major concern to the sponsor of this research.
This assignment developed a procedure to improve a firm’s
performance assessment system, by identifying two key-PIs out of 28
initial ones, and by setting criteria and their relative importance to
validate and rank the adequacy and the right number of operational
metrics. The Analytical-Hierarchy-Process was used with a synthesismethod
to treat data coming from the management inquiries.
Although organisational alignment has been achieved, business
processes should also be targeted and PIs continuously revised.
Abstract: The paper deals with possibilities of increase train
capacity by using a new type of railway wagon. In the first part is
created a mathematical model to calculate the capacity of the train.
The model is based on the main limiting parameters of the train -
maximum number of axles per train, maximum gross weight of train,
maximum length of train and number of TEUs per one wagon. In the
second part is the model applied to four different model trains with
different composition of the train set and three different average
weights of TEU and a train consisting of a new type of wagons. The
result is to identify where the carrying capacity of the original trains
is higher, respectively less than a capacity of train consisting of a new
type of wagons.
Abstract: In this work, by replacing the traditional solid spokes with colloidal spokes, a vehicle wheel with a built-in suspension structure is proposed. Following the background and description of the wheel system, firstly, a vibration model of the wheel equipped with colloidal spokes is proposed, and based on such model the equivalent damping coefficients and spring constants are identified. Then, a modified model of a quarter-vehicle moving on a rough pavement is proposed in order to estimate the transmissibility of vibration from the road roughness to vehicle body. In the end, the optimal design of the colloidal spokes and the optimum number of colloidal spokes are decided in order to minimize the transmissibility of vibration, i.e., to maximize the ride comfort of the vehicle.
Abstract: This paper aims to analysis the behavior of DC corona
discharge in wire-to-plate electrostatic precipitators (ESP). Currentvoltage
curves are particularly analyzed. Experimental results show
that discharge current is strongly affected by the applied voltage. The proposed method of current identification is to use the method
of least squares. Least squares problems that of into two categories:
linear or ordinary least squares and non-linear least squares,
depending on whether or not the residuals are linear in all unknowns.
The linear least-squares problem occurs in statistical regression
analysis; it has a closed-form solution. A closed-form solution (or
closed form expression) is any formula that can be evaluated in a
finite number of standard operations. The non-linear problem has no
closed-form solution and is usually solved by iterative.
Abstract: Segmentation of left ventricle (LV) from cardiac
ultrasound images provides a quantitative functional analysis of the
heart to diagnose disease. Active Shape Model (ASM) is widely used
for LV segmentation, but it suffers from the drawback that
initialization of the shape model is not sufficiently close to the target,
especially when dealing with abnormal shapes in disease. In this work,
a two-step framework is improved to achieve a fast and efficient LV
segmentation. First, a robust and efficient detection based on Hough
forest localizes cardiac feature points. Such feature points are used to
predict the initial fitting of the LV shape model. Second, ASM is
applied to further fit the LV shape model to the cardiac ultrasound
image. With the robust initialization, ASM is able to achieve more
accurate segmentation. The performance of the proposed method is
evaluated on a dataset of 810 cardiac ultrasound images that are mostly
abnormal shapes. This proposed method is compared with several
combinations of ASM and existing initialization methods. Our
experiment results demonstrate that accuracy of the proposed method
for feature point detection for initialization was 40% higher than the
existing methods. Moreover, the proposed method significantly
reduces the number of necessary ASM fitting loops and thus speeds up
the whole segmentation process. Therefore, the proposed method is
able to achieve more accurate and efficient segmentation results and is
applicable to unusual shapes of heart with cardiac diseases, such as left
atrial enlargement.
Abstract: Digital images are widely used in computer
applications. To store or transmit the uncompressed images
requires considerable storage capacity and transmission bandwidth.
Image compression is a means to perform transmission or storage of
visual data in the most economical way. This paper explains about
how images can be encoded to be transmitted in a multiplexing
time-frequency domain channel. Multiplexing involves packing
signals together whose representations are compact in the working
domain. In order to optimize transmission resources each 4 × 4
pixel block of the image is transformed by a suitable polynomial
approximation, into a minimal number of coefficients. Less than
4 × 4 coefficients in one block spares a significant amount of
transmitted information, but some information is lost. Different
approximations for image transformation have been evaluated as
polynomial representation (Vandermonde matrix), least squares +
gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev
polynomials or singular value decomposition (SVD). Results have
been compared in terms of nominal compression rate (NCR),
compression ratio (CR) and peak signal-to-noise ratio (PSNR)
in order to minimize the error function defined as the difference
between the original pixel gray levels and the approximated
polynomial output. Polynomial coefficients have been later encoded
and handled for generating chirps in a target rate of about two
chirps per 4 × 4 pixel block and then submitted to a transmission
multiplexing operation in the time-frequency domain.
Abstract: In wireless sensor network, sensor node transmits the
sensed data to the sink node in multi-hop communication
periodically. This high traffic induces congestion at the node which is
present one-hop distance to the sink node. The packet transmission
and reception rate of these nodes should be very high, when
compared to other sensor nodes in the network. Therefore, the energy
consumption of that node is very high and this effect is known as the
“funneling effect”. The tree based-data aggregation technique
(TBDA) is used to reduce the energy consumption of the node. The
throughput of the overall performance shows a considerable decrease
in the number of packet transmissions to the sink node. The proposed
scheme, TBDA, avoids the funneling effect and extends the lifetime
of the wireless sensor network. The average case time complexity for
inserting the node in the tree is O(n log n) and for the worst case time
complexity is O(n2).
Abstract: The paper presents a method in which the expert
knowledge is applied to fuzzy inference model. Even a less
experienced person could benefit from the use of such a system, e.g.
urban planners, officials. The analysis result is obtained in a very
short time, so a large number of the proposed locations can also be
verified in a short time. The proposed method is intended for testing
of locations of car parks in a city. The paper shows selected examples
of locations of the P&R facilities in cities planning to introduce the
P&R. The analyses of existing objects are also shown in the paper
and they are confronted with the opinions of the system users, with
particular emphasis on unpopular locations. The results of the
analyses are compared to expert analysis of the P&R facilities
location that was outsourced by the city and the opinions about
existing facilities users that were expressed on social networking
sites. The obtained results are consistent with actual users’ feedback.
The proposed method proves to be good, but does not require the
involvement of a large experts team and large financial contributions
for complicated research. The method also provides an opportunity to
show the alternative location of P&R facilities. Although the results
of the method are approximate, they are not worse than results of
analysis of employed experts. The advantage of this method is ease of
use, which simplifies the professional expert analysis. The ability of
analyzing a large number of alternative locations gives a broader
view on the problem. It is valuable that the arduous analysis of the
team of people can be replaced by the model's calculation. According
to the authors, the proposed method is also suitable for
implementation on a GIS platform.
Abstract: Searching the “Island of stability” is a topic of
extreme interest in theoretical as well as experimental modern
physics today. This “island of stability” is spanned by superheavy
elements (SHE's) that are produced in the laboratory. SHE's are
believed to exist primarily due to the “magic” stabilizing effects of
nuclear shell structure. SHE synthesis is extremely difficult due to
their very low production cross section, often of the order of pico
barns or less. Stabilizing effects of shell closures at proton number
Z=82 and neutron number N=126 are predicted theoretically. Though
stabilizing effects of Z=82 have been experimentally verified, no
concluding observations have been made with N=126, so far. We
measured and analyzed the total evaporation residue (ER) cross
sections for a number of systems with neutron number around 126 to
explore possible shell closure effects in ER cross sections, in this
work.
Abstract: Lightweight and efficient structures have the aim to
enhance the efficiency of the components in various industries.
Toward this end, composites are one of the most widely used
materials because of durability, high strength and modulus, and low
weight. One type of the advanced composites is grid-stiffened
composite (GSC) structures, which have been extensively considered
in aerospace, automotive, and aircraft industries. They are one of the
top candidates for replacing some of the traditional components,
which are used here. Although there are a good number of published
surveys on the design aspects and fabrication of GSC structures, little
systematic work has been reported on their material modification to
improve their properties, to our knowledge. Matrix modification
using nanoparticles is an effective method to enhance the flexural
properties of the fibrous composites. In the present study, a silanecoupling
agent (3-glycidoxypropyltrimethoxysilane/3-GPTS) was
introduced onto the silica (SiO2) nanoparticle surface and its effects
on the three-point flexural response of isogrid E-glass/epoxy
composites were assessed. Based on the Fourier Transform Infrared
Spectrometer (FTIR) spectra, it was inferred that the 3-GPTS
coupling agent was successfully grafted onto the surface of SiO2
nanoparticles after modification. Flexural test revealed an
improvement of 16%, 14%, and 36% in stiffness, maximum load and
energy absorption of the isogrid specimen filled with 3 wt.% 3-
GPTS/SiO2 compared to the neat one. It would be worth mentioning
that in these structures, considerable energy absorption was observed
after the primary failure related to the load peak. In addition, 3-
GPTMS functionalization had a positive effect on the flexural
behavior of the multiscale isogrid composites. In conclusion, this
study suggests that the addition of modified silica nanoparticles is a
promising method to improve the flexural properties of the gridstiffened
fibrous composite structures.
Abstract: Mineral product, waste concrete (fine aggregates),
waste in the optical field, industry, and construction employ separators
to separate solids and classify them according to their size. Various
sorting machines are used in the industrial field such as those operating
under electrical properties, centrifugal force, wind power, vibration,
and magnetic force. Study on separators has been carried out to
contribute to the environmental industry. In this study, we perform
CFD analysis for understanding the basic mechanism of the separation
of waste concrete (fine aggregate) particles from air with a machine
built with a rotor with blades. In CFD, we first performed
two-dimensional particle tracking for various particle sizes for the
model with 1 degree, 1.5 degree, and 2 degree angle between each
blade to verify the boundary conditions and the method of rotating
domain method to be used in 3D. Then we developed 3D numerical
model with ANSYS CFX to calculate the air flow and track the
particles. We judged the capability of particle separation for given size
by counting the number of particles escaping from the domain toward
the exit among 10 particles issued at the inlet. We confirm that
particles experience stagnant behavior near the exit of the rotating
blades where the centrifugal force acting on the particles is in balance
with the air drag force. It was also found that the minimum particle
size that can be separated by the machine with the rotor is determined
by its capability to stay at the outlet of the rotor channels.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.
Abstract: Rapid developments in technology in the present age
have made it necessary for communities to follow technological
developments and adapt themselves to these developments. One of
the fields that are most rapidly affected by these developments is
undoubtedly education. Determination of the attitudes of preservice
teachers, who live in an age of technology and get ready to raise
future individuals, is of paramount importance both educationally and
professionally. The purpose of this study was to analyze attitudes of
preservice teachers towards technology and some variables that
predict these attitudes (gender, daily duration of internet use, and the
number of technical devices owned). 329 preservice teachers
attending the education faculty of a large university in central Turkey
participated, on a volunteer basis, in this study, where relational
survey model was used as the research method. Research findings
reveal that preservice teachers’ attitudes towards technology are
positive and at the same time, the attitudes of male preservice
teachers towards technology are more positive than their female
counterparts. As a result of the stepwise multiple regression analysis
where factors predicting preservice teachers’ attitudes towards
technology, it was found that duration of daily internet use was the
strongest predictor of attitudes towards technology.
Abstract: Over the last two decades, externally bonded fiber
reinforced polymer (FRP) composites bonded to concrete substrates
has become a popular method for strengthening reinforced concrete
(RC) highway and railway bridges. Such structures are exposed to
severe cyclic loading throughout their lifetime often resulting in
fatigue damage to structural components and a reduction in the
service life of the structure. Since experimental and numerical results
on the fatigue performance of FRP-to-concrete joints are still limited,
the current research focuses on assessing the fatigue performance of
externally bonded FRP-to-concrete joints using a direct shear test.
Some early results indicate that the stress ratio and the applied cyclic
stress level have a direct influence on the fatigue life of the externally
bonded FRP. In addition, a calibrated finite element model is
developed to provide further insight into the influence of certain
parameters such as: concrete strength, FRP thickness, number of
cycles, frequency, and stiffness on the fatigue life of the FRP-toconcrete
joints.
Abstract: English like any other language is rich by means of arbitrary, conventional, symbols which lend it to lot of inconsistencies in spelling, phonology, syntax, and morphology. The research examines the irregularities prevalent in the structure and meaning of some ‘er’ lexical items in English and its implication to vocabulary acquisition. It centers its investigation on the derivational suffix ‘er’, which changes the grammatical category of word. English language poses many challenges to Second Language Learners because of its irregularities, exceptions, and rules. One of the meaning of –er derivational suffix is someone or somebody who does something. This rule often confuses the learners when they meet with the exceptions in normal discourse. The need to investigate instances of such inconsistencies in the formation of –er words and the meanings given to such words by the students motivated this study. For this purpose, some senior secondary two (SS2) students in six randomly selected schools in the metropolis were provided a large number of alphabetically selected ‘er’ suffix ending words, The researcher opts for a test technique, which requires them to provide the meaning of the selected words with- er. The marking of the test was scored on the scale of 1-0, where correct formation of –er word and meaning is scored one while wrong formation and meaning is scored zero. The number of wrong and correct formations of –er words meaning were calculated using percentage. The result of this research shows that a large number of students made wrong generalization of the meaning of the selected -er ending words. This shows how enormous the inconsistencies are in English language and how are affect the learning of English. Findings from the study revealed that though students mastered the basic morphological rules but the errors are generally committed on those vocabulary items that are not frequently in use. The study arrives at this conclusion from the survey of their textbook and their spoken activities. Therefore, the researcher recommends that there should be effective reappraisal of language teaching through implementation of the designed curriculum to reflect on modern strategies of teaching language, identification, and incorporation of the exceptions in rigorous communicative activities in language teaching, language course books and tutorials, training and retraining of teachers on the strategies that conform to the new pedagogy.