Abstract: Mixed-traffic (e.g., pedestrians, bicycles, and vehicles)
data at an intersection is one of the essential factors for intersection
design and traffic control. However, some data such as pedestrian
volume cannot be directly collected by common detectors (e.g.
inductive loop, sonar and microwave sensors). In this paper, a video
based detection algorithm is proposed for mixed-traffic data collection
at intersections using surveillance cameras. The algorithm is derived
from Gaussian Mixture Model (GMM), and uses a mergence time
adjustment scheme to improve the traditional algorithm. Real-world
video data were selected to test the algorithm. The results show that
the proposed algorithm has the faster processing speed and more
accuracy than the traditional algorithm. This indicates that the
improved algorithm can be applied to detect mixed-traffic at
signalized intersection, even when conflicts occur.
Abstract: Despite extensive study on wireless sensor network
security, defending internal attacks and finding abnormal behaviour
of the sensor are still difficult and unsolved task. The conventional
cryptographic technique does not give the robust security or detection
process to save the network from internal attacker that cause by
abnormal behavior. The insider attacker or abnormally behaved
sensor identificationand location detection framework using false
massage detection and Time difference of Arrival (TDoA) is
presented in this paper. It has been shown that the new framework
can efficiently identify and detect the insider attacker location so that
the attacker can be reprogrammed or subside from the network to
save from internal attack.
Abstract: The paper shows some ability to manage two-phase
flows arising from the use of unsteady effects. In one case, we
consider the condition of fragmentation of the interface between the
two components leads to the intensification of mixing. The problem
is solved when the temporal and linear scale are small for the
appearance of the developed mixing layer. Showing that exist such
conditions for unsteady flow velocity at the surface of the channel,
which will lead to the creation and fragmentation of vortices at Re
numbers of order unity. Also showing that the Re is not a criterion of
similarity for this type of flows, but we can introduce a criterion that
depends on both the Re, and the frequency splitting of the vortices. It
turned out that feature of this situation is that streamlines behave
stable, and if we analyze the behavior of the interface between the
components it satisfies all the properties of unstable flows. The other
problem we consider the behavior of solid impurities in the extensive
system of channels. Simulated unsteady periodic flow modeled
breaths. Consider the behavior of the particles along the trajectories.
It is shown that, depending on the mass and diameter of the particles,
they can be collected in a caustic on the channel walls, stop in a
certain place or fly back. Of interest is the distribution of particle
velocity in frequency. It turned out that by choosing a behavior of the
velocity field of the carrier gas can affect the trajectory of individual
particles including force them to fly back.
Abstract: This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.
Abstract: The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Abstract: A study was carried out to determine the effect of water quality on flotation performance. The experimental test work comprised of batch flotation tests using Denver lab cell for a period of 10 minutes. Nine different test runs were carried out in triplicates to ensure reproducibility using different water types from different thickener overflows, return and sewage effluent water (process water) and portable water. The water sources differed in pH, total dissolved solids, total suspended solids and conductivity. Process water was found to reduce the concentrate recovery and mass pull, while portable water increased the concentrate recovery and mass pull. Portable water reduced the concentrate grade while process water increased the concentrate grade. It is proposed that a combination of process water and portable water supply be used in flotation circuits to balance the different effects that the different water types have on the flotation efficiency.
Abstract: The recognition of human faces, especially those with
different orientations is a challenging and important problem in image
analysis and classification. This paper proposes an effective scheme
for rotation invariant face recognition using Log-Polar Transform and
Discrete Cosine Transform combined features. The rotation invariant
feature extraction for a given face image involves applying the logpolar
transform to eliminate the rotation effect and to produce a row
shifted log-polar image. The discrete cosine transform is then applied
to eliminate the row shift effect and to generate the low-dimensional
feature vector. A PSO-based feature selection algorithm is utilized to
search the feature vector space for the optimal feature subset.
Evolution is driven by a fitness function defined in terms of
maximizing the between-class separation (scatter index).
Experimental results, based on the ORL face database using testing
data sets for images with different orientations; show that the
proposed system outperforms other face recognition methods. The
overall recognition rate for the rotated test images being 97%,
demonstrating that the extracted feature vector is an effective rotation
invariant feature set with minimal set of selected features.
Abstract: This study has been prepared with the purpose to get the views of senior class Elementary Education Mathematics preservice teachers on proving. Data have been obtained via surveys and interviews carried out with 104 preservice teachers. According to the findings, although preservice teachers have positive views about using proving in mathematics teaching, it is seen that their experiences related to proving is limited to courses and they think proving is a work done only for the exams. Furthermore, they have expressed in the interviews that proving is difficult for them, and because of this reason they prefer memorizing instead of learning.
Abstract: Digital watermarking is one of the techniques for
copyright protection. In this paper, a normalization-based robust
image watermarking scheme which encompasses singular value
decomposition (SVD) and discrete cosine transform (DCT)
techniques is proposed. For the proposed scheme, the host image is
first normalized to a standard form and divided into non-overlapping
image blocks. SVD is applied to each block. By concatenating the
first singular values (SV) of adjacent blocks of the normalized image,
a SV block is obtained. DCT is then carried out on the SV blocks to
produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency
band of a SVD-DCT block by imposing a particular
relationship between two pseudo-randomly selected DCT
coefficients. An adaptive frequency mask is used to adjust local
watermark embedding strength. Watermark extraction involves
mainly the inverse process. The watermark extracting method is blind
and efficient. Experimental results show that the quality degradation
of watermarked image caused by the embedded watermark is visually
transparent. Results also show that the proposed scheme is robust
against various image processing operations and geometric attacks.
Abstract: An investigation of noise in a micro stepping motor is
considered to study in this article. Because of the trend towards higher
precision and more and more small 3C (including Computer,
Communication and Consumer Electronics) products, the micro
stepping motor is frequently used to drive the micro system or the
other 3C products. Unfortunately, noise in a micro stepped motor is
too large to accept by the customs. To depress the noise of a micro
stepped motor, the dynamic characteristics in this system must be
studied. In this article, a Visual Basic (VB) computer program speed
controlled micro stepped motor in a digital camera is investigated.
Karman KD2300-2S non-contract eddy current displacement sensor,
probe microphone, and HP 35670A analyzer are employed to analyze
the dynamic characteristics of vibration and noise in a motor. The
vibration and noise measurement of different type of bearings and
different treatment of coils are compared. The rotating components,
bearings, coil, etc. of the motor play the important roles in producing
vibration and noise. It is found that the noise will be depressed about
3~4 dB and 6~7 dB, when substitutes the copper bearing with plastic
one and coats the motor coil with paraffin wax, respectively.
Abstract: The purposes of this study are 1) to study the frequent
English writing errors of students registering the course: Reading and
Writing English for Academic Purposes II, and 2) to find out the
results of writing error correction by using coded indirect corrective
feedback and writing error treatments. Samples include 28 2nd year
English Major students, Faculty of Education, Suan Sunandha
Rajabhat University. Tool for experimental study includes the lesson
plan of the course; Reading and Writing English for Academic
Purposes II, and tool for data collection includes 4 writing tests of
short texts. The research findings disclose that frequent English
writing errors found in this course comprise 7 types of grammatical
errors, namely Fragment sentence, Subject-verb agreement, Wrong
form of verb tense, Singular or plural noun endings, Run-ons
sentence, Wrong form of verb pattern and Lack of parallel structure.
Moreover, it is found that the results of writing error correction by
using coded indirect corrective feedback and error treatment reveal
the overall reduction of the frequent English writing errors and the
increase of students’ achievement in the writing of short texts with
the significance at .05.
Abstract: Pineapples can be classified using an index with seven
levels of maturity based on the green and yellow color of the skin. As
the pineapple ripens, the skin will change from pale green to a golden
or yellowish color. The issues that occur in agriculture nowadays are
to do with farmers being unable to distinguish between the indexes of
pineapple maturity correctly and effectively. There are several
reasons for why farmers cannot properly follow the guideline provide
by Federal Agriculture Marketing Authority (FAMA) and one of
reason is that due to manual inspection done by experts, there are no
specific and universal guidelines to be adopted by farmers due to the
different points of view of the experts when sorting the pineapples
based on their knowledge and experience. Therefore, an automatic
system will help farmers to identify pineapple maturity effectively
and will become a universal indicator to farmers.
Abstract: This paper discusses E-government, in particular the
challenges that face adoption in Saudi Arabia. E-government can be
defined based on an existing set of requirements. In this research we
define E-government as a matrix of stakeholders: governments to
governments, governments to business and governments to citizens,
using information and communications technology to deliver and
consume services. E-government has been implemented for a
considerable time in developed countries. However, E-government
services still face many challenges in their implementation and
general adoption in many countries including Saudi Arabia. It has
been noted that the introduction of E-government is a major
challenge facing the government of Saudi Arabia, due to possible
concerns raised by its citizens. In addition, the literature review and
the discussion identify the influential factors that affect the citizens’
intention to adopt E-government services in Saudi Arabia.
Consequently, these factors have been defined and categorized
followed by an exploratory study to examine the importance of these
factors. Therefore, this research has identified factors that determine
if the citizen will adopt E-government services and thereby aiding
governments in accessing what is required to increase adoption.
Abstract: This paper presents an interactive modeling system of
uniform polyhedra using the isomorphic graphs. Especially,
Kepler-Poinsot solids are formed by modifications of dodecahedron
and icosahedron.
Abstract: In this paper, algorithm estimating the blood pressure
was proposed using the pulse transit time (PTT) as a more convenient
method of measuring the blood pressure. After measuring ECG and
pressure pulse, and photoplethysmography, the PTT was calculated
from the acquired signals. Thereafter, the system to indirectly measure
the systolic pressure and the diastolic pressure was composed using
the statistic method. In comparison between the blood pressure
indirectly measured by proposed algorithm estimating the blood
pressure and real blood pressure measured by conventional
sphygmomanometer, the systolic pressure indicates the mean error of
±3.24mmHg and the standard deviation of 2.53mmHg, while the
diastolic pressure indicates the satisfactory result, that is, the mean
error of ±1.80mmHg and the standard deviation of 1.39mmHg. These
results are satisfied with the regulation of ANSI/AAMI for
certification of sphygmomanometer that real measurement error value
should be within the mean error of ±5mmHg and the standard
deviation of 8mmHg. These results are suggest the possibility of
applying to portable and long time blood pressure monitoring system
hereafter.
Abstract: Ozone (O3) is considered as one of the most
phytotoxic pollutants with deleterious effects on living and non living
components of Ecosystems. It reduces growth and yield of many
crops as well as alters the physiology and crop quality. The present
study described series of experiments to investigate the effects of
ambient O3 at different locations with different ambient levels of O3
depending on proximity to pollutant source and ranged between 17
ppb/h in control experiment to 112 ppb/h in industrial area
respectively. The ambient levels in other three locations (King Saud
University botanical garden, King Fahd Rd, and Almanakh Garden)
were 61,61,77 ppb/h respectively. Tow legume crops species (vicia
vaba L ; and Pisum sativum) differ in their phenology and sensitivity
were used. The results showed a significant negative effect to ozone
on morphology, number of injured leaves, growth and productivity
with a difference in the degree of response depending on the plant
type. Visia Faba showed sensitivity to ozone to number and leaf area
and the degree of injury leaves 3, pisum sativum show higher
sensitivity for the gas for degree of injury 1,The relative growth rate
and seed weight, it turns out there is no significant difference
between the two plants in plant height and number of seeds.
Abstract: One of the most important areas of knowledge management studies is knowledge sharing. Measured in terms of number of scientific articles and organization-s applications, knowledge sharing stands as an example of success in the field. This paper reviews the related papers in the context of the underlying individual behavioral variables to providea direction framework for future research and writing.
Abstract: In many applications, it is a priori known that the
target function should satisfy certain constraints imposed by, for
example, economic theory or a human-decision maker. Here we
consider partially monotone problems, where the target variable
depends monotonically on some of the predictor variables but not all.
We propose an approach to build partially monotone models based
on the convolution of monotone neural networks and kernel
functions. The results from simulations and a real case study on
house pricing show that our approach has significantly better
performance than partially monotone linear models. Furthermore, the
incorporation of partial monotonicity constraints not only leads to
models that are in accordance with the decision maker's expertise,
but also reduces considerably the model variance in comparison to
standard neural networks with weight decay.
Abstract: The effect of calcination temperature and MgO crystallite sizes on the structure and catalytic performance of TiO2 supported nano-MgO catalyst for the trans-esterification of soybean oil has been studied. The catalyst has been prepared by deposition precipitation method, characterised by XRD and FTIR and tested in an autoclave at 225oC. The soybean oil conversion after 15 minutes of the trans-esterification reaction increased when the calcination temperature was increased from 500 to 600oC and decreased with further increase in calcination temperature. Some glycerolysis activity was also detected on catalysts calcined at 600 and 700oC after 45 minutes of reaction. The trans-esterification reaction rate increased with the decrease in MgO crystallite size for the first 30 min.
Abstract: Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.