Abstract: Mixed-traffic (e.g., pedestrians, bicycles, and vehicles)
data at an intersection is one of the essential factors for intersection
design and traffic control. However, some data such as pedestrian
volume cannot be directly collected by common detectors (e.g.
inductive loop, sonar and microwave sensors). In this paper, a video
based detection algorithm is proposed for mixed-traffic data collection
at intersections using surveillance cameras. The algorithm is derived
from Gaussian Mixture Model (GMM), and uses a mergence time
adjustment scheme to improve the traditional algorithm. Real-world
video data were selected to test the algorithm. The results show that
the proposed algorithm has the faster processing speed and more
accuracy than the traditional algorithm. This indicates that the
improved algorithm can be applied to detect mixed-traffic at
signalized intersection, even when conflicts occur.
Abstract: The paper shows some ability to manage two-phase
flows arising from the use of unsteady effects. In one case, we
consider the condition of fragmentation of the interface between the
two components leads to the intensification of mixing. The problem
is solved when the temporal and linear scale are small for the
appearance of the developed mixing layer. Showing that exist such
conditions for unsteady flow velocity at the surface of the channel,
which will lead to the creation and fragmentation of vortices at Re
numbers of order unity. Also showing that the Re is not a criterion of
similarity for this type of flows, but we can introduce a criterion that
depends on both the Re, and the frequency splitting of the vortices. It
turned out that feature of this situation is that streamlines behave
stable, and if we analyze the behavior of the interface between the
components it satisfies all the properties of unstable flows. The other
problem we consider the behavior of solid impurities in the extensive
system of channels. Simulated unsteady periodic flow modeled
breaths. Consider the behavior of the particles along the trajectories.
It is shown that, depending on the mass and diameter of the particles,
they can be collected in a caustic on the channel walls, stop in a
certain place or fly back. Of interest is the distribution of particle
velocity in frequency. It turned out that by choosing a behavior of the
velocity field of the carrier gas can affect the trajectory of individual
particles including force them to fly back.
Abstract: Binder drainage test is widely used to set an upper
limit to the design binder content of porous asphalt. However, the
presence of high amount of fine particles in the drained binder may
affect the accuracy of the test result. This paper presents a study to
characterize the composition and particle size distribution of fine
particles accumulated in the drained binder. Fine aggregates and filler
in the drained binder were extracted using a suitable solvent. Then,
wet and dry sieve analysis was carried out to identify the actual
composition of the extracted fine aggregates and filler. From the
results, almost half of the drained binder consisted of fine aggregates
and this significantly affects the accuracy of the design binder content
of porous asphalt mix. This simple finding highlights the importance
of taking into account the presence of fine aggregates in the
calculation of drained binder.
Abstract: This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.
Abstract: The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Abstract: A study was carried out to determine the effect of water quality on flotation performance. The experimental test work comprised of batch flotation tests using Denver lab cell for a period of 10 minutes. Nine different test runs were carried out in triplicates to ensure reproducibility using different water types from different thickener overflows, return and sewage effluent water (process water) and portable water. The water sources differed in pH, total dissolved solids, total suspended solids and conductivity. Process water was found to reduce the concentrate recovery and mass pull, while portable water increased the concentrate recovery and mass pull. Portable water reduced the concentrate grade while process water increased the concentrate grade. It is proposed that a combination of process water and portable water supply be used in flotation circuits to balance the different effects that the different water types have on the flotation efficiency.
Abstract: The recognition of human faces, especially those with
different orientations is a challenging and important problem in image
analysis and classification. This paper proposes an effective scheme
for rotation invariant face recognition using Log-Polar Transform and
Discrete Cosine Transform combined features. The rotation invariant
feature extraction for a given face image involves applying the logpolar
transform to eliminate the rotation effect and to produce a row
shifted log-polar image. The discrete cosine transform is then applied
to eliminate the row shift effect and to generate the low-dimensional
feature vector. A PSO-based feature selection algorithm is utilized to
search the feature vector space for the optimal feature subset.
Evolution is driven by a fitness function defined in terms of
maximizing the between-class separation (scatter index).
Experimental results, based on the ORL face database using testing
data sets for images with different orientations; show that the
proposed system outperforms other face recognition methods. The
overall recognition rate for the rotated test images being 97%,
demonstrating that the extracted feature vector is an effective rotation
invariant feature set with minimal set of selected features.
Abstract: The performance of an image filtering system depends
on its ability to detect the presence of noisy pixels in the image. Most
of the impulse detection schemes assume the presence of salt and
pepper noise in the images and do not work satisfactorily in case of
uniformly distributed impulse noise. In this paper, a new algorithm is
presented to improve the performance of switching median filter in
detection of uniformly distributed impulse noise. The performance of
the proposed scheme is demonstrated by the results obtained from
computer simulations on various images.
Abstract: Digital watermarking is one of the techniques for
copyright protection. In this paper, a normalization-based robust
image watermarking scheme which encompasses singular value
decomposition (SVD) and discrete cosine transform (DCT)
techniques is proposed. For the proposed scheme, the host image is
first normalized to a standard form and divided into non-overlapping
image blocks. SVD is applied to each block. By concatenating the
first singular values (SV) of adjacent blocks of the normalized image,
a SV block is obtained. DCT is then carried out on the SV blocks to
produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency
band of a SVD-DCT block by imposing a particular
relationship between two pseudo-randomly selected DCT
coefficients. An adaptive frequency mask is used to adjust local
watermark embedding strength. Watermark extraction involves
mainly the inverse process. The watermark extracting method is blind
and efficient. Experimental results show that the quality degradation
of watermarked image caused by the embedded watermark is visually
transparent. Results also show that the proposed scheme is robust
against various image processing operations and geometric attacks.
Abstract: An investigation of noise in a micro stepping motor is
considered to study in this article. Because of the trend towards higher
precision and more and more small 3C (including Computer,
Communication and Consumer Electronics) products, the micro
stepping motor is frequently used to drive the micro system or the
other 3C products. Unfortunately, noise in a micro stepped motor is
too large to accept by the customs. To depress the noise of a micro
stepped motor, the dynamic characteristics in this system must be
studied. In this article, a Visual Basic (VB) computer program speed
controlled micro stepped motor in a digital camera is investigated.
Karman KD2300-2S non-contract eddy current displacement sensor,
probe microphone, and HP 35670A analyzer are employed to analyze
the dynamic characteristics of vibration and noise in a motor. The
vibration and noise measurement of different type of bearings and
different treatment of coils are compared. The rotating components,
bearings, coil, etc. of the motor play the important roles in producing
vibration and noise. It is found that the noise will be depressed about
3~4 dB and 6~7 dB, when substitutes the copper bearing with plastic
one and coats the motor coil with paraffin wax, respectively.
Abstract: Pineapples can be classified using an index with seven
levels of maturity based on the green and yellow color of the skin. As
the pineapple ripens, the skin will change from pale green to a golden
or yellowish color. The issues that occur in agriculture nowadays are
to do with farmers being unable to distinguish between the indexes of
pineapple maturity correctly and effectively. There are several
reasons for why farmers cannot properly follow the guideline provide
by Federal Agriculture Marketing Authority (FAMA) and one of
reason is that due to manual inspection done by experts, there are no
specific and universal guidelines to be adopted by farmers due to the
different points of view of the experts when sorting the pineapples
based on their knowledge and experience. Therefore, an automatic
system will help farmers to identify pineapple maturity effectively
and will become a universal indicator to farmers.
Abstract: This paper discusses the causal explanation capability
of QRIOM, a tool aimed at supporting learning of organic chemistry
reactions. The development of the tool is based on the hybrid use of
Qualitative Reasoning (QR) technique and Qualitative Process
Theory (QPT) ontology. Our simulation combines symbolic,
qualitative description of relations with quantity analysis to generate
causal graphs. The pedagogy embedded in the simulator is to both
simulate and explain organic reactions. Qualitative reasoning through
a causal chain will be presented to explain the overall changes made
on the substrate; from initial substrate until the production of final
outputs. Several uses of the QPT modeling constructs in supporting
behavioral and causal explanation during run-time will also be
demonstrated. Explaining organic reactions through causal graph
trace can help improve the reasoning ability of learners in that their
conceptual understanding of the subject is nurtured.
Abstract: In illumination variant face recognition, existing
methods extracting face albedo as light normalized image may lead to
loss of extensive facial details, with light template discarded. To
improve that, a novel approach for realistic facial texture
reconstruction by combining original image and albedo image is
proposed. First, light subspaces of different identities are established
from the given reference face images; then by projecting the original
and albedo image into each light subspace respectively, texture
reference images with corresponding lighting are reconstructed and
two texture subspaces are formed. According to the projections in
texture subspaces, facial texture with normal light can be synthesized.
Due to the combination of original image, facial details can be
preserved with face albedo. In addition, image partition is applied to
improve the synthesization performance. Experiments on Yale B and
CMUPIE databases demonstrate that this algorithm outperforms the
others both in image representation and in face recognition.
Abstract: Structural representation and technology mapping of
a Boolean function is an important problem in the design of nonregenerative
digital logic circuits (also called combinational logic
circuits). Library aware function manipulation offers a solution to
this problem. Compact multi-level representation of binary networks,
based on simple circuit structures, such as AND-Inverter Graphs
(AIG) [1] [5], NAND Graphs, OR-Inverter Graphs (OIG), AND-OR
Graphs (AOG), AND-OR-Inverter Graphs (AOIG), AND-XORInverter
Graphs, Reduced Boolean Circuits [8] does exist in
literature. In this work, we discuss a novel and efficient graph
realization for combinational logic circuits, represented using a
NAND-NOR-Inverter Graph (NNIG), which is composed of only
two-input NAND (NAND2), NOR (NOR2) and inverter (INV) cells.
The networks are constructed on the basis of irredundant disjunctive
and conjunctive normal forms, after factoring, comprising terms with
minimum support. Construction of a NNIG for a non-regenerative
function in normal form would be straightforward, whereas for the
complementary phase, it would be developed by considering a virtual
instance of the function. However, the choice of best NNIG for a
given function would be based upon literal count, cell count and
DAG node count of the implementation at the technology
independent stage. In case of a tie, the final decision would be made
after extracting the physical design parameters.
We have considered AIG representation for reduced disjunctive
normal form and the best of OIG/AOG/AOIG for the minimized
conjunctive normal forms. This is necessitated due to the nature of
certain functions, such as Achilles- heel functions. NNIGs are found
to exhibit 3.97% lesser node count compared to AIGs and
OIG/AOG/AOIGs; consume 23.74% and 10.79% lesser library cells
than AIGs and OIG/AOG/AOIGs for the various samples considered.
We compare the power efficiency and delay improvement achieved
by optimal NNIGs over minimal AIGs and OIG/AOG/AOIGs for
various case studies. In comparison with functionally equivalent,
irredundant and compact AIGs, NNIGs report mean savings in power
and delay of 43.71% and 25.85% respectively, after technology
mapping with a 0.35 micron TSMC CMOS process. For a
comparison with OIG/AOG/AOIGs, NNIGs demonstrate average
savings in power and delay by 47.51% and 24.83%. With respect to
device count needed for implementation with static CMOS logic
style, NNIGs utilize 37.85% and 33.95% lesser transistors than their
AIG and OIG/AOG/AOIG counterparts.
Abstract: Phylogenies ; The evolutionary histories of groups of
species are one of the most widely used tools throughout the life
sciences, as well as objects of research with in systematic,
evolutionary biology. In every phylogenetic analysis reconstruction
produces trees. These trees represent the evolutionary histories of
many groups of organisms, bacteria due to horizontal gene transfer
and plants due to process of hybridization. The process of gene
transfer in bacteria and hybridization in plants lead to reticulate
networks, therefore, the methods of constructing trees fail in
constructing reticulate networks. In this paper a model has been
employed to reconstruct phylogenetic network in honey bee. This
network represents reticulate evolution in honey bee. The maximum
parsimony approach has been used to obtain this reticulate network.
Abstract: User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Abstract: In this paper, algorithm estimating the blood pressure
was proposed using the pulse transit time (PTT) as a more convenient
method of measuring the blood pressure. After measuring ECG and
pressure pulse, and photoplethysmography, the PTT was calculated
from the acquired signals. Thereafter, the system to indirectly measure
the systolic pressure and the diastolic pressure was composed using
the statistic method. In comparison between the blood pressure
indirectly measured by proposed algorithm estimating the blood
pressure and real blood pressure measured by conventional
sphygmomanometer, the systolic pressure indicates the mean error of
±3.24mmHg and the standard deviation of 2.53mmHg, while the
diastolic pressure indicates the satisfactory result, that is, the mean
error of ±1.80mmHg and the standard deviation of 1.39mmHg. These
results are satisfied with the regulation of ANSI/AAMI for
certification of sphygmomanometer that real measurement error value
should be within the mean error of ±5mmHg and the standard
deviation of 8mmHg. These results are suggest the possibility of
applying to portable and long time blood pressure monitoring system
hereafter.
Abstract: Ozone (O3) is considered as one of the most
phytotoxic pollutants with deleterious effects on living and non living
components of Ecosystems. It reduces growth and yield of many
crops as well as alters the physiology and crop quality. The present
study described series of experiments to investigate the effects of
ambient O3 at different locations with different ambient levels of O3
depending on proximity to pollutant source and ranged between 17
ppb/h in control experiment to 112 ppb/h in industrial area
respectively. The ambient levels in other three locations (King Saud
University botanical garden, King Fahd Rd, and Almanakh Garden)
were 61,61,77 ppb/h respectively. Tow legume crops species (vicia
vaba L ; and Pisum sativum) differ in their phenology and sensitivity
were used. The results showed a significant negative effect to ozone
on morphology, number of injured leaves, growth and productivity
with a difference in the degree of response depending on the plant
type. Visia Faba showed sensitivity to ozone to number and leaf area
and the degree of injury leaves 3, pisum sativum show higher
sensitivity for the gas for degree of injury 1,The relative growth rate
and seed weight, it turns out there is no significant difference
between the two plants in plant height and number of seeds.
Abstract: One of the most important areas of knowledge management studies is knowledge sharing. Measured in terms of number of scientific articles and organization-s applications, knowledge sharing stands as an example of success in the field. This paper reviews the related papers in the context of the underlying individual behavioral variables to providea direction framework for future research and writing.
Abstract: In many applications, it is a priori known that the
target function should satisfy certain constraints imposed by, for
example, economic theory or a human-decision maker. Here we
consider partially monotone problems, where the target variable
depends monotonically on some of the predictor variables but not all.
We propose an approach to build partially monotone models based
on the convolution of monotone neural networks and kernel
functions. The results from simulations and a real case study on
house pricing show that our approach has significantly better
performance than partially monotone linear models. Furthermore, the
incorporation of partial monotonicity constraints not only leads to
models that are in accordance with the decision maker's expertise,
but also reduces considerably the model variance in comparison to
standard neural networks with weight decay.