Abstract: Reasonably priced and well-constructed housing must
be an integral and element supporting a healthy society. The absence
of housing everyone in society can afford negatively affects the
people's health, education, ability to get jobs, develop their
community. Without access to decent housing, economic
development, integration of immigrants and inclusiveness, the society
is negatively impacted. Canada has a sterling record in creating
housing compared to many other nations around the globe. Canadian
housing gets support from a mature and responsive mortgage network
and a top-quality construction industry as well as safe and excellent
quality building materials that are readily available. Yet 1.7 million
Canadian households occupy substandard abodes. During the past
hundred years, Canada's government has made a wide variety of
attempts to provide decent residential facilities every Canadian can
afford. Despite these laudable efforts, today Canada is left with
housing that is inadequate for many Canadians. People who own their
housing are given all kinds of privileges and perks, while people with
relatively low incomes who rent their apartments or houses are
discriminated against.
To help solve these problems, zoning that is based on an
"inclusionary" philosophy is tool developed to help provide people
the affordable residences that they need. No, thirty years after its
introduction, this type of zoning has been shown effective in helping
build and provide Canadians with a houses or apartments they can
afford to pay for. Using this form of zoning can have different results
+depending on where and how it is used. After examining Canadian
affordable housing and four American cases where this type of
zoning was enforced in the USA, this makes various
recommendations for expanding Canadians' access to housing they
can afford.
Abstract: A zero dimensional model has been used to investigate
the combustion performance of a single cylinder direct injection
diesel engine fueled by biofuels with options like supercharging and
exhaust gas recirculation. The numerical simulation was performed at
constant speed. The indicated pressure, temperature diagrams are
plotted and compared for different fuels. The emissions of soot and
nitrous oxide are computed with phenomenological models. The
experimental work was also carried out with biodiesel (palm stearin
methyl ester) diesel blends, ethanol diesel blends to validate
simulation results with experimental results, and observed that the
present model is successful in predicting the engine performance with
biofuels.
Abstract: This paper reports the results of an experimental study
conducted to characterise the gas-liquid multiphase flows
experienced within a vertical riser transporting a range of gas-liquid
flow rates. The scale experiments were performed using an
air/silicone oil mixture within a 6 m long riser. The superficial air
velocities studied ranged from 0.047 to 2.836 m/ s, whilst
maintaining a liquid superficial velocity at 0.047 m/ s. Measurements
of the mean cross-sectional and time average radial void fraction
were obtained using a wire mesh sensor (WMS). The data were
recorded at an acquisition frequency of 1000 Hz over an interval of
60 seconds. For the range of flow conditions studied, the average
void fraction was observed to vary between 0.1 and 0.9. An analysis
of the data collected concluded that the observed void fraction was
strongly affected by the superficial gas velocity, whereby the higher
the superficial gas velocity, the higher was the observed average void
fraction. The average void fraction distributions observed were in
good agreement with the results obtained by other researchers. When
the air-silicone oil flows were fully developed reasonably symmetric
profiles were observed, with the shape of the symmetry profile being
strongly dependent on the superficial gas velocity.
Abstract: This paper describes an automated event detection and location system for water distribution pipelines which is based upon low-cost sensor technology and signature analysis by an Artificial
Neural Network (ANN). The development of a low cost failure sensor which measures the opacity or cloudiness of the local water
flow has been designed, developed and validated, and an ANN based system is then described which uses time series data produced by
sensors to construct an empirical model for time series prediction and
classification of events. These two components have been installed,
tested and verified in an experimental site in a UK water distribution
system. Verification of the system has been achieved from a series of
simulated burst trials which have provided real data sets. It is concluded that the system has potential in water distribution network
management.
Abstract: The Learning Management Systems present learning
environment which offers a collection of e-learning tools in a
package that allows a common interface and information sharing
among the tools. South East European University initial experience
in LMS was with the usage of the commercial LMS-ANGEL. After a
three year experience on ANGEL usage because of expenses that
were very high it was decided to develop our own software. As part
of the research project team for the in-house design and development
of the new LMS, we primarily had to select the features that would
cover our needs and also comply with the actual trends in the area of
software development, and then design and develop the system. In
this paper we present the process of LMS in-house development for
South East European University, its architecture, conception and
strengths with a special accent on the process of migration and
integration with other enterprise applications.
Abstract: This paper presents a design method of self-tuning
Quantitative Feedback Theory (QFT) by using improved deadbeat
control algorithm. QFT is a technique to achieve robust control with
pre-defined specifications whereas deadbeat is an algorithm that
could bring the output to steady state with minimum step size.
Nevertheless, usually there are large peaks in the deadbeat response.
By integrating QFT specifications into deadbeat algorithm, the large
peaks could be tolerated. On the other hand, emerging QFT with
adaptive element will produce a robust controller with wider
coverage of uncertainty. By combining QFT-based deadbeat
algorithm and adaptive element, superior controller that is called selftuning
QFT-based deadbeat controller could be achieved. The output
response that is fast, robust and adaptive is expected. Using a grain
dryer plant model as a pilot case-study, the performance of the
proposed method has been evaluated and analyzed. Grain drying
process is very complex with highly nonlinear behaviour, long delay,
affected by environmental changes and affected by disturbances.
Performance comparisons have been performed between the
proposed self-tuning QFT-based deadbeat, standard QFT and
standard dead-beat controllers. The efficiency of the self-tuning QFTbased
dead-beat controller has been proven from the tests results in
terms of controller’s parameters are updated online, less percentage
of overshoot and settling time especially when there are variations in
the plant.
Abstract: One of the vital developmental tasks that an
individual faces during adolescence is choosing a career. Arriving at
a career decision is difficult and anxious for many adolescents in the
tertiary level. The main purpose of this study is to determine the
factors relating to career indecision among freshmen college students
as basis for the formulation of a comprehensive career counseling
program for the psychological well-being of freshmen university
students. The subjects were purposively selected. The Slovin-s
formula was used in determining the sample size, using a 0.05
margin of error in getting the total number of samples per college and
per major. The researcher made use of descriptive correlational study
in determining significant factors relating to career indecision.
Multiple Regression Analysis indicated that career thoughts, career
decisions and vocational identity as factors related to career
indecision.
Abstract: A design of communication area for infrared
electronic-toll-collection systems to provide an extended
communication interval in the vehicle traveling direction and
regular boundary between contiguous traffic lanes is proposed.
By utilizing two typical low-cost commercial infrared LEDs with
different half-intensity angles Φ1/2 = 22◦ and 10◦, the radiation
pattern of the emitter is designed to properly adjust the spatial
distribution of the signal power. The aforementioned purpose
can be achieved with an LED array in a three-piece structure
with appropriate mounting angles. With this emitter, the influence
of the mounting parameters, including the mounting height and
mounting angles of the on-board unit and road-side unit, on the
system performance in terms of the received signal strength and
communication area are investigated. The results reveal that, for
our emitter proposed in this paper, the ideal ”long-and-narrow”
characteristic of the communication area is very little affected by
these mounting parameters. An optimum mounting configuration is
also suggested.
Abstract: Composites based on a biodegradable polycaprolactone (PCL) containing 0.5, 1.0 and 2.0 wt % of titanium dioxide (TiO2) micro and nanoparticles were prepared by melt mixing and the effect of filler type and contents on the thermal properties, dynamic-mechanical behaviour and morphology were investigated. Measurements of storage modulus and loss modulus by dynamic mechanical analysis (DMA) showed better results for microfilled PCL/TiO2 composites than nanofilled composites, with the same filler content. DSC analysis showed that the Tg and Tc of micro and nanocomposites were slightly lower than those of neat PCL. The crystallinity of the PCL increased with the addition of TiO2 micro and nanoparticles; however, the cc for the PCL was unchanged with micro TiO2 content. The thermal stability of PCL/TiO2 composites were characterized using thermogravimetric analysis (TGA). The initial weight loss (5 wt %) occurs at slightly higher temperature with micro and nano TiO2 addition and with increasing TiO2 content.
Abstract: Streaming Applications usually run in parallel or in
series that incrementally transform a stream of input data. It poses a
design challenge to break such an application into distinguishable
blocks and then to map them into independent hardware processing
elements. For this, there is required a generic controller that
automatically maps such a stream of data into independent processing
elements without any dependencies and manual considerations. In
this paper, Kahn Process Networks (KPN) for such streaming
applications is designed and developed that will be mapped on
MPSoC. This is designed in such a way that there is a generic Cbased
compiler that will take the mapping specifications as an input
from the user and then it will automate these design constraints and
automatically generate the synthesized RTL optimized code for
specified application.
Abstract: The purpose of this study was to investigate effects of
modality and redundancy principles on music theory learning among
pupils of different anxiety levels. The lesson of music theory was
developed in three different modes, audio and image (AI), text with
image (TI) and audio with image and text (AIT). The independent
variables were the three modes of courseware. The moderator
variable was the anxiety level, while the dependent variable was the
post test score. The study sample consisted of 405 third-grade pupils.
Descriptive and inferential statistics were conducted to analyze the
collected data. Analyses of covariance (ANCOVA) and Post hoc
were carried out to examine the main effects as well as the
interaction effects of the independent variables on the dependent
variable. The findings of this study showed that medium anxiety
pupils performed significantly better than low and high anxiety
pupils in all the three treatment modes. The AI mode was found to
help pupils with high anxiety significantly more than the TI and AIT
modes.
Abstract: Air conditioning is mainly use as human comfort
cooling medium. It use more in high temperatures are country such as
Malaysia. Proper estimation of cooling load will archive ideal
temperature. Without proper estimation can lead to over estimation or
under estimation. The ideal temperature should be comfort enough.
This study is to develop a program to calculate an ideal cooling load
demand, which is match with heat gain. Through this study, it is easy
to calculate cooling load estimation. Objective of this study are to
develop user-friendly and easy excess cooling load program. This is
to insure the cooling load can be estimate by any of the individual
rather than them using rule-of-thumb. Developed software is carryout
by using Matlab-GUI. These developments are only valid for
common building in Malaysia only. An office building was select as
case study to verify the applicable and accuracy of develop software.
In conclusion, the main objective has successfully where developed
software is user friendly and easily to estimate cooling load demand.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: Space exploration is a highly visible endeavour of
humankind to seek profound answers to questions about the origins
of our solar system, whether life exists beyond Earth, and how we
could live on other worlds. Different platforms have been utilized in
planetary exploration missions, such as orbiters, landers, rovers, and
penetrators.
Having low mass, good mechanical contact with the surface,
ability to acquire high quality scientific subsurface data, and ability to
be deployed in areas that may not be conducive to landers or rovers,
Penetrators provide an alternative and complimentary solution that
makes possible scientific exploration of hardly accessible sites (icy
areas, gully sites, highlands etc.).
The Canadian Space Agency (CSA) has put space exploration as
one of the pillars of its space program, and established ExCo program
to prepare Canada for future international planetary exploration.
ExCo sets surface mobility as its focus and priority, and invests
mainly in the development of rovers because of Canada's niche space
robotics technology. Meanwhile, CSA is also investigating how
micro-penetrators can help Canada to fulfill its scientific objectives
for planetary exploration.
This paper presents a review of the micro-penetrator technologies,
past missions, and lessons learned. It gives a detailed analysis of the
technical challenges of micro-penetrators, such as high impact
survivability, high precision guidance navigation and control, thermal
protection, communications, and etc. Then, a Canadian perspective of
a possible micro-penetrator mission is given, including Canadian
scientific objectives and priorities, potential instruments, and flight
opportunities.
Abstract: Model-based approaches have been applied successfully
to a wide range of tasks such as specification, simulation, testing, and
diagnosis. But one bottleneck often prevents the introduction of these
ideas: Manual modeling is a non-trivial, time-consuming task.
Automatically deriving models by observing and analyzing running
systems is one possible way to amend this bottleneck. To
derive a model automatically, some a-priori knowledge about the
model structure–i.e. about the system–must exist. Such a model
formalism would be used as follows: (i) By observing the network
traffic, a model of the long-term system behavior could be generated
automatically, (ii) Test vectors can be generated from the model,
(iii) While the system is running, the model could be used to diagnose
non-normal system behavior.
The main contribution of this paper is the introduction of a model
formalism called 'probabilistic regression automaton' suitable for the
tasks mentioned above.
Abstract: The presence of cold air with the convergent
topography of the Lut valley over the valley-s sloping terrain can
generate Low Level Jets (LLJ). Moreover, the valley-parallel
pressure gradients and northerly LLJ are produced as a result of the
large-scale processes. In the numerical study the regional MM5
model was run leading to achieve an appropriate dynamical analysis
of flows in the region for summer and winter. The results of this
study show the presence of summer synoptical systems cause the
formation of north-south pressure gradients in the valley which could
be led to the blowing of winds with the velocity more than 14 ms-1
and vulnerable dust and wind storms lasting more than 120 days.
Whereas the presence of cold air masses in the region in winter,
cause the average speed of LLJs decrease. In this time downslope
flows are noticeable in creating the night LLJs.
Abstract: Bioinformatics and computational biology involve
the use of techniques including applied mathematics,
informatics, statistics, computer science, artificial intelligence,
chemistry, and biochemistry to solve biological problems
usually on the molecular level. Research in computational
biology often overlaps with systems biology. Major research
efforts in the field include sequence alignment, gene finding,
genome assembly, protein structure alignment, protein structure
prediction, prediction of gene expression and proteinprotein
interactions, and the modeling of evolution. Various
global rearrangements of permutations, such as reversals and
transpositions,have recently become of interest because of their
applications in computational molecular biology. A reversal is
an operation that reverses the order of a substring of a permutation.
A transposition is an operation that swaps two adjacent
substrings of a permutation. The problem of determining the
smallest number of reversals required to transform a given
permutation into the identity permutation is called sorting by
reversals. Similar problems can be defined for transpositions
and other global rearrangements. In this work we perform a
study about some genome rearrangement primitives. We show
how a genome is modelled by a permutation, introduce some
of the existing primitives and the lower and upper bounds
on them. We then provide a comparison of the introduced
primitives.
Abstract: During recent years wind turbine technology has
undergone rapid developments. Growth in size and the optimization
of wind turbines has enabled wind energy to become increasingly
competitive with conventional energy sources. As a result today-s
wind turbines participate actively in the power production of several
countries around the world. These developments raise a number of
challenges to be dealt with now and in the future. The penetration of
wind energy in the grid raises questions about the compatibility of the
wind turbine power production with the grid. In particular, the
contribution to grid stability, power quality and behavior during fault
situations plays therefore as important a role as the reliability. In the
present work, we addressed two fault situations that have shown their
influence on the generator and the behavior of the wind over the
defects which are briefly discussed based on simulation results.
Abstract: A generalized Digital Modulation Identification algorithm for adaptive demodulator has been developed and presented in this paper. The algorithm developed is verified using wavelet Transform and histogram computation to identify QPSK and QAM with GMSK and M–ary FSK modulations. It has been found that the histogram peaks simplifies the procedure for identification. The simulated results show that the correct modulation identification is possible to a lower bound of 5 dB and 12 dB for GMSK and QPSK respectively. When SNR is above 5 dB the throughput of the proposed algorithm is more than 97.8%. The receiver operating characteristics (ROC) has been computed to measure the performance of the proposed algorithm and the analysis shows that the probability of detection (Pd) drops rapidly when SNR is 5 dB and probability of false alarm (Pf) is smaller than 0.3. The performance of the proposed algorithm has been compared with existing methods and found it will identify all digital modulation schemes with low SNR.
Abstract: The paper describes the evaluation of quality of
control for cases of controlled non-minimal phase plants. Control
circuits containing non-minimal phase plants have different
properties, they manifest reversed reaction at the beginning of unit
step response. For these types of plants are developed special
criterion of quality of control, which considers the difference and can
be helpful for synthesis of optimal controller tuning. All results are
clearly presented using Matlab/Simulink models.