Abstract: Distant-talking voice-based HCI system suffers from
performance degradation due to mismatch between the acoustic
speech (runtime) and the acoustic model (training). Mismatch is
caused by the change in the power of the speech signal as observed at
the microphones. This change is greatly influenced by the change in
distance, affecting speech dynamics inside the room before reaching
the microphones. Moreover, as the speech signal is reflected, its
acoustical characteristic is also altered by the room properties. In
general, power mismatch due to distance is a complex problem. This
paper presents a novel approach in dealing with distance-induced
mismatch by intelligently sensing instantaneous voice power variation
and compensating model parameters. First, the distant-talking speech
signal is processed through microphone array processing, and the
corresponding distance information is extracted. Distance-sensitive
Gaussian Mixture Models (GMMs), pre-trained to capture both
speech power and room property are used to predict the optimal
distance of the speech source. Consequently, pre-computed statistic
priors corresponding to the optimal distance is selected to correct
the statistics of the generic model which was frozen during training.
Thus, model combinatorics are post-conditioned to match the power
of instantaneous speech acoustics at runtime. This results to an
improved likelihood in predicting the correct speech command at
farther distances. We experiment using real data recorded inside two
rooms. Experimental evaluation shows voice recognition performance
using our method is more robust to the change in distance compared
to the conventional approach. In our experiment, under the most
acoustically challenging environment (i.e., Room 2: 2.5 meters), our
method achieved 24.2% improvement in recognition performance
against the best-performing conventional method.
Abstract: Dust storms are one of the most costly and destructive
events in many desert regions. They can cause massive damages both
in natural environments and human lives. This paper is aimed at
presenting a preliminary study on dust storms, as a major natural
hazard in arid and semi-arid regions. As a case study, dust storm
events occurred in Zabol city located in Sistan Region of Iran was
analyzed to diagnose and predict dust storms. The identification and
prediction of dust storm events could have significant impacts on
damages reduction. Present models for this purpose are complicated
and not appropriate for many areas with poor-data environments. The
present study explores Gamma test for identifying inputs of ANNs
model, for dust storm prediction. Results indicate that more attempts
must be carried out concerning dust storms identification and
segregate between various dust storm types.
Abstract: In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.
Abstract: Prior to the use of detectors, characteristics
comparison study was performed and baseline established. In patient
specific QA, the portal dosimetry mean values of area gamma,
average gamma and maximum gamma were 1.02, 0.31 and 1.31 with
standard deviation of 0.33, 0.03 and 0.14 for IMRT and the
corresponding values were 1.58, 0.48 and 1.73 with standard
deviation of 0.31, 0.06 and 0.66 for VMAT. With ImatriXX 2-D
array system, on an average 99.35% of the pixels passed the criteria
of 3%-3 mm gamma with standard deviation of 0.24 for dynamic
IMRT. For VMAT, the average value was 98.16% with a standard
deviation of 0.86. The results showed that both the systems can be
used in patient specific QA measurements for IMRT and VMAT.
The values obtained with the portal dosimetry system were found to
be relatively more consistent compared to those obtained with
ImatriXX 2-D array system.
Abstract: In this paper, the action research driven design of a
context relevant, developmental peer review of teaching model, its
implementation strategy and its impact at an Australian university is
presented. PRO-Teaching realizes an innovative process that
triangulates contemporaneous teaching quality data from a range of
stakeholders including students, discipline academics, learning and
teaching expert academics, and teacher reflection to create reliable
evidence of teaching quality. Data collected over multiple classroom
observations allows objective reporting on development differentials
in constructive alignment, peer, and student evaluations. Further
innovation is realized in the application of this highly structured
developmental process to provide summative evidence of sufficient
validity to support claims for professional advancement and learning
and teaching awards. Design decision points and contextual triggers
are described within the operating domain. Academics and
developers seeking to introduce structured peer review of teaching
into their organization will find this paper a useful reference.
Abstract: Groundwater resources in Arsanjan plain provide
water for agriculture, industry, and human consumption. Continued
agricultural development in this area needs to additional groundwater
resources for, particularly during of drought periods, and effects on
the quantity and quality of ground water available. The purpose of
this study is to evaluate water level changes in the aquifer of
Arsanjan plain in the Fars province in order to determine the areas of
greatest depletion and the causes of depletion. In this plain, farmers
and other users are pumping groundwater faster than its natural
replenishment rate, causing a continuous drop in groundwater tables
and depletion of this resource. In this research variation of
groundwater level, their effects and ways to help control groundwater
levels in aquifer of the Arsanjan plains were evaluated .Excessive
exploitation of groundwater in this aquifer caused the groundwater
levels fall too fast or to unacceptable levels. The average drawdown
of the groundwater level in this plain were 19.66 meters during
1996 to 2003.
Abstract: Real-time object tracking is a problem which involves extraction of critical information from complex and uncertain imagedata. In this paper, we present a comprehensive methodology to design an artificial neural network (ANN) for a real-time object tracking application. The object, which is tracked for the purpose of demonstration, is a specific airplane. However, the proposed ANN can be trained to track any other object of interest. The ANN has been simulated and tested on the training and testing datasets, as well as on a real-time streaming video. The tracking error is analyzed with post-regression analysis tool, which finds the correlation among the calculated coordinates and the correct coordinates of the object in the image. The encouraging results from the computer simulation and analysis show that the proposed ANN architecture is a good candidate solution to a real-time object tracking problem.
Abstract: This paper aims at overviewing the topics of a research project (CARDIOSENSOR) on the field of health sciences (biomaterials and biomedical engineering). The project has focused on the development of a nanosensor for the assessment of the risk of cardiovascular diseases by the monitoring of C-reactive protein (CRP), which has been currently considered as the best validated inflammatory biomarker associated to cardiovascular diseases. The project involves tasks such as: 1) the development of sensor devices based on field effect transistors (FET): assembly, optimization and validation; 2) application of sensors to the detection of CRP in standard solutions and comparison with enzyme-linked immunosorbent assay (ELISA); and 3) application of sensors to real samples such as blood and saliva and evaluation of their ability to predict the risk of cardiovascular disease.
Abstract: The study investigates the causal link between trade
openness and economic growth for four South Asian countries for
period 1972-1985 and 1986-2007 to examine the scenario before and
after the implementation of SAARC. Panel cointegration and
FMOLS techniques are employed for short run and long run
estimates. In 1972-85 short run unidirectional causality from GDP to
openness is found whereas, in 1986-2007 there exists bi-directional
causality between GDP and openness. The long run elasticity
magnitude between GDP and openness contains negative sign in
1972-85 which shows that there exists long run negative relationship.
While in time period 1986-2007 the elasticity magnitude has positive
sign that indicates positive causation between GDP and openness. So
it can be concluded that after the implementation of SAARC overall
situation of selected countries got better. Also long run coefficient of
error term suggests that short term equilibrium adjustments are driven
by adjustment back to long run equilibrium.
Abstract: In this paper, parallel interface for microprocessor
trainer was implemented. A programmable parallel–port device such
as the IC 8255A is initialized for simple input or output and for
handshake input or output by choosing kinds of modes. The hardware
connections and the programs can be used to interface
microprocessor trainer and a personal computer by using IC 8255A.
The assembly programs edited on PC-s editor can be downloaded to
the trainer.
Abstract: In the present study, Convective heat transfer
coefficient and pressure drop of Al2O3/water nanofluid in laminar
flow regime under constant heat flux conditions inside a circular tube
were experimentally investigated. Al2O3/water nanofluid with 0.5%
and 1% volume concentrations with 15 nm diameter nanoparticles
were used as working fluid. The effect of different volume
concentrations on convective heat transfer coefficient and friction
factor was studied. The results emphasize that increasing of particle
volume concentration leads to enhance convective heat transfer
coefficient. Measurements show the average heat transfer coefficient
enhanced about 11-20% with 0.5% volume concentration and
increased about 16-27% with 1% volume concentration compared to
distilled water. In addition, the convective heat transfer coefficient of
nanofluid enhances with increase in heat flux. From the results, the
average ratio of (fnf/fbf) was about 1.10 for 0.5% volume
concentration. Therefore, there is no significant increase in friction
factor for nanofluids.
Abstract: Sustainable energy usage has been recognized as one
of the important measure to increase the competitiveness of the
nation globally. Many strong emphases were given in the Ninth
Malaysia Plan (RMK9) to improve energy efficient especially to
government buildings. With this in view, a project to investigate the
potential of energy saving in selected building in Universiti Tun
Hussein Onn Malaysia (UTHM) was carried out. In this project, a
case study involving electric energy consumption of the academic
staff office building was conducted. The scope of the study include to
identify energy consumption in a selected building, to study energy
saving opportunities, to analyse cost investment in term of economic
and to identify users attitude with respect to energy usage. The
MS1525:2001, Malaysian Standard -Code of practice on energy
efficiency and use of renewable energy for non-residential buildings
was used as reference. Several energy efficient measures were
considered and their merits and priority were compared. Improving
human behavior can reduce energy consumption by 6% while
technical measure can reduce energy consumption by 44%. Two
economic analysis evaluation methods were applied; they are the
payback period method and net present value method.
Abstract: The paper deals with the kinematics and automated
calculation of intermittent mechanisms with radial cams. Currently,
electronic cams are increasingly applied in the drives of working link
mechanisms. Despite a huge advantage of electronic cams in their reprogrammability
or instantaneous change of displacement diagrams,
conventional cam mechanisms have an irreplaceable role in
production and handling machines. With high frequency of working
cycle periods, the dynamic load of the proper servomotor rotor
increases and efficiency of electronic cams strongly decreases.
Though conventional intermittent mechanisms with radial cams are
representatives of fixed automation, they have distinct advantages in
their high speed (high dynamics), positional accuracy and relatively
easy manufacture. We try to remove the disadvantage of firm
displacement diagram by reducing costs for simple design and
automated calculation that leads reliably to high-quality and
inexpensive manufacture.
Abstract: Very Large and/or computationally complex optimization problems sometimes require parallel or highperformance computing for achieving a reasonable time for computation. One of the most popular and most complicate problems of this family is “Traveling Salesman Problem". In this paper we have introduced a Branch & Bound based algorithm for the solution of such complicated problems. The main focus of the algorithm is to solve the “symmetric traveling salesman problem". We reviewed some of already available algorithms and felt that there is need of new algorithm which should give optimal solution or near to the optimal solution. On the basis of the use of logarithmic sampling, it was found that the proposed algorithm produced a relatively optimal solution for the problem and results excellent performance as compared with the traditional algorithms of this series.
Abstract: The paper provides biomasses characteristics by
proximate analysis (volatile matter, fixed carbon and ash) and
ultimate analysis (carbon, hydrogen, nitrogen and oxygen) for the
prediction of the heating value equations. The heating value
estimation of various biomasses can be used as an energy evaluation.
Thirteen types of biomass were studied. Proximate analysis was
investigated by mass loss method and infrared moisture analyzer.
Ultimate analysis was analyzed by CHNO analyzer. The heating
values varied from 15 to 22.4MJ kg-1. Correlations of the calculated
heating value with proximate and ultimate analyses were undertaken
using multiple regression analysis and summarized into three and two
equations, respectively. Correlations based on proximate analysis
illustrated that deviation of calculated heating values from
experimental heating values was higher than the correlations based
on ultimate analysis.
Abstract: Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.
Abstract: The purpose of this report is to suggest a new
methodology for the assessment of the comparative efficiency of the
reforms made in different countries by an integral index. We have
highlighted the reforms made in post-crisis period in 21 former
socialist countries.
The integral index describes the social-economic development
level. The integral index contains of six indexes: The Global
Competitiveness Index, Doing Business, The Corruption Perception,
The Index of Economic Freedom, The Human Development, and
The Democracy Index, which are reported by different international
organizations. With the help of our methodology we first summarized
the above-mentioned 6 indexes and attained 1 general index, besides,
our new method enables us to assess the comparative efficiency of the
reforms made in different countries by analyzing them.
The purpose is to reveal the opportunities and threats of socialeconomic
reforms in different directions.
Abstract: As the air traffic increases at a hub airport, some
flights cannot land or depart at their preferred target time. This event
happens because the airport runways become occupied to near their
capacity. It results in extra costs for both passengers and airlines
because of the loss of connecting flights or more waiting, more fuel
consumption, rescheduling crew members, etc. Hence, devising an
appropriate scheduling method that determines a suitable runway and
time for each flight in order to efficiently use the hub capacity and
minimize the related costs is of great importance. In this paper, we
present a mixed-integer zero-one model for scheduling a set of mixed
landing and departing flights (despite of most previous studies
considered only landings). According to the fact that the flight cost is
strongly affected by the level of airline, we consider different airline
categories in our model. This model presents a single objective
minimizing the total sum of three terms, namely 1) the weighted
deviation from targets, 2) the scheduled time of the last flight (i.e.,
makespan), and 3) the unbalancing the workload on runways. We
solve 10 simulated instances of different sizes up to 30 flights and 4
runways. Optimal solutions are obtained in a reasonable time, which
are satisfactory in comparison with the traditional rule, namely First-
Come-First-Serve (FCFS) that is far apart from optimality in most
cases.
Abstract: Copolymerization of ethylene with 1-hexene was
carried out using two ansa-fluorenyl titanium derivative complexes.
The substituent effect on the catalytic activity, monomer reactivity
ratio and polymer property was investigated. It was found that the
presence of t-Bu groups on fluorenyl ring exhibited remarkable
catalytic activity and produced polymer with high molecular weight.
However, these catalysts produce polymer with narrow molecular
weight distribution, indicating the characteristic of single-site
metallocene catalyst. Based on 13C NMR, we can observe that
monomer reactivity ratio was affected by catalyst structure. The rH
values of complex 2 were lower than that of complex 1 which might
be result from the higher steric hindrance leading to a reduction of 1-
hexene insertion step.
Abstract: Scheduling algorithms are used in operating systems
to optimize the usage of processors. One of the most efficient
algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ)
algorithm which uses several queues with different quanta. The most
important weakness of this method is the inability to define the
optimized the number of the queues and quantum of each queue. This
weakness has been improved in IMLFQ scheduling algorithm.
Number of the queues and quantum of each queue affect the response
time directly. In this paper, we review the IMLFQ algorithm for
solving these problems and minimizing the response time. In this
algorithm Recurrent Neural Network has been utilized to find both
the number of queues and the optimized quantum of each queue.
Also in order to prevent any probable faults in processes' response
time computation, a new fault tolerant approach has been presented.
In this approach we use combinational software redundancy to
prevent the any probable faults. The experimental results show that
using the IMLFQ algorithm results in better response time in
comparison with other scheduling algorithms also by using fault
tolerant mechanism we improve IMLFQ performance.