Abstract: Fundamental motivation of this paper is how gaze estimation can be utilized effectively regarding an application to games. In games, precise estimation is not always important in aiming targets but an ability to move a cursor to an aiming target accurately is also significant. Incidentally, from a game producing point of view, a separate expression of a head movement and gaze movement sometimes becomes advantageous to expressing sense of presence. A case that panning a background image associated with a head movement and moving a cursor according to gaze movement can be a representative example. On the other hand, widely used technique of POG estimation is based on a relative position between a center of corneal reflection of infrared light sources and a center of pupil. However, a calculation of a center of pupil requires relatively complicated image processing, and therefore, a calculation delay is a concern, since to minimize a delay of inputting data is one of the most significant requirements in games. In this paper, a method to estimate a head movement by only using corneal reflections of two infrared light sources in different locations is proposed. Furthermore, a method to control a cursor using gaze movement as well as a head movement is proposed. By using game-like-applications, proposed methods are evaluated and, as a result, a similar performance to conventional methods is confirmed and an aiming control with lower computation power and stressless intuitive operation is obtained.
Abstract: To produce sugar and ethanol, sugarcane processing
generates several agricultural residues, being straw and bagasse is
considered as the main among them. And what to do with this
residues has been subject of many studies and experiences in an
industry that, in recent years, highlighted by the ability to transform
waste into valuable products such as electric power. Cellulose is the
main component of these materials. It is the most common organic
polymer and represents about 1.5 x 1012 tons of total production of
biomass per year and is considered an almost inexhaustible source of
raw material. Pretreatment with mineral acids is one of the most
widely used as stage of cellulose extraction from lignocellulosic
materials for solubilizing most of the hemicellulose content. This
study had as goal to find the best reaction time of sugarcane bagasse
pretreatment with sulfuric acid in order to minimize the losses of
cellulose concomitantly with the highest possible removal of
hemicellulose and lignin. It was found that the best time for this
reaction was 40 minutes, in which it was reached a loss of
hemicelluloses around 70% and lignin and cellulose, around 15%.
Over this time, it was verified that the cellulose loss increased and
there was no loss of lignin and hemicellulose.
Abstract: The main objective of this work is to compare the
quality of service of the bus companies operating in the city of Rio
Branco, located in the state of Acre with the quality of service of the
bus companies operating in the city of Campos, situated in the state
of Rio de Janeiro, both cities in Brazil. This comparison, based on the
opinion of the bus users, will determine their degree of satisfaction
with the service available in both cities. The outcome of this
evaluation shows the users unhappy with the quality of the service
provided by the bus companies operating in both cities and the need
to identify alternative solutions that may minimize the consequences
caused by the main problems detected in this work. With these
alternatives available, the bus companies will be able to better
understand the needs of their customers in terms of manpower,
service cost, time schedule, etc.
Abstract: This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.
Abstract: In this paper, a block code to minimize the peak-toaverage
power ratio (PAPR) of orthogonal frequency division
multiplexing (OFDM) signals is proposed. It is shown that cyclic
shift and codeword inversion cause not change to peak envelope
power. The encoding rule for the proposed code comprises of
searching for a seed codeword, shifting the register elements, and
determining codeword inversion, eliminating the look-up table for
one-to-one correspondence between the source and the coded data.
Simulation results show that OFDM systems with the proposed code
always have the minimum PAPR.
Abstract: This paper proposes a method which reduces power consumption in single-error correcting, double error-detecting checker circuits that perform memory error correction code. Power is minimized with little or no impact on area and delay, using the degrees of freedom in selecting the parity check matrix of the error correcting codes. The genetic algorithm is employed to solve the non linear power optimization problem. The method is applied to two commonly used SEC-DED codes: standard Hamming and odd column weight Hsiao codes. Experiments were performed to show the performance of the proposed method.
Abstract: Diagnostic goal of transformers in service is to detect the winding or the core in fault. Transformers are valuable equipment which makes a major contribution to the supply security of a power system. Consequently, it is of great importance to minimize the frequency and duration of unwanted outages of power transformers. So, Frequency Response Analysis (FRA) is found to be a useful tool for reliable detection of incipient mechanical fault in a transformer, by finding winding or core defects. The authors propose as first part of this article, the coupled circuits method, because, it gives most possible exhaustive modelling of transformers. And as second part of this work, the application of FRA in low frequency in order to improve and simplify the response reading. This study can be useful as a base data for the other transformers of the same categories intended for distribution grid.
Abstract: The approach based on the wavelet transform has
been widely used for image denoising due to its multi-resolution
nature, its ability to produce high levels of noise reduction and the
low level of distortion introduced. However, by removing noise, high
frequency components belonging to edges are also removed, which
leads to blurring the signal features. This paper proposes a new
method of image noise reduction based on local variance and edge
analysis. The analysis is performed by dividing an image into 32 x 32
pixel blocks, and transforming the data into wavelet domain. Fast
lifting wavelet spatial-frequency decomposition and reconstruction is
developed with the advantages of being computationally efficient and
boundary effects minimized. The adaptive thresholding by local
variance estimation and edge strength measurement can effectively
reduce image noise while preserve the features of the original image
corresponding to the boundaries of the objects. Experimental results
demonstrate that the method performs well for images contaminated
by natural and artificial noise, and is suitable to be adapted for
different class of images and type of noises. The proposed algorithm
provides a potential solution with parallel computation for real time
or embedded system application.
Abstract: In this work the effects of uniaxial mechanical stress on a pixel readout circuit are theoretically analyzed. It is the effects of mechanical stress on the in-pixel transistors do not arise at the output, when a correlated double sampling circuit is used. However, mechanical stress effects on the photodiode will directly appear at the readout chain output. Therefore, compensation techniques are needed to overcome this situation. Moreover simulation technique of mechanical stress is proposed and diverse layout as well as design recommendations are put forward, in order to minimize stress related effects on the output of a circuit. he shown, that wever, Moreover, a out
Abstract: In order to consider the effects of the higher modes in
the pushover analysis, during the recent years several multi-modal
pushover procedures have been presented. In these methods the
response of the considered modes are combined by the square-rootof-
sum-of-squares (SRSS) rule while application of the elastic modal
combination rules in the inelastic phases is no longer valid. In this
research the feasibility of defining an efficient alternative
combination method is investigated. Two steel moment-frame
buildings denoted SAC-9 and SAC-20 under ten earthquake records
are considered. The nonlinear responses of the structures are
estimated by the directed algebraic combination of the weighted
responses of the separate modes. The weight of the each mode is
defined so that the resulted response of the combination has a
minimum error to the nonlinear time history analysis. The genetic
algorithm (GA) is used to minimize the error and optimize the weight
factors. The obtained optimal factors for each mode in different cases
are compared together to find unique appropriate weight factors for
each mode in all cases.
Abstract: Performance of a limited Round-Robin (RR) rule is
studied in order to clarify the characteristics of a realistic sharing
model of a processor. Under the limited RR rule, the processor
allocates to each request a fixed amount of time, called a quantum, in a
fixed order. The sum of the requests being allocated these quanta is
kept below a fixed value. Arriving requests that cannot be allocated
quanta because of such a restriction are queued or rejected. Practical
performance measures, such as the relationship between the mean
sojourn time, the mean number of requests, or the loss probability and
the quantum size are evaluated via simulation. In the evaluation, the
requested service time of an arriving request is converted into a
quantum number. One of these quanta is included in an RR cycle,
which means a series of quanta allocated to each request in a fixed
order. The service time of the arriving request can be evaluated using
the number of RR cycles required to complete the service, the number
of requests receiving service, and the quantum size. Then an increase
or decrease in the number of quanta that are necessary before service is
completed is reevaluated at the arrival or departure of other requests.
Tracking these events and calculations enables us to analyze the
performance of our limited RR rule. In particular, we obtain the most
suitable quantum size, which minimizes the mean sojourn time, for the
case in which the switching time for each quantum is considered.
Abstract: This paper presents the application of a signal intensity
independent similarity criterion for rigid and non-rigid body
registration of binary objects. The criterion is defined as the
weighted ratio image of two images. The ratio is computed on a
voxel per voxel basis and weighting is performed by setting the raios
between signal and background voxels to a standard high value. The
mean squared value of the weighted ratio is computed over the union
of the signal areas of the two images and it is minimized using the
Chebyshev polynomial approximation.
Abstract: This paper gives an overview of a deep drawing
process by pressurized liquid medium separated from the sheet by a
rubber diaphragm. Hydroforming deep drawing processing of sheet
metal parts provides a number of advantages over conventional
techniques. It generally increases the depth to diameter ratio possible
in cup drawing and minimizes the thickness variation of the drawn
cup. To explore the deformation mechanism, analytical and
numerical simulations are used for analyzing the drawing process of
an AA6061-T4 blank. The effects of key process parameters such as
coefficient of friction, initial thickness of the blank and radius
between cup wall and flange are investigated analytically and
numerically. The simulated results were in good agreement with the
results of the analytical model. According to finite element
simulations, the hydroforming deep drawing method provides a more
uniform thickness distribution compared to conventional deep
drawing and decreases the risk of tearing during the process.
Abstract: With the necessity of increased processing capacity with less energy consumption; power aware multiprocessor system has gained more attention in the recent future. One of the additional challenges that is to be solved in a multi-processor system when compared to uni-processor system is job allocation. This paper presents a novel task dependent job allocation algorithm: Energy centric- Allocation (Ec-A) and Rate Monotonic (RM) scheduling to minimize energy consumption in a multiprocessor system. A simulation analysis is carried out to verify the performance increase with reduction in energy consumption and required number of processors in the system.
Abstract: In this paper, we propose effective system for digital music retrieval. We divided proposed system into Client and Server. Client part consists of pre-processing and Content-based feature extraction stages. In pre-processing stage, we minimized Time code Gap that is occurred among same music contents. As content-based feature, first-order differentiated MFCC were used. These presented approximately envelop of music feature sequences. Server part included Music Server and Music Matching stage. Extracted features from 1,000 digital music files were stored in Music Server. In Music Matching stage, we found retrieval result through similarity measure by DTW. In experiment, we used 450 queries. These were made by mixing different compression standards and sound qualities from 50 digital music files. Retrieval accurate indicated 97% and retrieval time was average 15ms in every single query. Out experiment proved that proposed system is effective in retrieve digital music and robust at various user environments of web.
Abstract: Assembly line balancing is a very important issue in
mass production systems due to production cost. Although many
studies have been done on this topic, but because assembly line
balancing problems are so complex they are categorized as NP-hard
problems and researchers strongly recommend using heuristic
methods. This paper presents a new heuristic approach called the
critical task method (CTM) for solving U-shape assembly line
balancing problems. The performance of the proposed heuristic
method is tested by solving a number of test problems and comparing
them with 12 other heuristics available in the literature to confirm the
superior performance of the proposed heuristic. Furthermore, to
prove the efficiency of the proposed CTM, the objectives are
increased to minimize the number of workstation (or equivalently
maximize line efficiency), and minimizing the smoothness index.
Finally, it is proven that the proposed heuristic is more efficient than
the others to solve the U-shape assembly line balancing problem.
Abstract: In the gas refineries of Iran-s South Pars Gas
Complex, Sulfrex demercaptanization process is used to remove
volatile and corrosive mercaptans from liquefied petroleum gases by
caustic solution. This process consists of two steps. Removing low
molecular weight mercaptans and regeneration exhaust caustic. Some
parameters such as LPG feed temperature, caustic concentration and
feed-s mercaptan in extraction step and sodium mercaptide content in
caustic, catalyst concentration, caustic temperature, air injection rate
in regeneration step are effective factors. In this paper was focused on
temperature factor that play key role in mercaptans extraction and
caustic regeneration. The experimental results demonstrated by
optimization of temperature, sodium mercaptide content in caustic
because of good oxidation minimized and sulfur impurities in
product reduced.
Abstract: Variational methods for optical flow estimation are
known for their excellent performance. The method proposed by Brox
et al. [5] exemplifies the strength of that framework. It combines
several concepts into single energy functional that is then minimized
according to clear numerical procedure. In this paper we propose
a modification of that algorithm starting from the spatiotemporal
gradient constancy assumption. The numerical scheme allows to
establish the connection between our model and the CLG(H) method
introduced in [18]. Experimental evaluation carried out on synthetic
sequences shows the significant superiority of the spatial variant of
the proposed method. The comparison between methods for the realworld
sequence is also enclosed.
Abstract: This paper introduces a mixed integer programming model to find the optimum development plan for port Anzali. The model minimizes total system costs taking into account both port infrastructure costs and shipping costs. Due to the multipurpose function of the port, the model consists of 1020 decision variables and 2490 constraints. Results of the model determine the optimum number of berths that should be constructed in each period and for each type of cargo. In addition to, the results of sensitivity analysis on port operation quantity provide useful information for managers to choose the best scenario for port planning with the lowest investment risks. Despite all limitations-due to data availability-the model offers a straightforward decision tools to port planners aspiring to achieve optimum port planning steps.
Abstract: Multirate multimedia delivery applications in multihop Wireless Mesh Network (WMN) are data redundant and delay-sensitive, which brings a lot of challenges for designing efficient transmission systems. In this paper, we propose a new cross layer resource allocation scheme to minimize the receiver side distortion within the delay bound requirements, by exploring application layer Position and Value (P-V) diversity as well as the multihop Effective Capacity (EC). We specifically consider image transmission optimization here. First of all, the maximum supportable source traffic rate is identified by exploring the multihop Effective Capacity (EC) model. Furthermore, the optimal source coding rate is selected according to the P-V diversity of multirate media streaming, which significantly increases the decoded media quality. Simulation results show the proposed approach improved media quality significantly compared with traditional approaches under the same QoS requirements.