Abstract: As a by-product of the biodiesel industries, glycerol
has been vastly generated which surpasses the market demand. It is
imperative to develop an efficient glycerol valorization processes in
minimizing the net energy requirement and intensifying the biodiesel
production. In this study, base-catalyzed transesterification of
glycerol with dimethyl carbonate using microwave irradiation as
heating method to produce glycerol carbonate was conducted by
varying grades of glycerol, i.e. 70%, 86% and 99% purity, that is
obtained from biodiesel plant. Metal oxide catalysts were used with
varying operating parameters including reaction time, DMC/glycerol
molar ratio, catalyst weight %, temperature and stirring speed. From
the study on the effect of different operating parameters it was found
that the type of catalyst used has the most significant effect on the
transesterification reaction. Amidst the metal oxide catalysts
examined, CaO gave the best performance. This study indicates the
feasibility of producing glycerol carbonate using different grade of
glycerol in both conventional thermal activation and microwave
irradiation with CaO as catalyst. Microwave assisted
transesterification (MAT) of glycerol into glycerol carbonate has
demonstrated itself as an energy efficient route by achieving 94.2%
yield of GC at 65°C, 5 minutes reaction time, 1 wt% CaO and
DMC/glycerol molar ratio of 2. The advantages of MAT
transesterification route has made the direct utilization of bioglycerol
from biodiesel production without the need of purification. This has
marked a more economical and less-energy intensive glycerol
carbonate synthesis route.
Abstract: Sometimes the amount of time available for testing could be considerably less than the expected lifetime of the component. To overcome such a problem, there is the accelerated life-testing alternative aimed at forcing components to fail by testing them at much higher-than-intended application conditions. These models are known as acceleration models. One possible way to translate test results obtained under accelerated conditions to normal using conditions could be through the application of the “Maxwell Distribution Law.” In this paper we will apply a combined approach of a sequential life testing and an accelerated life testing to a low alloy high-strength steel component used in the construction of overpasses in Brazil. The underlying sampling distribution will be three-parameter Inverse Weibull model. To estimate the three parameters of the Inverse Weibull model we will use a maximum likelihood approach for censored failure data. We will be assuming a linear acceleration condition. To evaluate the accuracy (significance) of the parameter values obtained under normal conditions for the underlying Inverse Weibull model we will apply to the expected normal failure times a sequential life testing using a truncation mechanism. An example will illustrate the application of this procedure.
Abstract: With a growing number of digital libraries and other
open education repositories being made available throughout the
world, effective search and retrieval tools are necessary to access the
desired materials that surpass the effectiveness of traditional, allinclusive
search engines. This paper discusses the design and use of
Folksemantic, a platform that integrates OpenCourseWare search,
Open Educational Resource recommendations, and social network
functionality into a single open source project. The paper describes
how the system was originally envisioned, its goals for users, and
data that provides insight into how it is actually being used. Data
sources include website click-through data, query logs, web server
log files and user account data. Based on a descriptive analysis of its
current use, modifications to the platform's design are recommended
to better address goals of the system, along with recommendations
for additional phases of research.
Abstract: Web-based systems have become increasingly
important due to the fact that the Internet and the World Wide Web
have become ubiquitous, surpassing all other technological
developments in our history. The Internet and especially companies
websites has rapidly evolved in their scope and extent of use, from
being a little more than fixed advertising material, i.e. a "web
presences", which had no particular influence for the company's
business, to being one of the most essential parts of the company's
core business.
Traditional software engineering approaches with process models
such as, for example, CMM and Waterfall models, do not work very
well since web system development differs from traditional
development. The development differs in several ways, for example,
there is a large gap between traditional software engineering designs
and concepts and the low-level implementation model, many of the
web based system development activities are business oriented (for
example web application are sales-oriented, web application and
intranets are content-oriented) and not engineering-oriented.
This paper aims to introduce Increment Iterative extreme
Programming (IIXP) methodology for developing web based
systems. In difference to the other existence methodologies, this
methodology is combination of different traditional and modern
software engineering and web engineering principles.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: By systematically applying different engineering
methods, difficult financial problems become approachable. Using a
combination of theory and techniques such as wavelet transform,
time series data mining, Markov chain based discrete stochastic
optimization, and evolutionary algorithms, this work formulated a
strategy to characterize and forecast non-linear time series. It
attempted to extract typical features from the volatility data sets of
S&P100 and S&P500 indices that include abrupt drops, jumps and
other non-linearity. As a result, accuracy of forecasting has reached
an average of over 75% surpassing any other publicly available
results on the forecast of any financial index.
Abstract: Recently, a vehicular ad-hoc networks(VANETs) for
Intelligent Transport System(ITS) have become able safety and convenience services surpassing the simple services such as
an electronic toll collection system. To provide the proper services,
VANET needs infrastructure over the country infrastructure. Thus, we have to spend a huge sum of
human resources. In this reason, several studies have been made on the
usage of cellular networks instead of new protocols
this study is to assess a performance evaluation of the
cellular network for VANET. In this paper, the result of a
for the suitability of cellular networks for VANET
experiment, The LTE(Long Term Evolution) of cellular networks found to be most suitable among the others cellular networks
Abstract: This study include the effect of strain and storage
period and their interaction on some quantitative and qualitative traits
and percentages of the egg components in the eggs collected at the
start of production (at age 24 weeks). Eggs were divided into three
storage periods (1, 7 and 14) days under refrigerator temperature (5-
7)0C. Fifty seven eggs obtained randomly from each strain including
Isa Brown and Lohman White. General Linear Model within
SAS programme was used to analyze the collected data
and correlations between the studied traits were calculated for each
strain.Average egg weight (EW), Haugh Unit (HU), yolk index (YI),
yolk % (HP), albumin % (AP) and yolk to albumin ratio (YAR) was
56.629 gm, 87.968 %, 0.493, 22.13%, 67.74% and 32.76
respectively. Egg produced from ISA Brown surpassed those
produced by Lohman White significantly (P
Abstract: The aim of this retrospective study was to evaluate the
parameters of dental implants such as patient gender, number of
implant, failed implant before prosthetic restorations and failed
implant after implantation and failed implant after prosthetic
restorations. 135 male and 99 female patients, total 234 implant
patients which have been treated with 450 implant between 2005-
2009 years in GATA Haydarpasa Training Hospital Dental Service.
Twelve implants were failed before prosthetic restorations. Four
implant were failed after fixed prosthetic restorations. Cumulative
survival rate after prostheses were 97.56 % during 6 years period.
Abstract: The private theme parks are gradually surpassing
public-owned scenic areas after many years of development and have
become a mainstream choice for domestic tourists. Previous studies
show that visitors from different backgrounds differ in consumer
behavior and satisfaction factors. An understanding of visitor
satisfaction is therefore of extreme importance to operators of
privately-owned theme parks. Importance-Performance Analysis (IPA)
is used to measure consumer's potential satisfaction with services and
has become a widely used management tool for strength and weakness
analysis for brands, products, services and point of sales. As IPA has
so far not been used to evaluate the visitor satisfaction with
privately-owned theme parks, in this study the IPA method is used to
analyze visitor satisfaction with Janfusun Fancyworld (one of the most
popular private theme parks in Taiwan) and to rank visitor focus and
satisfaction on/in theme park facilities and services. Results of the
analysis provide private theme park operators with an understanding
of user or consumer demands as well as an assessment of the quality of
services currently offered.
Abstract: Cognitive Dissonance can be conceived both as a concept related to the tendency to avoid internal contradictions in certain situations, and as a higher order theory about information processing in the human mind. In the last decades, this last sense has been strongly surpassed by the former, as nearly all experiment on the matter discuss cognitive dissonance as an output of motivational contradictions. In that sense, the question remains: is cognitive dissonance a process intrinsically associated with the way that the mind processes information, or is it caused by such specific contradictions? Objective: To evaluate the effects of cognitive dissonance in the absence of rewards or any mechanisms to manipulate motivation. Method: To solve this question, we introduce a new task, the hypothetical social arrays paradigm, which was applied to 50 undergraduate students. Results: Our findings support the perspective that the human mind shows a tendency to avoid internal dissonance even when there are no rewards or punishment involved. Moreover, our findings also suggest that this principle works outside the conscious level.
Abstract: Color Image quantization (CQ) is an important
problem in computer graphics, image and processing. The aim of
quantization is to reduce colors in an image with minimum distortion.
Clustering is a widely used technique for color quantization; all
colors in an image are grouped to small clusters. In this paper, we
proposed a new hybrid approach for color quantization using firefly
algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased
algorithm that can be used for solving optimization problems.
The proposed method can overcome the drawbacks of both
algorithms such as the local optima converge problem in K-means
and the early converge of firefly algorithm. Experiments on three
commonly used images and the comparison results shows that the
proposed algorithm surpasses both the base-line technique k-means
clustering and original firefly algorithm.
Abstract: The classic problem of recovering arbitrary values of
a band-limited signal from its samples has an added complication
in software radio applications; namely, the resampling calculations
inevitably fold aliases of the analog signal back into the original
bandwidth. The phenomenon is quantified by the spur-free dynamic
range. We demonstrate how a novel application of the Remez (Parks-
McClellan) algorithm permits optimal signal recovery and SFDR, far
surpassing state-of-the-art resamplers.
Abstract: UWB is a very attractive technology for many
applications. It provides many advantages such as fine resolution and high power efficiency. Our interest in the current study is the use of
UWB radar technique in microwave medical imaging systems, especially for early breast cancer detection. The Federal Communications Commission FCC allowed frequency bandwidth of
3.1 to 10.6 GHz for this purpose. In this paper we suggest an UWB Bowtie slot antenna with enhanced bandwidth. Effects of varying the geometry of the antenna
on its performance and bandwidth are studied. The proposed antenna
is simulated in CST Microwave Studio. Details of antenna design and
simulation results such as return loss and radiation patterns are discussed in this paper. The final antenna structure exhibits good
UWB characteristics and has surpassed the bandwidth requirements.
Abstract: This paper presents a review on published literature
and experimental works on local sands for possible use as proppant,
specifically those from Terengganu coastal area. This includes
examination on characteristics of sand samples and selection of
experiments for proppant testing. Sand samples from identified areas
were tested according to particle size distribution, density, roundness
and sphericity, turbidity and mineralogy. Results from sand samples
were compared against proppant specifications set by API RP 56 and
selected commercial proppants. The present study found that the size
distribution, sphericity, turbidity and bulk density of Terengganu
sands are at par with some of commercial proppants. Nevertheless,
Terengganu sand samples do not completely surpass the required
roundness for use as proppant.
Abstract: Deformable active contours are widely used in
computer vision and image processing applications for image
segmentation, especially in biomedical image analysis. The active
contour or “snake" deforms towards a target object by controlling the
internal, image and constraint forces. However, if the contour
initialized with a lesser number of control points, there is a high
probability of surpassing the sharp corners of the object during
deformation of the contour. In this paper, a new technique is
proposed to construct the initial contour by incorporating prior
knowledge of significant corners of the object detected using the
Harris operator. This new reconstructed contour begins to deform, by
attracting the snake towards the targeted object, without missing the
corners. Experimental results with several synthetic images show the
ability of the new technique to deal with sharp corners with a high
accuracy than traditional methods.