Abstract: A procedure for the preparation of clarified Pawpaw
Juice was developed. About 750ml Pawpaw pulp was measured into
2 measuring cylinders A & B of capacity 1 litre heated to 400C,
cooled to 200C. 30mls pectinase was added into cylinder A, while
30mls distilled water was added into cylinder B. Enzyme treated
sample (A) was allowed to digest for 5hours after which it was heated
to 900C for 15 minutes to inactivate the enzyme. The heated sample
was cooled and with the aid of a mucillin cloth the pulp was filtered
to obtain the clarified pawpaw juice. The juice was filled into 100ml
plastic bottles, pasteurized at 950C for 45 minutes, cooled and stored
at room temperature. The sample treated with 30mls distilled water
also underwent the same process. Freshly pasteurized sample was
analyzed for specific gravity, titratable acidity, pH, sugars and
ascorbic acid. The remaining sample was then stored for 2 weeks and
the above analyses repeated. There were differences in the results of
the freshly pasteurized samples and stored sample in pH and ascorbic
acid levels, also sample treated with pectinase yielded higher
volumes of juice than that treated with distilled water.
Abstract: Recently there has been a growing interest in the field
of bio-mimetic robots that resemble the behaviors of an insect or an
aquatic animal, among many others. One of various bio-mimetic robot
applications is to explore pipelines, spotting any troubled areas or
malfunctions and reporting its data. Moreover, the robot is able to
prepare for and react to any abnormal routes in the pipeline. Special
types of mobile robots are necessary for the pipeline monitoring tasks.
In order to move effectively along a pipeline, the robot-s movement
will resemble that of insects or crawling animals. When situated in
massive pipelines with complex routes, the robot places fixed sensors
in several important spots in order to complete its monitoring. This
monitoring task is to prevent a major system failure by preemptively
recognizing any minor or partial malfunctions. Areas uncovered by
fixed sensors are usually impossible to provide real-time observation
and examination, and thus are dependent on periodical offline
monitoring. This paper proposes a monitoring system that is able to
monitor the entire area of pipelines–with and without fixed
sensors–by using the bio-mimetic robot.
Abstract: A new multi inner stage (MIS) cyclone was designed to
remove the acidic gas and fine particles produced from electronic
industry. To characterize gas flow in MIS cyclone, pressure and
velocity distribution were calculated by means of CFD program. Also,
the flow locus of fine particles and particle removal efficiency were
analyzed by Lagrangian method. When outlet pressure condition was
–100mmAq, the efficiency was the best in this study.
Abstract: Proper management of residues originated from
industrial activities is considered as one of the serious challenges
faced by industrial societies due to their potential hazards to the
environment. Common disposal methods for industrial solid wastes
(ISWs) encompass various combinations of solely management
options, i.e. recycling, incineration, composting, and sanitary
landfilling. Indeed, the procedure used to evaluate and nominate the
best practical methods should be based on environmental, technical,
economical, and social assessments. In this paper an environmentaltechnical
assessment model is developed using analytical network
process (ANP) to facilitate the decision making practice for ISWs
generated at Gilan province, Iran. Using the results of performed
surveys on industrial units located at Gilan, the various groups of
solid wastes in the research area were characterized, and four
different ISW management scenarios were studied. The evaluation
process was conducted using the above-mentioned model in the
Super Decisions software (version 2.0.8) environment. The results
indicates that the best ISW management scenario for Gilan province
is consist of recycling the metal industries residues, composting the
putrescible portion of ISWs, combustion of paper, wood, fabric and
polymeric wastes as well as energy extraction in the incineration
plant, and finally landfilling the rest of the waste stream in addition
with rejected materials from recycling and compost production plants
and ashes from the incineration unit.
Abstract: The phenomenon of global warming or climate
change has led to many environmental issues including higher
atmospheric temperatures, intense precipitation, increased
greenhouse gaseous emissions and increased indoor discomfort.
Studies have shown that bringing nature to the roof such as
constructing green roof and implementing high-reflective roof may
give positive impact in mitigating the effects of global warming and
in increasing thermal comfort sensation inside buildings. However,
no study has been conducted to compare both types of passive roof
treatments in Malaysia in order to increase thermal comfort in
buildings. Therefore, this study is conducted to investigate the effect
of green roof and white painted roof as passive roof treatment in
improving indoor comfort of Malaysian homes. This study uses an
experimental approach in which the measurements of temperatures
are conducted on the case study building. The measurements of
outdoor and indoor environments were conducted on the flat roof
with two different types of roof treatment that are green roof and
white roof. The measurement of existing black bare roof was also
conducted to act as a control for this study.
Abstract: Zero inflated strict arcsine model is a newly developed
model which is found to be appropriate in modeling overdispersed
count data. In this study, we extend zero inflated strict arcsine model
to zero inflated strict arcsine regression model by taking into
consideration the extra variability caused by extra zeros and
covariates in count data. Maximum likelihood estimation method is
used in estimating the parameters for this zero inflated strict arcsine
regression model.
Abstract: In this paper, we present a simple circuit for
Manchester decoding and without using any complicated or
programmable devices. This circuit can decode 90kbps of transmitted
encoded data; however, greater than this transmission rate can be
decoded if high speed devices were used. We also present a new
method for extracting the embedded clock from Manchester data in
order to use it for serial-to-parallel conversion. All of our
experimental measurements have been done using simulation.
Abstract: In this paper, we propose a face recognition algorithm
using AAM and Gabor features. Gabor feature vectors which are well
known to be robust with respect to small variations of shape, scaling,
rotation, distortion, illumination and poses in images are popularly
employed for feature vectors for many object detection and
recognition algorithms. EBGM, which is prominent among face
recognition algorithms employing Gabor feature vectors, requires
localization of facial feature points where Gabor feature vectors are
extracted. However, localization method employed in EBGM is based
on Gabor jet similarity and is sensitive to initial values. Wrong
localization of facial feature points affects face recognition rate. AAM
is known to be successfully applied to localization of facial feature
points. In this paper, we devise a facial feature point localization
method which first roughly estimate facial feature points using AAM
and refine facial feature points using Gabor jet similarity-based facial
feature localization method with initial points set by the rough facial
feature points obtained from AAM, and propose a face recognition
algorithm using the devised localization method for facial feature
localization and Gabor feature vectors. It is observed through
experiments that such a cascaded localization method based on both
AAM and Gabor jet similarity is more robust than the localization
method based on only Gabor jet similarity. Also, it is shown that the
proposed face recognition algorithm using this devised localization
method and Gabor feature vectors performs better than the
conventional face recognition algorithm using Gabor jet
similarity-based localization method and Gabor feature vectors like
EBGM.
Abstract: The so-called all-pass filter circuits are commonly
used in the field of signal processing, control and measurement.
Being connected to capacitive loads, these circuits tend to loose their
stability; therefore the elaborate analysis of their dynamic behavior is
necessary. The compensation methods intending to increase the
stability of such circuits are discussed in this paper, including the socalled
lead-lag compensation technique being treated in detail. For
the dynamic modeling, a two-port network model of the all-pass filter
is being derived. The results of the model analysis show, that
effective lead-lag compensation can be achieved, alone by the
optimization of the circuit parameters; therefore the application of
additional electric components are not needed to fulfill the stability
requirement.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: Complex engineering design problems consist of
numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive
waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the
problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well
understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It
provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction
hierarchies in a recursive and bottom-up approach that guarantees no
backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the
developed methodology is demonstrated by a design problem.
Abstract: Corporate credit rating prediction using statistical and
artificial intelligence (AI) techniques has been one of the attractive
research topics in the literature. In recent years, multiclass
classification models such as artificial neural network (ANN) or
multiclass support vector machine (MSVM) have become a very
appealing machine learning approaches due to their good
performance. However, most of them have only focused on classifying
samples into nominal categories, thus the unique characteristic of the
credit rating - ordinality - has been seldom considered in their
approaches. This study proposes new types of ANN and MSVM
classifiers, which are named OMANN and OMSVM respectively.
OMANN and OMSVM are designed to extend binary ANN or SVM
classifiers by applying ordinal pairwise partitioning (OPP) strategy.
These models can handle ordinal multiple classes efficiently and
effectively. To validate the usefulness of these two models, we applied
them to the real-world bond rating case. We compared the results of
our models to those of conventional approaches. The experimental
results showed that our proposed models improve classification
accuracy in comparison to typical multiclass classification techniques
with the reduced computation resource.
Abstract: The output beam quality of multi transverse modes of
laser, are relatively poor. In order to obtain better beam quality, one
may use an aperture inside the laser resonator. In this case, various
transverse modes can be selected. We have selected various
transverse modes both by simulation and doing experiment. By
inserting a circular aperture inside the diode end-pumped Nd:YAG
pulsed laser resonator, we have obtained 00 TEM , 01 TEM
, 20 TEM and have studied which parameters, can change the mode
shape. Then, we have determined the beam quality factor of TEM00
gaussian mode.
Abstract: In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.
Abstract: Concerns about low levels of children-s physical activity and motor skill development, prompted the Ministry of Education to trial a physical activity pilot project (PAPP) in 16 New Zealand primary schools. The project comprised professional development and training in physical education for lead teachers and introduced four physical activity coordinators to liaise with and increase physical activity opportunities in the pilot schools. A survey of generalist teachers (128 baseline, 155 post-intervention) from these schools looked at timetabled physical activity sessions and issues related to teaching physical education. The authors calculated means and standard deviations of data relating to timetabled PE sessions and used a one-way analysis of variance to determine significant differences. Results indicated time devoted to physical activity related subjects significantly increased over the course of the intervention. Teacher-s reported improved confidence and competence, which resulted in an improvement in quality physical education delivered more often.
Abstract: The aim of this paper to characterize a larger set of
wavelet functions for implementation in a still image compression
system using SPIHT algorithm. This paper discusses important
features of wavelet functions and filters used in sub band coding to
convert image into wavelet coefficients in MATLAB. Image quality
is measured objectively using peak signal to noise ratio (PSNR) and
its variation with bit rate (bpp). The effect of different parameters is
studied on different wavelet functions. Our results provide a good
reference for application designers of wavelet based coder.
Abstract: The present work compares the performance of three
turbulence modeling approach (based on the two-equation k -ε
model) in predicting erosive wear in multi-size dense slurry flow
through rotating channel. All three turbulence models include
rotation modification to the production term in the turbulent kineticenergy
equation. The two-phase flow field obtained numerically
using Galerkin finite element methodology relates the local flow
velocity and concentration to the wear rate via a suitable wear model.
The wear models for both sliding wear and impact wear mechanisms
account for the particle size dependence. Results of predicted wear
rates using the three turbulence models are compared for a large
number of cases spanning such operating parameters as rotation rate,
solids concentration, flow rate, particle size distribution and so forth.
The root-mean-square error between FE-generated data and the
correlation between maximum wear rate and the operating
parameters is found less than 2.5% for all the three models.
Abstract: Transaction management is one of the most crucial requirements for enterprise application development which often require concurrent access to distributed data shared amongst multiple application / nodes. Transactions guarantee the consistency of data records when multiple users or processes perform concurrent operations. Existing Fault Tolerance Infrastructure for Mobile Agents (FTIMA) provides a fault tolerant behavior in distributed transactions and uses multi-agent system for distributed transaction and processing. In the existing FTIMA architecture, data flows through the network and contains personal, private or confidential information. In banking transactions a minor change in the transaction can cause a great loss to the user. In this paper we have modified FTIMA architecture to ensure that the user request reaches the destination server securely and without any change. We have used triple DES for encryption/ decryption and MD5 algorithm for validity of message.
Abstract: The challenge for software development house in
Bangladesh is to find a path of using minimum process rather than CMMI or ISO type gigantic practice and process area. The small and medium size organization in Bangladesh wants to ensure minimum
basic Software Process Improvement (SPI) in day to day operational
activities. Perhaps, the basic practices will ensure to realize their company's improvement goals. This paper focuses on the key issues in basic software practices for small and medium size software
organizations, who are unable to effort the CMMI, ISO, ITIL etc. compliance certifications. This research also suggests a basic software process practices model for Bangladesh and it will show the mapping of our suggestions with international best practice. In this IT
competitive world for software process improvement, Small and medium size software companies that require collaboration and
strengthening to transform their current perspective into inseparable global IT scenario. This research performed some investigations and analysis on some projects- life cycle, current good practice, effective approach, reality and pain area of practitioners, etc. We did some
reasoning, root cause analysis, comparative analysis of various
approach, method, practice and justifications of CMMI and real life. We did avoid reinventing the wheel, where our focus is for minimal
practice, which will ensure a dignified satisfaction between
organizations and software customer.
Abstract: The world-s largest Pre-stressed Concrete Cylinder
Pipe (PCCP) water supply project had a series of pipe failures which
occurred between 1999 and 2001. This has led the Man-Made River
Authority (MMRA), the authority in charge of the implementation
and operation of the project, to setup a rehabilitation plan for the
conveyance system while maintaining the uninterrupted flow of
water to consumers. At the same time, MMRA recognized the need
for a long term management tool that would facilitate repair and
maintenance decisions and enable taking the appropriate preventive
measures through continuous monitoring and estimation of the
remaining life of each pipe. This management tool is known as the
Pipe Risk Management System (PRMS) and now in operation at
MMRA. Both the rehabilitation plan and the PRMS require the
availability of complete and accurate pipe construction and
manufacturing data
This paper describes a systematic approach of data collection,
analysis, evaluation and correction for the construction and
manufacturing data files of phase I pipes which are the platform for
the PRMS database and any other related decision support system.