Abstract: The study was conducted to evaluate the quality
characteristics of cookies produced from sweet potato-fermented
soybean flour. Cookies were subjected to proximate and sensory
analysis to determine the acceptability of the product. Protein, fat and
ash increased as the proportion of soybean flour increased, ranging
from 13.8-21.7, 1.22-5.25 and 2.20-2.57 respectively. The crude fibre
content was within the range of 3.08-4.83%. The moisture content of
the cookies decreased with increase in soybean flour from 3.42-
2.13%. Cookies produced from whole sweet potato flour had the
highest moisture content of 3.42% while 30% substitution had the
lowest moisture content 2.13%. A nine point hedonic scale was used
to evaluate the organoleptic characteristics of the cookies. The
sensory analysis indicated that there was no significant difference
between the cookies produced even when compared to the control
100% sweet potato cookies. The overall acceptance of the cookies
was ranked to 20% soybean flour substitute.
Abstract: In this paper, the local grid refinement is focused by
using a nested grid technique. The Cartesian grid numerical method is
developed for simulating unsteady, viscous, incompressible flows
with complex immersed boundaries. A finite volume method is used in
conjunction with a two-step fractional-step procedure. The key aspects
that need to be considered in developing such a nested grid solver are
imposition of interface conditions on the inter-block and accurate
discretization of the governing equation in cells that are with the
inter-block as a control surface. A new interpolation procedure is
presented which allows systematic development of a spatial
discretization scheme that preserves the spatial accuracy of the
underlying solver. The present nested grid method has been tested by
two numerical examples to examine its performance in the two
dimensional problems. The numerical examples include flow past a
circular cylinder symmetrically installed in a Channel and flow past
two circular cylinders with different diameters. From the numerical
experiments, the ability of the solver to simulate flows with
complicated immersed boundaries is demonstrated and the nested grid
approach can efficiently speed up the numerical solutions.
Abstract: The purpose of this study is to derive parameters
estimating for the Lyman–Kutcher–Burman (LKB) normal tissue
complication probability (NTCP) model using analysis of scintigraphy
assessments and quality of life (QoL) measurement questionnaires for
the parotid gland (xerostomia). In total, 31 patients with
head-and-neck (HN) cancer were enrolled. Salivary excretion factor
(SEF) and EORTC QLQ-H&N35 questionnaires datasets are used for
the NTCP modeling to describe the incidence of grade 4 xerostomia.
Assuming that n= 1, NTCP fitted parameters are given as TD50= 43.6
Gy, m= 0.18 in SEF analysis, and as TD50= 44.1 Gy, m= 0.11 in QoL
measurements, respectively. SEF and QoL datasets can validate the
Quantitative Analyses of Normal Tissue Effects in the Clinic
(QUANTEC) guidelines well, resulting in NPV-s of 100% for the both
datasets and suggests that the QUANTEC 25/20Gy gland-spared
guidelines are suitable for clinical used for the HN cohort to
effectively avoid xerostomia.
Abstract: A method to determine experimentally the melting
rate, rm, and the heat transfer coefficients, αv (W/(m3K)), at
convective melting in a fixed bed of particles under adiabatic regime
is established in this paper. The method lies in the determining of the
melting rate by measuring the fixed bed height in time. Experimental
values of rm, α and α v were determined using cylindrical particles of
ice (d = 6.8 mm, h = 5.5 mm) and, as a melting agent, aqueous NaCl
solution with a temperature of 283 K at different values of the liquid
flow rate (11.63·10-6, 28.83·10-6, 38.83·10-6 m3/s).
Our experimental results were compared with those existing in
literature being noticed a good agreement for Re values higher than
50.
Abstract: Biomass is becoming a large renewable resource for
power generation; it is involved in higher frequency in
environmentally clean processes, and even it is used for biofuels
preparation. On the other hand, hydrogen – other energy source – can
be produced in a variety of methods including gasification of
biomass. In this study, the production of hydrogen by gasification of
biomass waste is examined. This work explores the production of a
gaseous mixture with high power potential from Amazonas´ specie
known as copoazu, using a counter-flow fixed-bed bioreactor.
Abstract: This research is part of a broad program aimed at
advancing the science and technology involved in the rescue and
rehabilitation of oiled wildlife. One aspect of this research involves
the use of oil-sequestering magnetic particles for the removal of
contaminants from plumage – so-called “magnetic cleansing". This
treatment offers a number of advantages over conventional
detergent-based methods including portability - which offers the
possibility of providing a “quick clean" to the animal upon first
encounter in the field. This could be particularly advantageous
when the contaminant is toxic and/or corrosive and/or where there
is a delay in transporting the victim to a treatment centre. The
method could also be useful as part of a stabilization protocol when
large numbers of affected animals are awaiting treatment. This
presentation describes the design, development and testing of a
prototype field kit for providing a “quick clean" to contaminated
wildlife in the field.
Abstract: State-based testing is frequently used in software testing. Test data generation is one of the key issues in software testing. A properly generated test suite may not only locate the errors in a software system, but also help in reducing the high cost associated with software testing. It is often desired that test data in the form of test sequences within a test suite can be automatically generated to achieve required test coverage. This paper proposes an Ant Colony Optimization approach to test data generation for the state-based software testing.
Abstract: The European countries that during the past two
decades based their exchange rate regimes on currency board
arrangement (CBA) are usually analysed from the perspective of
corner solution choice’s stabilisation effects. There is an open
discussion on the positive and negative background of a strict
exchange rate regime choice, although it should be seen as part of the
transition process towards the monetary union membership. The
focus of the paper is on the Baltic countries that after two decades of
a rigid exchange rate arrangement and strongly influenced by global
crisis are finishing their path towards the euro zone. Besides the
stabilising capacity, the CBA is highly vulnerable regime, with
limited developing potential. The rigidity of the exchange rate (and
monetary) system, despite the ensured credibility, do not leave
enough (or any) space for the adjustment and/or active crisis
management. Still, the Baltics are in a process of recovery, with fiscal
consolidation measures combined with (painful and politically
unpopular) measures of internal devaluation. Today, two of them
(Estonia and Latvia) are members of euro zone, fulfilling their
ultimate transition targets, but de facto exchanging one fixed regime
with another.
The paper analyses the challenges for the CBA in unstable
environment since the fixed regimes rely on imported stability and
are sensitive to external shocks. With limited monetary instruments,
these countries were oriented to the fiscal policies and used a
combination of internal devaluation and tax policy measures. Despite
their rather quick recovery, our second goal is to analyse the long
term influence that the measures had on the national economy.
Abstract: For high-speed control of robots, a good knowledge of system modelling is necessary to obtain the desired bandwidth. In this paper, we present a cartesian robot with a pan/tilt unit in end-effector (5 dof). This robot is implemented with powerful direct drive AC induction machines. The dynamic model, parameter identification and model validation of the robot are studied (including actuators). This work considers the cartesian robot coupled and non linear (contrary to normal considerations for this type of robots). The mechanical and control architecture proposed in this paper is efficient for industrial and research application in which high speed, well known model and very high accuracy are required.
Abstract: This paper presents a low cost design of heart beat monitoring device using reflectance mode PhotoPlethysmography (PPG). PPG is known for its simple construction, ease of use and cost effectiveness and can provide information about the changes in cardiac activity as well as aid in earlier non-invasive diagnostics. The proposed device is divided into three phases. First is the detection of pulses through the fingertip. The signal is then passed to the signal processing unit for the purpose of amplification, filtering and digitizing. Finally the heart rate is calculated and displayed on the computer using parallel port interface. The paper is concluded with prototyping of the device followed by verification procedure of the heartbeat signal obtained in laboratory setting.
Abstract: The optimal control problem for the viscoelastic melt
spinning process has not been reported yet in the literature. In this
study, an optimal control problem for a mathematical model of a
viscoelastic melt spinning process is considered. Maxwell-Oldroyd
model is used to describe the rheology of the polymeric material, the
fiber is made of. The extrusion velocity of the polymer at the spinneret
as well as the velocity and the temperature of the quench air and the
fiber length serve as control variables. A constrained optimization
problem is derived and the first–order optimality system is set up
to obtain the adjoint equations. Numerical solutions are carried out
using a steepest descent algorithm. A computer program in MATLAB
is developed for simulations.
Abstract: Due to the recovering global economy, enterprises are
increasingly focusing on logistics. Investing in logistic measures for
a production generates a large potential for achieving a good starting
point within a competitive field. Unlike during the global economic
crisis, enterprises are now challenged with investing available capital
to maximize profits. In order to be able to create an informed and
quantifiably comprehensible basis for a decision, enterprises need an
adequate model for logistically and monetarily evaluating measures
in production. The Collaborate Research Centre 489 (SFB 489) at the
Institute for Production Systems (IFA) developed a Logistic
Information System which provides support in making decisions and
is designed specifically for the forging industry. The aim of a project
that has been applied for is to now transfer this process in order to
develop a universal approach to logistically and monetarily evaluate
measures in production.
Abstract: Information is power. Geographical information is an
emerging science that is advancing the development of knowledge to
further help in the understanding of the relationship of “place" with
other disciplines such as crime. The researchers used crime data for
the years 2004 to 2007 from the Baguio City Police Office to
determine the incidence and actual locations of crime hotspots.
Combined qualitative and quantitative research methodology was
employed through extensive fieldwork and observation, geographic
visualization with Geographic Information Systems (GIS) and Global
Positioning Systems (GPS), and data mining. The paper discusses
emerging geographic visualization and data mining tools and
methodologies that can be used to generate baseline data for
environmental initiatives such as urban renewal and rejuvenation.
The study was able to demonstrate that crime hotspots can be
computed and were seen to be occurring to some select places in the
Central Business District (CBD) of Baguio City. It was observed that
some characteristics of the hotspot places- physical design and milieu
may play an important role in creating opportunities for crime. A list
of these environmental attributes was generated. This derived
information may be used to guide the design or redesign of the urban
environment of the City to be able to reduce crime and at the same
time improve it physically.
Abstract: Atrial Fibrillation is the most common sustained
arrhythmia encountered by clinicians. Because of the invisible
waveform of atrial fibrillation in atrial activation for human, it is
necessary to develop an automatic diagnosis system. 12-Lead ECG
now is available in hospital and is appropriate for using Independent
Component Analysis to estimate the AA period. In this research, we
also adopt a second-order blind identification approach to transform
the sources extracted by ICA to more precise signal and then we use
frequency domain algorithm to do the classification. In experiment,
we gather a significant result of clinical data.
Abstract: Segmentation techniques based on Active Contour
Models have been strongly benefited from the use of prior information
during their evolution. Shape prior information is captured from
a training set and is introduced in the optimization procedure to
restrict the evolution into allowable shapes. In this way, the evolution
converges onto regions even with weak boundaries. Although
significant effort has been devoted on different ways of capturing
and analyzing prior information, very little thought has been devoted
on the way of combining image information with prior information.
This paper focuses on a more natural way of incorporating the
prior information in the level set framework. For proof of concept
the method is applied on hippocampus segmentation in T1-MR
images. Hippocampus segmentation is a very challenging task, due
to the multivariate surrounding region and the missing boundary
with the neighboring amygdala, whose intensities are identical. The
proposed method, mimics the human segmentation way and thus
shows enhancements in the segmentation accuracy.
Abstract: Although services play a crucial role in economy,
service did not gain as much importance as productivity management
in manufacturing. This paper presents key findings from literature
and practice. Based on an initial definition of complex services, seven
productivity concepts are briefly presented and assessed by relevant,
complex service specific criteria. Following the findings a complex
service productivity model is proposed. The novel model comprises
of all specific dimensions of service provision from both, the
provider-s as well as costumer-s perspective. A clear assignment of
identified value drivers and relationships between them is presented.
In order to verify the conceptual service productivity model a case
study from a project engineering department of a chemical plant
development and construction company is presented.
Abstract: The term private equity usually refers to any type of
equity investment in an asset in which the equity is not freely
tradable on a public stock market. Some researchers believe that
private equity contributed to the extent of the crisis and increased
the pace of its spread over the world. We do not agree with this.
On the other hand, we argue that during the economic recession
private equity might become an important source of funds for firms
with special needs (e.g. for firms seeking buyout financing, venture
capital, expansion capital or distress debt financing). However,
over-regulation of private equity in both the European Union and
the US can slow down this specific funding channel to the
economy and deepen credit crunch during global crises.
Abstract: In this paper, a new approach based on the extent of
friendship between the nodes is proposed which makes the nodes to
co-operate in an ad hoc environment. The extended DSR protocol is
tested under different scenarios by varying the number of malicious
nodes and node moving speed. It is also tested varying the number of
nodes in simulation used. The result indicates the achieved
throughput by extended DSR is greater than the standard DSR and
indicates the percentage of malicious drops over total drops are less
in the case of extended DSR than the standard DSR.
Abstract: The response surface methodology (RSM) is a
collection of mathematical and statistical techniques useful in the
modeling and analysis of problems in which the dependent variable
receives the influence of several independent variables, in order to
determine which are the conditions under which should operate these
variables to optimize a production process. The RSM estimated a
regression model of first order, and sets the search direction using the
method of maximum / minimum slope up / down MMS U/D.
However, this method selects the step size intuitively, which can
affect the efficiency of the RSM. This paper assesses how the step
size affects the efficiency of this methodology. The numerical
examples are carried out through Monte Carlo experiments,
evaluating three response variables: efficiency gain function, the
optimum distance and the number of iterations. The results in the
simulation experiments showed that in response variables efficiency
and gain function at the optimum distance were not affected by the
step size, while the number of iterations is found that the efficiency if
it is affected by the size of the step and function type of test used.
Abstract: A feed-forward, back-propagation Artificial Neural
Network (ANN) model has been used to forecast the occurrences of
wastewater overflows in a combined sewerage reticulation system.
This approach was tested to evaluate its applicability as a method
alternative to the common practice of developing a complete
conceptual, mathematical hydrological-hydraulic model for the
sewerage system to enable such forecasts. The ANN approach
obviates the need for a-priori understanding and representation of the
underlying hydrological hydraulic phenomena in mathematical terms
but enables learning the characteristics of a sewer overflow from the
historical data.
The performance of the standard feed-forward, back-propagation
of error algorithm was enhanced by a modified data normalizing
technique that enabled the ANN model to extrapolate into the
territory that was unseen by the training data. The algorithm and the
data normalizing method are presented along with the ANN model
output results that indicate a good accuracy in the forecasted sewer
overflow rates. However, it was revealed that the accurate
forecasting of the overflow rates are heavily dependent on the
availability of a real-time flow monitoring at the overflow structure
to provide antecedent flow rate data. The ability of the ANN to
forecast the overflow rates without the antecedent flow rates (as is
the case with traditional conceptual reticulation models) was found to
be quite poor.