Abstract: In this paper, a simple microfluidic device for monitoring algal cell behavior is proposed. An array of algal microwells is fabricated by PDMS soft-lithography using X-ray LIGA mold, placed on a glass substrate. Two layers of replicated PDMS and substrate are attached by oxygen plasma bonding, creating a microchannel for the microfluidic system. Algal cell are loaded into the microfluidic device, which provides positive charge on the bottom surface of wells. Algal cells, which are negative charged, can be attracted to the bottom of the wells via electrostatic interaction. By varying the concentration of algal cells in the loading suspension, it is possible to obtain wells with a single cell. Liquid medium for cells monitoring are flown continuously over the wells, providing nutrient and waste exchange between the well and the main flow. This device could lead to the uncovering of the quantitative biology of the algae, which is a key to effective and extensive algal utilizations in the field of biotechnology, food industry and bioenergy research and developments.
Abstract: State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.
Abstract: This paper describes a newly designed decentralized
nonlinear control strategy to control a robot manipulator. Based on the
concept of the nonlinear state feedback theory and decentralized
concept is developed to improve the drawbacks in previous works
concerned with complicate intelligent control and low cost effective
sensor. The control methodology is derived in the sense of Lyapunov
theorem so that the stability of the control system is guaranteed. The
decentralized algorithm does not require other joint angle and velocity
information. Individual Joint controller is implemented using a digital
processor with nearly actuator to make it possible to achieve good
dynamics and modular. Computer simulation result has been
conducted to validate the effectiveness of the proposed control scheme
under the occurrence of possible uncertainties and different reference
trajectories. The merit of the proposed control system is indicated in
comparison with a classical control system.
Abstract: Embedding Sustainability in technological curricula has become a crucial factor for educating engineers with competences in sustainability. The Technical University of Catalonia UPC, in 2008, designed the Sustainable Technology Excellence Program STEP 2015 in order to assure a successful Sustainability Embedding. This Program takes advantage of the opportunity that the redesign of all Bachelor and Master Degrees in Spain by 2010 under the European Higher Education Area framework offered. The STEP program goals are: to design compulsory courses in each degree; to develop the conceptual base and identify reference models in sustainability for all specialties at UPC; to create an internal interdisciplinary network of faculty from all the schools; to initiate new transdisciplinary research activities in technology-sustainability-education; to spread the know/how attained; to achieve international scientific excellence in technology-sustainability-education and to graduate the first engineers/architects of the new EHEA bachelors with sustainability as a generic competence. Specifically, in this paper authors explain their experience in leading the STEP program, and two examples are presented: Industrial Robotics subject and the curriculum for the School of Architecture.
Abstract: There are several approaches in trying to solve the
Quantitative 1Structure-Activity Relationship (QSAR) problem.
These approaches are based either on statistical methods or on
predictive data mining. Among the statistical methods, one should
consider regression analysis, pattern recognition (such as cluster
analysis, factor analysis and principal components analysis) or partial
least squares. Predictive data mining techniques use either neural
networks, or genetic programming, or neuro-fuzzy knowledge. These
approaches have a low explanatory capability or non at all. This
paper attempts to establish a new approach in solving QSAR
problems using descriptive data mining. This way, the relationship
between the chemical properties and the activity of a substance
would be comprehensibly modeled.
Abstract: In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .
Abstract: Benchmarking cleaner production performance is an
effective way of pollution control and emission reduction in coal-fired
power industry. A benchmarking method using two-stage
super-efficiency data envelopment analysis for coal-fired power plants
is proposed – firstly, to improve the cleaner production performance of
DEA-inefficient or weakly DEA-efficient plants, then to select the
benchmark from performance-improved power plants. An empirical
study is carried out with the survey data of 24 coal-fired power plants.
The result shows that in the first stage the performance of 16 plants is
DEA-efficient and that of 8 plants is relatively inefficient. The target
values for improving DEA-inefficient plants are acquired by
projection analysis. The efficient performance of 24 power plants and
the benchmarking plant is achieved in the second stage. The two-stage
benchmarking method is practical to select the optimal benchmark in
the cleaner production of coal-fired power industry and will
continuously improve plants- cleaner production performance.
Abstract: The Reverse Monte Carlo (RMC) simulation is applied in the study of an aqueous electrolyte LiCl6H2O. On the basis of the available experimental neutron scattering data, RMC computes pair radial distribution functions in order to explore the structural features of the system. The obtained results include some unrealistic features. To overcome this problem, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an energy constraint in addition to the commonly used constraints derived from experimental data. Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in pair partial distribution curves. This kind of study can be considered as a useful test for a defined interaction model for conventional simulation techniques.
Abstract: The objective of this work was to investigate flow
properties of powdered infant formula samples. Samples were
purchased at a local pharmacy and differed in composition. Lactose
free infant formula, gluten free infant formula and infant formulas
containing dietary fibers and probiotics were tested and compared
with a regular infant formula sample which did not contain any of
these supplements. Particle size and bulk density were determined
and their influence on flow properties was discussed. There were no
significant differences in bulk densities of the samples, therefore the
connection between flow properties and bulk density could not be
determined. Lactose free infant formula showed flow properties
different to standard supplement-free sample. Gluten free infant
formula with addition of probiotic microorganisms and dietary fiber
had the narrowest particle size distribution range and exhibited the
best flow properties. All the other samples exhibited the same
tendency of decreasing compaction coefficient with increasing flow
speed, which means they all become freer flowing with higher flow
speeds.
Abstract: We present a prototype interactive (hyper) map of strategic, tactical, and logistic options for Supply Chain Management. The map comprises an anthology of options, broadly classified within the strategic spectrum of efficiency versus responsiveness, and according to logistic and cross-functional drivers. They are exemplified by cases in diverse industries. We seek to get all these information and ideas organized to help supply chain managers identify effective choices for specific business environments. The key and innovative linkage we introduce is the configuration of competitive forces. Instead of going through seemingly endless and isolated cases and wondering how one can borrow from them, we aim to provide a guide by force comparisons. The premise is that best practices in a different industry facing similar forces may be a most productive resource in supply chain design and planning. A prototype template is demonstrated.
Abstract: Paced Auditory Serial Addition Test (PASAT) has
been used as a common research tool for different neurological
disorders like Multiple Sclerosis. Recently, technology let
researchers to introduce a new versions of the visual test, the paced
visual serial addition test (PVSAT). In this paper, the computerized
version of these two tests is introduced. Beside the number of true
responses are interpreted, the reaction time of subjects are calculated
by the software. We hypothesize that paying attention to the reaction
time may be valuable. For this purpose, sixty eight female normal
subjects and fifty eight male normal subjects are enrolled in the
study. We investigate the similarity between the PASAT3 and
PVSAT3 in number of true responses and the new criterion (the
average reaction time of each subject). The similarity between two
tests were rejected (p-value = 0.000) which means that these two test
differ. The effect of sex in the tests were not approved since the pvalues
of different between PASAT3 and PVSAT3 in both sex is the
same (p-value = 0.000) which means that male and female subjects
performed the tests at no different level of performance. The new
criterion shows a negative correlation with the age which offers aged
normal subjects may have the same number of true responses as the
young subjects but they have latent responses. This will give prove
for the importance of reaction time.
Abstract: The noteworthy point in the advancement of Brain Machine Interface (BMI) research is the ability to accurately extract features of the brain signals and to classify them into targeted control action with the easiest procedures since the expected beneficiaries are of disabled. In this paper, a new feature extraction method using the combination of adaptive band pass filters and adaptive autoregressive (AAR) modelling is proposed and applied to the classification of right and left motor imagery signals extracted from the brain. The introduction of the adaptive bandpass filter improves the characterization process of the autocorrelation functions of the AAR models, as it enhances and strengthens the EEG signal, which is noisy and stochastic in nature. The experimental results on the Graz BCI data set have shown that by implementing the proposed feature extraction method, a LDA and SVM classifier outperforms other AAR approaches of the BCI 2003 competition in terms of the mutual information, the competition criterion, or misclassification rate.
Abstract: Prickly pear (Opuntia spp) fruit has received renewed
interest since it contains a betalain pigment that has an attractive
purple colour for the production of juice. Prickly pear juice was
prepared by homogenizing the fruit and treating the pulp with 48 g of
pectinase from Aspergillus niger. Titratable acidity was determined
by diluting 10 ml prickly pear juice with 90 ml deionized water and
titrating to pH 8.2 with 0.1 N NaOH. Brix was measured using a
refractometer and ascorbic acid content assayed
spectrophotometrically. Colour variation was determined
colorimetrically (Hunter L.a.b.). Hunter L.a.b. analysis showed that
the red purple colour of prickly pear juice had been affected by juice
treatments. This was indicated by low light values of colour
difference meter (CDML*), hue, CDMa* and CDMb* values. It was
observed that non-treated prickly pear juice had a high (colour
difference meter of light) CDML* of 3.9 compared to juice
treatments (range 3.29 to 2.14). The CDML* significantly (p
Abstract: Mining frequent tree patterns have many useful
applications in XML mining, bioinformatics, network routing, etc.
Most of the frequent subtree mining algorithms (i.e. FREQT,
TreeMiner and CMTreeMiner) use anti-monotone property in the
phase of candidate subtree generation. However, none of these
algorithms have verified the correctness of this property in tree
structured data. In this research it is shown that anti-monotonicity
does not generally hold, when using weighed support in tree pattern
discovery. As a result, tree mining algorithms that are based on this
property would probably miss some of the valid frequent subtree
patterns in a collection of trees. In this paper, we investigate the
correctness of anti-monotone property for the problem of weighted
frequent subtree mining. In addition we propose W3-Miner, a new
algorithm for full extraction of frequent subtrees. The experimental
results confirm that W3-Miner finds some frequent subtrees that the
previously proposed algorithms are not able to discover.
Abstract: In this paper, the melting of a semi-infinite body as a
result of a moving laser beam has been studied. Because the Fourier
heat transfer equation at short times and large dimensions does not
have sufficient accuracy; a non-Fourier form of heat transfer
equation has been used. Due to the fact that the beam is moving in x
direction, the temperature distribution and the melting pool shape are
not asymmetric. As a result, the problem is a transient threedimensional
problem. Therefore, thermophysical properties such as
heat conductivity coefficient, density and heat capacity are functions
of temperature and material states. The enthalpy technique, used for
the solution of phase change problems, has been used in an explicit
finite volume form for the hyperbolic heat transfer equation. This
technique has been used to calculate the transient temperature
distribution in the semi-infinite body and the growth rate of the melt
pool. In order to validate the numerical results, comparisons were
made with experimental data. Finally, the results of this paper were
compared with similar problem that has used the Fourier theory. The
comparison shows the influence of infinite speed of heat propagation
in Fourier theory on the temperature distribution and the melt pool
size.
Abstract: In this paper we present our results on the performance analysis of a multi-product manufacturing line. We study the influence of external perturbations, intermediate buffer content and the number of manufacturing stages on the production tracking error of each machine in the multi-product line operated under a surplusbased production control policy. Starting by the analysis of a single machine with multiple production stages (one for each product type), we provide bounds on the production error of each stage. Then, we extend our analysis to a line of multi-stage machines, where similarly, bounds on each production tracking error for each product type, as well as buffer content are obtained. Details on performance of the closed-loop flow line model are illustrated in numerical simulations.
Abstract: This paper presents an analytical model to estimate
the cost of an optimized design of reinforced concrete isolated
footing base on structural safety. Flexural and optimized formulas for
square and rectangular footingare derived base on ACI building code
of design, material cost and optimization. The optimization
constraints consist of upper and lower limits of depth and area of
steel. Footing depth and area of reinforcing steel are to be minimized
to yield the optimal footing dimensions. Optimized footing materials
cost of concrete, reinforcing steel and formwork of the designed
sections are computed. Total cost factor TCF and other cost factors
are developed to generalize and simplify the calculations of footing
material cost. Numerical examples are presented to illustrate the
model capability of estimating the material cost of the footing for a
desired axial load.
Abstract: This paper introduces our first efforts of developing a
new team for RoboCup Middle Size Competition. In our robots we
have applied omni directional based mobile system with omnidirectional
vision system and fuzzy control algorithm to navigate
robots. The control architecture of MRL middle-size robots is a three
layered architecture, Planning, Sequencing, and Executing. It also
uses Blackboard system to achieve coordination among agents.
Moreover, the architecture should have minimum dependency on low
level structure and have a uniform protocol to interact with real
robot.
Abstract: This paper presents a new and efficient approach for
capacitor placement in radial distribution systems that determine
the optimal locations and size of capacitor with an objective of
improving the voltage profile and reduction of power loss. The
solution methodology has two parts: in part one the loss sensitivity
factors are used to select the candidate locations for the capacitor
placement and in part two a new algorithm that employs Plant growth
Simulation Algorithm (PGSA) is used to estimate the optimal size
of capacitors at the optimal buses determined in part one. The main
advantage of the proposed method is that it does not require any
external control parameters. The other advantage is that it handles the
objective function and the constraints separately, avoiding the trouble
to determine the barrier factors. The proposed method is applied to 9
and 34 bus radial distribution systems. The solutions obtained by the
proposed method are compared with other methods. The proposed
method has outperformed the other methods in terms of the quality
of solution.
Abstract: Low power consumption is a major constraint for battery-powered system like computer notebook or PDA. In the past, specialists usually designed both specific optimized equipments and codes to relief this concern. Doing like this could work for quite a long time, however, in this era, there is another significant restraint, the time to market. To be able to serve along the power constraint while can launch products in shorter production period, objectoriented programming (OOP) has stepped in to this field. Though everyone knows that OOP has quite much more overhead than assembly and procedural languages, development trend still heads to this new world, which contradicts with the target of low power consumption. Most of the prior power related software researches reported that OOP consumed much resource, however, as industry had to accept it due to business reasons, up to now, no papers yet had mentioned about how to choose the best OOP practice in this power limited boundary. This article is the pioneer that tries to specify and propose the optimized strategy in writing OOP software under energy concerned environment, based on quantitative real results. The language chosen for studying is C# based on .NET Framework 2.0 which is one of the trendy OOP development environments. The recommendation gotten from this research would be a good roadmap that can help developers in coding that well balances between time to market and time of battery.