Abstract: Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.
Abstract: Current research on semantic web aims at making intelligent web pages meaningful for machines. In this way, ontology plays a primary role. We believe that logic can help ontology languages (such as OWL) to be more fluent and efficient. In this paper we try to combine logic with OWL to reduce some disadvantages of this language. Therefore we extend OWL by logic and also show how logic can satisfy our future expectations of an ontology language.
Abstract: In this paper, the application of neural networks to study the design of short-term temperature forecasting (STTF) Systems for Kermanshah city, west of Iran was explored. One important architecture of neural networks named Multi-Layer Perceptron (MLP) to model STTF systems is used. Our study based on MLP was trained and tested using ten years (1996-2006) meteorological data. The results show that MLP network has the minimum forecasting error and can be considered as a good method to model the STTF systems.
Abstract: Uncertainties of a serial production line affect on the
production throughput. The uncertainties cannot be prevented in a
real production line. However the uncertain conditions can be
controlled by a robust prediction model. Thus, a hybrid model
including autoregressive integrated moving average (ARIMA) and
multiple polynomial regression, is proposed to model the nonlinear
relationship of production uncertainties with throughput. The
uncertainties under consideration of this study are demand, breaktime,
scrap, and lead-time. The nonlinear relationship of production
uncertainties with throughput are examined in the form of quadratic
and cubic regression models, where the adjusted R-squared for
quadratic and cubic regressions was 98.3% and 98.2%. We optimized
the multiple quadratic regression (MQR) by considering the time
series trend of the uncertainties using ARIMA model. Finally the
hybrid model of ARIMA and MQR is formulated by better adjusted
R-squared, which is 98.9%.
Abstract: Dextran is a D-glucose polymer which is produced by
Leuconostoc mesenteroides grown in a sucrose-rich media. The
organism was obtained from the Persian Type Culture Collection
(PTCC) and was transferred in MRS broth medium at 30°C and pH
6.8 for 24 h. After preparation of inoculums, organisms were
inoculated into five liquid fermentation media containing either
molasses or cheese whey or different combinations of cheese whey
and molasses. After certain fermentation period, the produced
dextran was separated and dried. Dextran yield was calculated and
significant differences in different media were observed.
Furthermore, FT-IR analysis was performed and the results showed
that there were no significant differences in the produced dextran
structures.
Abstract: Atlantic herring (Clupea harengus) is an important
commercial fish and shows to be more and more demanded for
human consumption. Therefore, it is very important to find good
methods for monitoring the freshness of the fish in order to keep it in
the best quality for human consumption. In this study, the fish was
stored in ice up to 2 weeks. Quality changes during storage were
assessed by the Quality Index Method (QIM), quantitative
descriptive analysis (QDA) and Torry scheme, by texture
measurements: puncture tests and Texture Profile Analysis (TPA)
tests on texture analyzer TA.XT2i, and by electronic nose (e-nose)
measurements using FreshSense instrument. Storage time of herring
in ice could be estimated by QIM with ± 2 days using 5 herring per
lot. No correlation between instrumental texture parameters and
storage time or between sensory and instrumental texture variables
was found. E-nose measurements could be use to detect the onset of
spoilage.
Abstract: Petrol Fuel Station (PFS) has potential hazards to the
people, asset, environment and reputation of an operating company.
Fire hazards, static electricity air pollution evoked by aliphatic and
aromatic organic compounds are major causes of accident/incident
occurrence at fuel station. Activities such as carelessness,
maintenance, housekeeping, slips trips and falls, transportation
hazard, major and minor injuries, robbery and snake bites has a
potential to create unsafe conditions. The level of risk of these
hazards varies according to location and country. The emphasis on
safety considerations by the government is variable all around the
world. Developed countries safety records are much better as
compared to developing countries safety statistics. There is no
significant approach available to highlight the unsafe acts and unsafe
conditions during operation and maintenance of fuel station. Fuel
station is the most commonly available facilities that contain
flammable and hazardous materials. Due to continuous operation of
fuel station they pose various hazards to people, environment and
assets of an organization. To control these hazards, there is a need for
specific approach. PFS operation is unique as compared to other
businesses. For smooth operations it demands an involvement of
operating company, contractor and operator group. This study will
focus to address hazard contributing factors that have a potential to
make PFS operation risky. One year data collected, 902 activities
analyzed, comparisons were made to highlight significant
contributing factors. The study will provide help and assistance to
PFS outlet marketing companies to make their fuel station operation
safer. It will help health safety and environment (HSE) professionals
to arrest the gap available related to safety matters at PFS.
Abstract: In this paper we propose a multi-agent architecture for web information retrieval using fuzzy logic based result fusion mechanism. The model is designed in JADE framework and takes advantage of JXTA agent communication method to allow agent communication through firewalls and network address translators. This approach enables developers to build and deploy P2P applications through a unified medium to manage agent-based document retrieval from multiple sources.
Abstract: Fast retrieval of data has been a need of user in any
database application. This paper introduces a buffer based query
optimization technique in which queries are assigned weights
according to their number of execution in a query bank. These
queries and their optimized executed plans are loaded into the buffer
at the start of the database application. For every query the system
searches for a match in the buffer and executes the plan without
creating new plans.
Abstract: Because of importance of energy, optimization of
power generation systems is necessary. Gas turbine cycles are
suitable manner for fast power generation, but their efficiency is
partly low. In order to achieving higher efficiencies, some
propositions are preferred such as recovery of heat from exhaust
gases in a regenerator, utilization of intercooler in a multistage
compressor, steam injection to combustion chamber and etc.
However thermodynamic optimization of gas turbine cycle, even
with above components, is necessary. In this article multi-objective
genetic algorithms are employed for Pareto approach optimization of
Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective
optimization a number of conflicting objective functions
are to be optimized simultaneously. The important objective
functions that have been considered for optimization are entropy
generation of RIGT cycle (Ns) derives using Exergy Analysis and
Gouy-Stodola theorem, thermal efficiency and the net output power
of RIGT Cycle. These objectives are usually conflicting with each
other. The design variables consist of thermodynamic parameters
such as compressor pressure ratio (Rp), excess air in combustion
(EA), turbine inlet temperature (TIT) and inlet air temperature (T0).
At the first stage single objective optimization has been investigated
and the method of Non-dominated Sorting Genetic Algorithm
(NSGA-II) has been used for multi-objective optimization.
Optimization procedures are performed for two and three objective
functions and the results are compared for RIGT Cycle. In order to
investigate the optimal thermodynamic behavior of two objectives,
different set, each including two objectives of output parameters, are
considered individually. For each set Pareto front are depicted. The
sets of selected decision variables based on this Pareto front, will
cause the best possible combination of corresponding objective
functions. There is no superiority for the points on the Pareto front
figure, but they are superior to any other point. In the case of three
objective optimization the results are given in tables.
Abstract: Over half of the total electricity consumption is used in buildings. Air-conditioning and electric lighting are the two main resources of electricity consumption in high rise buildings. One way to reduce electricity consumption would be to limit heat gain into buildings, therefore reduce the demand for air-conditioning during hot summer months especially in hot regions. On the other hand natural daylight can be used to reduce the use of electricity for artificial lighting. In this paper effective factors on minimizing heat gain and achieving required day light were reviewed .As daylight always accompanied by solar heat gain. Also interactions between heat gain and daylight were discussed through previous studies and equations which are related to heat gain and day lighting especially in high rise buildings. As a result importance of building-s form and its component on energy consumption in buildings were clarified.
Abstract: This paper is described one of the intelligent control method in Autonomous systems, which is called fuzzy control to correct the three wheel omnidirectional robot movement while it make mistake to catch the target. Fuzzy logic is especially advantageous for problems that can not be easily represented by mathematical modeling because data is either unavailable, incomplete or the process is too complex. Such systems can be easily up grated by adding new rules to improve performance or add new features. In many cases , fuzzy control can be used to improve existing traditional controller systems by adding an extra layer of intelligence to the current control method. The fuzzy controller designed here is more accurate and flexible than the traditional controllers. The project is done at MRL middle size soccer robot team.
Abstract: Physiological control of a left ventricle assist device (LVAD) is generally a complicated task due to diverse operating environments and patient variability. In this work, a tracking control algorithm based on sliding mode and feed forward control for a class of discrete-time single input single output (SISO) nonlinear uncertain systems is presented. The controller was developed to track the reference trajectory to a set operating point without inducing suction in the ventricle. The controller regulates the estimated mean pulsatile flow Qp and mean pulsatility index of pump rotational speed PIω that was generated from a model of the assist device. We recall the principle of the sliding mode control theory then we combine the feed-forward control design with the sliding mode control technique to follow the reference trajectory. The uncertainty is replaced by its upper and lower boundary. The controller was tested in a computer simulation covering two scenarios (preload and ventricular contractility). The simulation results prove the effectiveness and the robustness of the proposed controller
Abstract: In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Abstract: Static analysis of source code is used for auditing web
applications to detect the vulnerabilities. In this paper, we propose a
new algorithm to analyze the PHP source code for detecting LFI and
RFI potential vulnerabilities. In our approach, we first define some
patterns for finding some functions which have potential to be abused
because of unhandled user inputs. More precisely, we use regular
expression as a fast and simple method to define some patterns for
detection of vulnerabilities. As inclusion functions could be also used
in a safe way, there could occur many false positives (FP). The first
cause of these FP-s could be that the function does not use a usersupplied
variable as an argument. So, we extract a list of usersupplied
variables to be used for detecting vulnerable lines of code.
On the other side, as vulnerability could spread among the variables
like by multi-level assignment, we also try to extract the hidden usersupplied
variables. We use the resulted list to decrease the false
positives of our method. Finally, as there exist some ways to prevent
the vulnerability of inclusion functions, we define also some patterns
to detect them and decrease our false positives.
Abstract: This report focus on phase behavior of polyethylene glycol (PEG)4000/ phosphate/ guanidine hydrochloride/ water system at different guanidine hydrochloride concentrations and pH. The binodal of the systems was displaced toward higher concentrations of the components with increasing guanidine hydrochloride concentrations. The partition coefficient of guanidine hydrochloride was near unity and increased with decreasing pH and increasing PEG/salt (%w/w) ratio.
Abstract: Speed estimation is one of the important and practical tasks in machine vision, Robotic and Mechatronic. the availability of high quality and inexpensive video cameras, and the increasing need for automated video analysis has generated a great deal of interest in machine vision algorithms. Numerous approaches for speed estimation have been proposed. So classification and survey of the proposed methods can be very useful. The goal of this paper is first to review and verify these methods. Then we will propose a novel algorithm to estimate the speed of moving object by using fuzzy concept. There is a direct relation between motion blur parameters and object speed. In our new approach we will use Radon transform to find direction of blurred image, and Fuzzy sets to estimate motion blur length. The most benefit of this algorithm is its robustness and precision in noisy images. Our method was tested on many images with different range of SNR and is satisfiable.
Abstract: Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Abstract: To improve HSE standards, oil and gas industries are
interested in using remotely controlled and autonomous robots instead
of human workers on offshore platforms. In addition to earlier reason
this strategy would increase potential revenue, efficient usage of
work experts and even would allow operations in more remote areas.
This article is the presentation of a custom climbing robot, called
Walloid, designed for offshore platform topside automation. This 4
arms climbing robot with grippers is an ongoing project at University
of Oslo.
Abstract: In this paper, we explore the applicability of the Sinc-
Collocation method to a three-dimensional (3D) oceanography model.
The model describes a wind-driven current with depth-dependent
eddy viscosity in the complex-velocity system. In general, the
Sinc-based methods excel over other traditional numerical methods
due to their exponentially decaying errors, rapid convergence and
handling problems in the presence of singularities in end-points.
Together with these advantages, the Sinc-Collocation approach that
we utilize exploits first derivative interpolation, whose integration
is much less sensitive to numerical errors. We bring up several
model problems to prove the accuracy, stability, and computational
efficiency of the method. The approximate solutions determined by
the Sinc-Collocation technique are compared to exact solutions and
those obtained by the Sinc-Galerkin approach in earlier studies. Our
findings indicate that the Sinc-Collocation method outperforms other
Sinc-based methods in past studies.