Abstract: A dead leg is a typical subsea production system
component. CFD is required to model heat transfer within the dead
leg. Unfortunately its solution is time demanding and thus not
suitable for fast prediction or repeated simulations. Therefore there is
a need to create a thermal FEA model, mimicking the heat flows and
temperatures seen in CFD cool down simulations.
This paper describes the conventional way of tuning and a new
automated way using parametric model order reduction (PMOR)
together with an optimization algorithm. The tuned FE analyses
replicate the steady state CFD parameters within a maximum error in
heat flow of 6 % and 3 % using manual and PMOR method
respectively. During cool down, the relative error of the tuned FEA
models with respect to temperature is below 5% comparing to the
CFD. In addition, the PMOR method obtained the correct FEA setup
five times faster than the manually tuned FEA.
Abstract: Software project effort estimation is frequently seen
as complex and expensive for individual software engineers.
Software production is in a crisis. It suffers from excessive costs.
Software production is often out of control. It has been suggested that
software production is out of control because we do not measure.
You cannot control what you cannot measure. During last decade, a
number of researches on cost estimation have been conducted. The
metric-set selection has a vital role in software cost estimation
studies; its importance has been ignored especially in neural network
based studies. In this study we have explored the reasons of those
disappointing results and implemented different neural network
models using augmented new metrics. The results obtained are
compared with previous studies using traditional metrics. To be able
to make comparisons, two types of data have been used. The first
part of the data is taken from the Constructive Cost Model
(COCOMO'81) which is commonly used in previous studies and the
second part is collected according to new metrics in a leading
international company in Turkey. The accuracy of the selected
metrics and the data samples are verified using statistical techniques.
The model presented here is based on Multi-Layer Perceptron
(MLP). Another difficulty associated with the cost estimation studies
is the fact that the data collection requires time and care. To make a
more thorough use of the samples collected, k-fold, cross validation
method is also implemented. It is concluded that, as long as an
accurate and quantifiable set of metrics are defined and measured
correctly, neural networks can be applied in software cost estimation
studies with success
Abstract: One of the main trouble in a steel strip manufacturing
line is the breakage of whatever weld carried out between steel coils,
that are used to produce the continuous strip to be processed. A weld
breakage results in a several hours stop of the manufacturing line. In
this process the damages caused by the breakage must be repaired.
After the reparation and in order to go on with the production it will
be necessary a restarting process of the line. For minimizing this
problem, a human operator must inspect visually and manually each
weld in order to avoid its breakage during the manufacturing process.
The work presented in this paper is based on the Bayesian decision
theory and it presents an approach to detect, on real-time, steel strip
defective welds. This approach is based on quantifying the tradeoffs
between various classification decisions using probability and the
costs that accompany such decisions.
Abstract: ''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.
Abstract: The influence of eccentric discharge of stored solids in
squat silos has been highly valued by many researchers. However,
calculation method of lateral pressure under eccentric flowing still
needs to be deeply studied. In particular, the lateral pressure
distribution on vertical wall could not be accurately recognized
mainly because of its asymmetry. In order to build mechanical model
of lateral pressure, flow channel and flow pattern of stored solids in
squat silo are studied. In this passage, based on Janssen-s theory, the
method for calculating lateral static pressure in squat silos after
eccentric discharge is proposed. Calculative formulae are deduced for
each of three possible cases. This method is also focusing on
unsymmetrical distribution characteristic of silo wall normal
pressure. Finite element model is used to analysis and compare the
results of lateral pressure and the numerical results illustrate the
practicability of the theoretical method.
Abstract: Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.
Abstract: This article is dedicated to development of
mathematical models for determining the dynamics of
concentration of hazardous substances in urban turbulent
atmosphere. Development of the mathematical models implied
taking into account the time-space variability of the fields of
meteorological items and such turbulent atmosphere data as vortex
nature, nonlinear nature, dissipativity and diffusivity. Knowing the
turbulent airflow velocity is not assumed when developing the
model. However, a simplified model implies that the turbulent and
molecular diffusion ratio is a piecewise constant function that
changes depending on vertical distance from the earth surface.
Thereby an important assumption of vertical stratification of urban
air due to atmospheric accumulation of hazardous substances
emitted by motor vehicles is introduced into the mathematical
model. The suggested simplified non-linear mathematical model of
determining the sought exhaust concentration at a priori unknown
turbulent flow velocity through non-degenerate transformation is
reduced to the model which is subsequently solved analytically.
Abstract: University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.
Abstract: An approach to develop the FPGA of a flexible key
RSA encryption engine that can be used as a standard device in the
secured communication system is presented. The VHDL modeling of
this RSA encryption engine has the unique characteristics of
supporting multiple key sizes, thus can easily be fit into the systems
that require different levels of security. A simple nested loop addition
and subtraction have been used in order to implement the RSA
operation. This has made the processing time faster and used
comparatively smaller amount of space in the FPGA. The hardware
design is targeted on Altera STRATIX II device and determined that
the flexible key RSA encryption engine can be best suited in the
device named EP2S30F484C3. The RSA encryption implementation
has made use of 13,779 units of logic elements and achieved a clock
frequency of 17.77MHz. It has been verified that this RSA
encryption engine can perform 32-bit, 256-bit and 1024-bit
encryption operation in less than 41.585us, 531.515us and 790.61us
respectively.
Abstract: Performance of a limited Round-Robin (RR) rule is
studied in order to clarify the characteristics of a realistic sharing
model of a processor. Under the limited RR rule, the processor
allocates to each request a fixed amount of time, called a quantum, in a
fixed order. The sum of the requests being allocated these quanta is
kept below a fixed value. Arriving requests that cannot be allocated
quanta because of such a restriction are queued or rejected. Practical
performance measures, such as the relationship between the mean
sojourn time, the mean number of requests, or the loss probability and
the quantum size are evaluated via simulation. In the evaluation, the
requested service time of an arriving request is converted into a
quantum number. One of these quanta is included in an RR cycle,
which means a series of quanta allocated to each request in a fixed
order. The service time of the arriving request can be evaluated using
the number of RR cycles required to complete the service, the number
of requests receiving service, and the quantum size. Then an increase
or decrease in the number of quanta that are necessary before service is
completed is reevaluated at the arrival or departure of other requests.
Tracking these events and calculations enables us to analyze the
performance of our limited RR rule. In particular, we obtain the most
suitable quantum size, which minimizes the mean sojourn time, for the
case in which the switching time for each quantum is considered.
Abstract: Character segmentation is an important preprocessing step for text recognition. In degraded documents, existence of touching characters decreases recognition rate drastically, for any optical character recognition (OCR) system. In this paper a study of touching Gurmukhi characters is carried out and these characters have been divided into various categories after a careful analysis.Structural properties of the Gurmukhi characters are used for defining the categories. New algorithms have been proposed to segment the touching characters in middle zone. These algorithms have shown a reasonable improvement in segmenting the touching characters in degraded Gurmukhi script. The algorithms proposed in this paper are applicable only to machine printed text.
Abstract: This paper describes the architectural design
considerations for building a new class of application, a Personal
Knowledge Integrator and a particular example a Knowledge Theatre.
It then supports this description by describing a scenario of a child
acquiring knowledge and how this process could be augmented by
the proposed architecture and design of a Knowledge Theatre. David
Merrill-s first “principles of instruction" are kept in focus to provide
a background to view the learning potential.
Abstract: The incorporation of renewable energy sources for the sustainable electricity production is undertaking a more prominent role in electric power systems. Thus, it will be an indispensable incident that the characteristics of future power networks, their prospective stability for instance, get influenced by the imposed features of sustainable energy sources. One of the distinctive attributes of the sustainable energy sources is exhibiting the stochastic behavior. This paper investigates the impacts of this stochastic behavior on the small disturbance rotor angle stability in the upcoming electric power networks. Considering the various types of renewable energy sources and the vast variety of system configurations, the sensitivity analysis can be an efficient breakthrough towards generalizing the effects of new energy sources on the concept of stability. In this paper, the definition of small disturbance angle stability for future power systems and the iterative-stochastic way of its analysis are presented. Also, the effects of system parameters on this type of stability are described by performing a sensitivity analysis for an electric power test system.
Abstract: In this paper, we are concerned with the further study for system of nonlinear equations. Since systems with inaccurate function values or problems with high computational cost arise frequently in science and engineering, recently such systems have attracted researcher-s interest. In this work we present a new method which is independent of function evolutions and has a quadratic convergence. This method can be viewed as a extension of some recent methods for solving mentioned systems of nonlinear equations. Numerical results of applying this method to some test problems show the efficiently and reliability of method.
Abstract: Trace element speciation of an integrated soil
amendment matrix was studied with a modified BCR sequential
extraction procedure. The analysis included pseudo-total
concentration determinations according to USEPA 3051A and
relevant physicochemical properties by standardized methods. Based
on the results, the soil amendment matrix possessed neutralization
capacity comparable to commercial fertilizers. Additionally, the
pseudo-total concentrations of all trace elements included in the
Finnish regulation for agricultural fertilizers were lower than the
respective statutory limit values. According to chemical speciation,
the lability of trace elements increased in the following order: Hg <
Cr < Co < Cu < As < Zn < Ni < Pb < Cd < V < Mo < Ba. The
validity of the BCR approach as a tool for chemical speciation was
confirmed by the additional acid digestion phase. Recovery of trace
elements during the procedure assured the validity of the approach
and indicated good quality of the analytical work.
Abstract: The design of distributed systems involves the
partitioning of the system into components or partitions and the
allocation of these components to physical nodes. Techniques have
been proposed for both the partitioning and allocation process.
However these techniques suffer from a number of limitations. For
instance object replication has the potential to greatly improve the
performance of an object orientated distributed system but can be
difficult to use effectively and there are few techniques that support
the developer in harnessing object replication.
This paper presents a methodological technique that helps
developers decide how objects should be allocated in order to
improve performance in a distributed system that supports
replication. The performance of the proposed technique is
demonstrated and tested on an example system.
Abstract: In order to guarantee secure communication for wireless sensor networks (WSNs), many user authentication schemes have successfully drawn researchers- attention and been studied widely. In 2012, He et al. proposed a robust biometric-based user authentication scheme for WSNs. However, this paper demonstrates that He et al.-s scheme has some drawbacks: poor reparability problem, user impersonation attack, and sensor node impersonate attack.
Abstract: Hyperglycemia-mediated accumulation of advanced glycation end-products (AGEs) play a pivotal role in the development of diabetic complications by inducing inflammation. In the present study, we evaluated the possible effects of water/ethanol (1/1, v/v) extracts (WEE) and its fractions from Canarium album Raeusch. (Chinese olive) which is a fruit used on AGEs-stimulated oxidative stress and inflammation in monocytes and vascular endothelial cells. Co-incubation of EA.hy926 endothelial cells with WEE and its fractions for 24h resulted in a significant decrease of monocyte–endothelial cell adhesion, the expression of ICAM-1, generation of intracellular ROS and depletion of GSH induced by AGEs. Chinese olive fruit extracts also reduced the expression of pro-inflammatory mediates, such as TNF-α, IL-1β and IL-6 in THP-1 cells. These findings suggested that Chinese olive fruit was able to protect vascular endothelium from dysfunction induced by AGEs.
Abstract: In this paper we proposed comparison of four content based objective metrics with results of subjective tests from 80 video sequences. We also include two objective metrics VQM and SSIM to our comparison to serve as “reference” objective metrics because their pros and cons have already been published. Each of the video sequence was preprocessed by the region recognition algorithm and then the particular objective video quality metric were calculated i.e. mutual information, angular distance, moment of angle and normalized cross-correlation measure. The Pearson coefficient was calculated to express metrics relationship to accuracy of the model and the Spearman rank order correlation coefficient to represent the metrics relationship to monotonicity. The results show that model with the mutual information as objective metric provides best result and it is suitable for evaluating quality of video sequences.
Abstract: Reducing river sediments through path correction and
preservation of river walls leads to considerable reduction of
sedimentation at the pumping stations. Path correction and
preservation of walls is not limited to one particular method but,
depending on various conditions, a combination of several methods
can be employed. In this article, we try to review and evaluate
methods for preservation of river banks in order to reduce sediments.