Abstract: The counting process of cell colonies is always a long
and laborious process that is dependent on the judgment and ability
of the operator. The judgment of the operator in counting can vary in
relation to fatigue. Moreover, since this activity is time consuming it
can limit the usable number of dishes for each experiment. For these
purposes, it is necessary that an automatic system of cell colony
counting is used. This article introduces a new automatic system of
counting based on the elaboration of the digital images of cellular
colonies grown on petri dishes. This system is mainly based on the
algorithms of region-growing for the recognition of the regions of
interest (ROI) in the image and a Sanger neural net for the
characterization of such regions. The better final classification is
supplied from a Feed-Forward Neural Net (FF-NN) and confronted
with the K-Nearest Neighbour (K-NN) and a Linear Discriminative
Function (LDF). The preliminary results are shown.
Abstract: A study was conducted in greenhouse environment to
determine the response of five tissue-cultured date palm cultivars, Al-
Ahamad, Nabusaif, Barhee, Khalas, and Kasab to irrigation water
salinity of 1.6, 5, 10, or 20 dS/ m. The salinity level of 1.6dS/m, was
used as a control. The effects of high salinity on plant survival were
manifested at 360 days after planting (DAP) onwards. Three
cultivars, Khalas, Kasab and Barhee were able to tolerate 10 dS/m
salinity level at 24 months after the start of study. Khalas tolerated
the highest salinity level of 20 dS/ m and 'Nabusaif' was found to be the
least tolerant cv. The average heights of palms and the number of
fronds were decreased with increasing salinity levels as time
progressed.
Abstract: In this work, we present a reliable framework to solve boundary value problems with particular significance in solid mechanics. These problems are used as mathematical models in deformation of beams. The algorithm rests mainly on a relatively new technique, the Variational Iteration Method. Some examples are given to confirm the efficiency and the accuracy of the method.
Abstract: Copper based composites reinforced with WC and Ti
particles were prepared using planetary ball-mill. The experiment
was designed by using Taguchi technique and milling was carried out
in an air for several hours. The powder was characterized before and
after milling using the SEM, TEM and X-ray for microstructure and
for possible new phases. Microstructures show that milled particles
size and reduction in particle size depend on many parameters. The
distance d between planes of atoms estimated from X-ray powder
diffraction data and TEM image. X-ray diffraction patterns of the
milled powder did not show clearly any new peak or energy shift, but
the TEM images show a significant change in crystalline structure of
corporate on titanium in the composites.
Abstract: In this paper we present an off line system for the
recognition of the handwritten numeric chains. Our work is divided
in two big parts. The first part is the realization of a recognition
system of the isolated handwritten digits. In this case the study is
based mainly on the evaluation of neural network performances,
trained with the gradient back propagation algorithm. The used
parameters to form the input vector of the neural network are
extracted on the binary images of the digits by several methods: the
distribution sequence, the Barr features and the centred moments of
the different projections and profiles. The second part is the
extension of our system for the reading of the handwritten numeric
chains constituted of a variable number of digits. The vertical
projection is used to segment the numeric chain at isolated digits and
every digit (or segment) will be presented separately to the entry of
the system achieved in the first part (recognition system of the
isolated handwritten digits). The result of the recognition of the
numeric chain will be displayed at the exit of the global system.
Abstract: Dynamics of laser radiation – metal target interaction
in water at 1064 nm by applying Mach-Zehnder interference
technique was studied. The mechanism of generating the well
developed regime of evaporation of a metal surface and a spherical
shock wave in water is proposed. Critical intensities of the NIR for
the well developed evaporation of silver and gold targets were
determined. Dynamics of shock waves was investigated for earlier
(dozens) and later (hundreds) nanoseconds of time. Transparent
expanding plasma-vapor-compressed water object was visualized and
measured. The thickness of compressed layer of water and pressures
behind the front of a shock wave for later time delays were obtained
from the optical treatment of interferograms.
Abstract: A dead leg is a typical subsea production system
component. CFD is required to model heat transfer within the dead
leg. Unfortunately its solution is time demanding and thus not
suitable for fast prediction or repeated simulations. Therefore there is
a need to create a thermal FEA model, mimicking the heat flows and
temperatures seen in CFD cool down simulations.
This paper describes the conventional way of tuning and a new
automated way using parametric model order reduction (PMOR)
together with an optimization algorithm. The tuned FE analyses
replicate the steady state CFD parameters within a maximum error in
heat flow of 6 % and 3 % using manual and PMOR method
respectively. During cool down, the relative error of the tuned FEA
models with respect to temperature is below 5% comparing to the
CFD. In addition, the PMOR method obtained the correct FEA setup
five times faster than the manually tuned FEA.
Abstract: The cup method is applied for the measurement of water vapor transport properties of porous materials worldwide. However, in practical applications the experimental results are often used without taking into account some secondary effects which can play an important role under specific conditions. In this paper, the effect of temperature on water vapor transport properties of cellular concrete is studied, together with the influence of sample thickness. At first, the bulk density, matrix density, total open porosity and sorption and desorption isotherms are measured for material characterization purposes. Then, the steady state cup method is used for determination of water vapor transport properties, whereas the measurements are performed at several temperatures and for three different sample thicknesses.
Abstract: Free Hemoglobin promotes the accumulation of
hydroxyl radicals by the heme iron, which can react with endogenous
hydrogen peroxide to produce free radicals which may cause severe
oxidative cell damage. Haptoglobin binds to Hemoglobin strongly
and Haptoglobin-Hemoglobin binding is irreversible. Peroxidase
activity of Haptoglobin(2-2)-Hemoglobin complex was assayed by
following increase of absorption of produced tetraguaiacol as the
second substrate of Haptoglobin-Hemoglobin complex at 470 nm and
42°C by UV-Vis spectrophotometer. The results have shown that
peroxidase activity of Haptoglobin(2-2)-Hemoglobin complex is
modulated via homotropic effect of hydrogen peroxide as allostric
substrate. On the other hand antioxidant property of Haptoglobin(2-
2)-Hemoglobin was increased via heterotropic effect of the two drugs
(especially ampicillin) on peroxidase activity of the complex. Both
drugs also have mild effect on quality of homotropic property of
peroxidase activity of Haptoglobin(2-2)-Hemoglobin complex.
Therefore, in vitro studies show that the two drugs may help Hp-Hb
complex to remove hydrogen peroxide from serum at pathologic
temperature ature (42 C).
Abstract: Software project effort estimation is frequently seen
as complex and expensive for individual software engineers.
Software production is in a crisis. It suffers from excessive costs.
Software production is often out of control. It has been suggested that
software production is out of control because we do not measure.
You cannot control what you cannot measure. During last decade, a
number of researches on cost estimation have been conducted. The
metric-set selection has a vital role in software cost estimation
studies; its importance has been ignored especially in neural network
based studies. In this study we have explored the reasons of those
disappointing results and implemented different neural network
models using augmented new metrics. The results obtained are
compared with previous studies using traditional metrics. To be able
to make comparisons, two types of data have been used. The first
part of the data is taken from the Constructive Cost Model
(COCOMO'81) which is commonly used in previous studies and the
second part is collected according to new metrics in a leading
international company in Turkey. The accuracy of the selected
metrics and the data samples are verified using statistical techniques.
The model presented here is based on Multi-Layer Perceptron
(MLP). Another difficulty associated with the cost estimation studies
is the fact that the data collection requires time and care. To make a
more thorough use of the samples collected, k-fold, cross validation
method is also implemented. It is concluded that, as long as an
accurate and quantifiable set of metrics are defined and measured
correctly, neural networks can be applied in software cost estimation
studies with success
Abstract: There are various overlay structures that provide
efficient and scalable solutions for point and range query in a peer-topeer
network. Overlay structure based on m-Binary Search Tree
(BST) is one such popular technique. It deals with the division of the
tree into different key intervals and then assigning the key intervals to
a BST. The popularity of the BST makes this overlay structure
vulnerable to different kinds of attacks. Here we present four such
possible attacks namely index poisoning attack, eclipse attack,
pollution attack and syn flooding attack. The functionality of BST is
affected by these attacks. We also provide different security
techniques that can be applied against these attacks.
Abstract: ''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.
Abstract: The influence of eccentric discharge of stored solids in
squat silos has been highly valued by many researchers. However,
calculation method of lateral pressure under eccentric flowing still
needs to be deeply studied. In particular, the lateral pressure
distribution on vertical wall could not be accurately recognized
mainly because of its asymmetry. In order to build mechanical model
of lateral pressure, flow channel and flow pattern of stored solids in
squat silo are studied. In this passage, based on Janssen-s theory, the
method for calculating lateral static pressure in squat silos after
eccentric discharge is proposed. Calculative formulae are deduced for
each of three possible cases. This method is also focusing on
unsymmetrical distribution characteristic of silo wall normal
pressure. Finite element model is used to analysis and compare the
results of lateral pressure and the numerical results illustrate the
practicability of the theoretical method.
Abstract: The present study was designed to test the influence
of confirmed expectations, perceived usefulness and perceived
competence on e-learning satisfaction among university teachers. A
questionnaire was completed by 125 university teachers from 12
different universities in Norway. We found that 51% of the variance
in university teachers- satisfaction with e-learning could be explained
by the three proposed antecedents. Perceived usefulness seems to be
the most important predictor of teachers- satisfaction with e-learning.
Abstract: Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.
Abstract: This article is dedicated to development of
mathematical models for determining the dynamics of
concentration of hazardous substances in urban turbulent
atmosphere. Development of the mathematical models implied
taking into account the time-space variability of the fields of
meteorological items and such turbulent atmosphere data as vortex
nature, nonlinear nature, dissipativity and diffusivity. Knowing the
turbulent airflow velocity is not assumed when developing the
model. However, a simplified model implies that the turbulent and
molecular diffusion ratio is a piecewise constant function that
changes depending on vertical distance from the earth surface.
Thereby an important assumption of vertical stratification of urban
air due to atmospheric accumulation of hazardous substances
emitted by motor vehicles is introduced into the mathematical
model. The suggested simplified non-linear mathematical model of
determining the sought exhaust concentration at a priori unknown
turbulent flow velocity through non-degenerate transformation is
reduced to the model which is subsequently solved analytically.
Abstract: This paper considers various channels of gammaquantum
generation via an ultra-short high-power laser pulse
interaction with different targets.We analyse the possibilities to create
a pulsed gamma-radiation source using laser triggering of some
nuclear reactions and isomer targets. It is shown that sub-MeV
monochromatic short pulse of gamma-radiation can be obtained with
pulse energy of sub-mJ level from isomer target irradiated by intense
laser pulse. For nuclear reaction channel in light- atom materials, it is
shown that sub-PW laser pulse gives rise to formation about million
gamma-photons of multi-MeV energy.
Abstract: University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.
Abstract: The security of their network remains the priorities of almost all companies. Existing security systems have shown their limit; thus a new type of security systems was born: honeypots. Honeypots are defined as programs or intended servers which have to attract pirates to study theirs behaviours. It is in this context that the leurre.com project of gathering about twenty platforms was born. This article aims to specify a model of honeypots attack. Our model describes, on a given platform, the evolution of attacks according to theirs hours. Afterward, we show the most attacked services by the studies of attacks on the various ports. It is advisable to note that this article was elaborated within the framework of the research projects on honeyspots within the LABTIC (Laboratory of Information Technologies and Communication).
Abstract: An approach to develop the FPGA of a flexible key
RSA encryption engine that can be used as a standard device in the
secured communication system is presented. The VHDL modeling of
this RSA encryption engine has the unique characteristics of
supporting multiple key sizes, thus can easily be fit into the systems
that require different levels of security. A simple nested loop addition
and subtraction have been used in order to implement the RSA
operation. This has made the processing time faster and used
comparatively smaller amount of space in the FPGA. The hardware
design is targeted on Altera STRATIX II device and determined that
the flexible key RSA encryption engine can be best suited in the
device named EP2S30F484C3. The RSA encryption implementation
has made use of 13,779 units of logic elements and achieved a clock
frequency of 17.77MHz. It has been verified that this RSA
encryption engine can perform 32-bit, 256-bit and 1024-bit
encryption operation in less than 41.585us, 531.515us and 790.61us
respectively.