Abstract: All over the world, including the Middle and East
European countries, sustainable tillage and sowing technologies are
applied increasingly broadly with a view to optimising soil resources,
mitigating soil degradation processes, saving energy resources,
preserving biological diversity, etc. As a result, altered conditions of
tillage and sowing technological processes are faced inevitably. The
purpose of this study is to determine the seedbed topsoil hardness
when using a combined sowing coulter in different sustainable tillage
technologies. The research involved a combined coulter consisting
of two dissected blade discs and a shoe coulter. In order to determine
soil hardness at the seedbed area, a multipenetrometer was used. It
was found by experimental studies that in loosened soil, a combined
sowing coulter equally suppresses the furrow bottom, walls and soil
near the furrow; therefore, here, soil hardness was similar at all
researched depths and no significant differences were established. In
loosened and compacted (double-rolled) soil, the impact of a
combined coulter on the hardness of seedbed soil surface was more
considerable at a depth of 2 mm. Soil hardness at the furrow bottom
and walls to a distance of up to 26 mm was 1.1 MPa. At a depth of 10
mm, the greatest hardness was established at the furrow bottom. In
loosened and heavily compacted (rolled for 6 times) soil, at a depth
of 2 and 10 mm a combined coulter most of all compacted the furrow
bottom, which has a hardness of 1.8 MPa. At a depth of 20 mm, soil
hardness within the whole investigated area varied insignificantly and
fluctuated by around 2.0 MPa. The hardness of furrow walls and soil
near the furrow was by approximately 1.0 MPa lower than that at the
furrow bottom
Abstract: The quality and shelf life of foods of containing lipids (fats and oils) significantly reduces due to rancidity.Applications of natural antioxidants are one of the most effective manners to prevent the oxidation of oils and lipids. The antioxidant properties of juice extracted from barberry fruit (Berberris vulgaris.L) using maceration and SWE (10 bars and 120 - 180°C) methods were investigated and compared with conventional method. The amount of phenolic compound and reduction power of all samples were determined and the data were statistically analyzed using multifactor design. The results showed that the total amount of phenolic compound increased with increasing of pressure and temprature from 1861.9 to 2439.1 (mg Gallic acid /100gr Dry matter). The ability of reduction power of SWE obtained antioxidant extract compared with BHA (synthetic antioxidant) and ascorbic acid (natural antioxidant). There were significant differences among reduction power of extracts and there were remarkable difference with BHA and Ascorbic acid (P
Abstract: The compression-absorption heat pump (C-A HP), one
of the promising heat recovery equipments that make process hot
water using low temperature heat of wastewater, was evaluated by
computer simulation. A simulation program was developed based on
the continuity and the first and second laws of thermodynamics. Both
the absorber and desorber were modeled using UA-LMTD method. In
order to prevent an unfeasible temperature profile and to reduce
calculation errors from the curved temperature profile of a mixture,
heat loads were divided into lots of segments. A single-stage
compressor was considered. A compressor cooling load was also
taken into account. An isentropic efficiency was computed from the
map data. Simulation conditions were given based on the system
consisting of ordinarily designed components. The simulation results
show that most of the total entropy generation occurs during the
compression and cooling process, thus suggesting the possibility that
system performance can be enhanced if a rectifier is introduced.
Abstract: This article presents the developments of efficient
algorithms for tablet copies comparison. Image recognition has
specialized use in digital systems such as medical imaging,
computer vision, defense, communication etc. Comparison between
two images that look indistinguishable is a formidable task. Two
images taken from different sources might look identical but due to
different digitizing properties they are not. Whereas small variation
in image information such as cropping, rotation, and slight
photometric alteration are unsuitable for based matching
techniques. In this paper we introduce different matching
algorithms designed to facilitate, for art centers, identifying real
painting images from fake ones. Different vision algorithms for
local image features are implemented using MATLAB. In this
framework a Table Comparison Computer Tool “TCCT" is
designed to facilitate our research. The TCCT is a Graphical Unit
Interface (GUI) tool used to identify images by its shapes and
objects. Parameter of vision system is fully accessible to user
through this graphical unit interface. And then for matching, it
applies different description technique that can identify exact
figures of objects.
Abstract: In this paper we propose a Multiple Description Image Coding(MDIC) scheme to generate two compressed and balanced rates descriptions in the wavelet domain (Daubechies biorthogonal (9, 7) wavelet) using pairwise correlating transform optimal and application method for Generalized Multiple Description Coding (GMDC) to image coding in the wavelet domain. The GMDC produces statistically correlated streams such that lost streams can be estimated from the received data. Our performance test shown that the proposed method gives more improvement and good quality of the reconstructed image when the wavelet coefficients are normalized by Gaussian Scale Mixture (GSM) model then the Gaussian one ,.
Abstract: This paper presents modeling and analysis of 12-phase distribution static compensator (DSTATCOM), which is capable of balancing the source currents in spite of unbalanced loading and phase outages. In addition to balance the supply current, the power factor can be set to a desired value. The theory of instantaneous symmetrical components is used to generate the twelve-phase reference currents. These reference currents are then tracked using current controlled voltage source inverter, operated in a hysteresis band control scheme. An ideal compensator in place of physical realization of the compensator is used. The performance of the proposed DTATCOM is validated through MATLAB simulation and detailed simulation results are given.
Abstract: Stream Control Transmission Protocol (SCTP) has been
proposed to provide reliable transport of real-time communications.
Due to its attractive features, such as multi-streaming and multihoming,
the SCTP is often expected to be an alternative protocol
for TCP and UDP. In the original SCTP standard, the secondary path
is mainly regarded as a redundancy. Recently, most of researches
have focused on extending the SCTP to enable a host to send its
packets to a destination over multiple paths simultaneously. In order
to transfer packets concurrently over the multiple paths, the SCTP
should be well designed to avoid unnecessary fast retransmission
and the mis-estimation of congestion window size through the paths.
Therefore, we propose an Enhanced Cooperative ACK SCTP (ECASCTP)
to improve the path recovery efficiency of multi-homed host
which is under concurrent multiple transfer mode. We evaluated the
performance of our proposed scheme using ns-2 simulation in terms
of cwnd variation, path recovery time, and goodput. Our scheme
provides better performance in lossy and path asymmetric networks.
Abstract: The present microfluidic study is emphasizing the flow behavior within a Y shape micro-bifurcation in two similar flow configurations. We report here a numerical and experimental investigation on the velocity profiles evolution and secondary flows, manifested at different Reynolds numbers (Re) and for two different boundary conditions. The experiments are performed using special designed setup based on optical microscopic devices. With this setup, direct visualizations and quantitative measurements of the path-lines are obtained. A Micro-PIV measurement system is used to obtain velocity profiles distributions in a spatial evolution in the main flows domains. The experimental data is compared with numerical simulations performed with commercial computational code FLUENT in a 3D geometry with the same dimensions as the experimental one. The numerical flow patterns are found to be in good agreement with the experimental manifestations.
Abstract: A new dual-fluid concept was studied that could eventually find application for cold-gas propulsion for small space satellites or other constant flow applications. In basic form, the concept uses two different refrigerant working fluids, each having a different saturation vapor pressure. The higher vapor pressure refrigerant remains in the saturation phase and is used to pressurize the lower saturation vapor pressure fluid (the propellant) which remains in the compressed liquid phase. A demonstration thruster concept based on this principle was designed and built to study its operating characteristics. An automotive-type electronic fuel injector was used to meter and deliver the propellant. Ejected propellant mass and momentum were measured for several combinations of refrigerants and hydrocarbon fluids. The thruster has the advantage of delivering relatively large total impulse at low tank pressure within a small volume.
Abstract: This paper aims to initiate an analytical account of the
issues of compliance with economy condition for incentive pay
system application in an enterprise. Economy is considered one of the
conditions for effective incentive pay system application another
condition being the achievement of desired efficiency level of the
incentive pay system application. Bonus pay system is discussed as
an example.
Abstract: Perceptions of quality from both designers and users
perspective have now stretched beyond the traditional usability,
incorporating abstract and subjective concepts. This has led to a shift
in human computer interaction research communities- focus; a shift
that focuses on achieving user experience (UX) by not only fulfilling
conventional usability needs but also those that go beyond them. The
term UX, although widely spread and given significant importance,
lacks consensus in its unified definition. In this paper, we survey
various UX definitions and modeling frameworks and examine them
as the foundation for proposing a UX evolution lifecycle framework
for understanding UX in detail. In the proposed framework we identify
the building blocks of UX and discuss how UX evolves in various
phases. The framework can be used as a tool to understand experience
requirements and evaluate them, resulting in better UX design and
hence improved user satisfaction.
Abstract: Covering-based rough sets is an extension of rough
sets and it is based on a covering instead of a partition of the
universe. Therefore it is more powerful in describing some practical
problems than rough sets. However, by extending the rough sets,
covering-based rough sets can increase the roughness of each model
in recognizing objects. How to obtain better approximations from
the models of a covering-based rough sets is an important issue.
In this paper, two concepts, determinate elements and indeterminate
elements in a universe, are proposed and given precise definitions
respectively. This research makes a reasonable refinement of the
covering-element from a new viewpoint. And the refinement may
generate better approximations of covering-based rough sets models.
To prove the theory above, it is applied to eight major coveringbased
rough sets models which are adapted from other literature.
The result is, in all these models, the lower approximation increases
effectively. Correspondingly, in all models, the upper approximation
decreases with exceptions of two models in some special situations.
Therefore, the roughness of recognizing objects is reduced. This
research provides a new approach to the study and application of
covering-based rough sets.
Abstract: In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.
Abstract: Photoselective plastic films with thermic properties
are now available so that greenhouses clad with such plastics exhibit
a higher degree of “Greenhouse Effect” with a consequent increase in
night time temperature. In this study, we investigate the potential
benefits of a range of thermic plastic films used as greenhouse cover
materials on the vegetative and reproductive growth and development
of Iceberg lettuce (Lactuca sativa L). Transplants were grown under
thermic films and destructively harvested 4, 5, and 6 weeks after
transplanting. Thermic films can increase night temperatures up to 2
⁰C reducing the wide fluctuation in greenhouse temperature during
winter compared to the standard commercial film and consequently
increased the yield (leaf number, fresh weight, and dry weight) of
lettuce plants. Lettuce plants grown under Clear film respond to cold
stress by the accumulation of secondary products (phenolics, and
flavonoids).
Abstract: The purpose of this paper is to study Database Models
to use them efficiently in E-commerce websites. In this paper we are
going to find a method which can save and retrieve information in Ecommerce
websites. Thus, semantic web applications can work with,
and we are also going to study different technologies of E-commerce
databases and we know that one of the most important deficits in
semantic web is the shortage of semantic data, since most of the
information is still stored in relational databases, we present an
approach to map legacy data stored in relational databases into the
Semantic Web using virtually any modern RDF query language, as
long as it is closed within RDF. To achieve this goal we study XML
structures for relational data bases of old websites and eventually we
will come up one level over XML and look for a map from relational
model (RDM) to RDF. Noting that a large number of semantic webs
get advantage of relational model, opening the ways which can be
converted to XML and RDF in modern systems (semantic web) is
important.
Abstract: Microarray data profiles gene expression on a whole
genome scale, therefore, it provides a good way to study associations
between gene expression and occurrence or progression of cancer.
More and more researchers realized that microarray data is helpful
to predict cancer sample. However, the high dimension of gene
expressions is much larger than the sample size, which makes this
task very difficult. Therefore, how to identify the significant genes
causing cancer becomes emergency and also a hot and hard research
topic. Many feature selection algorithms have been proposed in
the past focusing on improving cancer predictive accuracy at the
expense of ignoring the correlations between the features. In this
work, a novel framework (named by SGS) is presented for stable gene
selection and efficient cancer prediction . The proposed framework
first performs clustering algorithm to find the gene groups where
genes in each group have higher correlation coefficient, and then
selects the significant genes in each group with Bayesian Lasso and
important gene groups with group Lasso, and finally builds prediction
model based on the shrinkage gene space with efficient classification
algorithm (such as, SVM, 1NN, Regression and etc.). Experiment
results on real world data show that the proposed framework often
outperforms the existing feature selection and prediction methods,
say SAM, IG and Lasso-type prediction model.
Abstract: To determine the length of engagement threads of a bolt installed in a tapped part in order to avoid the threads stripping remains a very current problem in the design of the thread assemblies. It does not exist a calculation method formalized for the cases where the bolt is screwed directly in a ductile material. In this article, we study the behavior of the threads stripping of a loaded assembly by using a modelling by finite elements and a rupture criterion by damage. This modelling enables us to study the different parameters likely to influence the behavior of this bolted connection. We study in particular, the influence of couple of materials constituting the connection, of the bolt-s diameter and the geometrical characteristics of the tapped part, like the external diameter and the length of engagement threads. We established an experiments design to know the most significant parameters. That enables us to propose a simple expression making possible to calculate the resistance of the threads whatever the metallic materials of the bolt and the tapped part. We carried out stripping tests in order to validate our model. The estimated results are very close to those obtained by the tests.
Abstract: Names are important in many societies, even in technologically oriented ones which use e.g. ID systems to identify individual people. Names such as surnames are the most important as they are used in many processes, such as identifying of people and genealogical research. On the other hand variation of names can be a major problem for the identification and search for people, e.g. web search or security reasons. Name matching presumes a-priori that the recorded name written in one alphabet reflects the phonetic identity of two samples or some transcription error in copying a previously recorded name. We add to this the lode that the two names imply the same person. This paper describes name variations and some basic description of various name matching algorithms developed to overcome name variation and to find reasonable variants of names which can be used to further increasing mismatches for record linkage and name search. The implementation contains algorithms for computing a range of fuzzy matching based on different types of algorithms, e.g. composite and hybrid methods and allowing us to test and measure algorithms for accuracy. NYSIIS, LIG2 and Phonex have been shown to perform well and provided sufficient flexibility to be included in the linkage/matching process for optimising name searching.
Abstract: This paper presents a hybrid association control
scheme that can maintain load balancing among access points in the
wireless LANs and can satisfy the quality of service requirements of
the multimedia traffic applications. The proposed model is
mathematically described as a linear programming model. Simulation
study and analysis were conducted in order to demonstrate the
performance of the proposed hybrid load balancing and association
control scheme. Simulation results shows that the proposed scheme
outperforms the other schemes in term of the percentage of blocking
and the quality of the data transfer rate providing to the multimedia
and real-time applications.
Abstract: Recently, grid computing has been widely focused on
the science, industry, and business fields, which are required a vast
amount of computing. Grid computing is to provide the environment
that many nodes (i.e., many computers) are connected with each
other through a local/global network and it is available for many
users. In the environment, to achieve data processing among nodes
for any applications, each node executes mutual authentication by
using certificates which published from the Certificate Authority
(for short, CA). However, if a failure or fault has occurred in the
CA, any new certificates cannot be published from the CA. As
a result, a new node cannot participate in the gird environment.
In this paper, an off-the-shelf scheme for dependable grid systems
using virtualization techniques is proposed and its implementation is
verified. The proposed approach using the virtualization techniques
is to restart an application, e.g., the CA, if it has failed. The system
can tolerate a failure or fault if it has occurred in the CA. Since
the proposed scheme is implemented at the application level easily,
the cost of its implementation by the system builder hardly takes
compared it with other methods. Simulation results show that the
CA in the system can recover from its failure or fault.