Abstract: Stormwater wetlands have been mainly designed in an
empirical approach for water quality improvement, with little
quantitative understanding of the internal microbial processes. This
study investigated into heterotrophic bacterial production rate,
heterotrophic bacterial mineralization percentage, and algal biomass
in hypertrophic and eutrophic surface flow stormwater wetlands.
Compared to a nearby wood leachate treatment wetland, the
stormwater wetlands had much higher chlorophyll-a concentrations.
The eutrophic stormwater wetland had improved water quality,
whereas the hypertrophic stormwater wetland had degraded water
quality. Heterotrophic bacterial activities in water were limited in the
stormwater wetlands due to competition of algal growth for nutrients.
The relative contribution of biofilms to the overall heterotrophic
activities was higher in the stormwater wetlands than that in the wood
leachate treatment wetland.
Abstract: At present, the tendency to implement the conditionbased
maintenance (CBM), which allows the optimization of the
expenses for equipment monitoring, is more and more evident; also,
the transformer substations with remote monitoring are increasingly
used. This paper reviews all the advantages of the on-line monitoring
and presents an equipment for on-line monitoring of bushings, which
is the own contribution of specialists who are the authors of this
paper. The paper presents a study of the temperature field, using the
finite element method. For carrying out this study, the 3D modelling
of the above mentioned bushing was performed. The analysis study is
done taking into account the extreme thermal stresses, focusing at the
level of the first cooling wing section of the ceramic insulator. This
fact enables to justify the tanδ variation in time, depending on the
transformer loading and the environmental conditions. With a view
to reducing the variation of dielectric losses in bushing insulation, the
use of ferrofuids instead of mineral oils is proposed.
Abstract: Ovshinsky initiated scientific research in the field of
amorphous and disordered materials that continues to this day. The
Ovshinsky Effect where the resistance of thin GST films is
significantly reduced upon the application of low voltage is of
fundamental importance in phase-change - random access memory
(PC-RAM) devices.GST stands for GdSbTe chalcogenide type
glasses.However, the Ovshinsky Effect is not without controversy.
Ovshinsky thought the resistance of GST films is reduced by the
redistribution of charge carriers; whereas, others at that time including
many PC-RAM researchers today argue that the GST resistance
changes because the GST amorphous state is transformed to the
crystalline state by melting, the heat supplied by external heaters. In
this controversy, quantum mechanics (QM) asserts the heat capacity of
GST films vanishes, and therefore melting cannot occur as the heat
supplied cannot be conserved by an increase in GST film
temperature.By precluding melting, QM re-opens the controversy
between the melting and charge carrier mechanisms. Supporting
analysis is presented to show that instead of increasing GST film
temperature, conservation proceeds by the QED induced creation of
photons within the GST film, the QED photons confined by TIR. QED
stands for quantum electrodynamics and TIR for total internal
reflection. The TIR confinement of QED photons is enhanced by the
fact the absorbedheat energy absorbed in the GST film is concentrated
in the TIR mode because of their high surface to volume ratio. The
QED photons having Planck energy beyond the ultraviolet produce
excitons by the photoelectric effect, the electrons and holes of which
reduce the GST film resistance.
Abstract: There is a growing body of evidence to support the
proposition of product take back for remanufacturing particularly
within the context of Extended Producer Responsibility (EPR).
Remanufacturing however presents challenges unlike that of
traditional manufacturing environments due to its high levels of
uncertainty which may further distract organizations from
considering its potential benefits. This paper presents a novel
modeling approach for evaluating the uncertainty of part failures
within the remanufacturing process and its impact on economic and
environmental performance measures. This paper presents both the
theoretical modeling approach and an example of its use in
application.
Abstract: Adhesion strength of exterior or interior coating of
steel pipes is too important. Increasing of coating adhesion on
surfaces can increase the life time of coating, safety factor of
transmitting line pipe and decreasing the rate of corrosion and costs.
Preparation of steel pipe surfaces before doing the coating process is
done by shot and grit blasting. This is a mechanical way to do it.
Some effective parameters on that process, are particle size of
abrasives, distance to surface, rate of abrasive flow, abrasive physical
properties, shapes, selection of abrasive, kind of machine and its
power, standard of surface cleanness degree, roughness, time of
blasting and weather humidity. This search intended to find some
better conditions which improve the surface preparation, adhesion
strength and corrosion resistance of coating. So, this paper has
studied the effect of varying abrasive flow rate, changing the
abrasive particle size, time of surface blasting on steel surface
roughness and over blasting on it by using the centrifugal blasting
machine. After preparation of numbers of steel samples (according to
API 5L X52) and applying epoxy powder coating on them, to
compare strength adhesion of coating by Pull-Off test. The results
have shown that, increasing the abrasive particles size and flow rate,
can increase the steel surface roughness and coating adhesion
strength but increasing the blasting time can do surface over blasting
and increasing surface temperature and hardness too, change,
decreasing steel surface roughness and coating adhesion strength.
Abstract: This paper presents the utilizing of ferroelectric
material on antenna application. There are two different ferroelectric
had been used on the proposed antennas which include of Barium
Strontium Titanate (BST) and Bismuth Titanate (BiT), suitable for
Access Points operating in the WLAN IEEE 802.11 b/g and WiMAX
IEEE 802.16 within the range of 2.3 GHz to 2.5 GHz application.
BST, which had been tested to own a dielectric constant of εr = 15
while BiT has a dielectric constant that higher than BST which is εr =
21 and both materials are in rectangular shaped. The influence of
various parameters on antenna characteristics were investigated
extensively using commercial electromagnetic simulations software
by Communication Simulation Technology (CST). From theoretical
analysis and simulation results, it was demonstrated that ferroelectric
material used have not only improved the directive emission but also
enhanced the radiation efficiency.
Abstract: Extracellular ubiquitin in vivo effect on regenerative liver cells and liver histoarchitectonics has been studied. Experiments were performed on mature female white rats. Partial hepatectomy was made using the modified method of Higgins and Anderson. Standard histopathological assessment of liver tissue was used. Proliferative activity of hepatocytes was analyzed by colchicine mitotic index and immunohistochemical staining on ki67. We have found that regardless of number of injections and dose of extracellular ubiquitin liver histology has not been changed, so at tissue level no effect was observed. In vivo double injection of ubiquitin significantly decreases the mitotic activity at 32 hour point after partial hepatectomy. Thus, we can conclude that in vivo injected extracellular ubiquitin inhibits proliferative activity of hepatocytes in partially hepatectomyzed rats.
Abstract: This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.
Abstract: Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.
Abstract: There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.
Abstract: For maintenance of a spine stability during the
postoperative period a transpedicular fixing of its elements is often
used. Usually the transpedicular systems are formed of rods which as
a result form a design of the frame type, fastening by screws to
vertebras. Such design should be rigid and perceive loadings
operating from the spine without essential deformations. From the
perfection point of view of known designs their stress
whole, and each of elements, in particular is of interest. In this study
the modeling of the transpedicular screw is performed and
estimation of its deformations taking into account interaction with a
vertebra body having variable structure is made.
Abstract: Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Abstract: According to the density of the chips, designers are
trying to put so any facilities of computational and storage on single
chips. Along with the complexity of computational and storage
circuits, the designing, testing and debugging become more and more
complex and expensive. So, hardware design will be built by using
very high speed hardware description language, which is more
efficient and cost effective. This paper will focus on the
implementation of 32-bit ALU design based on Verilog hardware
description language. Adder and subtracter operate correctly on both
unsigned and positive numbers. In ALU, addition takes most of the
time if it uses the ripple-carry adder. The general strategy for
designing fast adders is to reduce the time required to form carry
signals. Adders that use this principle are called carry look- ahead
adder. The carry look-ahead adder is to be designed with combination
of 4-bit adders. The syntax of Verilog HDL is similar to the C
programming language. This paper proposes a unified approach to
ALU design in which both simulation and formal verification can
co-exist.
Abstract: In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.
Abstract: In designing river intakes and diversion structures, it is paramount that the sediments entering the intake are minimized or, if possible, completely separated. Due to high water velocity, sediments can significantly damage hydraulic structures especially when mechanical equipment like pumps and turbines are used. This subsequently results in wasting water, electricity and further costs. Therefore, it is prudent to investigate and analyze the performance of lateral intakes affected by sediment control structures. Laboratory experiments, despite their vast potential and benefits, can face certain limitations and challenges. Some of these include: limitations in equipment and facilities, space constraints, equipment errors including lack of adequate precision or mal-operation, and finally, human error. Research has shown that in order to achieve the ultimate goal of intake structure design – which is to design longlasting and proficient structures – the best combination of sediment control structures (such as sill and submerged vanes) along with parameters that increase their performance (such as diversion angle and location) should be determined. Cost, difficulty of execution and environmental impacts should also be included in evaluating the optimal design. This solution can then be applied to similar problems in the future. Subsequently, the model used to arrive at the optimal design requires high level of accuracy and precision in order to avoid improper design and execution of projects. Process of creating and executing the design should be as comprehensive and applicable as possible. Therefore, it is important that influential parameters and vital criteria is fully understood and applied at all stages of choosing the optimal design. In this article, influential parameters on optimal performance of the intake, advantages and disadvantages, and efficiency of a given design are studied. Then, a multi-criterion decision matrix is utilized to choose the optimal model that can be used to determine the proper parameters in constructing the intake.
Abstract: Noise contamination in a magnetic resonance (MR)
image could occur during acquisition, storage, and transmission in
which effective filtering is required to avoid repeating the MR
procedure. In this paper, an iterative asymmetrical triangle fuzzy
filter with moving average center (ATMAVi filter) is used to reduce
different levels of salt and pepper noise in a brain MR image. Besides
visual inspection on filtered images, the mean squared error (MSE) is
used as an objective measurement. When compared with the median
filter, simulation results indicate that the ATMAVi filter is effective
especially for filtering a higher level noise (such as noise density =
0.45) using a smaller window size (such as 3x3) when operated
iteratively or using a larger window size (such as 5x5) when operated
non-iteratively.
Abstract: In this study, an analysis has been performed for
heat and mass transfer of a steady laminar boundary-layer flow
of a viscous flow past a nonlinearly stretching sheet.
Parameters n, Ec, k0, Sc represent the dominance of the
nonlinearly effect, viscous effect, radiation effect and mass
transfer effect which have presented in governing equations,
respectively. The similarity transformation and the
finite-difference method have been used to analyze the present
problem.
Abstract: This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.
Abstract: Global competitiveness has recently become the
biggest concern of both manufacturing and service companies.
Electronic commerce, as a key technology enables the firms to reach
all the potential consumers from all over the world. In this study, we
have presented commonly used electronic payment systems, and then
we have shown the evaluation of these systems in respect to different
criteria. The payment systems which are included in this research are
the credit card, the virtual credit card, the electronic money, the
mobile payment, the credit transfer and the debit instruments. We
have realized a systematic comparison of these systems in respect to
three main criteria: Technical, economical and social. We have
conducted a fuzzy multi-criteria decision making procedure to deal
with the multi-attribute nature of the problem. The subjectiveness
and imprecision of the evaluation process are modeled using
triangular fuzzy numbers.
Abstract: Intelligence tests are series of tasks designed to measure the capacity to make abstractions, to learn, and to deal with novel situations. Testing of the visual abilities of the shape understanding system (SUS) is performed based on the visual intelligence tests. In this paper the progressive matrices tests are formulated as tasks given to SUS. These tests require good visual problem solving abilities of the human subject. SUS solves these tests by performing complex visual reasoning transforming the visual forms (tests) into the string forms. The experiment proved that the proposed method, which is part of the SUS visual understanding abilities, can solve a test that is very difficult for human subject.