Abstract: This paper studies the application of a variety of
sawdust materials in the production of lightweight insulating bricks.
First, the mineralogical and chemical composition of clays was determined. Next, ceramic bricks were fabricated with different
quantities of materials (3–6 and 9 wt. % for sawdust, 65 wt. % for grey clay, 24–27 and 30 wt. % for yellow clay and 2 wt% of tuff).
These bricks were fired at 800 and 950 °C. The effect of adding this sawdust on the technological behaviour of the brick was assessed by
drying and firing shrinkage, water absorption, porosity, bulk density
and compressive strength. The results have shown that the optimum
sintering temperature is 950 °C. Below this temperature, at 950 °C,
increased open porosity was observed, which decreased the compressive strength of the bricks. Based on the results obtained, the
optimum amounts of waste were 9 wt. % sawdust of eucalyptus, 24 wt. % shaping moisture and 1.6 particle size diameter. These percentages produced bricks whose mechanical properties were
suitable for use as secondary raw materials in ceramic brick
production.
Abstract: The evolution of current modeling specifications gives rise to the problem of generating automated test cases from a variety of application tools. Past endeavours on behavioural testing of UML statecharts have not systematically leveraged the potential of existing graph theory for testing of objects. Therefore there exists a need for a simple, tool-independent, and effective method for automatic test generation. An architecture, codenamed ACUTE-J (Automated stateChart Unit Testing Engine for Java), for automating the unit test generation process is presented. A sequential approach for converting UML statechart diagrams to JUnit test classes is described, with the application of existing graph theory. Research byproducts such as a universal XML Schema and API for statechart-driven testing are also proposed. The result from a Java implementation of ACUTE-J is discussed in brief. The Chinese Postman algorithm is utilised as an illustration for a run-through of the ACUTE-J architecture.
Abstract: A highly optimized implementation of binary mixture
diffusion with no initial bulk velocity on graphics processors is
presented. The lattice Boltzmann model is employed for simulating
the binary diffusion of oxygen and nitrogen into each other with
different initial concentration distributions. Simulations have been
performed using the latest proposed lattice Boltzmann model that
satisfies both the indifferentiability principle and the H-theorem for
multi-component gas mixtures. Contemporary numerical
optimization techniques such as memory alignment and increasing
the multiprocessor occupancy are exploited along with some novel
optimization strategies to enhance the computational performance on
graphics processors using the C for CUDA programming language.
Speedup of more than two orders of magnitude over single-core
processors is achieved on a variety of Graphical Processing Unit
(GPU) devices ranging from conventional graphics cards to
advanced, high-end GPUs, while the numerical results are in
excellent agreement with the available analytical and numerical data
in the literature.
Abstract: The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.
Abstract: The equivalence class subset algorithm is a powerful
tool for solving a wide variety of constraint satisfaction problems and
is based on the use of a decision function which has a very high but
not perfect accuracy. Perfect accuracy is not required in the decision
function as even a suboptimal solution contains valuable information
that can be used to help find an optimal solution. In the hardest
problems, the decision function can break down leading to a
suboptimal solution where there are more equivalence classes than
are necessary and which can be viewed as a mixture of good decision
and bad decisions. By choosing a subset of the decisions made in
reaching a suboptimal solution an iterative technique can lead to an
optimal solution, using series of steadily improved suboptimal
solutions. The goal is to reach an optimal solution as quickly as
possible. Various techniques for choosing the decision subset are
evaluated.
Abstract: We introduce a novel approach to measuring how
humans learn based on techniques from information theory and
apply it to the oriental game of Go. We show that the total amount
of information observable in human strategies, called the strategic
information, remains constant for populations of players of differing
skill levels for well studied patterns of play. This is despite the very
large amount of knowledge required to progress from the recreational
players at one end of our spectrum to the very best and most
experienced players in the world at the other and is in contrast to
the idea that having more knowledge might imply more 'certainty'
in what move to play next. We show this is true for very local
up to medium sized board patterns, across a variety of different
moves using 80,000 game records. Consequences for theoretical and
practical AI are outlined.
Abstract: In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.
Abstract: Alzheimer is known as the loss of mental functions
such as thinking, memory, and reasoning that is severe enough to
interfere with a person's daily functioning. The appearance of
Alzheimer Disease symptoms (AD) are resulted based on which part
of the brain has a variety of infection or damage. In this case, the
MRI is the best biomedical instrumentation can be ever used to
discover the AD existence. Therefore, this paper proposed a fusion
method to distinguish between the normal and (AD) MRIs. In this
combined method around 27 MRIs collected from Jordanian
Hospitals are analyzed based on the use of Low pass -morphological
filters to get the extracted statistical outputs through intensity
histogram to be employed by the descriptive box plot. Also, the
artificial neural network (ANN) is applied to test the performance of
this approach. Finally, the obtained result of t-test with confidence
accuracy (95%) has compared with classification accuracy of ANN
(100 %). The robust of the developed method can be considered
effectively to diagnose and determine the type of AD image.
Abstract: The acid rain causes change in pH level of soil it is
directly influence on root and leaf growth. Yield of the crop was
reduced if acidity of soil is more. Acid rain seeps into the earth and
poisons plants and trees by dissolving toxic substances in the soil,
such as aluminum, which get absorbed by the roots. In present
investigation, effect of acid rain on crop Vigna radiata was studied.
The effect of acid rain on change in soil fertility was detected in
which pH of control sample was 6.5 and pH of 1% H2SO4 and 1%
HNO3 were 3.5. Nitrogen nitrate in soil was high in 1% HNO3 treated
soil & Control sample. Ammonium nitrogen in soil was low in 1%
HNO3 & H2SO4 treated soil. Ammonium nitrogen was medium in
control and other samples. The effect of acid rain on seed
germination on 3rd day of germination control sample growth was
6.1cm with plumule 0.001% HNO3 & 0.001% H2SO4 was 5.5cm
with plumule and 8cm with plumule. On 10th day fungal growth was
observed in 1% and 0.1% H2SO4 concentrations when all plants were
dead. The effect of acid rain on crop productivity was investigated on
3rd day roots were developed in plants. On 12th day Vigna radiata
showed more growth in 0.1% HNO3 and 0.1% H2SO4 treated plants
as compare to control plants. On 20th day development of
discoloration of plant pigments were observed on acid treated plants
leaves. On 34th day Vigna radiata showed flower in 0.1% HNO3,
0.01% HNO3 and 0.01% H2SO4treated plants and no flowers were
observed on control plants. On 42th day 0.1% HNO3, 0.01% HNO
and 0.01% H2SO4 treated Vigna radiata variety and control plants
were showed seeds on plants. In Vigna radiate variety 0.1%, 0.01%
HNO3, 0.01% H2SO4treated plants were dead on 46th day and fungal
growth was observed. The toxicological study was carried out on
Vigna radiata plants exposed to 1% HNO3 cells were damaged more
than 1% H2SO4. Leaf sections exposed to 0.001% HNO3 & H2SO4
showed less damaged of cells and pigmentation observed in entire
slide when compare with control plant.
Abstract: The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.
Abstract: Effective estimation of just noticeable distortion (JND) for images is helpful to increase the efficiency of a compression algorithm in which both the statistical redundancy and the perceptual redundancy should be accurately removed. In this paper, we design a DCT-based model for estimating JND profiles of color images. Based on a mathematical model of measuring the base detection threshold for each DCT coefficient in the color component of color images, the luminance masking adjustment, the contrast masking adjustment, and the cross masking adjustment are utilized for luminance component, and the variance-based masking adjustment based on the coefficient variation in the block is proposed for chrominance components. In order to verify the proposed model, the JND estimator is incorporated into the conventional JPEG coder to improve the compression performance. A subjective and fair viewing test is designed to evaluate the visual quality of the coding image under the specified viewing condition. The simulation results show that the JPEG coder integrated with the proposed DCT-based JND model gives better coding bit rates at visually lossless quality for a variety of color images.
Abstract: This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.
Abstract: A numerical simulation of vortex-induced vibration of
a 2-dimensional elastic circular cylinder with two degree of freedom
under the uniform flow is calculated when Reynolds is 200.
2-dimensional incompressible Navier-Stokes equations are solved
with the space-time finite element method, the equation of the cylinder
motion is solved with the new explicit integral method and the mesh
renew is achieved by the spring moving mesh technology. Considering
vortex-induced vibration with the low reduced damping parameter, the
variety trends of the lift coefficient, the drag coefficient, the
displacement of cylinder are analyzed under different oscillating
frequencies of cylinder. The phenomena of locked-in, beat and
phases-witch were captured successfully. The evolution of vortex
shedding from the cylinder with time is discussed. There are very
similar trends in characteristics between the results of the one degree
of freedom cylinder model and that of the two degree of freedom
cylinder model. The streamwise vibrations have a certain effect on the
lateral vibrations and their characteristics.
Abstract: Manufacturing companies are facing a broad variety
of challenges caused by a dynamic production environment. To
succeed in such an environment, it is crucial to minimize the loss of
time required to trigger the adaptation process of a company-s
production structures. This paper presents an approach for the
continuous monitoring of production structures by neurologic
principles. It enhances classical monitoring concepts, which are
principally focused on reactive strategies, and enables companies to
act proactively. Thereby, strategic aspects regarding the
harmonization of certain life cycles are integrated into the decision
making process for triggering the reconfiguration process of the
production structure.
Abstract: Since the presentation of the backpropagation algorithm, a vast variety of improvements of the technique for training a feed forward neural networks have been proposed. This article focuses on two classes of acceleration techniques, one is known as Local Adaptive Techniques that are based on weightspecific only, such as the temporal behavior of the partial derivative of the current weight. The other, known as Dynamic Adaptation Methods, which dynamically adapts the momentum factors, α, and learning rate, η, with respect to the iteration number or gradient. Some of most popular learning algorithms are described. These techniques have been implemented and tested on several problems and measured in terms of gradient and error function evaluation, and percentage of success. Numerical evidence shows that these techniques improve the convergence of the Backpropagation algorithm.
Abstract: Various security APIs (Application Programming
Interfaces) are being used in a variety of application areas requiring
the information security function. However, these standards are not
compatible, and the developer must use those APIs selectively
depending on the application environment or the programming
language. To resolve this problem, we propose the standard draft of
the information security component, while SSL (Secure Sockets
Layer) using the confidentiality and integrity component interface has
been implemented to verify validity of the standard proposal. The
implemented SSL uses the lower-level SSL component when
establishing the RMI (Remote Method Invocation) communication
between components, as if the security algorithm had been
implemented by adding one more layer on the TCP/IP.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called MorfeoSMC, enabling the development of mobility applications and services according to a channel model based on Services Oriented Architecture (SOA) principles. It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation of mobile Web contents. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering, as well as to exploit these semantic annotations in a novel user profile-aware content adaptation process. Semantic Web content adaptation is a way of adding value to and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: Computer-mediated communication technologies which provide for virtual communities have typically evolved in a cross-dichotomous manner, such that technical constructs of the technology have evolved independently from the social environment of the community. The present paper analyses some limitations of current implementations of computer-mediated communication technology that are implied by such a dichotomy, and discusses their inhibiting effects on possible developments of virtual communities. A Socio-Technical Indicator Model is introduced that utilizes integrated feedback to describe, simulate and operationalise increasing representativeness within a variety of structurally and parametrically diverse systems. In illustration, applications of the model are briefly described for financial markets and for eco-systems. A detailed application is then provided to resolve the aforementioned technical limitations of moderation on the evolution of virtual communities. The application parameterises virtual communities to function as self-transforming social-technical systems which are sensitive to emergent and shifting community values as products of on-going communications within the collective.
Abstract: Software reuse can be considered as the most realistic
and promising way to improve software engineering productivity and
quality. Automated assistance for software reuse involves the
representation, classification, retrieval and adaptation of components.
The representation and retrieval of components are important to
software reuse in Component-Based on Software Development
(CBSD). However, current industrial component models mainly focus
on the implement techniques and ignore the semantic information
about component, so it is difficult to retrieve the components that
satisfy user-s requirements. This paper presents a method of business
component retrieval based on specification matching to solve the
software reuse of enterprise information system. First, a business
component model oriented reuse is proposed. In our model, the
business data type is represented as sign data type based on XML,
which can express the variable business data type that can describe the
variety of business operations. Based on this model, we propose
specification match relationships in two levels: business operation
level and business component level. In business operation level, we
use input business data types, output business data types and the
taxonomy of business operations evaluate the similarity between
business operations. In the business component level, we propose five
specification matches between business components. To retrieval
reusable business components, we propose the measure of similarity
degrees to calculate the similarities between business components.
Finally, a business component retrieval command like SQL is
proposed to help user to retrieve approximate business components
from component repository.
Abstract: The effect of flooding can be a serious problem for
wheat farmers, even at dry land condition. Amount of flooding
damage depends on duration flooding, developmental stage, wheat
type and variety. Therefore as a factorial experiment in randomized
complete design based on winter bread wheat cultivars (Pishtaz,
Marvdasht, Shiraz, Zarin, Shahriar, C-81-4, Sardari, Agosta seed,
FGS and Azar2) at stages (Non- flooding stress, flooding at tillering
and stem elongation stages for 15 days) carried out in Faculty of
Agriculture, Razi University, Kermanshah, Iran. During flooding,
soil environment of plant roots were water saturated. Analysis of
variance showed that flooding had a significant effect on the number
of grains per spike, grain weight per spike and a grain weight. Hence
flooding reduces the number of grain per spike between 27.1 to 42.5
percent, grain weight per spike between 34.7 to 54.4 percent and
single grain weight between 12.1 to 15.1 percent. Effects of flooding
at the tillering stage reduced higher than stem elongation stage on
studied traits. The result also showed that flooding at tillering stage
delayed spikelet primordial and floret. Between wheat cultivars was
significant for traits, but were different reactions. "Shiraz", "Zarin"
and "Shahriar" had the most no. grain per spike, but "Zarin" and
"Sardari" had the most grain weight per spike and single grain
weight, respectively. Also, interaction between start of flooding and
cultivar was significant.