Abstract: This paper presents the results of an experimental
investigation carried out to evaluate the shrinkage of High Strength
Concrete. High Strength Concrete is made by partially replacement of
cement by flyash and silica fume. The shrinkage of High Strength
Concrete has been studied using the different types of coarse and fine
aggregates i.e. Sandstone and Granite of 12.5 mm size and Yamuna
and Badarpur Sand. The Mix proportion of concrete is 1:0.8:2.2 with
water cement ratio as 0.30. Superplasticizer dose @ of 2% by weight
of cement is added to achieve the required degree of workability in
terms of compaction factor.
From the test results of the above investigation it can be concluded
that the shrinkage strain of High Strength Concrete increases with
age. The shrinkage strain of concrete with replacement of cement by
10% of Flyash and Silica fume respectively at various ages are more
(6 to 10%) than the shrinkage strain of concrete without Flyash and
Silica fume. The shrinkage strain of concrete with Badarpur sand as
Fine aggregate at 90 days is slightly less (10%) than that of concrete
with Yamuna Sand. Further, the shrinkage strain of concrete with
Granite as Coarse aggregate at 90 days is slightly less (6 to 7%) than
that of concrete with Sand stone as aggregate of same size. The
shrinkage strain of High Strength Concrete is also compared with that
of normal strength concrete. Test results show that the shrinkage
strain of high strength concrete is less than that of normal strength
concrete.
Abstract: Symbolic Circuit Analysis (SCA) is a technique used
to generate the symbolic expression of a network. It has become a
well-established technique in circuit analysis and design. The
symbolic expression of networks offers excellent way to perform
frequency response analysis, sensitivity computation, stability
measurements, performance optimization, and fault diagnosis. Many
approaches have been proposed in the area of SCA offering different
features and capabilities. Numerical Interpolation methods are very
common in this context, especially by using the Fast Fourier
Transform (FFT). The aim of this paper is to present a method for
SCA that depends on the use of Wavelet Transform (WT) as a
mathematical tool to generate the symbolic expression for large
circuits with minimizing the analysis time by reducing the number of
computations.
Abstract: Missing data is a persistent problem in almost all
areas of empirical research. The missing data must be treated very
carefully, as data plays a fundamental role in every analysis.
Improper treatment can distort the analysis or generate biased results.
In this paper, we compare and contrast various imputation techniques
on missing data sets and make an empirical evaluation of these
methods so as to construct quality software models. Our empirical
study is based on NASA-s two public dataset. KC4 and KC1. The
actual data sets of 125 cases and 2107 cases respectively, without
any missing values were considered. The data set is used to create
Missing at Random (MAR) data Listwise Deletion(LD), Mean
Substitution(MS), Interpolation, Regression with an error term and
Expectation-Maximization (EM) approaches were used to compare
the effects of the various techniques.
Abstract: In this research work, poly (acrylonitrile-butadienestyrene)/
polypropylene (ABS/PP) blends were processed by melt
compounding in a twin-screw extruder. Upgrading of the thermal
characteristics of the obtained materials was attempted by the
incorporation of organically modified montmorillonite (OMMT), as
well as, by the addition of two types of compatibilizers;
polypropylene grafted with maleic anhydride (PP-g-MAH) and ABS
grafted with maleic anhydride (ABS-g-MAH). The effect of the
above treatments was investigated separately and in combination.
Increasing the PP content in ABS matrix seems to increase the
thermal stability of their blend and the glass transition temperature
(Tg) of SAN phase of ABS. From the other part, the addition of ABS
to PP promotes the formation of its β-phase, which is maximum at 30
wt% ABS concentration, and increases the crystallization temperature
(Tc) of PP. In addition, it increases the crystallization rate of PP.The
β-phase of PP in ABS/PP blends is reduced by the addition of
compatibilizers or/and organoclay reinforcement. The incorporation
of compatibilizers increases the thermal stability of PP and reduces
its melting (ΔΗm) and crystallization (ΔΗc) enthalpies. Furthermore it
decreases slightly the Tgs of PP and SAN phases of ABS/PP blends.
Regarding the storage modulus of the ABS/PP blends, it presents a
change in their behavior at about 10°C and return to their initial
behavior at ~110°C. The incorporation of OMMT to no compatibilized
and compatibilized ABS/PP blends enhances their storage modulus.
Abstract: The advances in multimedia and networking technologies
have created opportunities for Internet pirates, who can easily
copy multimedia contents and illegally distribute them on the Internet,
thus violating the legal rights of content owners. This paper describes
how a simple and well-known watermarking procedure based on a
spread spectrum method and a watermark recovery by correlation can
be improved to effectively and adaptively protect MPEG-2 videos
distributed on the Internet. In fact, the procedure, in its simplest
form, is vulnerable to a variety of attacks. However, its security
and robustness have been increased, and its behavior has been
made adaptive with respect to the video terminals used to open
the videos and the network transactions carried out to deliver them
to buyers. In fact, such an adaptive behavior enables the proposed
procedure to efficiently embed watermarks, and this characteristic
makes the procedure well suited to be exploited in web contexts,
where watermarks usually generated from fingerprinting codes have
to be inserted into the distributed videos “on the fly", i.e. during the
purchase web transactions.
Abstract: The recent growth of using multimedia transmission
over wireless communication systems, have challenges to protect the
data from lost due to wireless channel effect. Images are corrupted
due to the noise and fading when transmitted over wireless channel,
in wireless channel the image is transmitted block by block, Due to
severe fading, entire image blocks can be damaged. The aim of this
paper comes out from need to enhance the digital images at the
wireless receiver side. Proposed Boundary Interpolation (BI)
Algorithm using wavelet, have been adapted here used to
reconstruction the lost block in the image at the receiver depend on
the correlation between the lost block and its neighbors. New
Proposed technique by using Boundary Interpolation (BI) Algorithm
using wavelet with Pixel interleaver has been implemented. Pixel
interleaver work on distribute the pixel to new pixel position of
original image before transmitting the image. The block lost through
wireless channel is only effects individual pixel. The lost pixels at the
receiver side can be recovered by using Boundary Interpolation (BI)
Algorithm using wavelet. The results showed that the New proposed
algorithm boundary interpolation (BI) using wavelet with pixel
interleaver is better in term of MSE and PSNR.
Abstract: Ground-level tropospheric ozone is one of the air
pollutants of most concern. It is mainly produced by photochemical
processes involving nitrogen oxides and volatile organic compounds
in the lower parts of the atmosphere. Ozone levels become
particularly high in regions close to high ozone precursor emissions
and during summer, when stagnant meteorological conditions with
high insolation and high temperatures are common.
In this work, some results of a study about urban ozone
distribution patterns in the city of Badajoz, which is the largest and
most industrialized city in Extremadura region (southwest Spain) are
shown. Fourteen sampling campaigns, at least one per month, were
carried out to measure ambient air ozone concentrations, during
periods that were selected according to favourable conditions to
ozone production, using an automatic portable analyzer.
Later, to evaluate the ozone distribution at the city, the measured
ozone data were analyzed using geostatistical techniques. Thus, first,
during the exploratory analysis of data, it was revealed that they were
distributed normally, which is a desirable property for the subsequent
stages of the geostatistical study. Secondly, during the structural
analysis of data, theoretical spherical models provided the best fit for
all monthly experimental variograms. The parameters of these
variograms (sill, range and nugget) revealed that the maximum
distance of spatial dependence is between 302-790 m and the
variable, air ozone concentration, is not evenly distributed in reduced
distances. Finally, predictive ozone maps were derived for all points
of the experimental study area, by use of geostatistical algorithms
(kriging). High prediction accuracy was obtained in all cases as
cross-validation showed. Useful information for hazard assessment
was also provided when probability maps, based on kriging
interpolation and kriging standard deviation, were produced.
Abstract: In this paper we canvass three case studies of unique
research partnerships between universities and schools in the wider
community. In doing so, we consider those areas of indeterminate
zones of professional practice explored by academics in their
research activities within the wider community. We discuss three
cases: an artist-in-residence program designed to engage primary
school children with new understandings about local Indigenous
Australian issues in their pedagogical and physical landscapes; an
assessment of pedagogical concerns in relation to the use of physical
space in classrooms; and the pedagogical underpinnings of a
costumed museum school program. In doing so, we engage issues of
research as playing an integral part in the development,
implementation and maintenance of academic engagements with
wider community issues.
Abstract: The question of interethnic and interreligious conflicts
in ex-Yugoslavia receives much attention within the framework of
the international context created after 1991 because of the impact of
these conflicts on the security and the stability of the region of
Balkans and of Europe.
This paper focuses on the rationales leading to the declaration of
independence by Kosovo according to ethnic and religious criteria
and analyzes why these same rationales were not applied in Bosnia
and Herzegovina. The approach undertaken aims at comparatively
examining the cases of Kosovo, and Bosnia and Herzegovina. At the
same time, it aims at understanding the political decision making of
the international community in the case of Kosovo. Specifically, was
this a good political decision for the security and the stability of the
region of Balkans, of Europe, or even for global security and
stability?
This research starts with an overview on the European security
framework post 1991, paying particular attention to Kosovo and
Bosnia and Herzegovina. It then presents the theoretical and
methodological framework and compares the representative cases.
Using the constructivism issue and the comparative methodology, it
arrives at the results of the study. An important issue of the paper is
the thesis that this event modifies the principles of international law
and creates dangerous precedents for regional stability in the
Balkans.
Abstract: The mobile users with Laptops need to have an
efficient access to i.e. their home personal data or to the Internet from
any place in the world, regardless of their location or point of
attachment, especially while roaming outside the home subnet. An
efficient interpretation of packet losses problem that is encountered
from this roaming is to the centric of all aspects in this work, to be
over-highlighted. The main previous works, such as BER-systems,
Amigos, and ns-2 implementation that are considered to be in
conjunction with that problem under study are reviewed and
discussed. Their drawbacks and limitations, of stopping only at
monitoring, and not to provide an actual solution for eliminating or
even restricting these losses, are mentioned. Besides that, the
framework around which we built a Triple-R sequence as a costeffective
solution to eliminate the packet losses and bridge the gap
between subnets, an area that until now has been largely neglected, is
presented. The results show that, in addition to the high bit error rate
of wireless mobile networks, mainly the low efficiency of mobile-IP
registration procedure is a direct cause of these packet losses.
Furthermore, the output of packet losses interpretation resulted an
illustrated triangle of the registration process. This triangle should be
further researched and analyzed in our future work.
Abstract: In this paper we have proposed three and two
stage still gray scale image compressor based on BTC. In our
schemes, we have employed a combination of four techniques
to reduce the bit rate. They are quad tree segmentation, bit
plane omission, bit plane coding using 32 visual patterns and
interpolative bit plane coding. The experimental results show
that the proposed schemes achieve an average bit rate of 0.46
bits per pixel (bpp) for standard gray scale images with an
average PSNR value of 30.25, which is better than the results
from the exiting similar methods based on BTC.
Abstract: The aim of this paper is to examine factors related to system environment (namely, system quality and vendor support) that influences ERP implementation success in Iranian companies. Implementation success is identified using user satisfaction and organizational impact perspective. The study adopts the survey questionnaire approach to collect empirical data. The questionnaire was distributed to ERP users and a total of 384 responses were used for analysis. The results illustrated that both system quality and vendor support have significant effect on ERP implementation success. This implies that companies must ensure they source for the best available system and a vendor that is dependable, reliable and trustworthy.
Abstract: Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.
Abstract: The study of non-equilibrium systems has attracted
increasing interest in recent years, mainly due to the lack of
theoretical frameworks, unlike their equilibrium counterparts.
Studying the steady state and/or simple systems is thus one of the
main interests. Hence in this work we have focused our attention on
the driven lattice gas model (DLG model) consisting of interacting
particles subject to an external field E. The dynamics of the system
are given by hopping of particles to nearby empty sites with rates
biased for jumps in the direction of E. Having used small two
dimensional systems of DLG model, the stochastic properties at nonequilibrium
steady state were analytically studied. To understand the
non-equilibrium phenomena, we have applied the analytic approach
via master equation to calculate probability function and analyze
violation of detailed balance in term of the fluctuation-dissipation
theorem. Monte Carlo simulations have been performed to validate
the analytic results.
Abstract: This paper aims to present knowledge management for solving economic problem and poverty in Thai community. A community in Thailand is studied as a case study for master plan or social and economic plan which derived form the research people conducted by themselves in their community. The result shows that community uses knowledge management in recording income and expense, analyzing their consumption, and then systematic planning of the production, distribution and consumption in the community. Besides, community enterprises, that people create as the by-products of master plan, can facilitate diverse economic activities which are able to reduce economic problem and poverty. The knowledge that people gain from solving their problem through building community enterprises are both tacit and explicit knowledge. Four styles of knowledge conversion: socialization,externalization, combination and internalization, are used. Besides, knowledge sharing inside the organization, between organizations and its environment are found. Keywordsknowledge management, community enterprise, Thailand.
Abstract: Nowadays data backup format doesn-t cease to appear raising so the anxiety on their accessibility and their perpetuity. XML is one of the most promising formats to guarantee the integrity of data. This article suggests while showing one thing man can do with XML. Indeed XML will help to create a data backup model. The main task will consist in defining an application in JAVA able to convert information of a database in XML format and restore them later.
Abstract: This paper reports a new and accurate method for load-flow solution of radial distribution networks with minimum data preparation. The node and branch numbering need not to be sequential like other available methods. The proposed method does not need sending-node, receiving-node and branch numbers if these are sequential. The proposed method uses the simple equation to compute the voltage magnitude and has the capability to handle composite load modelling. The proposed method uses the set of nodes of feeder, lateral(s) and sub lateral(s). The effectiveness of the proposed method is compared with other methods using two examples. The detailed load-flow results for different kind of load-modellings are also presented.
Abstract: In this paper three different approaches for person
verification and identification, i.e. by means of fingerprints, face and
voice recognition, are studied. Face recognition uses parts-based
representation methods and a manifold learning approach. The
assessment criterion is recognition accuracy. The techniques under
investigation are: a) Local Non-negative Matrix Factorization
(LNMF); b) Independent Components Analysis (ICA); c) NMF with
sparse constraints (NMFsc); d) Locality Preserving Projections
(Laplacianfaces). Fingerprint detection was approached by classical
minutiae (small graphical patterns) matching through image
segmentation by using a structural approach and a neural network as
decision block. As to voice / speaker recognition, melodic cepstral
and delta delta mel cepstral analysis were used as main methods, in
order to construct a supervised speaker-dependent voice recognition
system. The final decision (e.g. “accept-reject" for a verification
task) is taken by using a majority voting technique applied to the
three biometrics. The preliminary results, obtained for medium
databases of fingerprints, faces and voice recordings, indicate the
feasibility of our study and an overall recognition precision (about
92%) permitting the utilization of our system for a future complex
biometric card.
Abstract: In this paper, a self starting two step continuous block
hybrid formulae (CBHF) with four Off-step points is developed using
collocation and interpolation procedures. The CBHF is then used to
produce multiple numerical integrators which are of uniform order
and are assembled into a single block matrix equation. These
equations are simultaneously applied to provide the approximate
solution for the stiff ordinary differential equations. The order of
accuracy and stability of the block method is discussed and its
accuracy is established numerically.
Abstract: This article presents the results of a study conducted to identify operational risks for information systems (IS) with service-oriented architecture (SOA). Analysis of current approaches to risk and system error classifications revealed that the system error classes were never used for SOA risk estimation. Additionally system error classes are not normallyexperimentally supported with realenterprise error data. Through the study several categories of various existing error classifications systems are applied and three new error categories with sub-categories are identified. As a part of operational risks a new error classification scheme is proposed for SOA applications. It is based on errors of real information systems which are service providers for application with service-oriented architecture. The proposed classification approach has been used to classify SOA system errors for two different enterprises (oil and gas industry, metal and mining industry). In addition we have conducted a research to identify possible losses from operational risks.