Abstract: Reinforced Concrete (RC) structures strengthened
with fiber reinforced polymer (FRP) lack in thermal resistance under
elevated temperatures in the event of fire. This phenomenon led to
the lining of strengthened concrete with thin high performance
cementitious composites (THPCC) to protect the substrate against
elevated temperature. Elevated temperature effects on THPCC, based
on different cementitious materials have been studied in the past but
high-alumina cement (HAC)-based THPCC have not been well
characterized. This research study will focus on the THPCC based on
HAC replaced by 60%, 70%, 80% and 85% of ground granulated
blast furnace slag (GGBS). Samples were evaluated by the
measurement of their mechanical strength (28 & 56 days of curing)
after exposed to 400°C, 600°C and 28°C of room temperature for
comparison and corroborated by their microstructure study. Results
showed that among all mixtures, the mix containing only HAC
showed the highest compressive strength after exposed to 600°C as
compared to other mixtures. However, the tensile strength of THPCC
made of HAC and 60% GGBS content was comparable to the
THPCC with HAC only after exposed to 600°C. Field emission
scanning electron microscopy (FESEM) images of THPCC
accompanying Energy Dispersive X-ray (EDX) microanalysis
revealed that the microstructure deteriorated considerably after
exposure to elevated temperatures which led to the decrease in
mechanical strength.
Abstract: Alpfa-fetoprotein and its fragments may be an important vehicle for targeted delivery of radionuclides to the tumor. We investigated the effect of conditions on the labeling of biologically active synthetic peptide based on the (F-afp) with technetium-99m. The influence of the nature of the buffer solution, pH, concentration of reductant, concentration of the peptide and the reaction temperature on the yield of labeling was examined. As a result, the following optimal conditions for labeling of (F-afp) are found: pH 8.5 (phosphate and bicarbonate buffers) and pH from 1.7 to 7.0 (citrate buffer). The reaction proceeds with sufficient yield at room temperature for 30 min at the concentration of SnCl2 and (Fafp) (F-afp) is to be less than 10 mkg/ml and 25 mkg/ml, respectively. Investigations of the test drug accumulation in the tumor cells of human breast cancer were carried out. Results can be assumed that the in vivo study of the (F-afp) in experimental tumor lesions will show concentrations sufficient for imaging these lesions by SPECT.
Abstract: One of the best ways for achievement of conventional
vehicle changing to hybrid case is trustworthy simulation result and
using of driving realities. For this object, in this paper, at first sevendegree-
of-freedom dynamical model of vehicle will be shown. Then
by using of statically model of engine, gear box, clutch, differential,
electrical machine and battery, the hybrid automobile modeling will
be down and forward simulation of vehicle for pedals to wheels
power transformation will be obtained. Then by design of a fuzzy
controller and using the proper rule base, fuel economy and
regenerative braking will be marked. Finally a series of
MATLAB/SIMULINK simulation results will be proved the
effectiveness of proposed structure.
Abstract: We investigate efficient spreading codes for transmitter based techniques of code division multiple access (CDMA) systems. The channel is considered to be known at the transmitter which is usual in a time division duplex (TDD) system where the channel is assumed to be the same on uplink and downlink. For such a TDD/CDMA system, both bitwise and blockwise multiuser transmission schemes are taken up where complexity is transferred to the transmitter side so that the receiver has minimum complexity. Different spreading codes are considered at the transmitter to spread the signal efficiently over the entire spectrum. The bit error rate (BER) curves portray the efficiency of the codes in presence of multiple access interference (MAI) as well as inter symbol interference (ISI).
Abstract: The prospective analysis is presented as an important tool to identify the most relevant opportunities and needs in research and development from planned interventions in innovation systems. This study chose Phyllanthus niruri, known as "stone break" to describe the knowledge about the specie, by using biotechnological forecasting through the software Vantage Point. It can be seen a considerable increase in studies on Phyllanthus niruri in recent years and that there are patents about this plant since twenty-five years ago. India was the country that most carried out research on the specie, showing interest, mainly in studies of hepatoprotection, antioxidant and anti-cancer activities. Brazil is in the second place, with special interest for anti-tumor studies. Given the identification of the Brazilian groups that exploit the species it is possible to mediate partnerships and cooperation aiming to help on the implementing of the Program of Herbal medicines (phytotherapics) in Brazil.
Abstract: We report a computational study of the spreading
dynamics of a viral infection in a complex (scale-free) network. The
final epidemic size distribution (FESD) was found to be unimodal or
bimodal depending on the value of the basic reproductive
number R0 . The FESDs occurred on time-scales long enough for
intermediate-time epidemic size distributions (IESDs) to be important
for control measures. The usefulness of R0 for deciding on the
timeliness and intensity of control measures was found to be limited
by the multimodal nature of the IESDs and by its inability to inform
on the speed at which the infection spreads through the population. A
reduction of the transmission probability at the hubs of the scale-free
network decreased the occurrence of the larger-sized epidemic events
of the multimodal distributions. For effective epidemic control, an
early reduction in transmission at the index cell and its neighbors was
essential.
Abstract: The bonding configuration and the heat of adsorption
of a furfural molecule on the Pd(111) surface were determined by ab
initio density-functional-theory calculations. The dynamics of pure
liquid water, the liquid-solid interface formed by liquid water and the
Pd(111) surface, as well as furfural at the water-Pd interface, were
investigated by ab initio molecular dynamics simulations at finite
temperatures. Calculations and simulations suggest that the bonding
configurations at the water-Pd interface promote decarbonylation of
furfural.
Abstract: Data clustering is an important data exploration technique
with many applications in data mining. We present an enhanced
version of the well known single link clustering algorithm. We will
refer to this algorithm as DCBOR. The proposed algorithm alleviates
the chain effect by removing the outliers from the given dataset.
So this algorithm provides outlier detection and data clustering
simultaneously. This algorithm does not need to update the distance
matrix, since the algorithm depends on merging the most k-nearest
objects in one step and the cluster continues grow as long as possible
under specified condition. So the algorithm consists of two phases;
at the first phase, it removes the outliers from the input dataset. At
the second phase, it performs the clustering process. This algorithm
discovers clusters of different shapes, sizes, densities and requires
only one input parameter; this parameter represents a threshold for
outlier points. The value of the input parameter is ranging from 0 to
1. The algorithm supports the user in determining an appropriate
value for it. We have tested this algorithm on different datasets
contain outlier and connecting clusters by chain of density points,
and the algorithm discovers the correct clusters. The results of
our experiments demonstrate the effectiveness and the efficiency of
DCBOR.
Abstract: In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Abstract: This paper presents a study of the Taguchi design
application to optimize surface quality in damper inserted end milling
operation. Maintaining good surface quality usually involves
additional manufacturing cost or loss of productivity. The Taguchi
design is an efficient and effective experimental method in which a
response variable can be optimized, given various factors, using
fewer resources than a factorial design. This Study included spindle
speed, feed rate, and depth of cut as control factors, usage of different
tools in the same specification, which introduced tool condition and
dimensional variability. An orthogonal array of L9(3^4)was used;
ANOVA analyses were carried out to identify the significant factors
affecting surface roughness, and the optimal cutting combination was
determined by seeking the best surface roughness (response) and
signal-to-noise ratio. Finally, confirmation tests verified that the
Taguchi design was successful in optimizing milling parameters for
surface roughness.
Abstract: Shadoo protein (Sho) was described in 2003 as the newest member of Prion protein superfamily [1]. Sho has similar structural motifs like prion protein (PrP) that is known for its central role in transmissible spongiform enchephalopathies. Although a great number of functions have been proposed, the exact physiological function of PrP is not known yet. Investigation of the function and localization of Sho may help us to understand the function of the Prion protein superfamily. Analyzing the subcellular localization of YFP-tagged forms of Sho, we detected the protein in the plasma membrane and in the nucleus of various cell lines. To reveal the localization of the endogenous protein we generated antibodies against Shadoo as well as employed commercially available anti-Shadoo antibodies: i) EG62 anti-mouse Shadoo antibody generated by Eurogentec Ltd.; ii) S-12 anti-human Shadoo antibody by Santa Cruz Biotechnology Inc.; iii) R-12 anti-mouse Shadoo antibody by Santa Cruz Biotechnology Inc.; iv) SPRN antibody against human Shadoo by Abgent Inc. We carried out immunocytochemistry on non-transfected HeLa, Zpl 2-1, Zw 3-5, GT1-1, GT1-7 and SHSY5Y cells as well as on YFP-Sho, Sho-YFP, and YFP-GPI transfected HeLa cells. Their specificity (in antibody-peptide competition assay) and co-localization (with the YFP signal) were assessed.
Abstract: Previous studies have shown that there are arguments
regarding the reliability and validity of the Ashworth and Modified
Ashworth Scale towards evaluating patients diagnosed with upper
limb disorders. These evaluations depended on the raters’ experiences.
This initiated us to develop an upper limb disorder part-task trainer
that is able to simulate consistent upper limb disorders, such as
spasticity and rigidity signs, based on the Modified Ashworth Scale to
improve the variability occurring between raters and intra-raters
themselves. By providing consistent signs, novice therapists would be
able to increase training frequency and exposure towards various
levels of signs. A total of 22 physiotherapists and occupational
therapists participated in the study. The majority of the therapists
agreed that with current therapy education, they still face problems
with inter-raters and intra-raters variability (strongly agree 54%; n =
12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The
therapists strongly agreed (72%; n = 16/22) that therapy trainees
needed to increase their frequency of training; therefore believe that
our initiative to develop an upper limb disorder training tool will help
in improving the clinical education field (strongly agree and agree
63%; n = 14/22).
Abstract: Graph coloring is an important problem in computer
science and many algorithms are known for obtaining reasonably
good solutions in polynomial time. One method of comparing
different algorithms is to test them on a set of standard graphs where
the optimal solution is already known. This investigation analyzes a
set of 50 well known graph coloring instances according to a set of
complexity measures. These instances come from a variety of
sources some representing actual applications of graph coloring
(register allocation) and others (mycieleski and leighton graphs) that
are theoretically designed to be difficult to solve. The size of the
graphs ranged from ranged from a low of 11 variables to a high of
864 variables. The method used to solve the coloring problem was
the square of the adjacency (i.e., correlation) matrix. The results
show that the most difficult graphs to solve were the leighton and the
queen graphs. Complexity measures such as density, mobility,
deviation from uniform color class size and number of block
diagonal zeros are calculated for each graph. The results showed that
the most difficult problems have low mobility (in the range of .2-.5)
and relatively little deviation from uniform color class size.
Abstract: This research was to evaluate a technical feasibility of
making single-layer experimental particleboard panels from bamboo
waste (Dendrocalamus asper Backer) by converting bamboo into
strips, which are used to make laminated bamboo furniture. Variable
factors were density (600, 700 and 800 kg/m3) and temperature of
condition (25, 40 and 55 °C). The experimental panels were tested for
their physical and mechanical properties including modulus of
elasticity (MOE), modulus of rupture (MOR), internal bonding
strength (IB), screw holding strength (SH) and thickness swelling
values according to the procedures defined by Japanese Industrial
Standard (JIS). The test result of mechanical properties showed that
the MOR, MOE and IB values were not in the set criteria, except the
MOR values at the density of 700 kg/m3 at 25 °C and at the density
of 800 kg/m3 at 25 and 40 °C, the IB values at the density of 600
kg/m3, at 40 °C, and at the density of 800 kg/m3 at 55 °C. The SH
values had the test result according to the set standard, except with
the density of 600 kg/m3, at 40 and 55 °C. Conclusively, a valuable
renewable biomass, bamboo waste could be used to manufacture
boards.
Abstract: IEEE has designed 802.11i protocol to address the
security issues in wireless local area networks. Formal analysis is
important to ensure that the protocols work properly without having
to resort to tedious testing and debugging which can only show the
presence of errors, never their absence. In this paper, we present
the formal verification of an abstract protocol model of 802.11i.
We translate the 802.11i protocol into the Strand Space Model and
then prove the authentication property of the resulting model using
the Strand Space formalism. The intruder in our model is imbued
with powerful capabilities and repercussions to possible attacks are
evaluated. Our analysis proves that the authentication of 802.11i is
not compromised in the presented model. We further demonstrate
how changes in our model will yield a successful man-in-the-middle
attack.
Abstract: Lately, an interest has grown greatly in the usages of
RFID in an un-presidential applications. It is shown in the adaptation
of major software companies such as Microsoft, IBM, and Oracle
the RFID capabilities in their major software products. For example
Microsoft SharePoints 2010 workflow is now fully compatible with
RFID platform. In addition, Microsoft BizTalk server is also capable
of all RFID sensors data acquisition. This will lead to applications
that required high bit rate, long range and a multimedia content in
nature. Higher frequencies of operation have been designated for
RFID tags, among them are the 2.45 and 5.8 GHz. The higher the
frequency means higher range, and higher bit rate, but the drawback
is the greater cost. In this paper we present a single layer, low
profile patch antenna operates at 5.8 GHz with pure resistive input
impedance of 50 and close to directive radiation. Also, we propose
a modification to the design in order to improve the operation band
width from 8.7 to 13.8
Abstract: Scalability poses a severe threat to the existing
DRAM technology. The capacitors that are used for storing and
sensing charge in DRAM are generally not scaled beyond 42nm.
This is because; the capacitors must be sufficiently large for reliable
sensing and charge storage mechanism. This leaves DRAM memory
scaling in jeopardy, as charge sensing and storage mechanisms
become extremely difficult. In this paper we provide an overview of
the potential and the possibilities of using Phase Change Memory
(PCM) as an alternative for the existing DRAM technology. The
main challenges that we encounter in using PCM are, the limited
endurance, high access latencies, and higher dynamic energy
consumption than that of the conventional DRAM. We then provide
an overview of various methods, which can be employed to
overcome these drawbacks. Hybrid memories involving both PCM
and DRAM can be used, to achieve good tradeoffs in access latency
and storage density. We conclude by presenting, the results of these
methods that makes PCM a potential replacement for the current
DRAM technology.
Abstract: The study investigated the practices of organisations in Gulf Cooperation Council (GCC) countries with regards to G2C egovernment maturity. It reveals that e-government G2C initiatives in the surveyed countries in particular, and arguably around the world in general, are progressing slowly because of the lack of a trusted and secure medium to authenticate the identities of online users. The authors conclude that national ID schemes will play a major role in helping governments reap the benefits of e-government if the three advanced technologies of smart card, biometrics and public key infrastructure (PKI) are utilised to provide a reliable and trusted authentication medium for e-government services.
Abstract: Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.
Abstract: The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.