Abstract: This paper presents a study of the Taguchi design
application to optimize surface quality in damper inserted end milling
operation. Maintaining good surface quality usually involves
additional manufacturing cost or loss of productivity. The Taguchi
design is an efficient and effective experimental method in which a
response variable can be optimized, given various factors, using
fewer resources than a factorial design. This Study included spindle
speed, feed rate, and depth of cut as control factors, usage of different
tools in the same specification, which introduced tool condition and
dimensional variability. An orthogonal array of L9(3^4)was used;
ANOVA analyses were carried out to identify the significant factors
affecting surface roughness, and the optimal cutting combination was
determined by seeking the best surface roughness (response) and
signal-to-noise ratio. Finally, confirmation tests verified that the
Taguchi design was successful in optimizing milling parameters for
surface roughness.
Abstract: Shadoo protein (Sho) was described in 2003 as the newest member of Prion protein superfamily [1]. Sho has similar structural motifs like prion protein (PrP) that is known for its central role in transmissible spongiform enchephalopathies. Although a great number of functions have been proposed, the exact physiological function of PrP is not known yet. Investigation of the function and localization of Sho may help us to understand the function of the Prion protein superfamily. Analyzing the subcellular localization of YFP-tagged forms of Sho, we detected the protein in the plasma membrane and in the nucleus of various cell lines. To reveal the localization of the endogenous protein we generated antibodies against Shadoo as well as employed commercially available anti-Shadoo antibodies: i) EG62 anti-mouse Shadoo antibody generated by Eurogentec Ltd.; ii) S-12 anti-human Shadoo antibody by Santa Cruz Biotechnology Inc.; iii) R-12 anti-mouse Shadoo antibody by Santa Cruz Biotechnology Inc.; iv) SPRN antibody against human Shadoo by Abgent Inc. We carried out immunocytochemistry on non-transfected HeLa, Zpl 2-1, Zw 3-5, GT1-1, GT1-7 and SHSY5Y cells as well as on YFP-Sho, Sho-YFP, and YFP-GPI transfected HeLa cells. Their specificity (in antibody-peptide competition assay) and co-localization (with the YFP signal) were assessed.
Abstract: Previous studies have shown that there are arguments
regarding the reliability and validity of the Ashworth and Modified
Ashworth Scale towards evaluating patients diagnosed with upper
limb disorders. These evaluations depended on the raters’ experiences.
This initiated us to develop an upper limb disorder part-task trainer
that is able to simulate consistent upper limb disorders, such as
spasticity and rigidity signs, based on the Modified Ashworth Scale to
improve the variability occurring between raters and intra-raters
themselves. By providing consistent signs, novice therapists would be
able to increase training frequency and exposure towards various
levels of signs. A total of 22 physiotherapists and occupational
therapists participated in the study. The majority of the therapists
agreed that with current therapy education, they still face problems
with inter-raters and intra-raters variability (strongly agree 54%; n =
12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The
therapists strongly agreed (72%; n = 16/22) that therapy trainees
needed to increase their frequency of training; therefore believe that
our initiative to develop an upper limb disorder training tool will help
in improving the clinical education field (strongly agree and agree
63%; n = 14/22).
Abstract: Graph coloring is an important problem in computer
science and many algorithms are known for obtaining reasonably
good solutions in polynomial time. One method of comparing
different algorithms is to test them on a set of standard graphs where
the optimal solution is already known. This investigation analyzes a
set of 50 well known graph coloring instances according to a set of
complexity measures. These instances come from a variety of
sources some representing actual applications of graph coloring
(register allocation) and others (mycieleski and leighton graphs) that
are theoretically designed to be difficult to solve. The size of the
graphs ranged from ranged from a low of 11 variables to a high of
864 variables. The method used to solve the coloring problem was
the square of the adjacency (i.e., correlation) matrix. The results
show that the most difficult graphs to solve were the leighton and the
queen graphs. Complexity measures such as density, mobility,
deviation from uniform color class size and number of block
diagonal zeros are calculated for each graph. The results showed that
the most difficult problems have low mobility (in the range of .2-.5)
and relatively little deviation from uniform color class size.
Abstract: This research was to evaluate a technical feasibility of
making single-layer experimental particleboard panels from bamboo
waste (Dendrocalamus asper Backer) by converting bamboo into
strips, which are used to make laminated bamboo furniture. Variable
factors were density (600, 700 and 800 kg/m3) and temperature of
condition (25, 40 and 55 °C). The experimental panels were tested for
their physical and mechanical properties including modulus of
elasticity (MOE), modulus of rupture (MOR), internal bonding
strength (IB), screw holding strength (SH) and thickness swelling
values according to the procedures defined by Japanese Industrial
Standard (JIS). The test result of mechanical properties showed that
the MOR, MOE and IB values were not in the set criteria, except the
MOR values at the density of 700 kg/m3 at 25 °C and at the density
of 800 kg/m3 at 25 and 40 °C, the IB values at the density of 600
kg/m3, at 40 °C, and at the density of 800 kg/m3 at 55 °C. The SH
values had the test result according to the set standard, except with
the density of 600 kg/m3, at 40 and 55 °C. Conclusively, a valuable
renewable biomass, bamboo waste could be used to manufacture
boards.
Abstract: IEEE has designed 802.11i protocol to address the
security issues in wireless local area networks. Formal analysis is
important to ensure that the protocols work properly without having
to resort to tedious testing and debugging which can only show the
presence of errors, never their absence. In this paper, we present
the formal verification of an abstract protocol model of 802.11i.
We translate the 802.11i protocol into the Strand Space Model and
then prove the authentication property of the resulting model using
the Strand Space formalism. The intruder in our model is imbued
with powerful capabilities and repercussions to possible attacks are
evaluated. Our analysis proves that the authentication of 802.11i is
not compromised in the presented model. We further demonstrate
how changes in our model will yield a successful man-in-the-middle
attack.
Abstract: Lately, an interest has grown greatly in the usages of
RFID in an un-presidential applications. It is shown in the adaptation
of major software companies such as Microsoft, IBM, and Oracle
the RFID capabilities in their major software products. For example
Microsoft SharePoints 2010 workflow is now fully compatible with
RFID platform. In addition, Microsoft BizTalk server is also capable
of all RFID sensors data acquisition. This will lead to applications
that required high bit rate, long range and a multimedia content in
nature. Higher frequencies of operation have been designated for
RFID tags, among them are the 2.45 and 5.8 GHz. The higher the
frequency means higher range, and higher bit rate, but the drawback
is the greater cost. In this paper we present a single layer, low
profile patch antenna operates at 5.8 GHz with pure resistive input
impedance of 50 and close to directive radiation. Also, we propose
a modification to the design in order to improve the operation band
width from 8.7 to 13.8
Abstract: Scalability poses a severe threat to the existing
DRAM technology. The capacitors that are used for storing and
sensing charge in DRAM are generally not scaled beyond 42nm.
This is because; the capacitors must be sufficiently large for reliable
sensing and charge storage mechanism. This leaves DRAM memory
scaling in jeopardy, as charge sensing and storage mechanisms
become extremely difficult. In this paper we provide an overview of
the potential and the possibilities of using Phase Change Memory
(PCM) as an alternative for the existing DRAM technology. The
main challenges that we encounter in using PCM are, the limited
endurance, high access latencies, and higher dynamic energy
consumption than that of the conventional DRAM. We then provide
an overview of various methods, which can be employed to
overcome these drawbacks. Hybrid memories involving both PCM
and DRAM can be used, to achieve good tradeoffs in access latency
and storage density. We conclude by presenting, the results of these
methods that makes PCM a potential replacement for the current
DRAM technology.
Abstract: Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.
Abstract: In the artificial intelligence field, knowledge
representation and reasoning are important areas for intelligent
systems, especially knowledge base systems and expert systems.
Knowledge representation Methods has an important role in
designing the systems. There have been many models for knowledge
such as semantic networks, conceptual graphs, and neural networks.
These models are useful tools to design intelligent systems. However,
they are not suitable to represent knowledge in the domains of reality
applications. In this paper, new models for knowledge representation
called computational networks will be presented. They have been
used in designing some knowledge base systems in education for
solving problems such as the system that supports studying
knowledge and solving analytic geometry problems, the program for
studying and solving problems in Plane Geometry, the program for
solving problems about alternating current in physics.
Abstract: The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.
Abstract: A minimal complexity version of component mode
synthesis is presented that requires simplified computer
programming, but still provides adequate accuracy for modeling
lower eigenproperties of large structures and their transient
responses. The novelty is that a structural separation into components
is done along a plane/surface that exhibits rigid-like behavior, thus
only normal modes of each component is sufficient to use, without
computing any constraint, attachment, or residual-attachment modes.
The approach requires only such input information as a few (lower)
natural frequencies and corresponding undamped normal modes of
each component. A novel technique is shown for formulation of
equations of motion, where a double transformation to generalized
coordinates is employed and formulation of nonproportional damping
matrix in generalized coordinates is shown.
Abstract: Due to the three- dimensional flow pattern interacting with bed material, the process of local scour around bridge piers is complex. Modeling 3D flow field and scour hole evolution around a bridge pier is more feasible nowadays because the computational cost and computational time have significantly decreased. In order to evaluate local flow and scouring around a bridge pier, a completely three-dimensional numerical model, SSIIM program, was used. The model solves 3-D Navier-Stokes equations and a bed load conservation equation. The model was applied to simulate local flow and scouring around a bridge pier in a large natural river with four piers. Computation for 1 day of flood condition was carried out to predict the maximum local scour depth. The results show that the SSIIM program can be used efficiently for simulating the scouring in natural rivers. The results also showed that among the various turbulence models, the k-ω model gives more reasonable results.
Abstract: This experiment was carried out to study the effect of
AMF, drought stress and phosphorus on physiological growth indices of basil at Iran using by a split-plot design with three replications.
The main-plot factor included: two levels of irrigation regimes (control=no drought stress and irrigation after 80 evaporation=
drought stress condition) while the sub-plot factors included
phosphorus (0, 35 and 70 kg/ha) and application and non-application of Glomus fasciculatum. The results showed that total dry matter
(TDM), life area index (LAI), relative growth rate (RGR) and crop growth rate (CGR) were all highly significantly different among the
phosphorus, whereas drought stress had effect of practical
significance on TDM, LAI, RGR and CGR. The results also showed that the highest TDM, LAI, RGR and CGR were obtained from
application of Glomus fasciculatum under no-drought condition.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: In this study, the effects of machining parameters on
specific energy during surface grinding of 6061Al-SiC35P
composites are investigated. Vol% of SiC, feed and depth of cut were
chosen as process variables. The power needed for the calculation of
the specific energy is measured from the two watt meter method.
Experiments are conducted using standard RSM design called Central
composite design (CCD). A second order response surface model was
developed for specific energy. The results identify the significant
influence factors to minimize the specific energy. The confirmation
results demonstrate the practicability and effectiveness of the
proposed approach.
Abstract: This study proposes a basic molecular formula for all
proteins. A total of 10,739 proteins belonging to 9 different protein
groups classified on the basis of their functions were selected
randomly. They included enzymes, storage proteins, hormones,
signalling proteins, structural proteins, transport proteins,
immunoglobulins or antibodies, motor proteins and receptor proteins.
After obtaining the protein molecular formula using the ProtParam
tool, the H/C, N/C, O/C, and S/C ratios were determined for each
randomly selected sample. In this case, H, N, O, and S coefficients
were specified per carbon atom. Surprisingly, the results
demonstrated that H, N, O, and S coefficients for all 10,739 proteins
are similar and highly correlated. This study demonstrates that
despite differences in the structure and function, all known proteins
have a similar basic molecular formula CnH1.58 ± 0.015nN0.28 ± 0.005nO0.30
± 0.007nS0.01 ± 0.002n. The total correlation between all coefficients was
found to be 0.9999.
Abstract: In the present article, a new method has been developed to enhance the application of equipment monitoring, which in turn results in improving condition-based maintenance economic impact in an automobile parts manufacturing factory. This study also describes how an effective software with a simple database can be utilized to achieve cost-effective improvements in maintenance performance. The most important results of this project are indicated here: 1. 63% reduction in direct and indirect maintenance costs. 2. Creating a proper database to analyse failures. 3. Creating a method to control system performance and develop it to similar systems. 4. Designing a software to analyse database and consequently create technical knowledge to face unusual condition of the system. Moreover, the results of this study have shown that the concept and philosophy of maintenance has not been understood in most Iranian industries. Thus, more investment is strongly required to improve maintenance conditions.
Abstract: The main aim is to perform mutational analysis of CTLA4 gene Exon 1 in SLE patients. A total of 61 SLE patients fulfilling “American College of Rheumatology (ACR) criteria" and 61 controls were enrolled in this study. The region of CTLA4 gene exon 1 was amplified by using Step-down PCR technique. Extracted DNA of band 354 bp was sequenced to analyze mutations in the exon-1 of CTLA-4 gene. Further, protein sequences were identified from nucleotide sequences of CTLA4 Exon 1 by using Expasy software and through Blast P software it was found that CTLA4 protein sequences of Pakistani SLE patients were similar to that of Chinese SLE population. No variations were found after patients sequences were compared with that of the control sequence. Furthermore it was found that CTLA4 protein sequences of Pakistani SLE patients were similar to that of Chinese SLE population. Thus CTLA4 gene may not be responsible for an autoimmune disease SLE.
Abstract: The study investigated the hydrophilic to hydrophobic
transition of modified polyacrylamide hydrogel with the inclusion of
N-isopropylacrylamide (NIAM). The modification was done by
mimicking micellar polymerization, which resulted in better
arrangement of NIAM chains in the polyacrylamide network. The
degree of NIAM arrangement is described by NH number. The
hydrophilic to hydrophobic transition was measured through the
partition coefficient, K, of Orange II and Methylene Blue in hydrogel
and in water. These dyes were chosen as a model for solutes with
different degree of hydrophobicity. The study showed that the
hydrogel with higher NH values resulted in better solubility of both
dyes. Moreover, in temperature above the lower critical solution
temperature (LCST) of Poly(N-isopropylacrylamide) (PNIAM)also
caused the collapse of NIPAM chains which results in a more
hydrophobic environment that increases the solubility of Methylene
Blue and decreases the solubility of Orange II in the hydrogels with
NIPAM present.