Abstract: The dielectric properties and ionic conductivity of
novel "ceramic state" polymer electrolytes for high capacity lithium
battery are characterized by Radio frequency and Microwave
methods in two broad frequency ranges from 50 Hz to 20 KHz and 4
GHz to 40 GHz. This innovative solid polymer electrolyte which is
highly ionic conductive (10-3 S/cm at room temperature) from -40oC
to +150oC can be used in any battery application. Such polymer
exhibits properties more like a ceramic rather than polymer. The
various applied measurement methods produced accurate dielectric
results for comprehensive analysis of electrochemical properties and
ion transportation mechanism of this newly invented polymer
electrolyte. Two techniques and instruments employing air gap
measurement by Capacitance Bridge and in-waveguide measurement
by vector network analyzer are applied to measure the complex
dielectric spectra. The complex dielectric spectra are used to
determine the complex alternating current electrical conductivity and
thus the ionic conductivity.
Abstract: One image is worth more than thousand words.
Images if analyzed can reveal useful information. Low level image
processing deals with the extraction of specific feature from a single
image. Now the question arises: What technique should be used to
extract patterns of very large and detailed image database? The
answer of the question is: “Image Mining”. Image Mining deals with
the extraction of image data relationship, implicit knowledge, and
another pattern from the collection of images or image database. It is
nothing but the extension of Data Mining. In the following paper, not
only we are going to scrutinize the current techniques of image
mining but also present a new technique for mining images using
Genetic Algorithm.
Abstract: This study was carried out to evaluate the nutritional
composition of the African River Prawn (Macrobrachium
vollenhovenii) in relation to Chokor (traditional) and Altona
(improved traditional) drying techniques used in the preservation and
processing of prawns by carrying out proximate composition
analysis. The value obtained for the proximate analysis of Chokor
and Altona smoke dried prawns were; Moisture (14.90% and
15.15%), Dry matter (85.10% and 84.85%), Protein (55.80% and
58.87%), Crude fat (1.95% and 1.98%), Crude fibre (21.40% and
13.11%), Carbohydrate (0.54% and 0.54%) and Ash (19.76% and
15.86%) respectively. The proximate mineral composition of Chokor
and Altona smoke dried prawns were; Calcium (5.66% and 4.20%)
and Phosphorus (9. 22% and 6.34%) respectively. Result shows there
was no loss of nutritional value with respect to Chokor and Altona
drying techniques used in the processing of prawns.
Abstract: Speech enhancement is a long standing problem with
numerous applications like teleconferencing, VoIP, hearing aids and
speech recognition. The motivation behind this research work is to
obtain a clean speech signal of higher quality by applying the optimal
noise cancellation technique. Real-time adaptive filtering algorithms
seem to be the best candidate among all categories of the speech
enhancement methods. In this paper, we propose a speech
enhancement method based on Recursive Least Squares (RLS)
adaptive filter of speech signals. Experiments were performed on
noisy data which was prepared by adding AWGN, Babble and Pink
noise to clean speech samples at -5dB, 0dB, 5dB and 10dB SNR
levels. We then compare the noise cancellation performance of
proposed RLS algorithm with existing NLMS algorithm in terms of
Mean Squared Error (MSE), Signal to Noise ratio (SNR) and SNR
Loss. Based on the performance evaluation, the proposed RLS
algorithm was found to be a better optimal noise cancellation
technique for speech signals.
Abstract: The investigation in the present paper is to obtain
certain types of relations for the well known hypergeometric functions
by employing the technique of fractional derivative and integral.
Abstract: The use of technology in the classroom is an issue that
is constantly evolving. Digital age students learn differently than their
teachers did, so now the teacher should be constantly evolving their
methods and teaching techniques to be more in touch with the
student. In this paper a case study presents how were used some of
these technologies by accompanying a classroom course, this in order
to provide students with a different and innovative experience as their
teacher usually presented the activities to develop. As students
worked in the various activities, they increased their digital skills by
employing unknown tools that helped them in their professional
training. The twenty-first century teacher should consider the use of
Information and Communication Technologies in the classroom
thinking in skills that students of the digital age should possess. It
also takes a brief look at the history of distance education and it is
also highlighted the importance of integrating technology as part of
the student's training.
Abstract: An analysis is carried out to investigate the effect of
magnetic field and heat source on the steady boundary layer flow and
heat transfer of a Casson nanofluid over a vertical cylinder stretching
exponentially along its radial direction. Using a similarity
transformation, the governing mathematical equations, with the
boundary conditions are reduced to a system of coupled, non –linear
ordinary differential equations. The resulting system is solved
numerically by the fourth order Runge – Kutta scheme with shooting
technique. The influence of various physical parameters such as
Reynolds number, Prandtl number, magnetic field, Brownian motion
parameter, thermophoresis parameter, Lewis number and the natural
convection parameter are presented graphically and discussed for non
– dimensional velocity, temperature and nanoparticle volume
fraction. Numerical data for the skin – friction coefficient, local
Nusselt number and the local Sherwood number have been tabulated
for various parametric conditions. It is found that the local Nusselt
number is a decreasing function of Brownian motion parameter Nb
and the thermophoresis parameter Nt.
Abstract: Verification and Validation of Simulated Process
Model is the most important phase of the simulator life cycle.
Evaluation of simulated process models based on Verification and
Validation techniques checks the closeness of each component model
(in a simulated network) with the real system/process with respect to
dynamic behaviour under steady state and transient conditions. The
process of Verification and Validation helps in qualifying the process
simulator for the intended purpose whether it is for providing
comprehensive training or design verification. In general, model
verification is carried out by comparison of simulated component
characteristics with the original requirement to ensure that each step
in the model development process completely incorporates all the
design requirements. Validation testing is performed by comparing
the simulated process parameters to the actual plant process
parameters either in standalone mode or integrated mode.
A Full Scope Replica Operator Training Simulator for PFBR -
Prototype Fast Breeder Reactor has been developed at IGCAR,
Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder
Reactor Simulator) where in the main participants are
engineers/experts belonging to Modeling Team, Process Design and
Instrumentation & Control design team. This paper discusses about
the Verification and Validation process in general, the evaluation
procedure adopted for PFBR operator training Simulator, the
methodology followed for verifying the models, the reference
documents and standards used etc. It details out the importance of
internal validation by design experts, subsequent validation by
external agency consisting of experts from various fields, model
improvement by tuning based on expert’s comments, final
qualification of the simulator for the intended purpose and the
difficulties faced while co-coordinating various activities.
Abstract: This work presents synthesis of α,ω-dithienyl
terminated poly(ethylene glycol) (PEGTh) capable for further chain
extension by either chemical or electrochemical polymeriztion.
PEGTh was characterized by FTIR and 1H-NMR. Further
copolymerization of PEGTh and pyrrole (Py) was performed by
chemical oxidative polymerization using ceric (IV) salt as an oxidant
(PPy-PEGTh). PEG without end group modification was used
directly to prepare copolymers with Py by Ce (IV) salt (PPy-PEG).
Block copolymers with mole ratio of pyrrole to PEGTh (PEG) 50:1
and 10:1 were synthesized. The electrical conductivities of
copolymers PPy-PEGTh and PPy-PEG were determined by four
point probe technique. Influence of the synthetic route and content of
the insulating segment on conductivity and yield of the copolymers
were investigated.
Abstract: The emerging Cognitive Radio is combo of both the
technologies i.e. Radio dynamics and software technology. It involve
wireless system with efficient coding, designing, and making them
artificial intelligent to take the decision according to the surrounding
environment and adopt themselves accordingly, so as to deliver the
best QoS. This is the breakthrough from fixed hardware and fixed
utilization of the spectrum. This software-defined approach of
research is centralized at user-definition and application driven
model, various software method are used for the optimization of the
wireless communication. This paper focused on the Spectrum
allocation technique using genetic algorithm GA to evolve radio,
represented by chromosomes. The chromosomes gene represents the
adjustable parameters in given radio and by using GA, evolving over
the generations, the optimized set of parameters are evolved, as per
the requirement of user and availability of the spectrum, in our
prototype the gene consist of 6 different parameters, and the best set
of parameters are evolved according to the application need and
availability of the spectrum holes and thus maintaining best QoS for
user, simultaneously maintaining licensed user rights. The analyzing
tool Matlab is used for the performance of the prototype.
Abstract: Managing and improving efficiency in the current
highly competitive global automotive industry demands that those
companies adopt leaner and more flexible systems. During the past
20 years the domestic automotive industry in North America has been
focusing on establishing new management strategies in order to meet
market demands. The lean management process also known as
Toyota Manufacturing Process (TPS) or lean manufacturing
encompasses tools and techniques that were established in order to
provide the best quality product with the fastest lead time at the
lowest cost. The following paper presents a study that focused on
improving labor efficiency at one of the Big Three (Ford, GM,
Chrysler LLC) domestic automotive facility in North America. The
objective of the study was to utilize several lean management tools in
order to optimize the efficiency and utilization levels at the “Pre-
Marriage” chassis area in a truck manufacturing and assembly
facility. Utilizing three different lean tools (i.e. Standardization of
work, 7 Wastes, and 5S) this research was able to improve efficiency
by 51%, utilization by 246%, and reduce operations by 14%. The
return on investment calculated based on the improvements made
was 284%.
Abstract: Fast changing knowledge systems on the Internet can
be accessed more efficiently with the help of automatic document
summarization and updating techniques. The aim of multi-document
update summary generation is to construct a summary unfolding the
mainstream of data from a collection of documents based on the
hypothesis that the user has already read a set of previous documents.
In order to provide a lot of semantic information from the documents,
deeper linguistic or semantic analysis of the source documents were
used instead of relying only on document word frequencies to select
important concepts. In order to produce a responsive summary,
meaning oriented structural analysis is needed. To address this issue,
the proposed system presents a document summarization approach
based on sentence annotation with aspects, prepositions and named
entities. Semantic element extraction strategy is used to select
important concepts from documents which are used to generate
enhanced semantic summary.
Abstract: Exact solution of an unsteady flow of elastico-viscous
fluid through a porous media in a tube of spherical cross section
under the influence of constant pressure gradient has been obtained in
this paper. Initially, the flow is generated by a constant pressure
gradient. After attaining the steady state, the pressure gradient is
suddenly withdrawn and the resulting fluid motion in a tube of
spherical cross section by taking into account of the porosity factor of
the bounding surface is investigated. The problem is solved in twostages
the first stage is a steady motion in tube under the influence of
a constant pressure gradient, the second stage concern with an
unsteady motion. The problem is solved employing separation of
variables technique. The results are expressed in terms of a nondimensional
porosity parameter (K) and elastico-viscosity parameter
(β), which depends on the Non-Newtonian coefficient. The flow
parameters are found to be identical with that of Newtonian case as
elastic-viscosity parameter tends to zero and porosity tends to
infinity. It is seen that the effect of elastico-viscosity parameter,
porosity parameter of the bounding surface has significant effect on
the velocity parameter.
Abstract: In the present work, the alloy of Bismuth-lead is
prepared on the basis of percentage of molecular weight 9:1, 5:5 and
1:9 ratios and grown by Zone- Refining Technique under a vacuum
atmosphere. The EDAX of these samples are done and the results are
reported. Micro hardness test has been used as an alternative test for
measuring material’s tensile properties. The effect of temperature and
load on the hardness of the grown alloy has been studied. Further the
comparative studies of work hardening coefficients are reported.
Abstract: Construction cost estimation is one of the most
important aspects of construction project design. For generations, the
process of cost estimating has been manual, time-consuming and
error-prone. This has partly led to most cost estimates to be unclear
and riddled with inaccuracies that at times lead to over- or underestimation
of construction cost. The development of standard set of
measurement rules that are understandable by all those involved in a
construction project, have not totally solved the challenges. Emerging
Building Information Modelling (BIM) technologies can exploit
standard measurement methods to automate cost estimation process
and improve accuracies. This requires standard measurement
methods to be structured in ontological and machine readable format;
so that BIM software packages can easily read them. Most standard
measurement methods are still text-based in textbooks and require
manual editing into tables or Spreadsheet during cost estimation. The
aim of this study is to explore the development of an ontology based
on New Rules of Measurement (NRM) commonly used in the UK for
cost estimation. The methodology adopted is Methontology, one of
the most widely used ontology engineering methodologies. The
challenges in this exploratory study are also reported and
recommendations for future studies proposed.
Abstract: Clustering involves the partitioning of n objects into k
clusters. Many clustering algorithms use hard-partitioning techniques
where each object is assigned to one cluster. In this paper we propose
an overlapping algorithm MCOKE which allows objects to belong to
one or more clusters. The algorithm is different from fuzzy clustering
techniques because objects that overlap are assigned a membership
value of 1 (one) as opposed to a fuzzy membership degree. The
algorithm is also different from other overlapping algorithms that
require a similarity threshold be defined a priori which can be
difficult to determine by novice users.
Abstract: This paper presents the details of a numerical study of
buckling and post buckling behaviour of laminated carbon fiber
reinforced plastic (CFRP) thin-walled cylindrical shell under axial
compression using asymmetric meshing technique (AMT) by
ABAQUS. AMT is considered to be a new perturbation method to
introduce disturbance without changing geometry, boundary
conditions or loading conditions. Asymmetric meshing affects both
predicted buckling load and buckling mode shapes. Cylindrical shell
having lay-up orientation [0^o/+45^o/-45^o/0^o] with radius to thickness
ratio (R/t) equal to 265 and length to radius ratio (L/R) equal to 1.5 is
analysed numerically. A series of numerical simulations
(experiments) are carried out with symmetric and asymmetric
meshing to study the effect of asymmetric meshing on predicted
buckling behaviour. Asymmetric meshing technique is employed in
both axial direction and circumferential direction separately using
two different methods, first by changing the shell element size and
varying the total number elements, and second by varying the shell
element size and keeping total number of elements constant. The
results of linear analysis (Eigenvalue analysis) and non-linear
analysis (Riks analysis) using symmetric meshing agree well with
analytical results. The results of numerical analysis are presented in
form of non-dimensional load factor, which is the ratio of buckling
load using asymmetric meshing technique to buckling load using
symmetric meshing technique. Using AMT, load factor has about 2%
variation for linear eigenvalue analysis and about 2% variation for
non-linear Riks analysis. The behaviour of load end-shortening curve
for pre-buckling is same for both symmetric and asymmetric meshing
but for asymmetric meshing curve behaviour in post-buckling
becomes extraordinarily complex. The major conclusions are:
different methods of AMT have small influence on predicted
buckling load and significant influence on load displacement curve
behaviour in post buckling; AMT in axial direction and AMT in
circumferential direction have different influence on buckling load
and load displacement curve in post-buckling.
Abstract: Image search engines rely on the surrounding textual
keywords for the retrieval of images. It is a tedious work for the
search engines like Google and Bing to interpret the user’s search
intention and to provide the desired results. The recent researches
also state that the Google image search engines do not work well on
all the images. Consequently, this leads to the emergence of efficient
image retrieval technique, which interprets the user’s search intention
and shows the desired results. In order to accomplish this task, an
efficient image re-ranking framework is required. Sequentially, to
provide best image retrieval, the new image re-ranking framework is
experimented in this paper. The implemented new image re-ranking
framework provides best image retrieval from the image dataset by
making use of re-ranking of retrieved images that is based on the
user’s desired images. This is experimented in two sections. One is
offline section and other is online section. In offline section, the reranking
framework studies differently (reference classes or Semantic
Spaces) for diverse user query keywords. The semantic signatures get
generated by combining the textual and visual features of the images.
In the online section, images are re-ranked by comparing the
semantic signatures that are obtained from the reference classes with
the user specified image query keywords. This re-ranking
methodology will increases the retrieval image efficiency and the
result will be effective to the user.
Abstract: High Peak to Average Power Ratio (PAPR) of the
transmitted signal is a serious problem in multicarrier systems (MC),
such as Orthogonal Frequency Division Multiplexing (OFDM), or in
Multi-Carrier Code Division Multiple Access (MC-CDMA) systems,
due to large number of subcarriers. This effect is possible reduce with
some PAPR reduction techniques. Spreading sequences at the
presence of Saleh and Rapp models of high power amplifier (HPA)
have big influence on the behavior of system. In this paper we
investigate the bit-error-rate (BER) performance of MC-CDMA
systems. Basically we can see from simulations that the MC-CDMA
system with Iterative algorithm can be providing significantly better
results than the MC-CDMA system. The results of our analyses are
verified via simulation.
Abstract: Background: With the perceived pain and poor
function experienced following knee arthroplasty, patients usually
feel un-satisfied. Yet, a controversy still persists on the appropriate
operative technique that doesn’t affect proprioception much.
Purpose: This study compared the effects of Cruciate Retaining
(CR) and Posterior Stabilized (PS) total knee arthroplasty (TKA on
dynamic balance, pain and functional performance following
rehabilitation.
Methods: Thirty patients with CRTKA (group I), thirty with
PSTKA (group II) and fifteen indicated for arthroplasty but weren’t
operated on yet (group III) participated in the study. The mean age
was 54.53±3.44, 55.13±3.48 and 55.33±2.32 years and BMI
35.7±3.03, 35.7±1.99 and 35.73±1.03 kg/m2 for groups I, II and III
respectively. The Berg Balance Scale (BBS), WOMAC pain subscale
and Timed Up-and-Go (TUG) and Stair-Climbing (SC) tests were
used for assessment. Assessments were conducted four weeks preand
post-operatively, three, six and twelve months post-operatively
with the control group being assessed at the same time intervals. The
post-operative rehabilitation involved hospitalization (1st week),
home-based (2nd-4th weeks), and outpatient clinic (5th-12th weeks)
programs, follow-up to all groups for twelve months.
Results: The Mixed design MANOVA revealed that group I had
significantly lower pain scores and SC time compared with group II
three, six and twelve months post-operatively. Moreover, the BBS
scores increased significantly and the pain scores and TUG and SC
time decreased significantly six months post-operatively compared
with four weeks pre- and post-operatively and three months postoperatively
in groups I and II with the opposite being true four weeks
post-operatively. But no significant differences in BBS scores, pain
scores and TUG and SC time between six and twelve months postoperatively
in groups I and II.
Interpretation/Conclusion: CRTKA is preferable to PSTKA,
possibly due to the preserved human proprioceptors in the un-excised
PCL.