Abstract: The Standard Penetration Test (SPT) is the most
common in situ test for soil investigations. On the other hand, the
Cone Penetration Test (CPT) is considered one of the best
investigation tools. Due to the fast and accurate results that can be
obtained it complaints the SPT in many applications like field
explorations, design parameters, and quality control assessments.
Many soil index and engineering properties have been correlated to
both of SPT and CPT. Various foundation design methods were
developed based on the outcome of these tests. Therefore it is vital to
correlate these tests to each other so that either one of the tests can be
used in the absence of the other, especially for preliminary evaluation
and design purposes.
The primary purpose of this study was to investigate the
relationships between the SPT and CPT for different type of sandy
soils in Florida. Data for this research were collected from number of
projects sponsored by the Florida Department of Transportation
(FDOT), six sites served as the subject of SPT-CPT correlations. The
correlations were established between the cone resistance (qc), sleeve
friction (fs) and the uncorrected SPT blow counts (N) for various
soils.
A positive linear relationship was found between qc, fs and N for
various sandy soils. In general, qc versus N showed higher
correlation coefficients than fs versus N. qc/N ratios were developed
for different soil types and compared to literature values, the results
of this research revealed higher ratios than literature values.
Abstract: Collaborative technologies or software known as
groupware are key enabling tools for communication, collaboration
and co-ordination among individuals, work groups and businesses.
Available reviews of the groupware literature are very few,
and mostly neither systematic nor recent.
This paper is an effort to fill this gap, and to provide researchers,
with a more up-to-date and wide systematic literature review. For this
purpose, 1087 scholarly articles, published from 1990 to 2013, on the
topic of groupware, were collected by the literature search. The study
here adopted the systematic approach of lexical analysis for the
analysis of those articles.
Abstract: Verification and Validation of Simulated Process
Model is the most important phase of the simulator life cycle.
Evaluation of simulated process models based on Verification and
Validation techniques checks the closeness of each component model
(in a simulated network) with the real system/process with respect to
dynamic behaviour under steady state and transient conditions. The
process of Verification and Validation helps in qualifying the process
simulator for the intended purpose whether it is for providing
comprehensive training or design verification. In general, model
verification is carried out by comparison of simulated component
characteristics with the original requirement to ensure that each step
in the model development process completely incorporates all the
design requirements. Validation testing is performed by comparing
the simulated process parameters to the actual plant process
parameters either in standalone mode or integrated mode.
A Full Scope Replica Operator Training Simulator for PFBR -
Prototype Fast Breeder Reactor has been developed at IGCAR,
Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder
Reactor Simulator) where in the main participants are
engineers/experts belonging to Modeling Team, Process Design and
Instrumentation & Control design team. This paper discusses about
the Verification and Validation process in general, the evaluation
procedure adopted for PFBR operator training Simulator, the
methodology followed for verifying the models, the reference
documents and standards used etc. It details out the importance of
internal validation by design experts, subsequent validation by
external agency consisting of experts from various fields, model
improvement by tuning based on expert’s comments, final
qualification of the simulator for the intended purpose and the
difficulties faced while co-coordinating various activities.
Abstract: This work presents synthesis of α,ω-dithienyl
terminated poly(ethylene glycol) (PEGTh) capable for further chain
extension by either chemical or electrochemical polymeriztion.
PEGTh was characterized by FTIR and 1H-NMR. Further
copolymerization of PEGTh and pyrrole (Py) was performed by
chemical oxidative polymerization using ceric (IV) salt as an oxidant
(PPy-PEGTh). PEG without end group modification was used
directly to prepare copolymers with Py by Ce (IV) salt (PPy-PEG).
Block copolymers with mole ratio of pyrrole to PEGTh (PEG) 50:1
and 10:1 were synthesized. The electrical conductivities of
copolymers PPy-PEGTh and PPy-PEG were determined by four
point probe technique. Influence of the synthetic route and content of
the insulating segment on conductivity and yield of the copolymers
were investigated.
Abstract: Conventional educational practices, do not offer all
the required skills for teachers to successfully survive in today’s
workplace. Due to poor professional training, a big gap exists across
the curriculum plan and the teacher practices in the classroom. As
such, raising the quality of teaching through ICT-enabled training and
professional development of teachers should be an urgent priority.
‘Mobile Learning’, in that vein, is an increasingly growing field of
educational research and practice across schools and work places. In
this paper, we propose a novel Mobile learning system that allows the
users to learn through an intelligent mobile learning in cooperatively
every-time and every-where. The system will reduce the training cost
and increase consistency, efficiency, and data reliability. To establish
that our system will display neither functional nor performance
failure, the evaluation strategy is based on formal observation of
users interacting with system followed by questionnaires and
structured interviews.
Abstract: Negative pressure phenomenon appears in many
thermodynamic, geophysical and biophysical processes in the Nature
and technological systems. For more than 100 years of the laboratory
researches beginning from F. M. Donny’s tests, the great values of
negative pressure have been achieved. But this phenomenon has not
been practically applied, being only a nice lab toy due to the special
demands for the purity and homogeneity of the liquids for its
appearance. The possibility of creation of direct wave of negative
pressure in real heterogeneous liquid systems was confirmed
experimentally under the certain kinetic and hydraulic conditions.
The negative pressure can be considered as the factor of both useful
and destroying energies. The new approach to generation of the
negative pressure waves in impure, unclean fluids has allowed the
creation of principally new energy saving technologies and
installations to increase the effectiveness and efficiency of different
production processes. It was proved that the negative pressure is one
of the main factors causing hard troubles in some technological and
natural processes. Received results emphasize the necessity to take
into account the role of the negative pressure as an energy factor in
evaluation of many transient thermohydrodynamic processes in the
Nature and production systems.
Abstract: Motion Tracking and Stereo Vision are complicated,
albeit well-understood problems in computer vision. Existing
softwares that combine the two approaches to perform stereo motion
tracking typically employ complicated and computationally expensive
procedures. The purpose of this study is to create a simple and
effective solution capable of combining the two approaches. The
study aims to explore a strategy to combine the two techniques
of two-dimensional motion tracking using Kalman Filter; and depth
detection of object using Stereo Vision. In conventional approaches
objects in the scene of interest are observed using a single camera.
However for Stereo Motion Tracking; the scene of interest is
observed using video feeds from two calibrated cameras. Using two
simultaneous measurements from the two cameras a calculation for
the depth of the object from the plane containing the cameras is made.
The approach attempts to capture the entire three-dimensional spatial
information of each object at the scene and represent it through a
software estimator object. In discrete intervals, the estimator tracks
object motion in the plane parallel to plane containing cameras and
updates the perpendicular distance value of the object from the plane
containing the cameras as depth. The ability to efficiently track
the motion of objects in three-dimensional space using a simplified
approach could prove to be an indispensable tool in a variety of
surveillance scenarios. The approach may find application from high
security surveillance scenes such as premises of bank vaults, prisons
or other detention facilities; to low cost applications in supermarkets
and car parking lots.
Abstract: Reverse Logistics (RL) Network is considered as
complex and dynamic network that involves many stakeholders such
as: suppliers, manufactures, warehouse, retails and costumers, this
complexity is inherent in such process due to lack of perfect
knowledge or conflicting information. Ontologies on the other hand
can be considered as an approach to overcome the problem of sharing
knowledge and communication among the various reverse logistics
partners. In this paper we propose a semantic representation based on
hybrid architecture for building the Ontologies in ascendant way, this
method facilitates the semantic reconciliation between the
heterogeneous information systems that support reverse logistics
processes and product data.
Abstract: Batteries of electric vehicles (BEV) are becoming
more attractive with the advancement of new battery technologies
and promotion of electric vehicles. BEV batteries are recharged on
board vehicles using either the grid (G2V for Grid to Vehicle) or
renewable energies in a stand-alone application (H2V for Home to
Vehicle). This paper deals with the modeling, sizing and control of a
photovoltaic stand-alone application that can charge the BEV at
home. The modeling approach and developed mathematical models
describing the system components are detailed. Simulation and
experimental results are presented and commented.
Abstract: This paper focuses on the assessment of the air
pollution and morbidity relationship in Tunisia. Air pollution is
measured by ozone air concentration and the morbidity is measured
by the number of respiratory-related restricted activity days during
the 2-week period prior to the interview. Socioeconomic data are also
collected in order to adjust for any confounding covariates. Our
sample is composed by 407 Tunisian respondents; 44.7% are women,
the average age is 35.2, near 69% are living in a house built after
1980, and 27.8% have reported at least one day of respiratory-related
restricted activity. The model consists on the regression of the
number of respiratory-related restricted activity days on the air
quality measure and the socioeconomic covariates. In order to correct
for zero-inflation and heterogeneity, we estimate several models
(Poisson, negative binomial, zero inflated Poisson, Poisson hurdle,
negative binomial hurdle and finite mixture Poisson models).
Bootstrapping and post-stratification techniques are used in order to
correct for any sample bias. According to the Akaike information
criteria, the hurdle negative binomial model has the greatest goodness
of fit. The main result indicates that, after adjusting for
socioeconomic data, the ozone concentration increases the probability
of positive number of restricted activity days.
Abstract: Wireless mesh networking is rapidly gaining in
popularity with a variety of users: from municipalities to enterprises,
from telecom service providers to public safety and military
organizations. This increasing popularity is based on two basic facts:
ease of deployment and increase in network capacity expressed in
bandwidth per footage; WMNs do not rely on any fixed
infrastructure. Many efforts have been used to maximizing
throughput of the network in a multi-channel multi-radio wireless
mesh network. Current approaches are purely based on either static or
dynamic channel allocation approaches. In this paper, we use a
hybrid multichannel multi radio wireless mesh networking
architecture, where static and dynamic interfaces are built in the
nodes. Dynamic Adaptive Channel Allocation protocol (DACA), it
considers optimization for both throughput and delay in the channel
allocation. The assignment of the channel has been allocated to be codependent
with the routing problem in the wireless mesh network and
that should be based on passage flow on every link. Temporal and
spatial relationship rises to re compute the channel assignment every
time when the pattern changes in mesh network, channel assignment
algorithms assign channels in network. In this paper a computing
path which captures the available path bandwidth is the proposed
information and the proficient routing protocol based on the new path
which provides both static and dynamic links. The consistency
property guarantees that each node makes an appropriate packet
forwarding decision and balancing the control usage of the network,
so that a data packet will traverse through the right path.
Abstract: Cost of governance in Nigeria has become a challenge
to development and concern to practitioners and scholars alike in the
field of business and social science research. In the 2010 national
budget of NGN4.6 trillion or USD28.75billion for instance, only a
pantry sum of NGN1.8trillion or USD11.15billion was earmarked for
capital expenditure. Similarly, in 2013, out of a total national budget
of NGN4.92trillion or USD30.75billion, only the sum of
NGN1.50trllion or USD9.38billion was voted for capital expenditure.
Therefore, based on the data sourced from the Nigerian Office of
Statistics, Central bank of Nigeria Statistical Bulletin as well as from
the United Nations Development Programme, this study examined
the causes of high cost of governance in Nigeria. It found out that the
high cost of governance in the country is in the interest of the ruling
class, arising from their unethical behaviour – corrupt practices and
the poor management of public resources. As a result, the study
recommends the need to intensify the war against corruption and
mismanagement of public resources by government officials as
possible solution to overcome the high cost of governance in Nigeria.
This could be achieved by strengthening the constitutional powers of
the various anti-corruption agencies in the area of arrest, investigation
and prosecution of offenders without the interference of the executive
arm of government either at the local, state or federal level.
Abstract: Water spray cooling is a technique typically used in
heat treatment and other metallurgical processes where controlled
temperature regimes are required. Water spray cooling is used in
static (without movement) or dynamic (with movement of the steel
plate) regimes. The static regime is notable for the fixed position of
the hot steel plate and fixed spray nozzle. This regime is typical for
quenching systems focused on heat treatment of the steel plate. The
second application of spray cooling is the dynamic regime. The
dynamic regime is notable for its static section cooling system and
moving steel plate. This regime is used in rolling and finishing mills.
The fixed position of cooling sections with nozzles and the
movement of the steel plate produce nonhomogeneous water
distribution on the steel plate. The length of cooling sections and
placement of water nozzles in combination with the nonhomogeneity
of water distribution lead to discontinued or interrupted cooling
conditions. The impact of static and dynamic regimes on cooling
intensity and the heat transfer coefficient during the cooling process
of steel plates is an important issue.
Heat treatment of steel is accompanied by oxide scale growth. The
oxide scale layers can significantly modify the cooling properties and
intensity during the cooling. The combination of static and dynamic
(section) regimes with the variable thickness of the oxide scale layer
on the steel surface impact the final cooling intensity. The study of
the influence of the oxide scale layers with different cooling regimes
was carried out using experimental measurements and numerical
analysis. The experimental measurements compared both types of
cooling regimes and the cooling of scale-free surfaces and oxidized
surfaces. A numerical analysis was prepared to simulate the cooling
process with different conditions of the section and samples with
different oxide scale layers.
Abstract: The use of eXtensible Markup Language (XML) in
web, business and scientific databases lead to the development of
methods, techniques and systems to manage and analyze XML data.
Semi-structured documents suffer due to its heterogeneity and
dimensionality. XML structure and content mining represent
convergence for research in semi-structured data and text mining. As
the information available on the internet grows drastically, extracting
knowledge from XML documents becomes a harder task. Certainly,
documents are often so large that the data set returned as answer to a
query may also be very big to convey the required information. To
improve the query answering, a Semantic Tree Based Association
Rule (STAR) mining method is proposed. This method provides
intentional information by considering the structure, content and the
semantics of the content. The method is applied on Reuter’s dataset
and the results show that the proposed method outperforms well.
Abstract: This paper investigates the viability of using carbon
fiber reinforced epoxy composites modified with carbon nanotubes to
strengthening reinforced concrete (RC) columns. Six RC columns
was designed and constructed according to ASCE standards. The
columns were wrapped using carbon fiber sheets impregnated with
either neat epoxy or CNTs modified epoxy. These columns were then
tested under concentric axial loading. Test results show that;
compared to the unwrapped specimens; wrapping concrete columns
with carbon fiber sheet embedded in CNTs modified epoxy resulted
in an increase in its axial load resistance, maximum displacement,
and toughness values by 24%, 109% and 232%, respectively. These
results reveal that adding CNTs into epoxy resin enhanced the
confinement effect, specifically, increased the axial load resistance,
maximum displacement, and toughness values by 11%, 6%, and
19%, respectively compared with columns strengthening with carbon
fiber sheet embedded in neat epoxy.
Abstract: The objective of meta-analysis is to combine results
from several independent studies in order to create generalization
and provide evidence base for decision making. But recent studies
show that the magnitude of effect size estimates reported in many
areas of research significantly changed over time and this can
impair the results and conclusions of meta-analysis. A number of
sequential methods have been proposed for monitoring the effect
size estimates in meta-analysis. However they are based on statistical
theory applicable only to fixed effect model (FEM) of meta-analysis.
For random-effects model (REM), the analysis incorporates the
heterogeneity variance, τ 2 and its estimation create complications.
In this paper we study the use of a truncated CUSUM-type test with
asymptotically valid critical values for sequential monitoring in REM.
Simulation results show that the test does not control the Type I error
well, and is not recommended. Further work required to derive an
appropriate test in this important area of applications.
Abstract: In EFL programs, rating scales used in writing
assessment are often constructed by intuition. Intuition-based scales
tend to provide inaccurate and divisive ratings of learners’ writing
performance. Hence, following an empirical approach, this study
attempted to develop a rating scale for elementary-level writing at an
EFL program in Saudi Arabia. Towards this goal, 98 students’ essays
were scored and then coded using comprehensive taxonomy of
writing constructs and their measures. An automatic linear modeling
was run to find out which measures would best predict essay scores.
A nonparametric ANOVA, the Kruskal-Wallis test, was then used to
determine which measures could best differentiate among scoring
levels. Findings indicated that there were certain measures that could
serve as either good predictors of essay scores or differentiators
among scoring levels, or both. The main conclusion was that a rating
scale can be empirically developed using predictive and
discriminative statistical tests.
Abstract: The fuzzy composition of objects depicted in images
acquired through MR imaging or the use of bio-scanners has often
been a point of controversy for field experts attempting to effectively
delineate between the visualized objects. Modern approaches in
medical image segmentation tend to consider fuzziness as a
characteristic and inherent feature of the depicted object, instead of
an undesirable trait. In this paper, a novel technique for efficient
image retrieval in the context of images in which segmented objects
are either crisp or fuzzily bounded is presented. Moreover, the
proposed method is applied in the case of multiple, even conflicting,
segmentations from field experts. Experimental results demonstrate
the efficiency of the suggested method in retrieving similar objects
from the aforementioned categories while taking into account the
fuzzy nature of the depicted data.
Abstract: Images are important source of information used as
evidence during any investigation process. Their clarity and accuracy
is essential and of the utmost importance for any investigation.
Images are vulnerable to losing blocks and having noise added to
them either after alteration or when the image was taken initially,
therefore, having a high performance image processing system and it
is implementation is very important in a forensic point of view. This
paper focuses on improving the quality of the forensic images.
For different reasons packets that store data can be affected,
harmed or even lost because of noise. For example, sending the
image through a wireless channel can cause loss of bits. These types
of errors might give difficulties generally for the visual display
quality of the forensic images.
Two of the images problems: noise and losing blocks are covered.
However, information which gets transmitted through any way of
communication may suffer alteration from its original state or even
lose important data due to the channel noise. Therefore, a developed
system is introduced to improve the quality and clarity of the forensic
images.
Abstract: Tumor is an uncontrolled growth of tissues in any part
of the body. Tumors are of different types and they have different
characteristics and treatments. Brain tumor is inherently serious and
life-threatening because of its character in the limited space of the
intracranial cavity (space formed inside the skull). Locating the tumor
within MR (magnetic resonance) image of brain is integral part of the
treatment of brain tumor. This segmentation task requires
classification of each voxel as either tumor or non-tumor, based on
the description of the voxel under consideration. Many studies are
going on in the medical field using Markov Random Fields (MRF) in
segmentation of MR images. Even though the segmentation process
is better, computing the probability and estimation of parameters is
difficult. In order to overcome the aforementioned issues, Conditional
Random Field (CRF) is used in this paper for segmentation, along
with the modified artificial bee colony optimization and modified
fuzzy possibility c-means (MFPCM) algorithm. This work is mainly
focused to reduce the computational complexities, which are found in
existing methods and aimed at getting higher accuracy. The
efficiency of this work is evaluated using the parameters such as
region non-uniformity, correlation and computation time. The
experimental results are compared with the existing methods such as
MRF with improved Genetic Algorithm (GA) and MRF-Artificial
Bee Colony (MRF-ABC) algorithm.