Abstract: Hypersonic flows around spatial vehicles during their reentry phase in planetary atmospheres are characterized by intense aerothermodynamics phenomena. The aim of this work is to analyze high temperature flows around an axisymmetric blunt body taking into account chemical and vibrational non-equilibrium for air mixture species and the no slip condition at the wall. For this purpose, the Navier-Stokes equations system is resolved by the finite volume methodology to determine the flow parameters around the axisymmetric blunt body especially at the stagnation point and in the boundary layer along the wall of the blunt body. The code allows the capture of shock wave before a blunt body placed in hypersonic free stream. The numerical technique uses the Flux Vector Splitting method of Van Leer. CFL coefficient and mesh size level are selected to ensure the numerical convergence.
Abstract: We investigated the structure and electronic properties
of the compound Mg1-xBixO with varying concentrations of 0, ¼, ½,
and ¾ x bismuth in the the cesium chloride (CsCl), zinc-blende
(ZnS), nickel arsenide (NiAs) NaCl (rock-salt) and WZ (wurtzite)
phases. We calculated. The calculations were performed using the
first-principles pseudo-potential method within the framework of spin
density functional theory (DFT).
Abstract: With the growing of computer and network, digital
data can be spread to anywhere in the world quickly. In addition,
digital data can also be copied or tampered easily so that the security
issue becomes an important topic in the protection of digital data.
Digital watermark is a method to protect the ownership of digital data.
Embedding the watermark will influence the quality certainly. In this
paper, Vector Quantization (VQ) is used to embed the watermark into
the image to fulfill the goal of data hiding. This kind of watermarking
is invisible which means that the users will not conscious the existing
of embedded watermark even though the embedded image has tiny
difference compared to the original image. Meanwhile, VQ needs a lot
of computation burden so that we adopt a fast VQ encoding scheme by
partial distortion searching (PDS) and mean approximation scheme to
speed up the data hiding process.
The watermarks we hide to the image could be gray, bi-level and
color images. Texts are also can be regarded as watermark to embed.
In order to test the robustness of the system, we adopt Photoshop to
fulfill sharpen, cropping and altering to check if the extracted
watermark is still recognizable. Experimental results demonstrate that
the proposed system can resist the above three kinds of tampering in
general cases.
Abstract: This work assesses the cortical and the sub-cortical
neural activity recorded from rodents using entropy and mutual
information based approaches to study how hypothermia affects neural
activity. By applying the multi-scale entropy and Shannon entropy, we
quantify the degree of the regularity embedded in the cortical and
sub-cortical neurons and characterize the dependency of entropy of
these regions on temperature. We study also the degree of the mutual
information on thalamocortical pathway depending on temperature.
The latter is most likely an indicator of coupling between these highly
connected structures in response to temperature manipulation leading
to arousal after global cerebral ischemia.
Abstract: One of the key aspects of power quality improvement
in power system is the mitigation of voltage sags/swells and flicker.
Custom power devices have been known as the best tools for voltage
disturbances mitigation as well as reactive power compensation.
Dynamic Voltage Restorer (DVR) which is the most efficient and
effective modern custom power device can provide the most
commercial solution to solve several problems of power quality in
distribution networks. This paper deals with analysis and simulation
technique of DVR based on instantaneous power theory which is a
quick control to detect signals. The main purpose of this work is to
remove three important disturbances including voltage sags/swells
and flicker. Simulation of the proposed method was carried out on
two sample systems by using Matlab software environment and the
results of simulation show that the proposed method is able to
provide desirable power quality in the presence of wide range of
disturbances.
Abstract: This paper analyzes the conceptual framework of three
statistical methods, multiple regression, path analysis, and structural
equation models. When establishing research model of the statistical
modeling of complex social phenomenon, it is important to know the
strengths and limitations of three statistical models. This study
explored the character, strength, and limitation of each modeling and
suggested some strategies for accurate explaining or predicting the
causal relationships among variables. Especially, on the studying of
depression or mental health, the common mistakes of research
modeling were discussed.
Abstract: Polymer Electrolyte Membrane Fuel Cell (PEMFC) is
such a time-vary nonlinear dynamic system. The traditional linear
modeling approach is hard to estimate structure correctly of PEMFC
system. From this reason, this paper presents a nonlinear modeling of
the PEMFC using Neural Network Auto-regressive model with
eXogenous inputs (NNARX) approach. The multilayer perception
(MLP) network is applied to evaluate the structure of the NNARX
model of PEMFC. The validity and accuracy of NNARX model are
tested by one step ahead relating output voltage to input current from
measured experimental of PEMFC. The results show that the obtained
nonlinear NNARX model can efficiently approximate the dynamic
mode of the PEMFC and model output and system measured output
consistently.
Abstract: Design concepts of real-time embedded system can be
realized initially by introducing novel design approaches. In this
literature, model based design approach and in-the-loop testing were
employed early in the conceptual and preliminary phase to formulate
design requirements and perform quick real-time verification. The
design and analysis methodology includes simulation analysis, model
based testing, and in-the-loop testing. The design of conceptual driveby-
wire, or DBW, algorithm for electronic control unit, or ECU, was
presented to demonstrate the conceptual design process, analysis, and
functionality evaluation. The concepts of DBW ECU function can be
implemented in the vehicle system to improve electric vehicle, or EV,
conversion drivability. However, within a new development process,
conceptual ECU functions and parameters are needed to be evaluated.
As a result, the testing system was employed to support conceptual
DBW ECU functions evaluation. For the current setup, the system
components were consisted of actual DBW ECU hardware, electric
vehicle models, and control area network or CAN protocol. The
vehicle models and CAN bus interface were both implemented as
real-time applications where ECU and CAN protocol functionality
were verified according to the design requirements. The proposed
system could potentially benefit in performing rapid real-time
analysis of design parameters for conceptual system or software
algorithm development.
Abstract: Cloud computing is the innovative and leading
information technology model for enabling convenient, on-demand
network access to a shared pool of configurable computing resources
that can be rapidly provisioned and released with minimal
management effort. In this paper, we aim at the development of
workflow management system for cloud computing platforms based
on our previous research on the dynamic allocation of the cloud
computing resources and its workflow process. We took advantage of
the HTML5 technology and developed web-based workflow interface.
In order to enable the combination of many tasks running on the cloud
platform in sequence, we designed a mechanism and developed an
execution engine for workflow management on clouds. We also
established a prediction model which was integrated with job queuing
system to estimate the waiting time and cost of the individual tasks on
different computing nodes, therefore helping users achieve maximum
performance at lowest payment. This proposed effort has the potential
to positively provide an efficient, resilience and elastic environment
for cloud computing platform. This development also helps boost user
productivity by promoting a flexible workflow interface that lets users
design and control their tasks' flow from anywhere.
Abstract: This work is the first dowel in a rather wide research
activity in collaboration with Euro Mediterranean Center for Climate
Changes, aimed at introducing scalable approaches in Ocean
Circulation Models. We discuss designing and implementation of
a parallel algorithm for solving the Variational Data Assimilation
(DA) problem on Graphics Processing Units (GPUs). The algorithm
is based on the fully scalable 3DVar DA model, previously proposed
by the authors, which uses a Domain Decomposition approach
(we refer to this model as the DD-DA model). We proceed with
an incremental porting process consisting of 3 distinct stages:
requirements and source code analysis, incremental development of
CUDA kernels, testing and optimization. Experiments confirm the
theoretic performance analysis based on the so-called scale up factor
demonstrating that the DD-DA model can be suitably mapped on
GPU architectures.
Abstract: A field study was conducted to evaluate the efficacy
of lavender for phytoremediation of contaminated soils. The
experiment was performed on an agricultural fields contaminated by
the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The
concentrations of Pb, Zn and Cd in lavender (roots, stems, leaves and
inflorescences) and in the essential oils of lavender were determined.
Lavender is a plant which is tolerant to heavy metals and can be
grown on contaminated soils, and which can be referred to the
hyperaccumulators of lead and the accumulators of cadmium and
zinc, and can be successfully used in the phytoremediation of heavy
metal contaminated soils. Favorable is also the fact that heavy metals
do not influence the development of the lavender, as well as on the
quality and quantity of the essential oil. The possibility of further
industrial processing will make lavender economically interesting
crops for farmers of phytoextraction technology.
Abstract: Latin hypercube designs (LHDs) have been applied in
many computer experiments among the space-filling designs found in
the literature. A LHD can be randomly generated but a randomly
chosen LHD may have bad properties and thus act poorly in
estimation and prediction. There is a connection between Latin
squares and orthogonal arrays (OAs). A Latin square of order s
involves an arrangement of s symbols in s rows and s columns, such
that every symbol occurs once in each row and once in each column
and this exists for every non-negative integer s. In this paper, a
computer program was written to construct orthogonal array-based
Latin hypercube designs (OA-LHDs). Orthogonal arrays (OAs) were
constructed from Latin square of order s and the OAs constructed
were afterward used to construct the desired Latin hypercube designs
for three input variables for use in computer experiments. The LHDs
constructed have better space-filling properties and they can be used
in computer experiments that involve only three input factors.
MATLAB 2012a computer package (www.mathworks.com/) was
used for the development of the program that constructs the designs.
Abstract: The present study was carried out to investigate the
effect of alloying elements and thermo-mechanical treatment (TMT)
i.e. hot rolling and forging with different reduction ratios on the
hardness (HV) and impact toughness (J) of heat-treated low alloy
steels. An understanding of the combined effect of TMT and alloying
elements and by measuring hardness, impact toughness, resulting
from different heat treatment following TMT of the low alloy steels,
it is possible to determine which conditions yielded optimum
mechanical properties and high strength to weight ratio.
Experimental Correlations between hot work reduction ratio,
hardness and impact toughness for thermo-mechanically heat treated
low alloy steels are analyzed quantitatively, and both regression and
mathematical hardness and impact toughness models are developed.
Abstract: This paper presents an optimal broadcast algorithm
for the hypercube networks. The main focus of the paper is the
effectiveness of the algorithm in the presence of many node faults.
For the optimal solution, our algorithm builds with spanning tree
connecting the all nodes of the networks, through which messages
are propagated from source node to remaining nodes. At any given
time, maximum n − 1 nodes may fail due to crashing. We show
that the hypercube networks are strongly fault-tolerant. Simulation
results analyze to accomplish algorithm characteristics under many
node faults. We have compared our simulation results between our
proposed method and the Fu’s method. Fu’s approach cannot tolerate
n − 1 faulty nodes in the worst case, but our approach can tolerate
n − 1 faulty nodes.
Abstract: During the post-Civil War era, the city of Nashville,
Tennessee, had the highest mortality rate in the United States. The
elevated death and disease rates among former slaves were
attributable to lack of quality healthcare. To address the paucity of
healthcare services, Meharry Medical College, an institution with the
mission of educating minority professionals and serving the
underserved population, was established in 1876.
Purpose: The social ecological framework and partial least squares
(PLS) path modeling were used to quantify the impact of
socioeconomic status and adverse health outcome on primary care
professionals serving the disadvantaged community. Thus, the study
results could demonstrate the accomplishment of the College’s
mission of training primary care professionals to serve in underserved
areas.
Methods: Various statistical methods were used to analyze alumni
data from 1975 – 2013. K-means cluster analysis was utilized to
identify individual medical and dental graduates in the cluster groups
of the practice communities (Disadvantaged or Non-disadvantaged
Communities). Discriminant analysis was implemented to verify the
classification accuracy of cluster analysis. The independent t-test was
performed to detect the significant mean differences of respective
clustering and criterion variables. Chi-square test was used to test if
the proportions of primary care and non-primary care specialists are
consistent with those of medical and dental graduates practicing in
the designated community clusters. Finally, the PLS path model was
constructed to explore the construct validity of analytic model by
providing the magnitude effects of socioeconomic status and adverse
health outcome on primary care professionals serving the
disadvantaged community.
Results: Approximately 83% (3,192/3,864) of Meharry Medical
College’s medical and dental graduates from 1975 to 2013 were
practicing in disadvantaged communities. Independent t-test confirmed the content validity of the cluster analysis model. Also, the
PLS path modeling demonstrated that alumni served as primary care
professionals in communities with significantly lower socioeconomic
status and higher adverse health outcome (p < .001). The PLS path
modeling exhibited the meaningful interrelation between primary
care professionals practicing communities and surrounding
environments (socioeconomic statues and adverse health outcome),
which yielded model reliability, validity, and applicability.
Conclusion: This study applied social ecological theory and
analytic modeling approaches to assess the attainment of Meharry
Medical College’s mission of training primary care professionals to
serve in underserved areas, particularly in communities with low
socioeconomic status and high rates of adverse health outcomes. In
summary, the majority of medical and dental graduates from Meharry
Medical College provided primary care services to disadvantaged
communities with low socioeconomic status and high adverse health
outcome, which demonstrated that Meharry Medical College has
fulfilled its mission. The high reliability, validity, and applicability of
this model imply that it could be replicated for comparable
universities and colleges elsewhere.
Abstract: Introduction: There are multiple social, individual and
cultural factors that influence an individual’s decision to adopt family
planning methods especially among non-users in patriarchal societies
like Pakistan. Non-users, if targeted efficiently, can contribute
significantly to country’s CPR. A research study showed that nonusers
if convinced to adopt lactational amenorrhea method can shift
to long term methods in future. Research shows that if non users are
targeted efficiently a 59% reduction in unintended pregnancies in
Saharan Africa and South-Central and South-East Asia is anticipated.
Methods: We did secondary data analysis on Pakistan
Demographic Heath Survey (2012-13) dataset. Use of contraception
(never-use/ever-use) was the outcome variable. At univariate level
Chi-square/Fisher Exact test was used to assess relationship of
baseline covariates with contraception use. Then variables to be
incorporated in the model were checked for multicollinearity,
confounding and interaction. Then binary logistic regression (with an
urban-rural stratification) was done to find relationship between
contraception use and baseline demographic and social variables.
Results: The multivariate analyses of the study showed that
younger women (≤ 29 years)were more prone to be never users as
compared to those who were >30 years and this trend was seen in
urban areas (AOR 1.92, CI 1.453-2.536) as well as rural areas (AOR
1.809, CI 1.421-2.303). While looking at regional variation, women
from urban Sindh (AOR 1.548, CI 1.142-2.099) and urban
Balochistan (AOR 2.403, CI 1.504-3.839) had more never users as
compared to other urban regions. Women in the rich wealth quintile
were more never users and this was seen both in urban and rural
localities (urban (AOR 1.106 CI .753-1.624); rural areas (AOR 1.162,
CI .887-1.524)) even though these were not statistically significant.
Women idealizing more children (>4) are more never users as
compared to those idealizing less children in both urban (AOR 1.854,
CI 1.275-2.697) and rural areas (AOR 2.101, CI 1.514-2.916).
Women who never lost a pregnancy were more inclined to be nonusers
in rural areas (AOR 1.394, CI 1.127-1.723) .Women familiar
with only traditional or no method had more never users in rural areas
(AOR 1.717, CI 1.127-1.723) but in urban areas it wasn’t significant.
Women unaware of Lady Health Worker’s presence in their area
were more never users especially in rural areas (AOR 1.276, CI
1.014-1.607). Women who did not visit any care provider were more
never users (urban (AOR 11.738, CI 9.112-15.121) rural areas (AOR
7.832, CI 6.243-9.826)).
Discussion/Conclusion: This study concluded that government,
policy makers and private sector family planning programs should
focus on the untapped pool of never users (younger women from underserved provinces, in higher wealth quintiles, who desire more
children.). We need to make sure to cover catchment areas where
there are less LHWs and less providers as ignorance to modern
methods and never been visited by an LHW are important
determinants of never use. This all is in sync with previous literate
from similar developing countries.
Abstract: Theory of Mind (ToM) refers to the ability to infer
another’s mental state. With appropriate ToM, one can behave well in
social interactions. A growing body of evidence has demonstrated that
patients with temporal lobe epilepsy (TLE) may damage ToM by
affecting on regions of the underlying neural network of ToM.
However, the question of whether there is cerebral laterality for ToM
functions remains open. This study aimed to examine whether there is
cerebral lateralization for ToM abilities in TLE patients. Sixty-seven
adult TLE patients and 30 matched healthy controls (HC) were
recruited. Patients were classified into right (RTLE), left (LTLE), and
bilateral (BTLE) TLE groups on the basis of a consensus panel review
of their seizure semiology, EEG findings, and brain imaging results.
All participants completed an intellectual test and four tasks measuring
basic and advanced ToM. The results showed that, on all ToM tasks,
(1) each patient group performed worse than HC; (2) there were no
significant differences between LTLE and RTLE groups; and (3) the
BTLE group performed the worst. It appears that the neural network
responsible for ToM is distributed evenly between the cerebral
hemispheres.
Abstract: This work sets out to debate the tensions involved in
the processes of contamination and self-purification in the urban
space, particularly in the streams that run through the Buenos Aires
metropolitan area. For much of their course, those streams are piped;
their waters do not come into contact with the outdoors until they
have reached deeply impoverished urban areas with high levels of
environmental contamination. These are peripheral zones that, until
thirty years ago, were marshlands and fields. They are now densely
populated areas largely lacking in urban infrastructure.
The Cárcova neighborhood, where this project is underway, is in
the José León Suárez section of General San Martín county, Buenos
Aires province. A stretch of José León Suarez canal crosses the
neighborhood. Starting upstream, this canal carries pollutants due to
the sewage and industrial waste released into it. Further downstream,
in the neighborhood, domestic drainage is poured into the stream. In
this paper, we formulate a hypothesis diametrical to the one that
holds that these neighborhoods are the primary source of
contamination, suggesting instead that in the stretch of the canal that
runs through the neighborhood the stream’s waters are actually
cleaned and the sediments accumulate pollutants. Indeed, the
stretches of water that runs through these neighborhoods act as water
processing plants for the metropolis.
This project has studied the different organic-load polluting
contributions to the water in a certain stretch of the canal, the
reduction of that load over the course of the canal, and the
incorporation of pollutants into the sediments. We have found that
the surface water has considerable ability to self-purify, mostly due to
processes of sedimentation and adsorption. The polluting load is
accumulated in the sediments where that load stabilizes slowly by
means of anaerobic processes. In this study, we also investigated the
risks of sediment management and the use of the processes studied
here in controlled conditions as tools of environmental restoration.
Abstract: Quality of Service (QoS) attributes as part of the
service description is an important factor for service attribute. It is not
easy to exactly quantify the weight of each QoS conditions since
human judgments based on their preference causes vagueness. As
web services selection requires optimization, evolutionary computing
based on heuristics to select an optimal solution is adopted. In this
work, the evolutionary computing technique Particle Swarm
Optimization (PSO) is used for selecting a suitable web services
based on the user’s weightage of each QoS values by optimizing the
QoS weight vector and thereby finding the best weight vectors for
best services that is being selected. Finally the results are compared
and analyzed using static inertia weight and deterministic inertia
weight of PSO.
Abstract: The Scheduling and mapping of tasks on a set of
processors is considered as a critical problem in parallel and
distributed computing system. This paper deals with the problem of
dynamic scheduling on a special type of multiprocessor architecture
known as Linear Crossed Cube (LCQ) network. This proposed
multiprocessor is a hybrid network which combines the features of
both linear types of architectures as well as cube based architectures.
Two standard dynamic scheduling schemes namely Minimum
Distance Scheduling (MDS) and Two Round Scheduling (TRS)
schemes are implemented on the LCQ network. Parallel tasks are
mapped and the imbalance of load is evaluated on different set of
processors in LCQ network. The simulations results are evaluated
and effort is made by means of through analysis of the results to
obtain the best solution for the given network in term of load
imbalance left and execution time. The other performance matrices
like speedup and efficiency are also evaluated with the given
dynamic algorithms.