Abstract: In this paper, an analytical simplified method for
calculating elasto-plastic stresses strains of notched bodies subject to
non-proportional loading paths is discussed. The method was based
on the Neuber notch correction, which relates the incremental elastic
and elastic-plastic strain energy densities at the notch root and the
material constitutive relationship. The validity of the method was
presented by comparing computed results of the proposed model
against finite element numerical data of notched shaft. The
comparison showed that the model estimated notch-root elasto-plastic
stresses strains with good accuracy using linear-elastic stresses. The
prosed model provides more efficient and simple analysis method
preferable to expensive experimental component tests and more
complex and time consuming incremental non-linear FE analysis.
The model is particularly suitable to perform fatigue life and fatigue
damage estimates of notched components subjected to nonproportional
loading paths.
Abstract: During the post-Civil War era, the city of Nashville,
Tennessee, had the highest mortality rate in the United States. The
elevated death and disease rates among former slaves were
attributable to lack of quality healthcare. To address the paucity of
healthcare services, Meharry Medical College, an institution with the
mission of educating minority professionals and serving the
underserved population, was established in 1876.
Purpose: The social ecological framework and partial least squares
(PLS) path modeling were used to quantify the impact of
socioeconomic status and adverse health outcome on primary care
professionals serving the disadvantaged community. Thus, the study
results could demonstrate the accomplishment of the College’s
mission of training primary care professionals to serve in underserved
areas.
Methods: Various statistical methods were used to analyze alumni
data from 1975 – 2013. K-means cluster analysis was utilized to
identify individual medical and dental graduates in the cluster groups
of the practice communities (Disadvantaged or Non-disadvantaged
Communities). Discriminant analysis was implemented to verify the
classification accuracy of cluster analysis. The independent t-test was
performed to detect the significant mean differences of respective
clustering and criterion variables. Chi-square test was used to test if
the proportions of primary care and non-primary care specialists are
consistent with those of medical and dental graduates practicing in
the designated community clusters. Finally, the PLS path model was
constructed to explore the construct validity of analytic model by
providing the magnitude effects of socioeconomic status and adverse
health outcome on primary care professionals serving the
disadvantaged community.
Results: Approximately 83% (3,192/3,864) of Meharry Medical
College’s medical and dental graduates from 1975 to 2013 were
practicing in disadvantaged communities. Independent t-test confirmed the content validity of the cluster analysis model. Also, the
PLS path modeling demonstrated that alumni served as primary care
professionals in communities with significantly lower socioeconomic
status and higher adverse health outcome (p < .001). The PLS path
modeling exhibited the meaningful interrelation between primary
care professionals practicing communities and surrounding
environments (socioeconomic statues and adverse health outcome),
which yielded model reliability, validity, and applicability.
Conclusion: This study applied social ecological theory and
analytic modeling approaches to assess the attainment of Meharry
Medical College’s mission of training primary care professionals to
serve in underserved areas, particularly in communities with low
socioeconomic status and high rates of adverse health outcomes. In
summary, the majority of medical and dental graduates from Meharry
Medical College provided primary care services to disadvantaged
communities with low socioeconomic status and high adverse health
outcome, which demonstrated that Meharry Medical College has
fulfilled its mission. The high reliability, validity, and applicability of
this model imply that it could be replicated for comparable
universities and colleges elsewhere.
Abstract: The exponential growth of social media arouses much
attention on public opinion information. The online forums, blogs,
micro blogs are proving to be extremely valuable resources and are
having bulk volume of information. However, most of the social
media data is unstructured and semi structured form. So that it is
more difficult to decipher automatically. Therefore, it is very much
essential to understand and analyze those data for making a right
decision. The online forums hotspot detection is a promising research
field in the web mining and it guides to motivate the user to take right
decision in right time. The proposed system consist of a novel
approach to detect a hotspot forum for any given time period. It uses
aging theory to find the hot terms and E-K-means for detecting the
hotspot forum. Experimental results demonstrate that the proposed
approach outperforms k-means for detecting the hotspot forums with
the improved accuracy.
Abstract: Association rule mining is one of the most important fields of data mining and knowledge discovery. In this paper, we propose an efficient multiple support frequent pattern growth algorithm which we called “MSFP-growth” that enhancing the FPgrowth algorithm by making infrequent child node pruning step with multiple minimum support using maximum constrains. The algorithm is implemented, and it is compared with other common algorithms: Apriori-multiple minimum supports using maximum constraints and FP-growth. The experimental results show that the rule mining from the proposed algorithm are interesting and our algorithm achieved better performance than other algorithms without scarifying the accuracy.
Abstract: This paper presents a model for a modified T-junction
device for microspheres generation. The numerical model is
developed using a commercial software package: COMSOL
Multiphysics. In order to test the accuracy of the numerical model,
multiple variables, such as the flow rate of cross-flow, fluid properties,
structure, and geometry of the microdevice are applied. The results
from the model are compared with the experimental results in the
diameter of the microsphere generated. The comparison shows a good
agreement. Therefore the model is useful in further optimization of the
device and feedback control of microsphere generation if any.
Abstract: A model was constructed to predict the amount of
solar radiation that will make contact with the surface of the earth in
a given location an hour into the future. This project was supported
by the Southern Company to determine at what specific times during
a given day of the year solar panels could be relied upon to produce
energy in sufficient quantities. Due to their ability as universal
function approximators, an artificial neural network was used to
estimate the nonlinear pattern of solar radiation, which utilized
measurements of weather conditions collected at the Griffin, Georgia
weather station as inputs. A number of network configurations and
training strategies were utilized, though a multilayer perceptron with
a variety of hidden nodes trained with the resilient propagation
algorithm consistently yielded the most accurate predictions. In
addition, a modeled direct normal irradiance field and adjacent
weather station data were used to bolster prediction accuracy. In later
trials, the solar radiation field was preprocessed with a discrete
wavelet transform with the aim of removing noise from the
measurements. The current model provides predictions of solar
radiation with a mean square error of 0.0042, though ongoing efforts
are being made to further improve the model’s accuracy.
Abstract: The arm length, hand length, hand breadth and middle
finger length of 1540 right-handed industrial workers of Haryana
state was used to assess the relationship between the upper limb
dimensions and stature. Initially, the data were analyzed using basic
univariate analysis and independent t-tests; then simple and multiple
linear regression models were used to estimate stature using SPSS
(version 17). There was a positive correlation between upper limb
measurements (hand length, hand breadth, arm length and middle
finger length) and stature (p < 0.01), which was highest for hand
length. The accuracy of stature prediction ranged from ± 54.897 mm
to ± 58.307 mm. The use of multiple regression equations gave better
results than simple regression equations. This study provides new
forensic standards for stature estimation from the upper limb
measurements of male industrial workers of Haryana (India). The
results of this research indicate that stature can be determined using
hand dimensions with accuracy, when only upper limb is available
due to any reasons likewise explosions, train/plane crashes, mutilated
bodies, etc. The regression formula derived in this study will be
useful for anatomists, archaeologists, anthropologists, design
engineers and forensic scientists for fairly prediction of stature using
regression equations.
Abstract: This paper introduces an original method for
guaranteed estimation of the accuracy for an ensemble of Lipschitz
classifiers. The solution was obtained as a finite closed set of
alternative hypotheses, which contains an object of classification with
probability of not less than the specified value. Thus, the
classification is represented by a set of hypothetical classes. In this
case, the smaller the cardinality of the discrete set of hypothetical
classes is, the higher is the classification accuracy. Experiments have
shown that if cardinality of the classifiers ensemble is increased then
the cardinality of this set of hypothetical classes is reduced. The
problem of the guaranteed estimation of the accuracy for an ensemble
of Lipschitz classifiers is relevant in multichannel classification of
target events in C-OTDR monitoring systems. Results of suggested
approach practical usage to accuracy control in C-OTDR monitoring
systems are present.
Abstract: This paper presents system level CMOS solid-state
nanopore techniques enhancement for speedup next generation
molecular recording and high throughput channels. This discussion
also considers optimum number of base-pair (bp) measurements
through channel as an important role to enhance potential read
accuracy. Effective power consumption estimation offered suitable
range of multi-channel configuration. Nanopore bp extraction model
in statistical method could contribute higher read accuracy with
longer read-length (200 < read-length). Nanopore ionic current
switching with Time Multiplexing (TM) based multichannel readout
system contributed hardware savings.
Abstract: Real bronchial tree is very complicated piping system.
Analysis of flow and pressure losses in this system is very difficult.
Due to the complex geometry and the very small size in the lower
generations is examination by CFD possible only in the central part
of bronchial tree. For specify the pressure losses of lower generations
is necessary to provide a mathematical equation. Determination of
mathematical formulas for calculation of pressure losses in the real
lungs is time consuming and inefficient process due to its complexity
and diversity. For these calculations is necessary to slightly simplify
the geometry of lungs (same cross-section over the length of
individual generation) or use one of the idealized models of lungs
(Horsfield, Weibel). The article compares the values of pressure
losses obtained from CFD simulation of air flow in the central part of
the real bronchial tree with the values calculated in a slightly
simplified real lungs by using a mathematical relationship derived
from the Bernoulli and continuity equations. The aim of the article is
to analyse the accuracy of the analytical method and its possibility of
use for the calculation of pressure losses in lower generations, which
is difficult to solve by numerical method due to the small geometry.
Abstract: In a multi-cultural learning context, where ties are
weak and dynamic, combining qualitative with quantitative research
methods may be more effective. Such a combination may also allow
us to answer different types of question, such as about people’s
perception of the network. In this study the use of observation,
interviews and photos were explored as ways of enhancing data from
social network questionnaires. Integrating all of these methods was
found to enhance the quality of data collected and its accuracy, also
providing a richer story of the network dynamics and the factors that
shaped these changes over time.
Abstract: The detection of moving objects from a video image
sequences is very important for object tracking, activity recognition,
and behavior understanding in video surveillance.
The most used approach for moving objects detection / tracking is
background subtraction algorithms. Many approaches have been
suggested for background subtraction. But, these are illumination
change sensitive and the solutions proposed to bypass this problem
are time consuming.
In this paper, we propose a robust yet computationally efficient
background subtraction approach and, mainly, focus on the ability to
detect moving objects on dynamic scenes, for possible applications in
complex and restricted access areas monitoring, where moving and
motionless persons must be reliably detected. It consists of three
main phases, establishing illumination changes invariance,
background/foreground modeling and morphological analysis for
noise removing.
We handle illumination changes using Contrast Limited Histogram
Equalization (CLAHE), which limits the intensity of each pixel to
user determined maximum. Thus, it mitigates the degradation due to
scene illumination changes and improves the visibility of the video
signal. Initially, the background and foreground images are extracted
from the video sequence. Then, the background and foreground
images are separately enhanced by applying CLAHE.
In order to form multi-modal backgrounds we model each channel
of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture
Model (GMM). Finally, we post process the resulting binary
foreground mask using morphological erosion and dilation
transformations to remove possible noise.
For experimental test, we used a standard dataset to challenge the
efficiency and accuracy of the proposed method on a diverse set of
dynamic scenes.
Abstract: This paper addresses the issue of the autonomous
mobile robot (AMR) navigation task based on the hybrid control
modes. The novel hybrid control mode, based on multi-sensors
information by using the fuzzy approach, has been presented in this
research. The system operates in real time, is robust, enables the robot
to operate with imprecise knowledge, and takes into account the
physical limitations of the environment in which the robot moves,
obtaining satisfactory responses for a large number of different
situations. An experiment is simulated and carried out with a pioneer
mobile robot. From the experimental results, the effectiveness and
usefulness of the proposed AMR obstacle avoidance and navigation
scheme are confirmed. The experimental results show the feasibility,
and the control system has improved the navigation accuracy. The
implementation of the controller is robust, has a low execution time,
and allows an easy design and tuning of the fuzzy knowledge base.
Abstract: The aim of this paper is to select the most accurate
forecasting method for predicting the future values of the
unemployment rate in selected European countries. In order to do so,
several forecasting techniques adequate for forecasting time series
with trend component, were selected, namely: double exponential
smoothing (also known as Holt`s method) and Holt-Winters` method
which accounts for trend and seasonality. The results of the empirical
analysis showed that the optimal model for forecasting
unemployment rate in Greece was Holt-Winters` additive method. In
the case of Spain, according to MAPE, the optimal model was double
exponential smoothing model. Furthermore, for Croatia and Italy the
best forecasting model for unemployment rate was Holt-Winters`
multiplicative model, whereas in the case of Portugal the best model
to forecast unemployment rate was Double exponential smoothing
model. Our findings are in line with European Commission
unemployment rate estimates.
Abstract: Ti6Al4V alloy is highly used in the automotive and
aerospace industry due to its good machining characteristics. Micro
EDM drilling is commonly used to drill micro hole on extremely hard
material with very high depth to diameter ratio. In this study, the
parameters of micro-electrical discharge machining (EDM) in drilling
of Ti6Al4V alloy is optimized for higher machining accuracy with
less hole-dilation and hole taper ratio. The micro-EDM machining
parameters includes, peak current and pulse on time. Fuzzy analysis
was developed to evaluate the machining accuracy. The analysis
shows that hole-dilation and hole-taper ratio are increased with the
increasing of peak current and pulse on time. However, the surface
quality deteriorates as the peak current and pulse on time increase.
The combination that gives the optimum result for hole dilation is
medium peak current and short pulse on time. Meanwhile, the
optimum result for hole taper ratio is low peak current and short pulse
on time.
Abstract: The aim of this work is to build a model based on
tissue characterization that is able to discriminate pathological and
non-pathological regions from three-phasic CT images. With our
research and based on a feature selection in different phases, we are
trying to design a neural network system with an optimal neuron
number in a hidden layer. Our approach consists of three steps:
feature selection, feature reduction, and classification. For each
region of interest (ROI), 6 distinct sets of texture features are
extracted such as: first order histogram parameters, absolute gradient,
run-length matrix, co-occurrence matrix, autoregressive model, and
wavelet, for a total of 270 texture features. When analyzing more
phases, we show that the injection of liquid cause changes to the high
relevant features in each region. Our results demonstrate that for
detecting HCC tumor phase 3 is the best one in most of the features
that we apply to the classification algorithm. The percentage of
detection between pathology and healthy classes, according to our
method, relates to first order histogram parameters with accuracy of
85% in phase 1, 95% in phase 2, and 95% in phase 3.
Abstract: Development of a method to estimate gene functions is
an important task in bioinformatics. One of the approaches for the
annotation is the identification of the metabolic pathway that genes are
involved in. Since gene expression data reflect various intracellular
phenomena, those data are considered to be related with genes’
functions. However, it has been difficult to estimate the gene function
with high accuracy. It is considered that the low accuracy of the
estimation is caused by the difficulty of accurately measuring a gene
expression. Even though they are measured under the same condition,
the gene expressions will vary usually. In this study, we proposed a
feature extraction method focusing on the variability of gene
expressions to estimate the genes' metabolic pathway accurately. First,
we estimated the distribution of each gene expression from replicate
data. Next, we calculated the similarity between all gene pairs by KL
divergence, which is a method for calculating the similarity between
distributions. Finally, we utilized the similarity vectors as feature
vectors and trained the multiclass SVM for identifying the genes'
metabolic pathway. To evaluate our developed method, we applied the
method to budding yeast and trained the multiclass SVM for
identifying the seven metabolic pathways. As a result, the accuracy
that calculated by our developed method was higher than the one that
calculated from the raw gene expression data. Thus, our developed
method combined with KL divergence is useful for identifying the
genes' metabolic pathway.
Abstract: In this paper, an effective non-destructive, noninvasive
approach for leak detection was proposed. The process relies
on analyzing thermal images collected by an IR viewer device that
captures thermo-grams. In this study a statistical analysis of the
collected thermal images of the ground surface along the expected
leak location followed by a visual inspection of the thermo-grams
was performed in order to locate the leak. In order to verify the
applicability of the proposed approach the predicted leak location
from the developed approach was compared with the real leak
location. The results showed that the expected leak location was
successfully identified with an accuracy of more than 95%.
Abstract: ESPRIT-TLS method appears a good choice for high
resolution fault detection in induction machines. It has a very high
effectiveness in the frequency and amplitude identification.
Contrariwise, it presents a high computation complexity which
affects its implementation in real time fault diagnosis. To avoid this
problem, a Fast-ESPRIT algorithm that combined the IIR band-pass
filtering technique, the decimation technique and the original
ESPRIT-TLS method was employed to enhance extracting accurately
frequencies and their magnitudes from the wind stator current with
less computation cost. The proposed algorithm has been applied to
verify the wind turbine machine need in the implementation of an online,
fast, and proactive condition monitoring. This type of remote
and periodic maintenance provides an acceptable machine lifetime,
minimize its downtimes and maximize its productivity. The
developed technique has evaluated by computer simulations under
many fault scenarios. Study results prove the performance of Fast-
ESPRIT offering rapid and high resolution harmonics recognizing
with minimum computation time and less memory cost.
Abstract: The Blue Nile Basin is the most important tributary of
the Nile River. Egypt and Sudan are almost dependent on water
originated from the Blue Nile. This multi-dependency creates
conflicts among the three countries Egypt, Sudan, and Ethiopia
making the management of these conflicts as an international issue.
Good assessment of the water resources of the Blue Nile is an
important to help in managing such conflicts. Hydrological models
are good tool for such assessment. This paper presents a critical
review of the nature and variability of the climate and hydrology of
the Blue Nile Basin as a first step of using hydrological modeling to
assess the water resources of the Blue Nile. Many several attempts
are done to develop basin-scale hydrological modeling on the Blue
Nile. Lumped and semi distributed models used averages of
meteorological inputs and watershed characteristics in hydrological
simulation, to analyze runoff for flood control and water resource
management. Distributed models include the temporal and spatial
variability of catchment conditions and meteorological inputs to
allow better representation of the hydrological process. The main
challenge of all used models was to assess the water resources of the
basin is the shortage of the data needed for models calibration and
validation. It is recommended to use distributed model for their
higher accuracy to cope with the great variability and complexity of
the Blue Nile basin and to collect sufficient data to have more
sophisticated and accurate hydrological modeling.