Abstract: Health and Social care (HSc) services planning and scheduling are facing unprecedented challenges, due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven approaches can help to improve policies, plan and design services provision schedules using algorithms that assist healthcare managers to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as Classification and Regression Trees (CART), Random Forests (RF), and Logistic Regression (LGR). The significance tests Chi-Squared and Student’s test are used on data over a 39 years span for which data exist for services delivered in Scotland. The demands are associated using probabilities and are parts of statistical hypotheses. These hypotheses, as their NULL part, assume that the target demand is statistically dependent on other services’ demands. This linking is checked using the data. In addition, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus, groups of services. Statistical tests confirmed ML coupling and made the prediction statistically meaningful and proved that a target service can be matched reliably to other services while ML showed that such marked relationships can also be linear ones. Zero padding was used for missing years records and illustrated better such relationships both for limited years and for the entire span offering long-term data visualizations while limited years periods explained how well patients numbers can be related in short periods of time or that they can change over time as opposed to behaviours across more years. The prediction performance of the associations were measured using metrics such as Receiver Operating Characteristic (ROC), Area Under Curve (AUC) and Accuracy (ACC) as well as the statistical tests Chi-Squared and Student. Co-plots and comparison tables for the RF, CART, and LGR methods as well as the p-value from tests and Information Exchange (IE/MIE) measures are provided showing the relative performance of ML methods and of the statistical tests as well as the behaviour using different learning ratios. The impact of k-neighbours classification (k-NN), Cross-Correlation (CC) and C-Means (CM) first groupings was also studied over limited years and for the entire span. It was found that CART was generally behind RF and LGR but in some interesting cases, LGR reached an AUC = 0 falling below CART, while the ACC was as high as 0.912 showing that ML methods can be confused by zero-padding or by data’s irregularities or by the outliers. On average, 3 linear predictors were sufficient, LGR was found competing well RF and CART followed with the same performance at higher learning ratios. Services were packed only when a significance level (p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, low birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited duration, across various services sectors, learning configurations, as confirmed by using statistical hypotheses.
Abstract: This paper considers the modelling of a non-stationary
bivariate integer-valued autoregressive moving average of order
one (BINARMA(1,1)) with correlated Poisson innovations. The
BINARMA(1,1) model is specified using the binomial thinning
operator and by assuming that the cross-correlation between the
two series is induced by the innovation terms only. Based on
these assumptions, the non-stationary marginal and joint moments
of the BINARMA(1,1) are derived iteratively by using some initial
stationary moments. As regards to the estimation of parameters of
the proposed model, the conditional maximum likelihood (CML)
estimation method is derived based on thinning and convolution
properties. The forecasting equations of the BINARMA(1,1) model
are also derived. A simulation study is also proposed where
BINARMA(1,1) count data are generated using a multivariate
Poisson R code for the innovation terms. The performance of
the BINARMA(1,1) model is then assessed through a simulation
experiment and the mean estimates of the model parameters obtained
are all efficient, based on their standard errors. The proposed model
is then used to analyse a real-life accident data on the motorway in
Mauritius, based on some covariates: policemen, daily patrol, speed
cameras, traffic lights and roundabouts. The BINARMA(1,1) model
is applied on the accident data and the CML estimates clearly indicate
a significant impact of the covariates on the number of accidents on
the motorway in Mauritius. The forecasting equations also provide
reliable one-step ahead forecasts.
Abstract: In this paper, we report the experimental results on using complementary Golay coded signals at 7.5 MHz to detect breast microcalcifications of 50 µm size. Simulations using complementary Golay coded signals show perfect consistence with the experimental results, confirming the improved signal to noise ratio for complementary Golay coded signals. For improving the success on detecting the microcalcifications, orthogonal complementary Golay sequences having cross-correlation for minimum interference are used as coded signals and compared to tone burst pulse of equal energy in terms of resolution under weak signal conditions. The measurements are conducted using an experimental ultrasound research scanner, Digital Phased Array System (DiPhAS) having 256 channels, a phased array transducer with 7.5 MHz center frequency and the results obtained through experiments are validated by Field-II simulation software. In addition, to investigate the superiority of coded signals in terms of resolution, multipurpose tissue equivalent phantom containing series of monofilament nylon targets, 240 µm in diameter, and cyst-like objects with attenuation of 0.5 dB/[MHz x cm] is used in the experiments. We obtained ultrasound images of monofilament nylon targets for the evaluation of resolution. Simulation and experimental results show that it is possible to differentiate closely positioned small targets with increased success by using coded excitation in very weak signal conditions.
Abstract: A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.
Abstract: In this research, the occurrences of large size events in various system sizes of the Bak-Tang-Wiesenfeld sandpile model are considered. The system sizes (square lattice) of model considered here are 25×25, 50×50, 75×75 and 100×100. The cross-correlation between the ratio of sites containing 3 grain time series and the large size event time series for these 4 system sizes are also analyzed. Moreover, a prediction method of the large-size event for the 50×50 system size is also introduced. Lastly, it can be shown that this prediction method provides a slightly higher efficiency than random predictions.
Abstract: This paper presents a SAC-OCDMA code with zero cross correlation property to minimize the Multiple Access Interface (MAI) as New Zero Cross Correlation code (NZCC), which is found to be more scalable compared to the other existing SAC-OCDMA codes. This NZCC code is constructed using address segment and data segment. In this work, the proposed NZCC code is implemented in an optical system using the Opti-System software for the spectral amplitude coded optical code-division multiple-access (SAC-OCDMA) scheme. The main contribution of the proposed NZCC code is the zero cross correlation, which reduces both the MAI and PIIN noises. The proposed NZCC code reveals properties of minimum cross-correlation, flexibility in selecting the code parameters and supports a large number of users, combined with high data rate and longer fiber length. Simulation results reveal that the optical code division multiple access system based on the proposed NZCC code accommodates maximum number of simultaneous users with higher data rate transmission, lower Bit Error Rates (BER) and longer travelling distance without any signal quality degradation, as compared to the former existing SAC-OCDMA codes.
Abstract: Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.
Abstract: Wireless Body Area Network (WBAN) is a short-range
wireless communication around human body for various applications
such as wearable devices, entertainment, military, and especially
medical devices. WBAN attracts the attention of continuous health
monitoring system including diagnostic procedure, early detection of
abnormal conditions, and prevention of emergency situations.
Compared to cellular network, WBAN system is more difficult to
control inter- and inner-cell interference due to the limited power,
limited calculation capability, mobility of patient, and
non-cooperation among WBANs.
In this paper, we compare the performance of resource allocation
scheme based on several Pseudo Orthogonal Codewords (POCs) to
mitigate inter-WBAN interference. Previously, the POCs are widely
exploited for a protocol sequence and optical orthogonal code. Each
POCs have different properties of auto- and cross-correlation and
spectral efficiency according to its construction of POCs. To identify
different WBANs, several different pseudo orthogonal patterns based
on POCs exploits for resource allocation of WBANs. By simulating
these pseudo orthogonal resource allocations of WBANs on
MATLAB, we obtain the performance of WBANs according to
different POCs and can analyze and evaluate the suitability of POCs
for the resource allocation in the WBANs system.
Abstract: The social logic of 'Sequina' slum area in Alexandria details the integral measure of space syntax at the room-level of twenty-building samples. The essence of spatial structure integrates the central 'visitor' domain with the 'living' frontage of the 'children' zone against the segregated privacy of the opposite 'parent' depth. Meanwhile, the multifunctioning of shallow rooms optimizes the integral 'visitor' structure through graph and visibility dimensions in contrast to the 'inhabitant' structure of graph-tails out of sight. Common theme of the layout integrity increases in compensation to the decrease of room visibility. Despite the 'pheno-type' of collective integration, the individual layouts observe 'geno-type' structure of spatial diversity per room adjoins. In this regard, the layout integrity alternates the cross-correlation of the 'kitchen & living' rooms with the 'inhabitant & visitor' domains of 'motherhood' dynamic structure. Moreover, the added 'grandparent' restructures the integral measure to become the deepest space, but opens to the 'living' of 'household' integrity. Some isomorphic layouts change the integral structure just through the 'balcony' extension of access, visual or ignored 'ringiness' of space syntax. However, the most integrated or segregated layouts invert the 'geno-type' into a shallow 'inhabitant' centrality versus the remote 'visitor' structure. Overview of the multivariate social logic of spatial integrity could never clarify without the micro-data analysis.
Abstract: Image compression based on fractal coding is a lossy
compression method and normally used for gray level images range
and domain blocks in rectangular shape. Fractal based digital image
compression technique provide a large compression ratio and in this
paper, it is proposed using YUV colour space and the fractal theory
which is based on iterated transformation. Fractal geometry is mainly
applied in the current study towards colour image compression
coding. These colour images possesses correlations among the colour
components and hence high compression ratio can be achieved by
exploiting all these redundancies. The proposed method utilises the
self-similarity in the colour image as well as the cross-correlations
between them. Experimental results show that the greater
compression ratio can be achieved with large domain blocks but more
trade off in image quality is good to acceptable at less than 1 bit per
pixel.
Abstract: In this study, we propose a novel technique for acoustic
echo suppression (AES) during speech recognition under barge-in
conditions. Conventional AES methods based on spectral subtraction
apply fixed weights to the estimated echo path transfer function
(EPTF) at the current signal segment and to the EPTF estimated until
the previous time interval. However, the effects of echo path changes
should be considered for eliminating the undesired echoes. We
describe a new approach that adaptively updates weight parameters in
response to abrupt changes in the acoustic environment due to
background noises or double-talk. Furthermore, we devised a voice
activity detector and an initial time-delay estimator for barge-in speech
recognition in communication networks. The initial time delay is
estimated using log-spectral distance measure, as well as
cross-correlation coefficients. The experimental results show that the
developed techniques can be successfully applied in barge-in speech
recognition systems.
Abstract: In this work, a method of time delay estimation for
dual-channel acoustic signals (speech, music, etc.) recorded under
reverberant conditions is investigated. Standard methods based on
cross-correlation of the signals show poor results in cases involving
strong reverberation, large distances between microphones and
asynchronous recordings. Under similar conditions, a method based
on cross-correlation of temporal envelopes of the signals delivers a
delay estimation of acceptable quality. This method and its properties
are described and investigated in detail, including its limits of
applicability. The method’s optimal parameter estimation and a
comparison with other known methods of time delay estimation are
also provided.
Abstract: There is a growing interest in the use of ultrasonic speckle tracking for biomedical image formation of tissue deformation. Speckle tracking is angle independent and has an ability to differentiate soft tissue into benign and malignant regions. In this paper a simulation model for dynamic ultrasound scatterer is presented. The model composes Field-II ultrasonic scatterers and FEM (ANSYS-11) nodes as a regional tissue deformation. A performance evaluation is presented on axial displacement and strain fields estimation of a uniformly elastic model, using speckle tracking based 1D cross-correlation of optimally segmented pre and post-deformation frames. Optimum correlation window length is investigated in terms of highest signal-to-noise ratio (SNR) for a selected region of interest of a smoothed displacement field. Finally, gradient based strain field of both smoothed and non-smoothed displacement fields are compared. Simulation results from the model are shown to compare favorably with FEM results.
Abstract: This paper investigates the problem of spreading
sequence and receiver code synchronization techniques for satellite
based CDMA communications systems. The performance of CDMA
system depends on the autocorrelation and cross-correlation
properties of the used spreading sequences. In this paper we propose
the uses of chaotic Lu system to generate binary sequences for
spreading codes in a direct sequence spread CDMA system. To
minimize multiple access interference (MAI) we propose the use of
genetic algorithm for optimum selection of chaotic spreading
sequences. To solve the problem of transmitter-receiver
synchronization, we use the passivity controls. The concept of
semipassivity is defined to find simple conditions which ensure
boundedness of the solutions of coupled Lu systems. Numerical
results are presented to show the effectiveness of the proposed
approach.
Abstract: In this paper, a target signal detection method using
multiple signal classification (MUSIC) algorithm is proposed. The
MUSIC algorithm is a subspace-based direction of arrival (DOA)
estimation method. The algorithm detects the DOAs of multiple
sources using the inverse of the eigenvalue-weighted eigen spectra. To
apply the algorithm to target signal detection for GSC-based
beamforming, we utilize its spectral response for the target DOA in
noisy conditions. For evaluation of the algorithm, the performance of
the proposed target signal detection method is compared with that of
the normalized cross-correlation (NCC), the fixed beamforming, and
the power ratio method. Experimental results show that the proposed
algorithm significantly outperforms the conventional ones in receiver
operating characteristics(ROC) curves.
Abstract: In this paper we proposed comparison of four content based objective metrics with results of subjective tests from 80 video sequences. We also include two objective metrics VQM and SSIM to our comparison to serve as “reference” objective metrics because their pros and cons have already been published. Each of the video sequence was preprocessed by the region recognition algorithm and then the particular objective video quality metric were calculated i.e. mutual information, angular distance, moment of angle and normalized cross-correlation measure. The Pearson coefficient was calculated to express metrics relationship to accuracy of the model and the Spearman rank order correlation coefficient to represent the metrics relationship to monotonicity. The results show that model with the mutual information as objective metric provides best result and it is suitable for evaluating quality of video sequences.
Abstract: In this study, the problem of discriminating between interictal epileptic and non- epileptic pathological EEG cases, which present episodic loss of consciousness, investigated. We verify the accuracy of the feature extraction method of autocross-correlated coefficients which extracted and studied in previous study. For this purpose we used in one hand a suitable constructed artificial supervised LVQ1 neural network and in other a cross-correlation technique. To enforce the above verification we used a statistical procedure which based on a chi- square control. The classification and the statistical results showed that the proposed feature extraction is a significant accurate method for diagnostic discrimination cases between interictal and non-interictal EEG events and specifically the classification procedure showed that the LVQ neural method is superior than the cross-correlation one.
Abstract: In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.
Abstract: We analyze the effectivity of different pseudo noise (PN) and orthogonal sequences for encrypting speech signals in terms of perceptual intelligence. Speech signal can be viewed as sequence of correlated samples and each sample as sequence of bits. The residual intelligibility of the speech signal can be reduced by removing the correlation among the speech samples. PN sequences have random like properties that help in reducing the correlation among speech samples. The mean square aperiodic auto-correlation (MSAAC) and the mean square aperiodic cross-correlation (MSACC) measures are used to test the randomness of the PN sequences. Results of the investigation show the effectivity of large Kasami sequences for this purpose among many PN sequences.
Abstract: The successful use of CDMA technology is based on
the construction of large families of encoding sequences with good
correlation properties. This paper discusses PN sequence generation
based on Residue Arithmetic with an effort to improve the performance
of existing interference-limited CDMA technology for mobile
cellular systems. All spreading codes with residual number system
proposed earlier did not consider external interferences, multipath
propagation, Doppler effect etc. In literature the use of residual
arithmetic in DS-CDMA was restricted to encoding of already spread
sequence; where spreading of sequence is done by some existing
techniques. The novelty of this paper is the use of residual number
system in generation of the PN sequences which is used to spread
the message signal. The significance of cross-correlation factor in
alleviating multi-access interference is also discussed. The RNS based
PN sequence has superior performance than most of the existing
codes that are widely used in DS-CDMA applications. Simulation
results suggest that the performance of the proposed system is
superior to many existing systems.