Region-Based Image Fusion with Artificial Neural Network
For most image fusion algorithms separate
relationship by pixels in the image and treat them more or less
independently. In addition, they have to be adjusted different
parameters in different time or weather. In this paper, we propose a
region–based image fusion which combines aspects of feature and
pixel-level fusion method to replace only by pixel. The basic idea is
to segment far infrared image only and to add information of each
region from segmented image to visual image respectively. Then we
determine different fused parameters according different region. At
last, we adopt artificial neural network to deal with the problems of
different time or weather, because the relationship between fused
parameters and image features are nonlinear. It render the fused
parameters can be produce automatically according different states.
The experimental results present the method we proposed indeed
have good adaptive capacity with automatic determined fused
parameters. And the architecture can be used for lots of applications.
[1] Z. Wang, D. Ziou, C. Armenakis, D. Li, and Q. Li, "A Comparative
Analysis of Image Fusion Methods," Geoscience and Remote Sensing,
vol. 43, no. 6, pp. 1391-1402, June 2006.
[2] J. G. Liu, "Smoothing filter-based intensity modulation: A spectral
preserve image fusion technique for improving spatial details," Int. J.
Remote Sensing, vol. 21, no. 18, pp. 3461-3472, 2000.
[3] M. Li, W. Cai, and Z. Tan, "A region-based multi-sensor image fusion
scheme using pulse-coupled neural network," Pattern Recognition
Letters, vol. 27, pp. 1948-1956, 2006.
[4] L. J. Guo and J. M. Moore, "Pixel block intensity modulation: adding
spatial detail to TM band 6 thermal imagery," Int. J. Remote Sensing.,
vol. 19, no. 13, pp. 2477-2491, 1988.
[5] P. S. Chavez and J. A. Bowell, "Comparison of the spectral
information content of Landsat thematic mapper and SPOT for three
different sites in the Phoenix, Arizona region," Photogramm. Eng.
Remote Sensing., vol. 54, no.12, pp. 1699-1708, 1988.
[6] A. R. Gillespie, A. B. Kahle, and R. E. Walker, "Color enhancement of
highly Correlated images- . Channel ratio and chromaticity
transformation Techniques," Remote Sensing Environment, vol. 22,
pp. 343-365, 1987.
[7] J. Sun, J. Li and J. Li, "Multi-source remote sensing image fusion,"
INT. J. Remote Sensing, vol. 2, no. 1, pp. 323-328, Feb. 1998.
[8] W. J. Carper, T. M. Lillesand, and R. W. Kiefer, "The use of Intensity-
Hue-Saturation transformation for merging SPOT panchromatic and
multispectral image data," Photogramm. Eng. Remote Sensing, vol. 56,
no. 4, pp. 459-467, 1990.
[9] K. Edwards and P. A. Davis, "The use of Intensity-Hue-Saturation
transformation for producing color shaded-relief images,"
Photogramm. Eng. Remote Sensing, vol. 60, no. 11, pp. 1369-1374,
1994.
[10] E. M. Schetselaar, "Fusion by the IHS transform: Should we use
cylindrical or Spherical coordinates?," Int. J. Remote Sensing, vol. 19,
no. 4, pp. 759-765, 1998.
[11] J. Zhou, D. L. Civco, and J. A. Silander, "A wavelet transform method
to merge Landsat TM and SPOT panchromatic data," Int. J. Remote
Sensing, vol. 19, no. 4, pp. 743-757, 1998.
[12] S. Li, J. T. Kwok, Y. Wang, "Multifocus image fusion using artificial
neural networks," Pattern Recognition Letters, vol. 23, pp. 985-997,
2002.
[13] Q. Yuan, C.Y. Dong, Q. Wang, "An adaptive fusion algorithm based
on ANFIS for radar/infrared system," Expert Systems with
Applications, vol. 36, pp. 111-120, 2009.
[1] Z. Wang, D. Ziou, C. Armenakis, D. Li, and Q. Li, "A Comparative
Analysis of Image Fusion Methods," Geoscience and Remote Sensing,
vol. 43, no. 6, pp. 1391-1402, June 2006.
[2] J. G. Liu, "Smoothing filter-based intensity modulation: A spectral
preserve image fusion technique for improving spatial details," Int. J.
Remote Sensing, vol. 21, no. 18, pp. 3461-3472, 2000.
[3] M. Li, W. Cai, and Z. Tan, "A region-based multi-sensor image fusion
scheme using pulse-coupled neural network," Pattern Recognition
Letters, vol. 27, pp. 1948-1956, 2006.
[4] L. J. Guo and J. M. Moore, "Pixel block intensity modulation: adding
spatial detail to TM band 6 thermal imagery," Int. J. Remote Sensing.,
vol. 19, no. 13, pp. 2477-2491, 1988.
[5] P. S. Chavez and J. A. Bowell, "Comparison of the spectral
information content of Landsat thematic mapper and SPOT for three
different sites in the Phoenix, Arizona region," Photogramm. Eng.
Remote Sensing., vol. 54, no.12, pp. 1699-1708, 1988.
[6] A. R. Gillespie, A. B. Kahle, and R. E. Walker, "Color enhancement of
highly Correlated images- . Channel ratio and chromaticity
transformation Techniques," Remote Sensing Environment, vol. 22,
pp. 343-365, 1987.
[7] J. Sun, J. Li and J. Li, "Multi-source remote sensing image fusion,"
INT. J. Remote Sensing, vol. 2, no. 1, pp. 323-328, Feb. 1998.
[8] W. J. Carper, T. M. Lillesand, and R. W. Kiefer, "The use of Intensity-
Hue-Saturation transformation for merging SPOT panchromatic and
multispectral image data," Photogramm. Eng. Remote Sensing, vol. 56,
no. 4, pp. 459-467, 1990.
[9] K. Edwards and P. A. Davis, "The use of Intensity-Hue-Saturation
transformation for producing color shaded-relief images,"
Photogramm. Eng. Remote Sensing, vol. 60, no. 11, pp. 1369-1374,
1994.
[10] E. M. Schetselaar, "Fusion by the IHS transform: Should we use
cylindrical or Spherical coordinates?," Int. J. Remote Sensing, vol. 19,
no. 4, pp. 759-765, 1998.
[11] J. Zhou, D. L. Civco, and J. A. Silander, "A wavelet transform method
to merge Landsat TM and SPOT panchromatic data," Int. J. Remote
Sensing, vol. 19, no. 4, pp. 743-757, 1998.
[12] S. Li, J. T. Kwok, Y. Wang, "Multifocus image fusion using artificial
neural networks," Pattern Recognition Letters, vol. 23, pp. 985-997,
2002.
[13] Q. Yuan, C.Y. Dong, Q. Wang, "An adaptive fusion algorithm based
on ANFIS for radar/infrared system," Expert Systems with
Applications, vol. 36, pp. 111-120, 2009.
@article{"International Journal of Information, Control and Computer Sciences:56638", author = "Shuo-Li Hsu and Peng-Wei Gau and I-Lin Wu and Jyh-Horng Jeng", title = "Region-Based Image Fusion with Artificial Neural Network", abstract = "For most image fusion algorithms separate
relationship by pixels in the image and treat them more or less
independently. In addition, they have to be adjusted different
parameters in different time or weather. In this paper, we propose a
region–based image fusion which combines aspects of feature and
pixel-level fusion method to replace only by pixel. The basic idea is
to segment far infrared image only and to add information of each
region from segmented image to visual image respectively. Then we
determine different fused parameters according different region. At
last, we adopt artificial neural network to deal with the problems of
different time or weather, because the relationship between fused
parameters and image features are nonlinear. It render the fused
parameters can be produce automatically according different states.
The experimental results present the method we proposed indeed
have good adaptive capacity with automatic determined fused
parameters. And the architecture can be used for lots of applications.", keywords = "Image fusion, Region-based fusion, Segmentation,
Neural network, Multi-sensor.", volume = "3", number = "5", pages = "1326-4", }