计算机科学 ›› 2009, Vol. 36 ›› Issue (10): 268-273.

• 图形图像及体系结构 • 上一篇    下一篇

一种基于NAMPD的快速图像分割

吴雪丽,陈传波,夏晖   

  1. (华中科技大学计算机科学与技术学院 武汉 430074)
  • 出版日期:2018-11-16 发布日期:2018-11-16
  • 基金资助:
    本文受国家自然科学基金项目(60873031)资助。

Fast Image Segmentation Based on NAMPD

WU Xue-li, CHEN Chuan-bo, XIA Hui   

  • Online:2018-11-16 Published:2018-11-16

摘要: 非对称逆布局模式表示模型(Non-symmetry and Anti-packing pattern representation Mode1,NAM)借助布局问题的思想,使用一个子模式集合来表示原模式。基于NAM模型,提出了一种灰度图像表示方法,非对称逆布局平面分解模式表示模型(NAM-structrucd Planc Dccomposition,NAMPD)。在NAMPD中,每一个子模式都对应于图像中的一个矩形区域,该区域的亮度函数由一个斜面模型逼近。图像分割是图像分析中的一种关键方法。传统的图像分割算法大多是基于点阵表示的,运算效率不高。基于NAMPD,提出了一种快速图像分割算法。因为NAMPD将图像块而不是像素作为最小操作单位,所以基于NAMPD的图像处理操作效率更高。实验结果表明,基于NAMPD的图像分割算法的速度较之经典算法更快。

关键词: 非对称逆布局模式表示模型,布局问题,图像表示,图像分割

Abstract: With the concept of packing problem, the Non-symmetry and Anti packing pattern representation Model (NAM) uses a set of sulrpatterns to represent an original pattern. In this paper, we developed a new method for grey scale image representation based on NAM, called NAM-structured plane decomposition(NAMPD). In NAMPD, each sub-pattern is associated with a rectangular region in the image. The luminance function of pixels in this region was approximated by an oblique plane model. Image segmentation was a key method in image analysis. The traditional image segmentation algorithms was developed using the pixel representation. In this paper, we proposed a fast algorithm for segmentation of grey scale images based on NAMPD. Image processing using the NAMPD representation performed more quickly because it permited the execution of operations on image blocks instead of pixels. hhe experimental results presented in this paper show that the image segmentation method using NAMPD performs faster than the classical ones.

Key words: Non-symmetry and anti packing pattern representation model ( NAM),Packing problem, Image representation, Image segmentation

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!