【图像算法】彩色图像分割专题五:提取彩色图像上特定色彩
【圖像算法】彩色圖像分割專題五:提取彩色圖像特定色彩
????SkySeraph?Jun 8th 2011? HQU
Email:zgzhaobo@gmail.com? ? QQ:452728574
Latest Modified Date:Jun 8th 2011 HQU
一 原理及說明:
1? RGB(red,green,blue)模式是一種與設(shè)備相關(guān)的色彩空間,最常用的用途就是顯示器系統(tǒng)。RGB下,各分量關(guān)聯(lián)性太大,每個通道都編入了亮度信息,容易受周圍環(huán)境影響(光照等),其與人眼認(rèn)知顏色的過程不太匹配,并不適合用來對彩色圖像進(jìn)行分析和分割,相比下HSV空間是從人的視覺系統(tǒng)除法的,更適于圖像分析等。更多關(guān)于各種彩色空間模型請參考http://www.cnblogs.com/skyseraph/archive/2011/05/03/2035643.html
2??國內(nèi)很多關(guān)于車牌識別的論文中,當(dāng)利用到顏色信息時,一般都是在HSV/YIQ/Lab模式下,根據(jù)特定的車牌顏色信息(常見車牌顏色有:白底黑字、黑底白字、藍(lán)底白字、黃底黑字等),進(jìn)行車牌分割進(jìn)行的。 顏色的提取方法即本文所述。 這種方法只適合特定顏色的提取,用PR術(shù)語,類似"有監(jiān)督學(xué)習(xí)";反之,無監(jiān)督,對任意圖像進(jìn)行顏色分割,屬于彩色分割領(lǐng)域。
3? 關(guān)于HSV范圍的劃分:
<1> 論文:Car color recognition from CCTV camera image:http://www.docin.com/p-211572110.html?
作者采用的是如下方式:
<2>論文:利用支持向量機識別汽車顏色:http://www.cnki.com.cn/Article/CJFDTotal-JSJF200405018.htm
作者首先是在Lab空間下分出16類顏色,然后再HSV下進(jìn)行樣本空間分解,采用如下方式:
<3>本文根據(jù)實驗,采取劃分方式如源碼所示,在這種方式下,測試結(jié)果較好。
二?源碼:
/ // Note: 顏色分割:提取特定顏色 // Version: 5/11/2011 skyseraph zgzhaobo@gmail.com / void CColorSegDlg::ColorSegByHSV(IplImage* img) // 提取特定顏色 {//====================== 變量定義====================//int x,y; //循環(huán)//====================== 輸入彩色圖像信息====================//IplImage* pSrc = NULL;pSrc = cvCreateImage(cvGetSize(img),img->depth,img->nChannels);cvCopyImage(img,pSrc);int width = pSrc->width; //圖像寬度int height = pSrc->height; //圖像高度int depth = pSrc->depth; //圖像位深(IPL_DEPTH_8U...)int channels = pSrc->nChannels; //圖像通道數(shù)(1、2、3、4)int imgSize = pSrc->imageSize; //圖像大小 imageSize = height*widthStepint step = pSrc->widthStep/sizeof(uchar); //相鄰行的同列點之間的字節(jié)數(shù): 注意widthStep != width*nChannels (有字節(jié)填零補充)uchar* data = (uchar *)pSrc->imageData;int imageLen = width*height; ////=========================================//double B=0.0,G=0.0,R=0.0,H=0.0,S=0.0,V=0.0;IplImage* dstColorSegByColor = cvCreateImage(cvGetSize(pSrc),IPL_DEPTH_8U,3);IplImage* dstColorSegByColorGray = cvCreateImage(cvGetSize(pSrc),IPL_DEPTH_8U,1);//CvFont font = cvFont( 1, 1 );for (y=0; y<height; y++){for ( x=0; x<width; x++){// 獲取BGR值B = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels];G = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels+1];R = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels+2];// RGB-HSVpMyColorSpace.RGB2HSV(R,G,B,H,S,V); H = (360*H)/(2*PI); // 黑白//黑色if(V<0.35) {((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 0; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R} //白色if(S<0.15 && V>0.75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 255; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R}//灰色if(S<0.15 && 0.35<V && V<0.75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 128; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 128; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 128; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 128; //R}// 彩色if(V>=0.35 && S>=0.15){//紅色相近if((H>=0 && H<15) || (H>=340 && H<360)){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 40; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R }//黃色相近else if(H>=15 && H<75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 80; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R}//綠色相近else if(H>=75 && H<150){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 120; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R }///*//青色相近else if(H>=150 && H<185){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 160; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R}//*///藍(lán)色相近else if(H>=185 && H<270){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 200; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R } // /* //洋紅:270-340else if(H>=270 && H<340){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 220; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R }//*/else{((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 180; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 128; //B //紫色Purple((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 128; //R}}}}//cvNamedWindow("src",1);//cvShowImage("src",pSrc);cvNamedWindow("dstColorSegByColor",1);cvShowImage("dstColorSegByColor",dstColorSegByColor);cvNamedWindow("dstColorSegByColorGray",1);cvShowImage("dstColorSegByColorGray",dstColorSegByColorGray);cvSaveImage(".\\dstColorSegByColor.jpg",dstColorSegByColor);cvSaveImage(".\\dstColorSegByColorGray.jpg",dstColorSegByColorGray);cvWaitKey(0);cvDestroyAllWindows();cvReleaseImage(&pSrc);cvReleaseImage(&dstColorSegByColor);cvReleaseImage(&dstColorSegByColorGray);}三 效果:
(1)原圖
(2)顏色分割后彩色圖
(3)顏色分割后灰度圖(利用不同灰度級顯示)
??四 補充(RGB模式下,來源網(wǎng)絡(luò))
1 源碼
void CFindRGBDlg::OnFind() {int color=m_colorList.GetCurSel();pic=cvCreateImage( cvSize(image->width,image->height), 8, 1 );cvZero(pic);for(int x=0;x<image->height;x++){for(int y=0;y<image->width;y++) {uchar* ptrImg = &CV_IMAGE_ELEM(image,uchar,x,y*3);// uchar* ptrPic = &((uchar*)(pic->imageData + pic->widthStep*y))[x];//redif(color==0){if((ptrImg[0]-ptrImg[1])>200&&(ptrImg[0]-ptrImg[2])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}//Greenelse if(color==1){if((ptrImg[1]-ptrImg[0])>200&&(ptrImg[1]-ptrImg[2])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}//blueelse if(color==2){if((ptrImg[2]-ptrImg[0])>200&&(ptrImg[2]-ptrImg[1])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}}}cvNamedWindow("temp",-1);cvShowImage("temp",pic);cvWaitKey();storage = cvCreateMemStorage(0);contour = 0;mode = CV_RETR_EXTERNAL;cvFindContours( pic, storage, &contour, sizeof(CvContour), mode, CV_CHAIN_APPROX_SIMPLE);cvDrawContours(image, contour, CV_RGB(0,0,0), CV_RGB(0, 0, 0), 2, 2, 8);CRect rect; GetDlgItem(IDC_PICTURE)->GetClientRect(&rect); InvalidateRect(rect,true); }2 效果:
?
More in ?http://skyseraph.com/2011/08/27/CV/圖像算法專題/?
?
Author:???????? SKySeraph
Email/GTalk: zgzhaobo@gmail.com ???QQ:452728574
From:???????? http://www.cnblogs.com/skyseraph/
本文版權(quán)歸作者和博客園共有,歡迎轉(zhuǎn)載,但未經(jīng)作者同意必須保留此段聲明,且在文章頁面明顯位置給出原文連接,請尊重作者的勞動成果
轉(zhuǎn)載于:https://www.cnblogs.com/skyseraph/archive/2011/06/08/2075599.html
總結(jié)
以上是生活随笔為你收集整理的【图像算法】彩色图像分割专题五:提取彩色图像上特定色彩的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: cap流程图_化工工艺流程图制图图例汇编
- 下一篇: kpu 处理器_首轮融资即估值过亿,中科