Weisfeiler-Lehman图同构测试及其他
有問題歡迎留言討論
Weisfeiler-Lehman圖同構測試及其他
Weisfeiler-Lehman Test (WL Test)
Boris Weisfeiler and Andrey Lehman, 1968
Graph Isomorphism
一個簡單的同構圖例子:
1-dimensional WL Test
輸入:兩個可有節點屬性的圖
輸出:兩個圖是否同構(滿足WL Test是兩圖同構的必要條件)
-
演示1
-
演示2
-
更加具體的描述
- 穩定狀態
- 失效情況
注意這里和原ppt不同,2-WL其實應該是無法區分這個六邊形和兩個三角形的。(一種說法是1-WL和2-WL的能力其實是一樣的。)
k-dimensional WL Test
3維的WL Test首先枚舉圖中所有三個點的組合,初始化標簽,然后按照類似的方法進行細化。
k維的WL Test考慮了k個節點的組合。
Morgan Algorithm
Morgan, 1965
-
Chemical Fingerprints (ECFP)
The ECFP generation process has three sequential stages:
- An initial assignment stage in which each atom has an integer identifier assigned to it.
- An iterative updating stage in which each atom identifier is updated to reflect the identifiers of each atom’s neighbors, including identification of whether it is a structural duplicate of other features.
- A duplicate identifier removal stage in which multiple occurrences of the same feature are reduced to a single representative in the final feature list. (The occurrence count may be retained if one requires a set of counts rather than a standard binary fingerprint.)
- 初始化原子標識符。哈希函數處理非氫原子屬性(原子序號、連接性等),得到一個整數
- 標志符的迭代更新。類似Mogan算法,但是迭代次數是預先設定的,不追求得到1-WL的區分度。
- 標識符去重。保留所有不同的標識符,壓縮到一個比特串中。
-
Canonical SMILES
利用Mogan算法迭代連通度,直到穩定(更新后連通度直方圖形狀不變),利用連通度進行排序,得到唯一的SMILES記法。(這種算法是商業化的,所以計算Canonical SMILES要用Daylight軟件。)
Message-Passing Neural Network (MPNN)
Weisfeiler-Lehman Netwrok (WLN)
WLN的思想是將1-WL 中離散的呈指數增長的節點標簽用嵌入向量代替
用于預測有機分子化學反應,NeurIPS 2017
MPNN
消息傳遞網絡MPNN是一種聚合鄰近點信息的圖神經網絡框架。
MPNN contains two phases, a message passing phase (namely the propagation step) and a readout phase.
- The message passing phase runs for T times and is defined by message function Mt and vertex update function Ut.
where m v t m_v^t mvt? is message and e v w e_{vw} evw? is the feature of the edge from node v to w
- The reader phase computes a feature vector for the whole graph using readout function
How Powerful Are Graph Neural Networks?
ICLR 2019
WL Test & MPNN
- MPNN
- 1d WL Test
1-WL 是MPNN類型的GNN的性能上界
不過利用WL得到的節點特征是離散的,或者說是one hot類型的,不能用于計算圖的相似度等。
- 設計合適的更新函數和聚合函數非常重要
- 常用的聚合函數如MAX、MEAN不能處理的一些情況
聚合函數需要是 單射(injective) 的,即函數不同的輸入不能有相同的輸出。
Graph Isomorphism Networks (GIN)
- 更新(以及聚合)函數:MLP+SUM
- 存在這樣一個函數是單射的
- 用MLP擬合 f ? f\phi f?
- READOUT函數
實驗(圖分類 Graph Classification)
訓練集:
測試集:
Beyond WL
Beyond 1-WL
non local
基于k-WL及各種變種k-WL設計的網絡,雖然理論很好但實際效果不佳。
- k-GNNs 需要 O ( n k ) O(n^k) O(nk)級別內存
- Invariant Graph Networks (IGN) based on k-order tensors
- 3-WL 級別的IGN有平方級別的復雜度,但較MPNN的線性復雜度還是略顯臃腫
Beyond WL
- GSN 在MPNN的基礎上,使聚合的信息包括局部圖結構(保留了局部性和線性復雜度)
- 近似同構,用某種度量來衡量兩圖的相似性
Reference
Michael Bronstein’s Blog (Recommended)
- Expressive power of graph neural networks and the Weisfeiler-Lehman test
- Beyond Weisfeiler-Lehman: using substructures for provably expressive graph neural networks
- Beyond Weisfeiler-Lehman: approximate isomorphisms and metric embeddings
WL Test
- Combinatorial Properties of the Weisfeiler-Leman Algorithm by Sandra Kiefer
- Weisfeiler-lehman graph kernels
- On Weisfeiler-Leman Invariance: Subgraph Counts and Related Graph Properties
Chemical Fingerprint
- Extended Connectivity Fingerprint ECFP
- Extended-Connectivity Fingerprints J.chem 2010
WLN
- Predicting organic reaction outcomes with weisfeiler-lehman network NeurIPS2017
MPNN
- Graph neural networks: A review of methods and applications arXiv 2018
- Neural message passing for quantum chemistry arXiv 2017
GIN
- How powerful are graph neural networks? ICLR 2019
總結
以上是生活随笔為你收集整理的Weisfeiler-Lehman图同构测试及其他的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: ABAP Authorization o
- 下一篇: S11LOL刀妹第一个大件出什么 刀妹出