-
Notifications
You must be signed in to change notification settings - Fork 139
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
f4fb083
commit ab0cc2c
Showing
9 changed files
with
928 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
|
||
This is sandbox code for the "DNCuts" (Downsampled Normalized Cuts) part of Multiscale Combinatorial Grouping, CVPR 2014 | ||
|
||
Here we demonstrate a technique for taking an affinity matrix from an image (where each pixel in the image is a row/column of the affinity matrix) and efficiently approximating the eigenvectors of the Laplacian of that graph, a la Normalized Cuts. | ||
|
||
Run "go.m" to see the technique in progress. This script uses a simple affinity measure, not the kind used in the CVPR 2014 paper, as this code is just meant to demonstrate the effectiveness of the eigenvector computation. The affinity measure can be swapped out for most commonly used affinity measures for segmentation, such as http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/resources.html. If everything works correctly, you should see true NCuts eigenvectors for the image, and fast eigenvectors from our method, and the fast eigenvectors should be roughly 20x faster to compute (though the exact speedup is heavily dependent on the connectivity of the affinity matrix and the parameters for DNCuts). | ||
|
||
Email questions about this code and technique to Jon Barron ([email protected]), but please direct all questions regarding the rest of the Multiscale Combinatorial Grouping paper to Pablo Abelaez ([email protected]). |
Oops, something went wrong.