Skip to content

Commit

Permalink
Fix typos
Browse files Browse the repository at this point in the history
  • Loading branch information
dboyliao committed Jul 7, 2019
1 parent 665ecfd commit f56b40b
Showing 1 changed file with 6 additions and 3 deletions.
9 changes: 6 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,16 +91,19 @@ With `uTensorGraphMatcher`, performing common subgraph tasks such as isomorphic

Note: we'll use operation/node/layer interchangeably in the documentation

- It's commonly seen pattern in convolution neural network (`CNN`), `conv -> relu -> pooling`. That is, a 2D convolution followed by a relu layer and then a pooliing down sampling layer. With our `uTensorGraphMatcher`, you can locate such pattern in your `CNN` model and fuse/replace matched nodes into one optimized `QuantConvPool` node.
- It's commonly seen pattern in convolution neural network (`CNN`), `conv -> relu -> pooling`. That is, a 2D convolution followed by a relu layer and then a pooling down sampling layer.
- With our `uTensorGraphMatcher`, you can locate such pattern in your `CNN` model and fuse/replace matched nodes into one optimized `QuantizedFusedConv2DMaxpool` node.
- Left: original graph
- Middle: matched convolution layer
- Right: replace the matched layer with specialized `QuantConvPool` node
- Right: replace the matched layer with specialized `QuantizedFusedConv2DMaxpool` node

![conv-pool-fuce](images/conv_pool_fuse.png)

### Dropout Layer Removal

- Though `dropout` is an effective technique to improve training performance of your model, it's not necessary during inference phrase. In the mainstream frameworks such as `Tensorflow` or `PyTorch`, an `dropout` layer is typically implemented with other elementary operations/nodes. As a result, finding and removing those nodes for interence optimization (both in model size and prediciton time) is not trivial and error prone. With our `uTensorGraphMatcher`, you can find and remove the dropout nodes as illustrated in the following picture.
- Though `dropout` is an effective technique to improve training performance of your model, it's not necessary during inference phrase.
- In the mainstream frameworks such as `Tensorflow` or `PyTorch`, an `dropout` layer is typically implemented with other elementary operations/nodes. As a result, finding and removing those nodes for interence optimization (both in model size and prediciton time) is not trivial and error prone.
- With our `uTensorGraphMatcher`, you can find and remove the dropout nodes as illustrated in the following picture.
- Left: original graph with dropout Layers
- Middle: matched dropout layers
- Right: graph with dropout layers removed
Expand Down

0 comments on commit f56b40b

Please sign in to comment.