Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature request] - Add support for Cloudflare R2 #1062

Closed
StevenMapes opened this issue Sep 29, 2021 · 20 comments
Closed

[feature request] - Add support for Cloudflare R2 #1062

StevenMapes opened this issue Sep 29, 2021 · 20 comments

Comments

@StevenMapes
Copy link

With the announcement of Cloudflare R2 it would be great if we could add in a backend to support that

@akshaybabloo
Copy link

Aren't they using s3 APIs? 🤔

@StevenMapes
Copy link
Author

Aren't they using s3 APIs? thinking

That is true. I'm still waiting for a response to my access request so I could test it myself, was hoping someone may have already gotten access here and would be able to confirm if it does work out the box or if there are any tweaks that are required

@eliezerp3
Copy link

@StevenMapes You end up testing it?

@shrawanx
Copy link

Hi @StevenMapes,
I tested and it's working using S3Boto3Storage for private media files.
Since, buckets are not public as of now didn't test the static files part/public media part.

I had written a blog regarding it at https://djangotherightway.com/using-cloudflare-r2-with-django-for-storage

@timkofu
Copy link

timkofu commented Aug 20, 2022

We're successfully serving Django static files from an R2 bucket on a custom domain by attaching a CF worker to it.

@djch
Copy link

djch commented Jan 17, 2023

Hi @StevenMapes, I tested and it's working using S3Boto3Storage for private media files. Since, buckets are not public as of now didn't test the static files part/public media part.

I had written a blog regarding it at https://djangotherightway.com/using-cloudflare-r2-with-django-for-storage

Thanks for the little tutorial. Unfortunately it doesn't seem to be compatible with R2's custom domains feature, which is the only way to use their caching. If you set AWS_S3_ENDPOINT_URL to the custom domain, uploads don't work, but rendering (with caching) does. And if you use AWS_S3_CUSTOM_DOMAIN, then it doesn't generate signed URLs (so uploading works and the rendering doesn't).

It's annoying, cause it's so close to being there. If you could use AWS_S3_ENDPOINT_URL for uploading but a different URL for serving the signed images it would be fine. If there's a way I've overlooked, please let me know.

@mikhail-skorikov
Copy link

@djch Any luck finding a way to solve this? I myself have the same problem, I think, but I'm no good at media file handling or well DevOps in general.

My files get uploaded, but I can't retrieve it unless I allow public access to the bucket and use the public bucket url, but the same does not work if using a custom domain for the public url. In private mode, S3 API uploads, but does not load the file. I didn't even look into caching yet but I will need that too.

I guess R2 is not viable to use quite yet.

@djch
Copy link

djch commented Jan 25, 2023

@mikhail-skorikov for the time being I have implemented the same solution as @timkofu and deployed a CF Worker called render to proxy requests in front of the R2 bucket on a custom domain. That seems like a viable workaround until django-storages has better (or native) support for R2 storage.

@banool
Copy link

banool commented Feb 15, 2023

Would any of you want to share all the configs / code that you've put together to make this work? It's a shame that this doesn't work natively.

@timkofu
Copy link

timkofu commented May 14, 2023

It now works as expected; create bucket, chose a region, attach a subdomain, set up CORS and voila!

Would any of you want to share all the configs / code that you've put together to make this work?

    STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3StaticStorage"}}
    AWS_STORAGE_BUCKET_NAME = "bucket_name"
    AWS_LOCATION = "a_folder_inside_the_bucket"
    AWS_S3_ACCESS_KEY_ID = "r2_key"
    AWS_S3_SECRET_ACCESS_KEY = "r2_secret"
    AWS_S3_CUSTOM_DOMAIN = "things.example.com"
    AWS_S3_ENDPOINT_URL = (
        "https://s3_api.url.from.r2_bucket.settings.page/" # Yes; without the appended bucket name.
    )

@alexandernst
Copy link

Is URL signing also working?

@timkofu
Copy link

timkofu commented Jun 1, 2023

Yes. I tested it with AWS CLI:

s3 presign --endpoint-url https://account_id.r2.cloudflarestorage.com  s3://private_bucket/folder_in_bucket/starfleet.png --expires-in 3600

This produced a public URL that was accessible for an hour.

@alexandernst
Copy link

@timkofu I tried this and while the command does output a link, the link itself doesn't work. When I try to access the URL I get this error:

<Error>
  <Code>InvalidArgument</Code>
  <Message>
    Invalid Argument: Credential access key has length 20, should be 32
  </Message>
</Error>

Does it work for you? If "yes", are you using a paid cloudflare plan?

@dhess

This comment was marked as off-topic.

@alexandernst
Copy link

I made some research. Basically, this won't work for custom domains. Custom domains must use HMAC validation ( https://developers.cloudflare.com/ruleset-engine/rules-language/functions/#hmac-validation ).

@pirsquare
Copy link
Contributor

@timkofu I tried this and while the command does output a link, the link itself doesn't work. When I try to access the URL I get this error:

<Error>
  <Code>InvalidArgument</Code>
  <Message>
    Invalid Argument: Credential access key has length 20, should be 32
  </Message>
</Error>

Does it work for you? If "yes", are you using a paid cloudflare plan?

I've encountered and resolved this issue. In my case this issue is because we were using the old aws credentials. I would say you try to override the storage and inspect what the access key print.

import boto3
from storages.backends.s3boto3 import S3Boto3Storage
from storages.utils import setting


class R2Storage(S3Boto3Storage):
    def _create_session(self):
        print(f"access_key: {self.access_key}")
        print(f"secret_key: {self.secret_key}")

        if self.session_profile:
            session = boto3.Session(profile_name=self.session_profile)
        else:
            session = boto3.Session(
                    aws_access_key_id=self.access_key,
                    aws_secret_access_key=self.secret_key,
                    aws_session_token=self.security_token
            )
        return session

I don't think it's due to the custom domain thingy. We're using custom domain with no issue and I've managed to get it fully work for R2.

In our case, we inherited the S3Boto3Storage class and override the default settings accordingly.

@smyja
Copy link

smyja commented Feb 8, 2024

It now works as expected; create bucket, chose a region, attach a subdomain, set up CORS and voila!

Would any of you want to share all the configs / code that you've put together to make this work?

    STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3StaticStorage"}}
    AWS_STORAGE_BUCKET_NAME = "bucket_name"
    AWS_LOCATION = "a_folder_inside_the_bucket"
    AWS_S3_ACCESS_KEY_ID = "r2_key"
    AWS_S3_SECRET_ACCESS_KEY = "r2_secret"
    AWS_S3_CUSTOM_DOMAIN = "things.example.com"
    AWS_S3_ENDPOINT_URL = (
        "https://s3_api.url.from.r2_bucket.settings.page/" # Yes; without the appended bucket name.
    )

This worked

@alexdeathway
Copy link

There is an authorization issue related to relative paths in CSS files. Similar problem were encountered in AWS #734 which was resolved by @dennisvang by whitelisting the files in the bucket policy, but no such solution for Cloudflare r2,Anybody with similar issues or solution?

@jowparks
Copy link

There is an authorization issue related to relative paths in CSS files. Similar problem were encountered in AWS #734 which was resolved by @dennisvang by whitelisting the files in the bucket policy, but no such solution for Cloudflare r2,Anybody with similar issues or solution?

This is a pretty lame workaround, but I ended up using the S3Client in nodejs instead, just needed a short script:
const { S3Client } = require("@aws-sdk/client-s3");

@jschneier
Copy link
Owner

Docs added in #1378

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests