Saleor and AWS S3: Files with custom ACL and how to disable files overwriting

by Gregory Sánchez

As you might know, Saleor is an open source storefront, that features out-of-the-box solutions to manage an e-commerce platform.

Saleor is developed with Django, and a lot of projects use it as a base to build custom market places. This has led to the fork of the main project and to a great number of developers to experiment with it.

One of the cool features about Saleor is the storing of files within AWS S3 buckets. To do this, Saleor uses Django Storages, a custom collection of storages for Django. Once you set the credentials, Saleor will allow us to use a bucket as media storage.

Although Storages take care of uploading the files, you can face some scenarios where you might need more advanced solutions. These are two of those possible cases.

Avoid file overwriting when uploading to a bucket

By default, if an uploading file has the same name and path as an existing file, Saleor will overwrite that file. If you want to avoid this behavior and keep both files, you must add this setting to Saleor’s settings.py file:

# In saleor/settings.pyAWS_S3_FILE_OVERWRITE = os.environ.get('AWS_S3_FILE_OVERWRITE', False)

This setting tells the Storage to add a random suffix to the file if exists a file with the same path and name in the bucket.

You should notice the use of an environment variable. If the env variable is not set, then the default value would be False, so the files won’t be overwritten.

Change AWS Storage to manage different ACL depending on the FileField

Saleor uses Django’s FileField to store files in the configured storage.

You can set the object’s ACL via the AWS_DEFAULT_ACL env variable. But, this configuration will affect ALL of the uploaded files. This can be an issue if you want to manage different ACLs depending on the type of resource you want to store.

For example, let’s assume you’ve set the AWS_DEFAULT_ACL variable to public-read. But you wish to have some FileFields that store files with private access.

To solve this, you can create a new Storage extending the S3MediaStorage (saleor/core/storages.py) class. That class is the one responsible for uploading the files to AWS. The idea is to write this custom behavior in a child class. You can place this new Storage within the same Saleor application tree, or in a new module, if you wish.

# In saleor/core/storages.
from django.conf import settings
from storages.backends.s3boto3 import S3Boto3Storageclass S3MediaStorage(S3Boto3Storage):
def __init__(self, *args, **kwargs):
self.bucket_name = settings.AWS_MEDIA_BUCKET_NAME
self.custom_domain = settings.AWS_MEDIA_CUSTOM_DOMAIN
super().__init__(*args, **kwargs)# New class to manage private storage
class PrivateMediaStorage(S3MediaStorage):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.default_acl = 'private'

As you may notice, S3MediaStorage inherits S3Boto3Storage. This is the storage provided by the Django Storages collection that allows us to use AWS S3.

For our specific purposes, we only have to create a new class extending S3MediaStorage. This way we will keep all the features already available by this storage, and overwrite the ACL value that we are going to use.

Then later, when you create some model to use this feature, you only need to specify the new Storage on the FileField.

from saleor.core.storages import PrivateMediaStorageclass Revision(models.Model):
created_at = models.DateTimeField(default=now, blank=True)
file = models.FileField(
blank=True, max_length=250, storage=PrivateMediaStorage())

And that’s it. Our FileField will store files with private access.

You have to notice that you won’t be able to store files locally if you don’t set the AWS credentials. This is because we are overriding the default storage.

If you want to keep this compatibility, you can create a function that returns the storage type depending on the state of the AWS credentials.

# In saleor/core/utils.py
from saleor.core.storages import PrivateMediaStorage
from saleor.settings import AWS_MEDIA_BUCKET_NAMEdef get_private_storage():
# Si el bucket está definido
return PrivateMediaStorage()
return None# In saleor/core/storages.py
from saleor.core.utils import get_private_storageclass Revision(models.Model):
created_at = models.DateTimeField(default=now, blank=True)
file = models.FileField(
blank=True, max_length=250,

Now, the function will detect if AWS Storage is enabled. Then, the FileField will use the custom Storage or the Default Storage of Django depending on the configured storage.

Move and Rename objects within an S3 Bucket using Boto 3
by Gregory Sánchez