This will upload all compiled assets to AWS S3 bucket during a webpack build process. You can serve all your files via Cloud Front or different CDN.
$ npm i -S webpack-s3-uploader
First set environmental variables:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
// require plugin
var S3Uploader = require('webpack-s3-uploader')
const config = {
context: path.resolve(__dirname, '..'),
output: {
path: path.resolve(__dirname, '../build/public/assets'),
publicPath: 'your_cdn_url',
},
plugins: [
new S3Uploader({
s3Options: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: 'us-west-1'
},
s3UploadOptions: {
Bucket: 'MyBucket'
},
})
]
// ..other configuration
}
It is required to set:
output.path
is a path, where all assets will be compiled and those will be uploaded. You can useexclude
andinclude
option.output.publicPath
it is a path, where all compiled assets will be referenced to. During a compilation process webpack replaces local path with this one. If you have Cloud Front pointed to your S3 bucket, you should put url here.
exclude
: A Pattern to match for excluded content (e.g./.*\.(css|js)/
). Behaves similarly to webpack's loader configuration.include
: A Pattern to match for included content. Behaves the same as theexclude
.s3Options
: Provide keys for upload extention of s3Configs3UploadOptions
: Provide upload options putObjectbasePath
: Provide the namespace where upload files on S3progress
: Enable progress bar (defaults true)
include
and exclude
rules behave similarly to Webpack's loader options. In addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.
This is a lite and refactored version of s3-plugin-webpack