How to Upload and Serve Data Using Amazon CloudFront and Amazon S3 in Node.js
Upload and serve data faster and more efficiently.
Join the DZone community and get the full member experience.
Join For FreeMost applications today serve users across the globe and need a way to deliver their content fast. To accomplish this, developers often rely on a Content Delivery Network (CDN), a network of servers that are geographically distributed with the intent of serving content to users as fast as possible.
Amazon CloudFront is one such CDN. In this article, I will describe how to upload files to S3 bucket and serve those files through CloudFront in Node.js.
Prerequisites
Create a bucket in S3 and create a CloudFront distribution in AWS. Navigate to IAM and go to Security Credentials under User. Create an access key and download the CSV file. We will need this access key later.
Then, click on My Security Credentials in My Account.
My security credentials
Under CloudFront key pairs, create a key-pair and download the private key. Make sure to keep track of your access key ID. We will need it later for integration.
Creating Node.js Application
Let’s create a simple Node.js express server and add two REST API endpoints for file upload and download. Here is the sample project structure.
I am using typescript and ts-node-dev npm modules in this sample project. Therefore, I have tsconfig.json in the project.
Here is the entire app.ts file. The file contains the logic to initialize express server and REST endpoints. I am also using a multer npm module to handle multi-part file upload.
import express from 'express';
import * as fileCtrl from './fileController';
import multer from 'multer';
import crypto from 'crypto';
import path from 'path';
const app = express();
const port = 3000;
app.listen(port, () => {
console.log('Server listening on port %s.', port);
});
//Multer module for handling multi part file upload.
var storage = multer.diskStorage({
destination: './files',
filename: function (req, file, cb) {
crypto.pseudoRandomBytes(16, function (err, raw) {
if (err) return cb(err)
cb(null, raw.toString('hex') + path.extname(file.originalname))
})
}
})
app.use(multer({ storage: storage }).single('file'));
app.get('/api/download', asyncHandler(fileCtrl.download));
app.post('/api/upload', asyncHandler(fileCtrl.upload));
export function asyncHandler(handler) {
return function (req, res, next) {
if (!handler) {
next(new Error(`Invalid handler ${handler}, it must be a function.`));
} else {
handler(req, res, next).catch(next);
}
};
}
Uploading Files to Amazon S3 Bucket
Let us look at how to upload files to S3 bucket. We will need to install node module aws-sdk to access S3 buckets from Node.js application.
Once we have installed the handler for upload, the endpoint is defined in as follows:
export async function upload(req, res) {
let response = await uploadFile(req.file.originalname, req.file.path);
res.send(response);
res.end();
}
In fileComponent.ts, we need to import the AWS-SDK module as follows.
import awsSDK from 'aws-sdk';
In the beginning of this article, we downloaded a CSV file that contained access key id and secret access key. We will use them to upload files to the S3 bucket. Using the AWS-SDK module, we need to configure the access key id and secret access key as follows:
export function uploadFile(filename, fileDirectoryPath) {
awsSDK.config.update({ accessKeyId: process.env.S3_ACCESS_KEY_ID, secretAccessKey: process.env.S3_SECRET_ACCESS_KEY });
const s3 = new awsSDK.S3();
return new Promise(function (resolve, reject) {
fs.readFile(fileDirectoryPath.toString(), function (err, data) {
if (err) { reject(err); }
s3.putObject({
Bucket: '' + process.env.S3_BUCKET_NAME,
Key: filename,
Body: data,
ACL: 'public-read'
}, function (err, data) {
if (err) reject(err);
resolve("succesfully uploaded");
});
});
});
}
Using the putObject() method, we will upload files to the S3 bucket. In putObject(), we need to pass the bucket name to which we will upload the files. Note that, depending on your bucket policies, you can send parameters in putObject(). In this example, I have set the canned ACL policy to public-read.
Now, we can start the server and test our POST endpoint. Here is an example from Postman.
Once the request is successful, we can see the file in the S3 bucket.
Serving Files via Amazon CloudFront
Earlier, we downloaded private key from CloudFront key pairs. We will use that private key and access key ID to access CloudFront in Node.js.
The handler for download API endpoint is as follows.
export async function download(req, res) {
let response = await getFileLink(req.query.filename);
res.send(response);
res.end();
}
In this handler, we are expecting a file name that has to be downloaded via CloudFront.
Let us look at how to access CloudFront in our Node.js app. First, we will install an aws-cloudfront-sign npm module. Using this module, we can get signed Amazon CloudFront URLs which enables us to provide users access to our private content. The signed URLs also contain additional meta-information, such as expiration time. This gives more control over access to our content.
export function getFileLink(filename) {
return new Promise(function (resolve, reject) {
var options = { keypairId: process.env.CLOUDFRONT_ACCESS_KEY_ID, privateKeyPath: process.env.CLOUDFRONT_PRIVATE_KEY_PATH };
var signedUrl = awsCloudFront.getSignedUrl(process.env.CLOUDFRONT_URL + filename, options);
resolve(signedUrl);
});
}
Here, we need to pass the access key ID path to the private key file and CloudFront URL to getSignedUrl(). The CloudFront URL should look something like this: https://XYZ.cloudfront.net.
Start the server and test the GET endpoint as follows:
Conclusion
In this article, we saw how to upload files to Amazon S3 and serve those files via Amazon CloudFront. I hope you enjoyed this article. Let me know if you have any comments or suggestions in the comments section below.
The example for this article can be found on this GitHub repository.
Opinions expressed by DZone contributors are their own.
Comments