Uploading files throws a 413 (Request Entity Too Large) with "large" files

Hello everyone I have an issue that I’m trying to resolve. If someone can help me, I’ll be so grateful since I have a deadline to implement a functionality in my project. I’ll give you some context:

I’m currently implementing a function in a controller that handles file uploading in AWS S3 and creates a register in the DB with the name and details of the file. I’m able to upload files to my S3 instance without any problem. I created a React App that handles the file gathering and sending the files through an API to a controller. I also have this project running on a production AWS server.

I send the files from the frontend to the backend using a multipart/form-data request in axios. This works really fine in the local server and the production server. As I mentioned before, I’m capable of uploading the files to S3.

In these days, I was trying to upload “large” files through the production server. The concept of “large” is relative, cause in my case, I’m not able to upload files larger than 5-7MB approximately. So I’m currently able to upload files of 0-3 MB approximately without any problem.

Both local and production apps (the project I’m developing) upload the files to the same S3 instance. So if I upload files from my local development everything works fine no matter the size the file/files has/have, but it’s not the same case for the production one.

Looking on the internet I found that I had to change the maxSize in the files object of the bodyParser.js file. I tried that without getting any success. I also tried changing the limit field at the json object in the same file, even for the urlencoded one.

In the following code segments I show you what I’ve done so far:

bodyParser.js

 json: {
    
    limit: '1024mb',
    strict: true,
    types: [
      'application/json',
      'application/json-patch+json',
      'application/vnd.api+json',
      'application/csp-report'
    ]
  },
raw: {
    types: [
      'text/*'
    ]
  },
 form: {
    types: [
      'application/x-www-form-urlencoded'
    ],
    limit: '1024mb',
  },
files: {
    types: [
      'multipart/form-data'
    ],
    maxSize: '1024mb',
    limit: '1024mb',
    autoProcess: true,
    processManually: ['/api/v1/resources-upload', '/api/v1/resources-upload/:topicContentId']
  }

As you can see ‘/api/v1/resources-upload’ and ‘/api/v1/resources-upload’ are the APIs I use to handle file uploading.
I set the limit and maxSize to 1024mb which is like 1GB because at some point the users of this project will upload files of little less than 1GB.

The following code segment shows the axios request I do from the frontend (this is not directly related to AdonisJS) but I post on here to provide a better explanation:

const options = {
          header: {
            'Content-Type': 'multipart/form-data',
          },
          onUploadProgress: (progressEvent) => {
            const { loaded, total } = progressEvent;
            let percentCompleted = Math.floor((loaded * 100) / total);
            console.log("%")
            console.log(percentCompleted)
          },
        };
        axios.post('/resources-upload', fd, options)
            .then((response) => {
                console.log(response)
            })
            .catch((error) => {
               console.log(error)
            });
        }

The onUploadProgress allows me to check the uploading process.

I also add in the following segment the code of the controller in which I handle the file uploading to S3:

Controller:

async upload ({ request, response, view }) {
        await request.multipart.file('files', {}, async (file) => {
            try {
                const ContentType = file.headers['content-type']

                const ACL = 'public-read'
                const Key = `${Math.round((new Date()).getTime() / 1000)}_${file.clientName}`

                const url = await Drive.put(Key, file.stream, {
                    ContentType,
                    ACL,
                })

                const resource = await Resource.create({
                    path: url,
                    name: file.clientName,
                })

                return resource

            } catch(err) {
              console.log('error', err);
                return response.status(err.status).send({
                    error: {
                        err_message: err.message
                    }
                })
            }
        })
        .process()
    }

So basically that’s my problem. Let me write a summary of my explanation:

  • I have an app built in Adonis + a frontend framework (React) that is able to upload “small” files to an S3 instance without any problem in the local server.
  • In the local server, I’m able to upload files of all the sizes, eg: 50KB, 10MB, 100MB, 500MB… and so on.
  • In production, I’m not able to upload files greater than 5-7MB approximately.
  • When I try to upload files grather than the size previously mentioned in production, I got a response with a 413 error code from the server once the uploading percentage is 100%.
  • I tried changing the maxSize and limit attributes at the bodyParser.js file without getting any success.

are you running in production behind proxy, nginx for example, if yes check
client_max_body_size in config.

Yes, I am. The client_max_body_size in my proxy config is set to the same maxSize value as in the files: { … } object (1024mb).

If it works locally but not in prod think about all the differences that there are between those two environments.
Maybe it has several proxies or proxy did not reload after update or something similar.