Hi, it seems the size of the CSV is hitting some server limits.
One approach could be to increase those limits. if you do a bit more googling, there are other aspects mentioned (I’m NOT expert in server config, i just found a few posts by googling both “Request Entity Too Large” and “the amount of data provided in the request exceeds the capacity limit”) These posts mention a bunch of things that could cause it.
https://stackoverflow.com/questions/8896644/request-entity-too-large
https://stackoverflow.com/questions/18121227/how-to-avoid-request-entity-too-large-413-error
OTHER APPROACHES:
1) avoid the huge POST form and send CSV to FILE on server
Are you doing the CSV extract through admin backend on a non public list ? Temporarily make the list public (if it’s sensitive data MAKE sure that the list does NOT have ‘Show CSV’ ticked in the overview (so it does not present as an option if list in front end)
On next cache run it will write the csv to a file. Download file.
Then make the list NOT public – the file will be deleted.
2) OR reduce the size of the data. Are all fields required ?
Do you have any junk users who can be excluded from the list by some criteria ? Maybe break the list into sublists by some criteria.
See note 1 re large lists on main description near bottom https://www.ads-software.com/plugins/amr-users/
DO please let us know how you for the benefit of any others.