Is there any downside or performance issue if we extend the index size?
Since it's you that control what gets indexed and it's not an "open" endpoint where the limits are needed for DoS attack protection.
If you have for example a few 100 MB files that don't get indexed currently it should be fine but it's anyway smart to consider why they are so big and if they really should get their content indexed. It's good to try to keep a full index run as fast as possible.
We are seeing this error while indexing some files.
Unable to write data to the transport connection: An established connection was aborted by the software in your host machine.
Please note that the files which are throwing this error are around 90-100Mb in size.
Have you contacted support to get your upload limit raised?
We have a requirement in our application to index some files which are greater than 50MB in size.
Please let me know if there is any workaround to do this.