Memory Issues with large files

As I use my data exported from an excel file of 35mb, it literally brings grist crawling, with almost 95% memory usage on a 16gb machine.

Is it same with everyone ? Is there any memory related setting which can be tweaked ?

Hi there!

Are you using our hosted version of Grist (available at getgrist.com) or using a self-hosted version?

Would you be able to share your Excel file with support@getgrist.com? A 35MB document should not take up 16GB of memory.

The document linked below is a random, large document available publicly. Try importing that and see how Grist fares. It should take about 39MB of memory to load it. If it loads fine, then the problem is likely with the Excel file.

https://www.stats.govt.nz/assets/Uploads/International-trade/International-trade-June-2023-quarter/Download-data/overseas-trade-indexes-june-2023-quarter-provisional.csv

Thanks,
Natalie

I am using a self-hosted local version on windows 11

I can not share the file since it is work-related however, I can share that it has approximately
4 sheets of 7000 rows by 200 column
full unicode text (in atleast 5 different languages)

For a database with more than 400k rows (raw data indicates 120MB as total size), docker stats returns 6.36GiB mem usage.