Hello there,
I am reaching out to seek advice and share my experiences related to managing and optimizing large datasets in Grist. Recently; I have been working on a project that involves handling a substantial amount of data over 200,000 rows and counting.
While Grist has been fantastic for organizing and visualizing my data, I am starting to notice some performance issues; particularly when running complex queries and generating reports.
Some of my more complex queries take significantly longer to run than I would like. Are there any best practices for optimizing query performance in Grist?
Importing large CSV files into Grist can be quite slow. Is there a more efficient method for importing data, or are there any recommended tools that integrate well with Grist to streamline this process?
I have several columns with complex formulas that seem to be impacting the overall performance. How can I optimize these calculations to reduce lag time?
Are there specific settings or configurations within Grist that can help better allocate resources for handling large datasets?
I have read through the official documentation and explored some of the community discussions, but I am hoping to gather more insights from those who have dealt with similar issues. Any tips; tricks; or tools that you have found helpful would be greatly appreciated.
Also, I have gone through this post: https://community.getgrist.com/t/grist-accepting-data-very-slowly/3417minitab which definitely helped me out a lot.
Additionally; if anyone has recommendations for resources or tutorials that dive deeper into advanced Grist usage and performance optimization; I would love to check them out.
Thank you in advance for your help and suggestions.