Hi,
It would be great if there was a simpler way to include extra Python libraries into self-hosted Grist instance.
I think it could be easily done if the Docker image would take additional ENV variable EXTRA_PYTHON3_REQUIREMENTS
that could be set to i.e. /persist/requirements.txt that would get installed then container gets created.
I imagine you could run potentially into clashes if user wants to install higher version of package than Grist supports/needs, but I guess if you install from user provided requirements.txt first and then force install Grist default requirements.txt on top of that then it should be sufficient solution for now.
3 Likes
Your idea makes sense @Pawel_Cwiek. My one concern would be needing to go through an installation step each time a container is created. Perhaps a volume could be mounted or a subdirectory of /persist
used to store and retain the installed material?
Hi, I’m not that fluent in Docker and it’s still quite new for me, but your suggestion on subdirectory of /persist makes sense. When creating a container you would still need to always try to install from that custom requirement var specified file but it could skip the packages already present in that persist subfolder from previous installs
1 Like
Just was having a thought about this, the easiest way to implement this in the way as you mentioned @paul-grist I think would be if that env variable was a path that would get appended to PATH system variable inside container. Python then uses that variable to discover available packages
1 Like
The issue with binding in would be if the version of python changed. Given python suffers from a notorious DLL hell, and from the above it seems the system python interpreter is used (=> the interpreter version would change if the base debian image got bumped) then I think installing on first bootup would be acceptable. Other images perform special actions on first startup, and I’d usually expect this if I recreated a container, so I think it’s a known idiom and as an administrator I would be prepared to wait a bit whilst it downloaded some things if I decided to nuke it and start again. Also, mounting the site-package folder would still be possible for users who wanted to do this and take the risks. It would permit e.g. custom packages built locally without execcing in and installing a hweel.
Also, I see that I need to edit gencode.py
in order to import packages; an editable area in the UI would be nice, but in the meantime, it’d probably be nice to be able to inject some custom code into the interpreter Grist is running. From there I could easily add in custom routines, import packages - you could perhaps also even lazily load required packages yourself - etc. In this case it might make sense to separate out core grist from user modfications using some neat interface for that. Thus general users would have a vanilla setup baked into the container image; self-hosters who want to fine-tune their use of python could have full access to customisation, and grist would hook in with that in a neat way to let them - e.g. mounting in a hook file and going from there.