Actually, I have two questions regarding utilizing Dockers.
1) How do we enable GPU version of tensorflow inside the codalab cloned dockers?
As far as I know we must be using NVIDIA docker in order to achieve this?
https://github.com/NVIDIA/nvidia-docker
2) Also let say, I made changes inside the codalab cloned dockers and pushed it to my dockerhub repository.
Do we need to submit this information for you to run our codes?
Hi,
We do not provide GPU support, sorry! as for tensorflow, for now, you could call the installation from your model's code. We will try to include it in the competition docker.
Regarding the other question, no, sorry, for this competition we are not getting participants' dockers, but only their models.
Best
Posted by: hugo.jair @ Aug. 13, 2018, 2:23 p.m.