- This topic has 2 replies, 2 voices, and was last updated 1 year ago by .
Viewing 3 posts - 1 through 3 (of 3 total)
Viewing 3 posts - 1 through 3 (of 3 total)
- You must be logged in to reply to this topic.
Home › Forums › FABRIC General Questions and Discussion › Lack of space in Server Filesystem
Tagged: file-download, jupyter, storage
Hi,
My experiments are running on VMs in a slice. Some of the VMs generate large files as part of the experiment. Even after compression, the files remain large. For example, I have a 4.3G file that is compressed to 2.5G.
The problem is when I try to download it (scp) to my Jupyter server, it fails because the workspace mount point on it has only 1G total storage memory available. How can I go about either, increasing the Jupyter server’s storage or downloading such large files ?
Hello,
You can copy the ssh configuration file from work/fabric_config in Jupyter Hub to your laptop (into e.g. ~/.ssh/ directory) along with referenced bastion and sliver keys and then use SCP to copy those files onto your laptop. We are not able to provide significant storage space inside the Jupyter Hub.
Thanks Ilya. This solution worked. It requires a slight bit of manual intervention, but is still mostly scriptable. It can possibly be fully scripted as well using the Fablib APIs, and I will look into that when time permits.