1. File save error and Load file error

File save error and Load file error

Home Forums FABRIC General Questions and Discussion File save error and Load file error

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
    Posts
  • #3709
    Xusheng Ai
    Participant

      Hello,

      We were running a layer 2 network test experiment with our Jupyter notebook. We created the notebook within JupyterHub. After we requested the slice, a file save error popped up. Our network works properly. Please see the detail information below:

      While the same time, we could not open other example notebooks neither. The error shows that file load error for example.ipynb. Please see the screenshot below:

      I am not sure what might cause this happen. I was wondering if I could have some help with this issue.

      Thanks,
      Xusheng

      • This topic was modified 1 year, 2 months ago by Xusheng Ai.
      • This topic was modified 1 year, 2 months ago by Xusheng Ai.
      #3714
      Paul Ruth
      Keymaster

        It looks like you filled your disk allocation our JupyterHub.  Do you have old files that you can clean up?

        #3724
        Xusheng Ai
        Participant

          I delete some old files, and it works well right now. Thank you so much for the hints.

          #3725
          Xusheng Ai
          Participant

            Is there any chance that we could have a larger disk on JupyterHub?

            #3726
            Paul Ruth
            Keymaster

              What are you using it for? Generally, the JupyterHub is a good place for code/script/docs (i.e. smaller things). Do you need space for large data sets? If so we can create a persistent storage volume in the testbed itself.

              #3727
              Xusheng Ai
              Participant

                We are trying to upload one 1G bio-related file to the slice nodes through JupyterHub. Since there is no way to upload files from local to Fabric nodes directly. I checked the disk space that we have 1G available disk space in total. If it was possible to create a persistent storage volume in the testbed, it will be much appreciated.

                #3728
                yoursunny
                Participant

                  there is no way to upload files from local to Fabric nodes directly

                  It’s possible in two ways:

                  • Host your file with an HTTPS server somewhere on the Internet (with HTTP Basic authentication if desired), and download it on the nodes with wget command.
                  • Add the nodes into your local ~/.ssh/config with ProxyJump through the bastion, and then run scp to upload the file to the nodes.

                  I’ve done both in different experiments, but only the first one can be automated.

                  #3766
                  Ilya Baldin
                  Participant

                    Just as a form of explanation – we host the Jupyter Hub in Google Cloud, which costs real $$s allocated to us from NSF via a project named CloudBank. We are still evaluating the true costs of running it in its current configuration (so we can more accurately project future costs). We may revise the amount of disk space and other resources each notebook server gets, however we are constrained by the budget and this will not be a decision we will be making in the near term.

                    In general the Hub is not intended as a place to park or transfer large files.

                  Viewing 8 posts - 1 through 8 (of 8 total)
                  • You must be logged in to reply to this topic.