Home › Forums › FABRIC General Questions and Discussion › Test Server/Bastion Host Login
- This topic has 3 replies, 2 voices, and was last updated 3 years ago by POLINA LYUBAVINA.
-
AuthorPosts
-
November 10, 2021 at 9:29 pm #971
Having trouble going through the bastion host after logging into the test server. I see that the authorized_keys file in the test server matches my fab_rsa.pub, but it doesn’t recognize the private key that correlates to that.
This is the error I receive:
Identity file fab_rsa not accessible: No such file or directory.
The bastion host then denies me access.November 11, 2021 at 2:33 pm #982I think one issue might be mixing up which key pair is which. For reference, the initialization code in the Hello, FABRIC note book looks like the following.
Each user has two key pairs. The first key pair is the one the is installed on the bastion host. The other key pair is the one that is installed in the VMs.
In the Hello, FABRIC notebook example the bastion_key_filename needs to point to the bastion private key. This is the private half of the key pair that we installed in your account on the bastion host. By default this private key is not in your Jupyter environment. You will need to copy that private key to Jupyter and set bastion_key_filename to the path to that private key.
In the Hello, FABRIC notebook example the key pair that is used in your VMs are referenced with the vars ssh_key_file_priv and ssh_key_file_pub. These keys are set to a default key pair that is automatically in your Jupyter environment. You can use that key pair or create a new one, if you want. If you want ssh to your VMs from outside of your Jupyter environment you need to copy the private key to your laptop or other work station.
bastion_public_addr = 'bastion-1.fabric-testbed.net' bastion_private_ipv4_addr = '192.168.11.226' bastion_private_ipv6_addr = '2600:2701:5000:a902::c' bastion_username = <your bastion id> bastion_key_filename = os.environ['HOME'] + "/.ssh/id_rsa_fabric" ssh_key_file_priv=os.environ['HOME']+"/.ssh/id_rsa" ssh_key_file_pub=os.environ['HOME']+"/.ssh/id_rsa.pub" ssh_key_pub = None with open (ssh_key_file_pub, "r") as myfile: ssh_key_pub=myfile.read() ssh_key_pub=ssh_key_pub.strip()
Let me know if this was the issue,
Paul
November 11, 2021 at 5:31 pm #997Hi Paul,
My id_rsa_fabric is called fab_rsa on my local machine and I’ve copied that over to the Jupyter host. I updated the code that you have above to make that one change. And I’m able to build slices, just not able to log in to them. Here’s the stack trace that I’m getting:
`
Unknown exception: q must be exactly 160, 224, or 256 bits long
Traceback (most recent call last):
File “/opt/conda/lib/python3.9/site-packages/paramiko/transport.py”, line 2109, in run
handler(self.auth_handler, m)
File “/opt/conda/lib/python3.9/site-packages/paramiko/auth_handler.py”, line 298, in _parse_service_accept
sig = self.private_key.sign_ssh_data(blob)
File “/opt/conda/lib/python3.9/site-packages/paramiko/dsskey.py”, line 108, in sign_ssh_data
key = dsa.DSAPrivateNumbers(
File “/opt/conda/lib/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dsa.py”, line 244, in private_key
return backend.load_dsa_private_numbers(self)
File “/opt/conda/lib/python3.9/site-packages/cryptography/hazmat/backends/openssl/backend.py”, line 826, in load_dsa_private_numbers
dsa._check_dsa_private_numbers(numbers)
File “/opt/conda/lib/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dsa.py”, line 282, in _check_dsa_private_numbers
_check_dsa_parameters(parameters)
File “/opt/conda/lib/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dsa.py”, line 274, in _check_dsa_parameters
raise ValueError(“q must be exactly 160, 224, or 256 bits long”)
ValueError: q must be exactly 160, 224, or 256 bits long`
Here’s the printed output:
Node Node1 IP 63.239.135.112
q must be exactly 160, 224, or 256 bits longThanks,
Polina
November 11, 2021 at 8:23 pm #999I meant to post this in the No such file or directory discussion 🙂
-
AuthorPosts
- You must be logged in to reply to this topic.