Skip to content

Instantly share code, notes, and snippets.

@stevekm
Last active May 31, 2025 07:03
Show Gist options
  • Save stevekm/36ee1459ae1a768fea3f45b75aaea29e to your computer and use it in GitHub Desktop.
Save stevekm/36ee1459ae1a768fea3f45b75aaea29e to your computer and use it in GitHub Desktop.
SSH config to allow VS Code Remote SSH to connect to a fresh new SLURM job running on a remote HPC system
# ssh config to use with VS Code Remote SSH
# to launch a VS Code remote ssh session inside of
# a new SLURM compute job running on the remote HPC
# when you start a new ssh session from your LOCAL laptop,
# SLURM will spin up a new srun job and connect to the ssh daemon on the compute node
# your ssh connection will thus tunnel from your LOCAL laptop
# through the head node, to the compute node
# the SLURM job on the compute node will remain active until it expires via time limit
# the SLURM job will terminate when the ssh session is closed
# when you connect VS Code in this way via the Remote SSH extension,
# the VS Code remote server processes will run INSIDE THE COMPUTE NODE
# do you dont BLOW UP THE HEAD NODE with any VS Code tasks
# the default behavior of VS Code with Remote SSH is to attempt to re-connect if the ssh connection is lost
# so every time the SLURM job expires, VS Code will just automatically re-connect, triggering a new SLURM job
# and your VS Code session should continue largely uninterrupted
# REQUIREMENTS
# - able to ssh into the HPC head node
# - able to ssh into a running HPC compute node, from the head node
# - desktop VS Code app installed
# - VS Code Remote SSH extension installed
# start an interactive ssh session like this;
# $ ssh slurm-compute-node
#
# launch VS Code directly with this;
# $ code --remote ssh-remote+slurm-compute-node /home/my-username
Host slurm-compute-node
User my-username
#
# make sure the .pub key for this key is added to ~/.ssh/authorized_keys on the HPC
IdentityFile ~/.ssh/keys/id_ed25519
#
# put your alias to the HPC Cluster head node here after 'ssh';
ProxyCommand ssh my-slurm-hpc "bash -l -i -c 'srun --job-name=vs-code-session --nodes=1 --ntasks=1 --time=04:00:00 --pty bash -l -i -c \"exec nc localhost %p\" ' "
#
# wait up to 10 minutes for connection ; it can take a while for a new SLURM node to spin up
ConnectTimeout 600
#
# once connected, keep the tunnel alive
ServerAliveInterval 60
ServerAliveCountMax 10
Host my-slurm-hpc
#
# put your ssh config to log into the remote HPC head node here
#
ProxyCommand lots-of-complicated-stuff-goes-here
#
IdentityFile ~/.ssh/keys/id_ed25519
User my-username
ServerAliveInterval 60
ServerAliveCountMax 10
# NOTE: supposedly some of these might also work
# ProxyCommand ssh my-slurm-hpc \
# "bash -l -i -c '\
# srun --job-name=vs-code-ssh --nodes=1 --ntasks=1 --time=04:00:00 --pty ssh -o BatchMode=yes -o StrictHostKeyChecking=yes -W localhost:%p $(whoami)@localhost '"
# srun --nodes=1 --ntasks=1 --pty ssh -W localhost:%p $(whoami)@localhost
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment