Skip to content

Conversation

@rubenfb23
Copy link

@rubenfb23 rubenfb23 commented Nov 4, 2025

  • Spun up a thin pufferlib.postprocess shim that re-exports the legacy wrappers so existing environment modules keep importing without breakage.
  • Swapped the training TF32 setup to the new torch.backends.cuda.matmul.fp32_precision / torch.backends.cudnn.conv.fp32_precision knobs, with fallbacks for older runtimes, eliminating the PyTorch 2.9 deprecation warning.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant