Skip to content

docs: Add B200 and RTX PRO 6000 Blackwell GPU types to Flash docs#569

Merged
muhsinking merged 2 commits intomainfrom
promptless/flash-b200-rtx-pro-6000-gpu-types
Mar 23, 2026
Merged

docs: Add B200 and RTX PRO 6000 Blackwell GPU types to Flash docs#569
muhsinking merged 2 commits intomainfrom
promptless/flash-b200-rtx-pro-6000-gpu-types

Conversation

@promptless
Copy link
Contributor

@promptless promptless bot commented Mar 23, 2026

Open this suggestion in Promptless to view citations and reasoning process

Updates the Flash GPU types documentation with newly supported Blackwell-architecture GPUs: the B200 (180GB) and RTX PRO 6000 variants (96GB). Adds both the new GPU pools (BLACKWELL_96, BLACKWELL_180) and specific GPU types for precise hardware selection.

Trigger Events

runpod/flash PR #233: feat: refactor GpuType enum names, add B200 and RTX PRO 6000s

Promptless Research (8 files, 1 GitHub PR)

  • Reviewed runpod/flash#233
  • Viewed .long_term_context/product_knowledge/product_overview.md
  • Viewed .long_term_context/doc_workflow/client_instructions.md
  • Viewed flash/configuration/gpu-types.mdx
  • Viewed flash/configuration/parameters.mdx
  • Viewed flash/quickstart.mdx
  • Viewed .long_term_context/style/client_style_guide.md
  • Viewed release-notes.mdx
  • Viewed references/gpu-types.mdx

Agent Response

I've created a documentation suggestion based on the Flash PR #233. Here's a summary:

Documentation Updated:

  • flash/configuration/gpu-types.mdx - Added the new Blackwell-architecture GPUs to both tables:

New GPU Pools:

Pool GPUs VRAM Use Case
GpuGroup.BLACKWELL_96 RTX PRO 6000 Blackwell (Server, Workstation, Max-Q) 96GB Professional Blackwell workloads
GpuGroup.BLACKWELL_180 B200 180GB Maximum VRAM, next-gen training

New GPU Types:

Type Model VRAM Architecture
GpuType.NVIDIA_B200 NVIDIA B200 180GB Blackwell
GpuType.NVIDIA_RTX_PRO_6000_BLACKWELL_SERVER_EDITION RTX PRO 6000 Server 96GB Blackwell
GpuType.NVIDIA_RTX_PRO_6000_BLACKWELL_WORKSTATION_EDITION RTX PRO 6000 Workstation 96GB Blackwell
GpuType.NVIDIA_RTX_PRO_6000_BLACKWELL_MAX_Q_WORKSTATION_EDITION RTX PRO 6000 Max-Q 96GB Blackwell

Notes:

  • The PR initially proposed removing the NVIDIA_ prefix from all enum names (a breaking change), but based on comments from KAJdev, this was reverted. The final PR only adds the new GPU types while keeping existing enum names unchanged.
  • The general references/gpu-types.mdx page already lists these GPUs (auto-generated from the API), so no update needed there.
  • Release notes were not updated since this PR is not yet merged—the new GPUs will be announced when the feature ships.

Tip: Send Promptless a meeting transcript in Slack to generate doc updates 📝

| `GpuGroup.ADA_48_PRO` | RTX 6000 Ada | 48GB | Professional inference |
| `GpuGroup.AMPERE_80` | A100 80GB PCIe, A100-SXM4-80GB | 80GB | XL models, intensive training |
| `GpuGroup.ADA_80_PRO` | H100 80GB HBM3 | 80GB | Cutting-edge inference |
| `GpuGroup.BLACKWELL_96` | RTX PRO 6000 Blackwell (Server, Workstation, Max-Q) | 96GB | Professional Blackwell workloads |
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Citation: New GPU groups and types added in PR #233. Changes in src/runpod_flash/core/resources/gpu.py add GpuGroup.BLACKWELL_96 and GpuGroup.BLACKWELL_180 pools, along with GpuType.NVIDIA_B200 and three RTX PRO 6000 Blackwell variants. VRAM specifications (96GB for RTX PRO 6000, 180GB for B200) are derived from the README update in the same PR.
View source

@mintlify
Copy link

mintlify bot commented Mar 23, 2026

Preview deployment for your docs. Learn more about Mintlify Previews.

Project Status Preview Updated (UTC)
runpod-docs 🟢 Ready View Preview Mar 23, 2026, 3:34 PM

@promptless
Copy link
Contributor Author

promptless bot commented Mar 23, 2026

Just a reminder: If you'd like me to act on any feedback you have via Github comments, just type @Promptless in your suggestion and I'll get right on it! (I won't show up in the user dropdown, but I'll process any request that has @Promptless in the comment body.)

@muhsinking muhsinking merged commit f0e5dbc into main Mar 23, 2026
3 checks passed
@muhsinking muhsinking deleted the promptless/flash-b200-rtx-pro-6000-gpu-types branch March 23, 2026 16:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant