Skip to content

GCP client pagination should be iterative to avoid recursion depth failures #2337

@drwicid

Description

@drwicid

Description

Summary

The GCP client currently walks paginated API responses recursively. That works for small result sets, but it can fail when an API returns a large number of pages, such as Security Command Center findings collections.

The fix is to convert pagination from recursion to an iterative loop while preserving the existing request and next-page behavior.

Impact

  • Large paginated collections can fail with Python recursion depth errors.
  • Collection reliability drops for APIs with many small pages.
  • A low-level client failure can block otherwise valid resource collection.

Affected Files And Changes

  • plugins/gcp/fix_plugin_gcp/gcp_client.py
    • Replace recursive next_responses() paging with an iterative request loop.
    • Keep page accumulation logic unchanged.
  • plugins/gcp/test/test_gcp_client.py
    • Add a regression test that simulates 1500 pages and verifies the full result is collected.

Expected Behavior

  • Pagination should continue until the API returns no next request.
  • Result aggregation should be identical to the recursive implementation.
  • Large page counts should not fail because of recursion depth.

Version

v4.3.0

Environment

Docker

Steps to Reproduce

  1. Start a collection against a large GCP project with a high volume of gcp_scc_finding
  2. Observe a python RecursionError

Logs

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions