Skip to content

How can I re-queue a URL? #942

@matecsaj

Description

@matecsaj

Crawlee optimizes requests by suppressing redundant URLs and giving up on a URL after reaching a configurable retry limit. This is great, but I encountered an edge case.

Situation: You're extracting data from a page and realize that it wasn’t downloaded properly.

Ideal Goal: Re-queue the URL while decrementing its retry counter.

Alternative: Add the URL back to the queue with a fresh retry counter.

How can I achieve either of these?

Metadata

Metadata

Assignees

No one assigned

    Labels

    t-toolingIssues with this label are in the ownership of the tooling team.

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions