Skip to content

Conversation

michelle0927
Copy link
Collaborator

@michelle0927 michelle0927 commented Oct 6, 2025

Resolves #10939

Summary by CodeRabbit

  • New Features

    • Added a Crawl URL action to fetch web pages with options to select a scraper, capture screenshots, manage storage, and include custom headers. Provides a success summary and returns the API response.
    • Introduced a preset list of supported scrapers for easier selection.
  • Chores

    • Bumped package version to 0.1.0 and added a required platform dependency.
Copy link

vercel bot commented Oct 6, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

2 Skipped Deployments
Project Deployment Preview Comments Updated (UTC)
pipedream-docs Ignored Ignored Oct 6, 2025 6:36pm
pipedream-docs-redirect-do-not-edit Ignored Ignored Oct 6, 2025 6:36pm
Copy link
Contributor

coderabbitai bot commented Oct 6, 2025

Walkthrough

Adds a new Crawl URL action using a refactored Crawlbase app client. Introduces a constants module with scraper identifiers. Replaces a deprecated auth method with a generic request helper. Updates package metadata and adds a dependency.

Changes

Cohort / File(s) Change Summary
New action: Crawl URL
components/crawlbase/actions/crawl-url/crawl-url.mjs
Adds a default-exported action to crawl a URL. Defines props (crawlbase, url, scraper, screenshot, store, getHeaders). Uses app’s makeRequest to call API and exports a success summary.
Common constants
components/crawlbase/common/constants.mjs
Introduces SCRAPERS array and default export { SCRAPERS }.
App client refactor
components/crawlbase/crawlbase.app.mjs
Removes authKeys(); adds _baseUrl() and makeRequest({...}) that builds axios calls with token and path. Centralizes HTTP logic.
Package metadata
components/crawlbase/package.json
Bumps version 0.0.1 → 0.1.0; adds dependency "@pipedream/platform": "^3.1.0".

Sequence Diagram(s)

sequenceDiagram autonumber participant U as User participant A as Crawl URL Action participant C as Crawlbase App (makeRequest) participant API as Crawlbase API U->>A: Provide props (url, scraper, screenshot, store, getHeaders) A->>C: makeRequest({ path:"/...", params:{ url, scraper, screenshot, store, get_headers, format } }) C->>API: HTTP request with api_token and params API-->>C: Response (result payload) C-->>A: Response A-->>U: Export summary "Successfully crawled URL: <url>" and return response 
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Poem

I hop through links with whiskers high,
New paths to crawl, beneath the sky.
A bundle of scrapers, neatly packed,
One makeRequest—clean, exact.
Version bumped, I thump with glee,
Logs are quiet, but pages spree. 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Description Check ⚠️ Warning The pull request description only contains a reference to the issue number and does not include the required "## WHY" section or any explanation of the motivation, so it does not follow the repository template. Please expand the description to include a "## WHY" section explaining the motivation for this change per the repository template.
✅ Passed checks (4 passed)
Check name Status Explanation
Title Check ✅ Passed The title clearly indicates the added Crawlbase action named crawl-url, matching the primary change to introduce this action module in the Crawlbase component.
Linked Issues Check ✅ Passed The changes add the Crawlbase integration by introducing the makeRequest method, new crawl-url action, and related constants, satisfying the objectives of integrating Crawlbase functionality from issue #10939.
Out of Scope Changes Check ✅ Passed All modifications are focused on implementing the Crawlbase integration and the new action, and there are no unrelated or out-of-scope code changes present.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch issue-10939

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d268b9a and 343fd82.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (4)
  • components/crawlbase/actions/crawl-url/crawl-url.mjs (1 hunks)
  • components/crawlbase/common/constants.mjs (1 hunks)
  • components/crawlbase/crawlbase.app.mjs (1 hunks)
  • components/crawlbase/package.json (2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
components/crawlbase/crawlbase.app.mjs (1)
components/spotify/actions/get-album-tracks/get-album-tracks.mjs (1)
  • axios (53-56)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
  • GitHub Check: Lint Code Base
  • GitHub Check: Publish TypeScript components
  • GitHub Check: pnpm publish
  • GitHub Check: Verify TypeScript components
🔇 Additional comments (4)
components/crawlbase/package.json (1)

3-17: Package metadata update looks good

Version bump and the @pipedream/platform dependency align with the new axios-based helper. ✔️

components/crawlbase/common/constants.mjs (1)

1-48: SCRAPER catalog addition is solid

Comprehensive list of scraper identifiers will keep UI options in sync with Crawlbase’s catalog.

components/crawlbase/crawlbase.app.mjs (1)

1-24: New request helper is well structured

makeRequest cleanly centralizes token injection and base URL handling—this should simplify future actions.

components/crawlbase/actions/crawl-url/crawl-url.mjs (1)

1-64: Crawl URL action implementation looks correct

Prop wiring, param mapping, and summary export are consistent with the new app helper—ready to ship.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants