github mendableai/firecrawl v1.7.0

2 days ago

v1.7.0 - Release Notes

New Features

  • Deep Research Open Alpha: Structured outputs + customizability.
  • llmstxt.new: Generate an llms.txt for any website by just appending its url: llmstxt.new/firecrawl.dev
  • Concurrent Browsers: Improved rate limits for all users.
  • Compare Beta: Figure what has changed in the website directly in /scrape and /crawl endpoints. Currently in closed beta.
  • /extract: URLs are now optional.
  • /scrape: Warns if concurrency-limited.
  • New Firecrawl Examples: Featuring models like Claude 3.7, Gemini 2.5, Deepseek V3, Mistral 3.1, and more.
  • Crawl: maxDiscoveryDepth option added.

Fixes & Improvements

  • Fixed circular JSON error in search.
  • Reworked new tally system.
  • Fixed sitemaps poisoning crawler with unrelated links.
  • Crawler status retries added on failure (up to 3 times).
  • Credit check now snaps to remaining credits if exceeded.
  • Fixed path filtering bug in Map.
  • Removed unsupported schema in llmExtract.

What's Changed

  • fix: resolve circular JSON structure error in search function by @invarrow in #1330
  • feat(crawl): add maxDiscoveryDepth by @mogery in #1329
  • tally rework api switchover by @mogery in #1328
  • (feat/pricing) Concurrent Browsers - Improve rate limits by @nickscamara in #1331
  • Fix SearxNG categories by @loorisr in #1319
  • (fix/map) Map failed to filter by path if indexed by @nickscamara in #1333
  • fix(crawler): sitemaps poisoning crawls with unrelated links by @mogery in #1334
  • fix(llmExtract): remove unsupported JSON schema properties (FIR-1246) by @mogery in #1335
  • Add/ Claude 3.7 implementation by @aparupganguly in #1336
  • fix(js-sdk/crawl,batch-scrape): retry status call if it returns an error up to 3 times by @mogery in #1343
  • Increase maxurls for generate /llmstxt by @ericciarla in #1349
  • fix(v1/checkCredits): snap crawl limit to remaining credits if over without erroring out (FIR-1450) by @mogery in #1350
  • feat(scrape): add warning to document if it was concurrency limited by @mogery in #1348
  • (feat/extract) URLs can now be optional in /extract by @nickscamara in #1346
  • [SDKs] Added 403s to sdk error handlers by @rafaelsideguide in #1357
  • feat(scrape/actions/click): add all parameter (FIR-1443) by @mogery in #1342
  • (feat/deep-research) Add Analysis Prompt by @nickscamara in #1351
  • Add examples/ mistral-small-3.1-crawler by @aparupganguly in #1366
  • Add examples/mistral 3.1 company researcher by @aparupganguly in #1369
  • Add example/Deep-research Apartment finder by @aparupganguly in #1378
  • (feat/deep-research) Deep Research Alpha v1 - Structured Outputs + Customizability by @nickscamara in #1365
  • Add examples/gemini-2.5-extractor by @aparupganguly in #1381
  • Add examples/gemini-2.5-pro crawler by @aparupganguly in #1380
  • feat(scrapeURL): return js returns from f-e (FIR-1535) by @mogery in #1385
  • Add examples/deepseek-v3-crawler by @aparupganguly in #1383
  • Add examples/ Deepseek V3 Company Researcher by @aparupganguly in #1384
  • feat(queue-jobs): update notification logic for concurrency limits and add parameter (jsdocs) to batchScrapeUrls by @ftonato in #1398
  • feat(notification): add notification message for concurrency limit reached by @ftonato in #1404
  • compare format (FIR-1560) by @mogery in #1405
  • feat(queue-jobs): add function to determine job type and update notification logic for concurrency limits by @ftonato in #1409

New Contributors

Full Changelog: v1.6.0...v1.7.0

Don't miss a new firecrawl release

NewReleases is sending notifications on new releases.