Added
- Scripts are now auto-detected on the command line, so
--multiis no longer required to run them. Any positional argument that looks like a script —.js/.cjs/.mjs, or any file whose first non-empty line doesn't start withhttp— flips multi mode on automatically, and mixed inputs likelogin.js https://example.com logout.jsjust work.--multiis kept as an explicit override for the one case auto-detection can't cover: sharing a single browser session across a list of plain URLs. All existing invocations behave exactly as before #4725.
Fixed
- Gzipped HAR files are now written by piping JSON through
createGzipstraight to disk instead of materialising the JSON string, aBuffercopy of it, and the full gzippedBufferall at once. For a 200 MB HAR that removes several hundred MB of avoidable peak RSS, multiplied when multiple pages finish around the same time. The storage layer now accepts aReadablein addition to strings and Buffers; existing callers are unaffected #4728. - Gzipped JSON result files (Chrome traces, console logs, etc.) are read by streaming through
createGunzipand collecting utf-8 chunks rather than buffering the whole gzipped payload, gunzipping it into another Buffer, then stringifying. The parsed object still has to fit in memory, but the throwaway gzipped and unzipped buffer copies are gone — meaningful on 50+ MB traces #4726. - The sustainable plugin and the S3 plugin pick up the same streaming treatment: the sustainable plugin now uses the shared streaming gzipped-JSON helper instead of its own buffer-everything copy, and the S3 plugin streams uploads via
createReadStreamwith an explicitContentLengthinstead of loading each file fully into memory beforePutObject. With 20 concurrent S3 uploads, RSS no longer spikes to ~20× the size of the largest file in the result bundle #4727.