Conversation
b3c7b80 to
5fa58c9
Compare
WalkthroughAdds PGP-based manifest verification and asset digest checks, a filesystem-backed response/cache layer with quick binary fingerprinting and atomic writes, centralizes GitHub URL/header helpers, extends cache-state persistence and download orchestration, refactors download/restore flow, updates CI/scripts, and adds extensive tests and patches. Changes
🚥 Pre-merge checks | ✅ 2✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. 📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/test.yml:
- Around line 232-236: The normalization currently maps any non-X64 arch to
aarch64 and any non-macOS OS to a lowercased value, which can silently accept
unknown runner values; update the case statements that set runner_arch and
runner_os to explicitly handle known values (e.g., X64 -> x64, ARM64/ARM ->
aarch64, macOS -> darwin, Linux/Windows -> linux/windows) and add a default
branch that prints an error and exits non‑zero when an unexpected runner.arch or
runner.os is encountered, so the workflow fails fast instead of producing
invalid outputs.
In `@src/action.ts`:
- Around line 125-126: The code currently calls verifyAsset for every download
(including custom user URLs), which breaks installs from custom
`bun-download-url`; update the download flow so that verifyAsset is only invoked
for assets that come from the official GitHub release/manifest path and skipped
for custom URLs: detect when the download came from a user-provided URL (the
branch that calls downloadBun with the custom url / the option controlling
bun-download-url) and bypass verifyAsset for that path (i.e., only call
verifyAsset when the asset metadata/manifest is present or when the URL is from
the official release flow), adjusting the logic around downloadBun/verifyAsset
and the revision assignment so custom installs proceed without verification.
In `@src/filesystem-cache.ts`:
- Around line 32-34: The empty catch blocks in src/filesystem-cache.ts (the two
silent catches around filesystem read/write operations) should log the caught
Error at debug level so persistent cache failures are observable; update those
catch blocks to call the project's debug logger (e.g., processLogger.debug or
the module's logger) with a short contextual message and the error object (fall
back to console.debug if no logger is available), preserving the original
behavior of not throwing after logging.
In `@src/github-asset.ts`:
- Around line 65-88: The parseAssetUrl function currently assumes pathname
segments at fixed indexes and can mis-parse unrelated URLs; add an explicit
shape guard before using parts[0..5]: verify parts[2] === "releases" and either
(parts[3] === "download" and parts.length >= 6) or (parts[3] === "latest" &&
parts[4] === "download" and parts.length >= 6); if the shape check fails, throw
an informative Error; keep the rest of the logic (setting latest and extracting
owner, repo, tag, name) the same but only after the shape guard passes so you
never index into unexpected parts in parseAssetUrl.
In `@src/response-storage.ts`:
- Around line 37-43: The cache-hit branch currently treats falsy empty payloads
as misses and always sets "Content-Type": "application/json"; change the hit
check to detect actual presence (e.g., data !== undefined && data !== null) so
empty strings are valid hits, and set the Content-Type from the stored
metadata/headers for that cached entry (use the cache entry's contentType or
headers if available, or infer from the cache key suffix like ".txt" / ".asc"
and fall back to "application/octet-stream"), updating the new Response(...)
call that currently sets "Content-Type": "application/json" to use that dynamic
content type; refer to the variables/expressions data and the new Response(...)
call in response-storage.ts.
In `@src/utils.ts`:
- Around line 31-34: The cache lookup uses only URL (getStoredResponse(url))
which can return wrong responses for different HTTP methods or Authorization
headers; update the caching to include request context in the cache key (e.g.,
method and relevant headers like Authorization, and request body for non-GETs)
by either: (1) changing getStoredResponse to accept a computed cacheKey (e.g.,
`${method}|${url}|auth:${authHeader}|bodyHash:${hash}`) and using that
everywhere the function is called (including the other spot at line ~47), or (2)
adding a helper like computeCacheKey(method, url, headers, body) and passing its
result into getStoredResponse and the corresponding storeResponse so cache
lookups and stores use the same composite key.
In `@src/verify.ts`:
- Around line 48-49: The cleanup calls using unlinkSync(zipPath) currently can
throw and mask the original verification errors; update each place (the
unlinkSync(zipPath) calls at the sites shown and the similar cleanup calls
around lines 84-85 and 100-101) so you preserve and rethrow the original error:
when an integrity/verification error occurs, capture that original error in a
variable, then perform unlinkSync inside a try/catch; if unlinkSync throws,
attach or log the cleanup error (for example as a suppressed property on the
original Error or via processLogger) but do not replace it, and finally rethrow
the original verification error. Locate the unlinkSync usages in src/verify.ts
to implement this pattern around the functions/blocks performing verification
(the blocks that currently throw new Error(...) and then call
unlinkSync(zipPath)) so cleanup failures never hide the verification failure.
- Around line 78-81: The current search uses line.includes(metadata.name) which
can match substrings; instead locate the manifest line and compare the filename
exactly. In the block around verifiedText/manifestMatch, first find the line,
parse the checksum and filename from the line (using the same regex that
extracts the checksum and filename), and then require that the parsed filename
=== metadata.name before using the checksum; reference the variables
manifestMatch, verifiedText and metadata.name and update the .find(...)
predicate to parse and compare the filename exactly rather than using includes.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 993a239a-fa48-4792-bd09-dc1f06b24e47
⛔ Files ignored due to path filters (4)
bun.lockis excluded by!**/*.lockdist/cache-save/index.jsis excluded by!**/dist/**dist/setup/index.jsis excluded by!**/dist/**package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (14)
.github/workflows/test.ymlpackage.jsonsrc/action.tssrc/download-url.tssrc/filesystem-cache.tssrc/github-api.tssrc/github-asset.tssrc/manifest.tssrc/response-storage.tssrc/signing-key.tssrc/url.tssrc/utils.tssrc/verify.tstests/download-url.spec.ts
There was a problem hiding this comment.
♻️ Duplicate comments (2)
src/verify.ts (2)
48-49:⚠️ Potential issue | 🟡 MinorDo not let cleanup failures replace verification failures.
Line 48, Line 86, and Line 102 call
unlinkSyncdirectly before throwing verification errors. If unlink fails, the filesystem error can mask the integrity/security failure.Suggested fix
+function silentUnlink(filePath: string): void { + try { + unlinkSync(filePath); + } catch { + // keep original verification failure as the thrown error + } +} ... - unlinkSync(zipPath); + silentUnlink(zipPath); throw new Error( ... - unlinkSync(zipPath); + silentUnlink(zipPath); throw new Error( ... - unlinkSync(zipPath); + silentUnlink(zipPath); throw new Error(Also applies to: 86-87, 102-103
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/verify.ts` around lines 48 - 49, The cleanup calls to unlinkSync(zipPath) are executed immediately before throwing verification errors, so if unlinkSync fails its filesystem error can replace the intended verification error; wrap each unlinkSync(zipPath) in a try/catch (or call a safeRemove helper) so cleanup errors are caught and ignored or logged but do not overwrite or replace the original verification Error being thrown (i.e., call unlinkSync inside try { unlinkSync(zipPath) } catch (e) { /* log/ignore */ } and then throw the original verification error).
80-80: 🧹 Nitpick | 🔵 TrivialConstrain manifest hash parsing to SHA-256 length.
Line 80 accepts any-length hex (
+). Since this path is SHA-256, requiring exactly 64 hex chars is safer and matches the intent documented at Line 93.Suggested fix
- .map((line) => line.match(/^([A-Fa-f0-9]+) [* ](.+)$/)) + .map((line) => line.match(/^([A-Fa-f0-9]{64})\s+[* ](.+)$/))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/verify.ts` at line 80, The regex used in the map call that parses manifest lines currently allows any-length hex (the .map((line) => line.match(/^([A-Fa-f0-9]+) [* ](.+)$/)) expression); tighten it to require exactly 64 hex characters for the SHA-256 hash (e.g. replace the + quantifier with {64}) so the first capture group only matches 64 hex chars and not arbitrary lengths, leaving the rest of the pattern (space, optional *, and filename) unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/verify.ts`:
- Around line 48-49: The cleanup calls to unlinkSync(zipPath) are executed
immediately before throwing verification errors, so if unlinkSync fails its
filesystem error can replace the intended verification error; wrap each
unlinkSync(zipPath) in a try/catch (or call a safeRemove helper) so cleanup
errors are caught and ignored or logged but do not overwrite or replace the
original verification Error being thrown (i.e., call unlinkSync inside try {
unlinkSync(zipPath) } catch (e) { /* log/ignore */ } and then throw the original
verification error).
- Line 80: The regex used in the map call that parses manifest lines currently
allows any-length hex (the .map((line) => line.match(/^([A-Fa-f0-9]+) [*
](.+)$/)) expression); tighten it to require exactly 64 hex characters for the
SHA-256 hash (e.g. replace the + quantifier with {64}) so the first capture
group only matches 64 hex chars and not arbitrary lengths, leaving the rest of
the pattern (space, optional *, and filename) unchanged.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/verify.ts`:
- Line 43: Validate that metadata.updated_at is a valid Date before comparing it
to GITHUB_DIGEST_THRESHOLD: after the Date is constructed (where new
Date(asset.updated_at) produces metadata.updated_at) check for invalidity (e.g.,
isNaN(metadata.updated_at.getTime()) or equivalent) and if invalid, fail the
verification with an error/log and do not silently skip the digest check;
replace the direct comparison at if (metadata.updated_at >=
GITHUB_DIGEST_THRESHOLD) with a guard that throws or returns an error on an
invalid Date so the verification cannot be bypassed.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/verify.ts`:
- Around line 20-21: The code currently hardcodes "SHASUMS256.txt" and the
digest key "shasum256" while the function parameter algorithm can be something
else; update the logic in the verify function that accepts parameter algorithm
so the manifest filename and digest key are derived from that algorithm instead
of being hard-coded. Concretely, normalize algorithm (to lowercase), extract the
numeric suffix (e.g., "sha256" -> "256"), build the manifest filename (e.g.,
`SHASUMS{suffix}.txt`) and the digest key/name used to look up digests (e.g.,
`shasum{suffix}`), and ensure any hash computation uses
crypto.createHash(algorithm) rather than a fixed "sha256". Add a safe fallback
for unexpected algorithm strings (or throw) and update all places referencing
"SHASUMS256.txt" or "shasum256" to use these derived variables.
- Around line 42-47: Declare metadata before the try so it remains in scope
(e.g., let metadata: ReturnType<typeof fetchAssetMetadata> | undefined), and
inside the catch ensure subsequent code checks for metadata being undefined
before accessing metadata.name/owner/repo/tag; if metadata is unavailable, fall
back to deriving manifest info from downloadUrl or skip manifest-based matching.
Also make the manifest filename depend on the algorithm parameter instead of
hardcoding "SHASUMS256.txt"—map algorithm (e.g., "sha256" -> "SHASUMS256.txt",
"sha1" -> "SHASUMS1.txt", "sha512" -> "SHASUMS512.txt") and use that when
calling getGitHubManifestUrl and when looking up entries, so the manifest and
local hash algorithm are consistent; update all references to
getGitHubManifestUrl and manifest lookup that assumed metadata to handle the
metadata-undefined case.
81cff7a to
00a66af
Compare
✅ Actions performedReviews resumed. |
There was a problem hiding this comment.
♻️ Duplicate comments (2)
src/verify.ts (2)
75-77:⚠️ Potential issue | 🔴 CriticalFix undefined identifier at Line 76 (
meta→metadata).
metais not declared. This is a compile/runtime correctness bug in the digest verification path.Proposed fix
- assetName = meta.name; + assetName = metadata.name;#!/bin/bash set -euo pipefail # Confirm the undefined identifier usage rg -n --type=ts -C2 '\bassetName\s*=\s*meta\.name\b' src/verify.ts # Confirm no declaration exists for `meta` in this file rg -n --type=ts -C1 '\b(const|let|var)\s+meta\b' src/verify.ts || trueExpected result: first command reports Line 76; second command reports no
metadeclaration.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/verify.ts` around lines 75 - 77, The code assigns assetName from a non-existent identifier `meta`; change the usage to the fetched variable `metadata` so the digest verification uses the correct object. Update the assignment where `assetName = meta.name` to reference `metadata.name` (the result of fetchAssetMetadata) and ensure any other uses in the same block that refer to `meta` are replaced with `metadata`; check the block around fetchAssetMetadata, assetName, and getGitHubManifestUrl to keep names consistent.
83-87:⚠️ Potential issue | 🟠 MajorLine 84 invalid-date guard is neutralized by the catch block.
The
Invalid updated_aterror thrown at Lines 84–87 is swallowed by Lines 112–117, so verification continues instead of failing that path.Proposed fix
} catch (err) { - if (err instanceof Error && err.message.startsWith("Security Mismatch:")) { + if ( + err instanceof Error && + ( + err.message.startsWith("Security Mismatch:") || + err.message.startsWith("Invalid updated_at") + ) + ) { throw err; // always propagate real mismatches } warning(`Skipping GitHub API digest check for: ${downloadUrl}`); }Also applies to: 112-117
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/verify.ts` around lines 83 - 87, The guard that constructs updatedAt from metadata.updated_at and throws an Error for invalid dates (the updatedAt/metadata.updated_at check that calls silentUnlink(zipPath) and throws `Invalid updated_at for asset ${assetName}`) is currently being swallowed by the surrounding catch, allowing verification to continue; update the error handling so the invalid-date path stops processing: either perform this validation outside the try/catch that swallows errors, or after calling silentUnlink(zipPath) rethrow the Error (or return a rejected Promise) so the catch block does not silently continue — make the change where updatedAt is validated and ensure the catch around the verification logic does not suppress that specific thrown Error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/verify.ts`:
- Around line 75-77: The code assigns assetName from a non-existent identifier
`meta`; change the usage to the fetched variable `metadata` so the digest
verification uses the correct object. Update the assignment where `assetName =
meta.name` to reference `metadata.name` (the result of fetchAssetMetadata) and
ensure any other uses in the same block that refer to `meta` are replaced with
`metadata`; check the block around fetchAssetMetadata, assetName, and
getGitHubManifestUrl to keep names consistent.
- Around line 83-87: The guard that constructs updatedAt from
metadata.updated_at and throws an Error for invalid dates (the
updatedAt/metadata.updated_at check that calls silentUnlink(zipPath) and throws
`Invalid updated_at for asset ${assetName}`) is currently being swallowed by the
surrounding catch, allowing verification to continue; update the error handling
so the invalid-date path stops processing: either perform this validation
outside the try/catch that swallows errors, or after calling
silentUnlink(zipPath) rethrow the Error (or return a rejected Promise) so the
catch block does not silently continue — make the change where updatedAt is
validated and ensure the catch around the verification logic does not suppress
that specific thrown Error.
There was a problem hiding this comment.
♻️ Duplicate comments (1)
src/verify.ts (1)
92-125:⚠️ Potential issue | 🟠 MajorDo not swallow
Invalid updated_atafter deleting the archive.Line 93 deletes
zipPath, then Line 94 throws; but Line 121 only rethrows"Security Mismatch:"errors. This can continue and report success after the file is already removed.Proposed fix
+class DigestVerificationError extends Error {} + export async function verifyAsset( zipPath: string, downloadUrl: string, token?: string, algorithm: SupportedAlgorithmNames = "sha256", ): Promise<void> { @@ const updatedAt = new Date(metadata.updated_at); if (Number.isNaN(updatedAt.getTime())) { silentUnlink(zipPath); - throw new Error(`Invalid updated_at for asset ${assetName}`); + throw new DigestVerificationError( + `Invalid updated_at for asset ${assetName}`, + ); } @@ if (githubHash !== actualHash) { silentUnlink(zipPath); - throw new Error( + throw new DigestVerificationError( `Security Mismatch: GitHub API digest (${githubHash}) differs from local hash (${actualHash})!`, ); } @@ } catch (err) { - if (err instanceof Error && err.message.startsWith("Security Mismatch:")) { + if (err instanceof DigestVerificationError) { throw err; // always propagate real mismatches } warning(`Skipping GitHub API digest check for: ${downloadUrl}`); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/verify.ts` around lines 92 - 125, The catch currently only rethrows errors whose message starts with "Security Mismatch:" which lets the earlier thrown "Invalid updated_at for asset ..." be swallowed; update the catch in the verification logic so it also rethrows errors whose message starts with "Invalid updated_at" (or otherwise matches that exact message pattern) in addition to "Security Mismatch:" so that when silentUnlink(zipPath) and throw new Error(`Invalid updated_at for asset ${assetName}`) occur they are propagated instead of being converted into a warning; locate the catch surrounding the asset metadata check (referencing updatedAt, silentUnlink, assetName, downloadUrl, and the existing "Security Mismatch:" handling) and add the additional rethrow branch for the "Invalid updated_at" message.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/verify.ts`:
- Around line 92-125: The catch currently only rethrows errors whose message
starts with "Security Mismatch:" which lets the earlier thrown "Invalid
updated_at for asset ..." be swallowed; update the catch in the verification
logic so it also rethrows errors whose message starts with "Invalid updated_at"
(or otherwise matches that exact message pattern) in addition to "Security
Mismatch:" so that when silentUnlink(zipPath) and throw new Error(`Invalid
updated_at for asset ${assetName}`) occur they are propagated instead of being
converted into a warning; locate the catch surrounding the asset metadata check
(referencing updatedAt, silentUnlink, assetName, downloadUrl, and the existing
"Security Mismatch:" handling) and add the additional rethrow branch for the
"Invalid updated_at" message.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/response-storage.ts`:
- Around line 36-50: The cache check currently uses "undefined !== data" but
getCache() returns string | null so null values are treated as hits; change the
condition in the block that uses getCache(url) to a truthy check so null and
empty string are treated as misses (replace the "undefined !== data" check with
a truthy check such as "if (data)" for the getCache(url) result used when
constructing the Response with new URL(url).hostname and new Response(...)).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 28a2dca2-bf3b-4a51-b6dc-06d42ae9459c
📒 Files selected for processing (1)
src/response-storage.ts
83bf433 to
689123d
Compare
✅ Actions performedReviews resumed. |
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (2)
src/github-asset.ts (1)
77-79:⚠️ Potential issue | 🟡 MinorIncomplete URL structure validation for "latest" pattern.
The check at line 77-79 validates that
parts[3]is either"latest"or"download", but doesn't verify thatparts[4]is"download"whenparts[3]is"latest". This could accept malformed URLs like/owner/repo/releases/latest/invalid/file.zip.🛡️ Proposed hardening
- const expectedStructure = - "releases" === parts[2] && - ("latest" === parts[3] || "download" === parts[3]); + const isTaggedDownload = + "releases" === parts[2] && "download" === parts[3]; + const isLatestDownload = + "releases" === parts[2] && + "latest" === parts[3] && + "download" === parts[4]; + const expectedStructure = isTaggedDownload || isLatestDownload;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/github-asset.ts` around lines 77 - 79, The URL validation in the expectedStructure check is too permissive: ensure that when parts[3] === "latest" you also require parts[4] === "download" (and that parts.length is sufficient) before treating it as valid. Update the logic around the expectedStructure variable (and any code that reads the parts array) to explicitly check the "latest" case (parts[3] === "latest" && parts[4] === "download") or the "download" case (parts[3] === "download"), and fail validation otherwise to prevent accepting malformed URLs.src/response-storage.ts (1)
36-50:⚠️ Potential issue | 🟡 MinorTreat empty cached bodies as misses again.
null !== datamakes""a cache hit, so an empty stored response is synthesized as200here instead of falling through to the network. If the intended behavior is still “empty = miss”, this needs to go back to a truthy check (if (data)).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/response-storage.ts` around lines 36 - 50, The current cache check uses `if (null !== data)` which treats empty string bodies as hits; change the condition in the block around `getCache(url)` so empty bodies are treated as misses (use a truthy check like `if (data)`), keeping the rest of the logic that builds the `contentType` (based on `new URL(url).hostname`) and returns the `Response` with headers including `"X-Storage-Hit": "true"`. This ensures only non-empty cached responses are returned and empty strings fall through to network handling.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/github-asset.ts`:
- Around line 65-69: The error occurs in parseAssetUrl: the thrown Error uses an
undefined variable `url.hostname`; change that reference to the declared URL
object `urlObj.hostname` so the error message correctly reports the hostname;
ensure the check and Error construction inside parseAssetUrl consistently use
`urlObj` (and keep return type UrlAssetMetadata unchanged).
- Around line 119-127: Before calling response.json(), verify the HTTP response
succeeded (check response.ok or response.status); if not, read the error body
(await response.text() or json()) and throw a descriptive Error including
response.status and the returned error body so callers don’t try to access
release.assets on an invalid payload. Update the logic around response, release,
and asset (the code that finds release.assets?.find(...) and throws `Asset
${base.name} not found`) to only run after the successful-response check and
include base.name and base.tag in thrown errors for clearer diagnostics.
In `@src/response-storage.ts`:
- Around line 22-25: The current logic in response-storage.ts (variables
secureApi and smallish) marks all HTTPS GitHub API GETs as cacheable, which
risks persisting authenticated/private responses because the cache key ignores
Authorization; update this by restricting secureApi to only a small whitelist of
public metadata endpoints used by the action (e.g., specific paths like
/repos/:owner/:repo, /repos/:owner/:repo/releases/latest, or whichever exact
endpoints the code uses) or move the cacheability check out of
response-storage.ts into the call sites (the functions that perform the GitHub
requests) where the Authorization/token is known and you can skip caching when
an auth header is present; specifically change the logic that defines secureApi
(and thus the return value) so it only returns true for those approved public
endpoints or defer the decision to the request() caller that can inspect
headers.
In `@src/verify.ts`:
- Around line 157-162: The manifest parsing accepts any-length hex in the regex
used to build manifestMatch from verifiedText, which should be tightened to a
fixed SHA-256 length; update the regex in the manifestMatch creation (the .map
callback that currently uses /^([A-Fa-f0-9]+) [* ](.+)$/) to require exactly 64
hex chars for the first capture group, leaving the asset name capture and the
subsequent .find check against assetName unchanged so manifestMatch still
selects entries by m[2].trim() === assetName.
- Line 83: Remove the dead variable digest_matched and its assignment(s) (the
let digest_matched = false; declaration and the place where it is set to true)
since it is never read; alternatively, if the intent was to short-circuit
manifest verification when the GitHub digest matches, use digest_matched in the
control flow by checking it before calling verifyManifest (or the manifest
verification block) and skip that work when digest_matched is true—update the
logic around the digest comparison and the verifyManifest call accordingly.
---
Duplicate comments:
In `@src/github-asset.ts`:
- Around line 77-79: The URL validation in the expectedStructure check is too
permissive: ensure that when parts[3] === "latest" you also require parts[4] ===
"download" (and that parts.length is sufficient) before treating it as valid.
Update the logic around the expectedStructure variable (and any code that reads
the parts array) to explicitly check the "latest" case (parts[3] === "latest" &&
parts[4] === "download") or the "download" case (parts[3] === "download"), and
fail validation otherwise to prevent accepting malformed URLs.
In `@src/response-storage.ts`:
- Around line 36-50: The current cache check uses `if (null !== data)` which
treats empty string bodies as hits; change the condition in the block around
`getCache(url)` so empty bodies are treated as misses (use a truthy check like
`if (data)`), keeping the rest of the logic that builds the `contentType` (based
on `new URL(url).hostname`) and returns the `Response` with headers including
`"X-Storage-Hit": "true"`. This ensures only non-empty cached responses are
returned and empty strings fall through to network handling.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: ecb5b52c-fe71-45f2-a1e2-be6b76071c25
📒 Files selected for processing (4)
src/filesystem-cache.tssrc/github-asset.tssrc/response-storage.tssrc/verify.ts
b659934 to
2ccaaf3
Compare
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/github-asset.ts`:
- Around line 43-53: The getHexFromDigest function currently destructures
digest.toLowerCase().split(":") without validating parts, causing hex to be
undefined for inputs missing the colon; update getHexFromDigest to first split
into parts = digest.toLowerCase().split(":"), verify parts.length === 2 and both
parts[0] and parts[1] are non-empty, then assign algorithm and hex from parts,
and only after that look up DIGEST_CONFIG[algorithm] and check hex.length and
/^[a-f0-9]+$/; if the split validation fails throw a clear Unsupported or
Invalid digest error instead of allowing a TypeError.
- Around line 82-85: The asset filename from the URL is percent-encoded, so
decode the filename before comparing it to the GitHub API metadata: replace uses
of the raw URL-derived name (the variable `name` / `base.name`) with a decoded
value (e.g., run `decodeURIComponent` once) prior to the asset lookup that
compares to `asset.name` and in the error message; update the lookup in the
function handling asset resolution and the error string that references `name`
so they use the decoded filename to correctly match names like "foo bar.zip".
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 5f316f6b-2937-48c2-a916-757659436548
📒 Files selected for processing (1)
src/github-asset.ts
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
src/github-asset.ts (1)
43-53:⚠️ Potential issue | 🟠 MajorStill crashes on malformed digests before validation runs.
Line 44 trusts
split(":"), but Line 132 later feeds this function a raw API field viaas GitHubDigest. Inputs likesha256reach Line 51 withhex === undefined, so this throws aTypeErrorinstead of the intended validation error.🐛 Proposed fix
export function getHexFromDigest(digest: GitHubDigest): string { - const [algorithm, hex] = digest.toLowerCase().split(":"); + const parts = digest.toLowerCase().split(":"); + if (parts.length !== 2 || !parts[0] || !parts[1]) { + throw new Error(`Invalid digest format: ${digest}`); + } + + const [algorithm, hex] = parts; const expectedLength = DIGEST_CONFIG[algorithm as DigestAlgorithm];Run this to reproduce the current failure mode:
#!/bin/bash set -euo pipefail node <<'NODE' function current(digest) { const DIGEST_CONFIG = { sha256: 64, sha512: 128 }; const [algorithm, hex] = digest.toLowerCase().split(":"); const expectedLength = DIGEST_CONFIG[algorithm]; if (!expectedLength) { throw new Error(`Unsupported digest algorithm: ${algorithm}`); } if (hex.length !== expectedLength || !/^[a-f0-9]+$/.test(hex)) { throw new Error( `Invalid ${algorithm} hex format. Expected ${expectedLength} chars, got ${hex.length}`, ); } } try { current("sha256"); console.log("unexpectedly accepted"); } catch (error) { console.log(String(error)); } NODE🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/github-asset.ts` around lines 43 - 53, getHexFromDigest currently assumes digest.split(":") yields two parts so hex can be used, which causes a TypeError for inputs like "sha256"; change the function to explicitly validate the split result before using hex: call digest.toLowerCase().split(":", 2) (or check parts.length === 2), ensure both algorithm and hex are present and hex is a string, then look up DIGEST_CONFIG[algorithm as DigestAlgorithm] and run the length/regex checks; if the split fails or hex is missing, throw the same validation error (e.g., "Invalid <algorithm> hex format") so malformed inputs are rejected gracefully.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/github-asset.ts`:
- Around line 72-85: The code currently allows paths with more than six segments
which can mis-parse asset name; change the initial length check from
"parts.length < 6" to "parts.length !== 6" so only exactly six path segments are
accepted, keep the existing expectedStructure validation, and ensure this change
is applied where parts, downloadUrl, expectedStructure, owner, repo, tag, and
name are used to locate the asset.
- Around line 120-123: Replace the raw API tag assignment so base.tag stores a
percent-encoded form of the release tag instead of release.tag_name directly:
when setting base.tag (the assignment that currently uses release.tag_name),
apply proper URL encoding (e.g., encodeURIComponent) so later callers like
getGitHubManifestUrl() and buildUrl() receive a safe pathname segment; update
the assignment in the code that sets base.tag to use the encoded value and
preserve the original tag elsewhere if needed.
---
Duplicate comments:
In `@src/github-asset.ts`:
- Around line 43-53: getHexFromDigest currently assumes digest.split(":") yields
two parts so hex can be used, which causes a TypeError for inputs like "sha256";
change the function to explicitly validate the split result before using hex:
call digest.toLowerCase().split(":", 2) (or check parts.length === 2), ensure
both algorithm and hex are present and hex is a string, then look up
DIGEST_CONFIG[algorithm as DigestAlgorithm] and run the length/regex checks; if
the split fails or hex is missing, throw the same validation error (e.g.,
"Invalid <algorithm> hex format") so malformed inputs are rejected gracefully.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: ab44ff63-929c-4dca-830d-843603b438f9
📒 Files selected for processing (1)
src/github-asset.ts
There was a problem hiding this comment.
♻️ Duplicate comments (2)
src/github-asset.ts (2)
126-127:⚠️ Potential issue | 🟠 MajorNormalize latest-release tags to the encoded form.
Line 127 stores raw
release.tag_name, but the non-latestpath keepstagpercent-encoded from the download URL. If the resolved latest tag contains/, spaces, or other reserved characters, later URL builders will treat it as multiple path segments.🛠️ Proposed fix
const release = await response.json(); if (release && base.latest) { - base.tag = release.tag_name; + base.tag = encodeURIComponent(release.tag_name); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/github-asset.ts` around lines 126 - 127, The code assigns raw release.tag_name to base.tag which mismatches the percent-encoded tag used on the non-`latest` path; update the assignment so base.tag is the encoded form (e.g., encode the resolved release tag the same way the non-latest download URL does) when setting base.tag from release.tag_name in src/github-asset.ts to ensure URL builders treat the tag as a single path segment (reference the variables base.tag and release.tag_name).
78-85:⚠️ Potential issue | 🟡 MinorReject URLs with extra path segments.
Line 78 still only enforces a minimum length, so an unsupported path like
/releases/download/v1/foo/bar.zipis parsed as assetfooand fails later as a misleading lookup error instead of being rejected up front.🛠️ Proposed fix
- if (parts.length < 6) { + if (parts.length !== 6) { throw new Error(`Unsupported GitHub asset URL format: ${downloadUrl}`); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/github-asset.ts` around lines 78 - 85, The code currently only checks a minimum parts.length and the expectedStructure boolean, so URLs with extra path segments slip through; update the validation to also enforce exact path segment counts for each supported pattern: when parts[2] === "releases" and parts[3] === "latest" && parts[4] === "download" require the overall parts.length that matches the "owner/repo/releases/latest/download/<asset>" shape, and when parts[2] === "releases" and parts[3] === "download" require the parts.length that matches the "owner/repo/releases/download/<tag>/<asset>" shape; modify the logic around expectedStructure (and the earlier parts.length check) to reject URLs with extra segments using the downloadUrl and parts variables so unsupported extra-path URLs are thrown immediately.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/github-asset.ts`:
- Around line 126-127: The code assigns raw release.tag_name to base.tag which
mismatches the percent-encoded tag used on the non-`latest` path; update the
assignment so base.tag is the encoded form (e.g., encode the resolved release
tag the same way the non-latest download URL does) when setting base.tag from
release.tag_name in src/github-asset.ts to ensure URL builders treat the tag as
a single path segment (reference the variables base.tag and release.tag_name).
- Around line 78-85: The code currently only checks a minimum parts.length and
the expectedStructure boolean, so URLs with extra path segments slip through;
update the validation to also enforce exact path segment counts for each
supported pattern: when parts[2] === "releases" and parts[3] === "latest" &&
parts[4] === "download" require the overall parts.length that matches the
"owner/repo/releases/latest/download/<asset>" shape, and when parts[2] ===
"releases" and parts[3] === "download" require the parts.length that matches the
"owner/repo/releases/download/<tag>/<asset>" shape; modify the logic around
expectedStructure (and the earlier parts.length check) to reject URLs with extra
segments using the downloadUrl and parts variables so unsupported extra-path
URLs are thrown immediately.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 69b7140e-b39a-42a4-b94d-3d8479491801
📒 Files selected for processing (1)
src/github-asset.ts
✅ Actions performedReviews resumed. |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/verify.ts`:
- Around line 124-141: The fallback logic that constructs manifestBaseUrl by
replacing the last path segment of downloadUrl is unsafe for arbitrary customUrl
inputs and can produce incorrect manifest paths after fetchAssetMetadata()
errors; update verify.ts so that before performing the pathname-replacement
fallback (the code using downloadUrl, manifestFile, manifestBaseUrl) you
validate the custom/download URL is an asset-like URL (e.g., ends with a known
archive extension or matches expected GitHub asset path pattern) and only
perform the replacement when it matches, otherwise either reject the input with
a clear error or use the resolved filename returned by fetchAssetMetadata() when
available (thread the resolved asset filename from fetchAssetMetadata() into the
fallback) so you never assume URL structure; reference the symbols downloadUrl,
manifestBaseUrl, manifestFile, fetchAssetMetadata(), and DigestVerificationError
when making these changes.
- Around line 107-117: The code sets the GitHub Actions output
"bun-download-checksum" prematurely inside the metadata.digest branch (where
getHexFromDigest(metadata.digest) is compared to actualHash); remove the
setOutput("bun-download-checksum", `${metadata.digest}`) call in that
conditional so the checksum is not published until the full verification
completes, and ensure the final setOutput call after all integrity checks (the
existing setOutput later in the function) remains as the only place that
publishes the checksum; update any references to
assetName/metadata.digest/actualHash accordingly so the output is only written
on the successful verification path.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 3eb09603-d114-4acb-ad53-d58491f508a7
📒 Files selected for processing (3)
README.mdaction.ymlsrc/verify.ts
8208b8b to
922dbf2
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@patches/compare-versions`+6.1.1.patch:
- Around line 27-29: The build emits an empty import statement ("import {} from
'./utils.js';") which is a compilation artifact and should be removed or
prevented: either delete the empty import from the generated module (the line
importing './utils.js') or adjust the source to use a type-only import (e.g.,
use "import type { CompareOperator } from './utils.js';" in the
TypeScript/source so the compiler doesn't emit an empty import) to avoid leaving
an unused, no-op import in compare.js.
- Around line 288-339: The tsconfig.json you added contains an
invalid/unnecessary compiler option "project": "." inside compilerOptions which
can trigger TypeScript warnings; open the new tsconfig.json (look for the
compilerOptions block and the "project" key) and remove the "project": "." entry
(or move it out of compilerOptions to where your build tooling expects it),
verify only valid tsconfig compilerOptions remain (e.g., no misplaced top-level
build settings), and re-run tsc to confirm the warning is gone.
In `@src/action.ts`:
- Around line 123-125: Cache hits skip verifyAsset() because verification only
runs inside downloadBun(), so existsSync(bunPath) and restoreCache() trust the
binary by version alone and never emit the bun-download-checksum output; update
the cache handling so the trust model is consistent: either persist the verified
archive checksum with the cache entry when storing/restoring (so restoreCache()
reads and verifyAsset() is considered satisfied) or explicitly disable
checksum/verified mode on cache hits and emit a clear
bun-download-checksum-not-verified output. Concretely, modify the cache
save/restore logic referenced by restoreCache and the existsSync(bunPath) branch
to include the archive checksum (or a verified flag) alongside the cached
artifact and ensure downloadBun/verifyAsset and the code that sets
bun-download-checksum read that metadata and behave consistently.
In `@src/filesystem-cache.ts`:
- Around line 50-52: Remove the redundant existsSync check before creating the
cache directory: replace the conditional block that calls existsSync(CACHE_DIR)
then mkdirSync(CACHE_DIR, { recursive: true }) with a single
mkdirSync(CACHE_DIR, { recursive: true }) call so the directory creation is
idempotent; update any related comments in filesystem-cache.ts and keep
references to CACHE_DIR, existsSync, and mkdirSync to locate the change.
In `@src/manifest.ts`:
- Around line 80-85: In getVerifiedManifest, building ascUrl by
string-concatenating `${downloadUrl}.asc` incorrectly modifies query/fragment;
instead construct a URL object from downloadUrl (use new URL(downloadUrl)),
append ".asc" to its pathname (e.g. parsedUrl.pathname = parsedUrl.pathname +
'.asc' or parsedUrl.pathname += '.asc'), then use parsedUrl.toString() where
ascUrl is needed so the .asc is applied to the path only and query/fragment
remain intact; update the code around ascUrl/parsedUrl accordingly.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: e952eea6-0273-4ce0-84dc-63700531c492
⛔ Files ignored due to path filters (4)
bun.lockis excluded by!**/*.lockdist/cache-save/index.jsis excluded by!**/dist/**dist/setup/index.jsis excluded by!**/dist/**package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (18)
.github/workflows/test.ymlREADME.mdaction.ymlpackage.jsonpatches/compare-versions+6.1.1.patchpatches/[email protected]src/action.tssrc/download-url.tssrc/filesystem-cache.tssrc/github-api.tssrc/github-asset.tssrc/manifest.tssrc/response-storage.tssrc/signing-key.tssrc/url.tssrc/utils.tssrc/verify.tstests/download-url.spec.ts
💤 Files with no reviewable changes (1)
- patches/[email protected]
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/manifest.ts`:
- Around line 144-147: The current logic in manifest verification incorrectly
compares a subkey's fingerprint to the primary key's fingerprint: locate the
block using publicKey.getCommonKeys(sig.keyID) and signingSubkey, and change it
so that if getCommonKeys(sig.keyID) returns a matching subkey (signingSubkey is
truthy) you treat the signature as trusted (return true) instead of comparing
signingSubkey.getFingerprint() to fingerprint; remove the fingerprint equality
check and simply accept the match from getCommonKeys (optionally ensure
getCommonKeys result is non-empty before returning true).
- Around line 22-46: The duplicate search should be consolidated into one pass:
replace the two separate blocks that build `subpackets`/`allSubpackets` and the
`creationSub`/for-loop with a single iteration over the combined set of
subpackets (e.g., Array.from(...) of `sigPacket.hashedSubpackets` and
`sigPacket.unhashedSubpackets` or `sigPacket.subpackets` when present) and check
`if (p?.type === 2 && p?.creationTime instanceof Date) return p.creationTime;`;
remove the earlier `creationSub` find and the later loop so you only inspect
each subpacket once and still handle iterable/map cases for
`hashedSubpackets`/`unhashedSubpackets`.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: befd21b9-2aef-4ce9-b16e-2e61a5419065
📒 Files selected for processing (1)
src/manifest.ts
✅ Actions performedReviews resumed. |
- Cache headers with the response
First, `package-lock.json` then `bun.lock` by reading that file. Used: bun run modules:lock
e56dd20 to
7d54eaf
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (4)
src/action.ts (1)
71-78:⚠️ Potential issue | 🟠 MajorDefer and safely reconcile
bunxagainst the final binary path.
bunxis created before download resolution and points to the initialbunPath; later the canonical binary can move toresult.binPath, leavingbunxstale. This block also still relies on directerror.codeaccess and EEXIST-only handling instead of guarded symlink reconciliation.🔧 Suggested fix
-import { mkdirSync, symlinkSync, existsSync, readFileSync } from "node:fs"; +import { + mkdirSync, + symlinkSync, + existsSync, + readFileSync, + lstatSync, + readlinkSync, + unlinkSync, +} from "node:fs"; @@ - const bunPath = join(binPath, exe("bun")); - try { - symlinkSync(bunPath, join(binPath, exe("bunx"))); - } catch (error) { - if ("EEXIST" !== error.code) { - throw error; - } - } + const bunPath = join(binPath, exe("bun")); @@ saveState("cache", stateValue); + + const bunxPath = join(dirname(cacheState.bunPath), exe("bunx")); + const bunxTarget = exe("bun"); + try { + const stat = lstatSync(bunxPath); + if (stat.isSymbolicLink() && readlinkSync(bunxPath) !== bunxTarget) { + unlinkSync(bunxPath); + symlinkSync(bunxTarget, bunxPath); + } + } catch (error) { + if ( + !(error instanceof Error) || + (error as NodeJS.ErrnoException).code !== "ENOENT" + ) { + throw error; + } + symlinkSync(bunxTarget, bunxPath); + }#!/bin/bash # Verify current symlink lifecycle and error narrowing in src/action.ts rg -n "symlinkSync\\(.*exe\\(\"bunx\"\\)\\)|result\\.binPath|cacheState\\.bunPath|catch \\(error\\)|error\\.code|lstatSync|unlinkSync|readlinkSync" src/action.ts -C2Based on learnings,
bunxmust targetexe("bun")relatively and be reconciled only after finalcacheState.bunPath, with narrowed errors and symlink-only unlink behavior.Also applies to: 232-237, 269-269
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/action.ts` around lines 71 - 78, The current creation of bunx uses an early absolute bunPath and swallows non-EEXIST errors; instead defer or reconcile bunx against the finalized cacheState.bunPath/result.binPath by creating bunx as a relative symlink that points to exe("bun") and, if bunx already exists, narrow error handling: use lstatSync to confirm it's a symlink, readlinkSync to inspect its target, and only unlinkSync when the target differs (and only if lstat indicates a symlink), then recreate the relative symlink; ensure you check and rethrow unexpected errors by testing err.code explicitly (e.g., EEXIST, ENOENT) rather than blind property access on error.patches/compare-versions+6.1.1.patch (1)
293-339:⚠️ Potential issue | 🟡 MinorRemove invalid
"project": "."from tsconfig.json compilerOptions.The
projectfield is not a validcompilerOptionsproperty in TypeScript's tsconfig schema. It may cause TypeScript to emit warnings. This option belongs at the top level (for project references) or should be removed entirely.🔧 Proposed fix in the patch
"skipLibCheck": true, - - "project": ".", } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@patches/compare-versions`+6.1.1.patch around lines 293 - 339, Remove the invalid "project": "." entry from the tsconfig.json's compilerOptions block (the file added in the diff), because "project" is not a valid compilerOptions property; edit the tsconfig.json to delete that key (or move it to top-level if you intended a project reference) so the "compilerOptions" object only contains valid keys like "module", "target", "outDir", etc.tests/init.suite.ts (1)
4-4: 🧹 Nitpick | 🔵 TrivialRemove unused
previousEnvVarsvariable.This variable captures the environment state but is never used for teardown or restoration. Either implement the teardown logic or remove the dead code.
♻️ Proposed fix
import { join } from "node:path"; import { tmpdir } from "node:os"; -const previousEnvVars = { ...process.env }; - // This is a side-effect only file process.env["RUNNER_TEMP"] = join(tmpdir(), "setup-bun-integration-test");🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/init.suite.ts` at line 4, The variable previousEnvVars is declared but never used; either remove the dead declaration "previousEnvVars" from tests/init.suite.ts, or implement teardown that restores process.env using that snapshot (e.g., add an afterAll or afterEach hook that assigns the saved previousEnvVars back to process.env or selectively restores modified keys). Ensure you reference the existing previousEnvVars identifier and process.env in the chosen fix so the intent is clear.src/manifest.ts (1)
79-80:⚠️ Potential issue | 🟠 MajorRedact
ascUrlin thrown errors to avoid leaking signed/credentialed URLs.Line 79 and Line 113 currently interpolate the full URL directly into exception text.
🔧 Proposed fix
-import { getValidatedLastModified, request } from "./utils"; +import { getValidatedLastModified, request, redactUrlForLogs } from "./utils"; @@ const ascUrl = parsedUrl.href; + const safeAscUrl = redactUrlForLogs(ascUrl); @@ - throw new Error(`No PGP signatures from ${fingerprint} found in ${ascUrl}`); + throw new Error( + `No PGP signatures from ${fingerprint} found in ${safeAscUrl}`, + ); @@ - `PGP Signature verification failed for ${ascUrl}: ${errMessage}`, + `PGP Signature verification failed for ${safeAscUrl}: ${errMessage}`, );Also applies to: 112-114
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/manifest.ts` around lines 79 - 80, The error messages currently interpolate the full ascUrl (e.g. in the throw new Error(`No PGP signatures from ${fingerprint} found in ${ascUrl}`) statements), which can leak credentialed URLs; update those throws to omit or redact ascUrl (use a placeholder like "[REDACTED]" or only include non-sensitive parts such as hostname) and retain fingerprint for context, while if full ascUrl is needed for debugging send it to a safe logger instead of the thrown Error; specifically change the throw expressions that reference ascUrl to use a redacted string and keep any logging of the full ascUrl separate.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/test.yml:
- Line 303: The echo line is dedented and falls outside the run: shell block;
indent the `echo "Return code: ${status}"` line so it is part of the `run:`
block (add 4 more spaces so it totals 12 spaces to match the surrounding else
block indentation), ensuring the `else` / `run:` structure remains aligned and
the shell commands are parsed as script content.
In `@src/action.ts`:
- Line 93: Restore and save are deriving cache keys from different URL sources
(getCacheKey(url) for restore vs result.url for save) which can drift if
downloadBun normalizes the URL; change the code to derive a single stable cache
key from the original input (call getCacheKey(url) once into cacheKey) and reuse
that same cacheKey when saving the cache instead of using result.url, or
alternatively compute and persist the normalized URL immediately after download
and consistently use that persisted value for both restore and save operations
(refs: getCacheKey, downloadBun, cacheKey, result.url).
In `@src/extract-bun.ts`:
- Around line 14-18: The filename check and the try/catch are misordered: change
the regex to anchor the match (e.g., /^bun.*\.zip$/ against name) and move the
try so it wraps both extractZip(entryPath) and extractBun(extractedPath) calls;
i.e., for the branch that tests name with the anchored regex, place a try {
const extractedPath = await extractZip(entryPath); return await
extractBun(extractedPath); } catch (err) { /* fallback handling */ } so
extractZip failures are caught alongside extractBun.
In `@src/verify.ts`:
- Around line 26-30: The type-guard isSupportedAlgorithm currently uses the `in`
operator which can match inherited properties; change it to an own-property
check by using Object.prototype.hasOwnProperty.call(supportedAlgorithms,
algoName) (or an equivalent own-property check) so the guard only returns true
for keys actually present on supportedAlgorithms; update the function
isSupportedAlgorithm accordingly to use that check.
In `@tests/response-storage.test.ts`:
- Line 104: The test assigns a non-standard property to the Response object
using "res.storedAt = oldDate.getTime()" which breaks strict TypeScript; change
it to use the established escape-hatch cast pattern so it matches the other
occurrence (e.g., replace the assignment with "(res as any).storedAt =
oldDate.getTime()") so the compiler accepts the ad-hoc property while keeping
the rest of the test unchanged.
---
Duplicate comments:
In `@patches/compare-versions`+6.1.1.patch:
- Around line 293-339: Remove the invalid "project": "." entry from the
tsconfig.json's compilerOptions block (the file added in the diff), because
"project" is not a valid compilerOptions property; edit the tsconfig.json to
delete that key (or move it to top-level if you intended a project reference) so
the "compilerOptions" object only contains valid keys like "module", "target",
"outDir", etc.
In `@src/action.ts`:
- Around line 71-78: The current creation of bunx uses an early absolute bunPath
and swallows non-EEXIST errors; instead defer or reconcile bunx against the
finalized cacheState.bunPath/result.binPath by creating bunx as a relative
symlink that points to exe("bun") and, if bunx already exists, narrow error
handling: use lstatSync to confirm it's a symlink, readlinkSync to inspect its
target, and only unlinkSync when the target differs (and only if lstat indicates
a symlink), then recreate the relative symlink; ensure you check and rethrow
unexpected errors by testing err.code explicitly (e.g., EEXIST, ENOENT) rather
than blind property access on error.
In `@src/manifest.ts`:
- Around line 79-80: The error messages currently interpolate the full ascUrl
(e.g. in the throw new Error(`No PGP signatures from ${fingerprint} found in
${ascUrl}`) statements), which can leak credentialed URLs; update those throws
to omit or redact ascUrl (use a placeholder like "[REDACTED]" or only include
non-sensitive parts such as hostname) and retain fingerprint for context, while
if full ascUrl is needed for debugging send it to a safe logger instead of the
thrown Error; specifically change the throw expressions that reference ascUrl to
use a redacted string and keep any logging of the full ascUrl separate.
In `@tests/init.suite.ts`:
- Line 4: The variable previousEnvVars is declared but never used; either remove
the dead declaration "previousEnvVars" from tests/init.suite.ts, or implement
teardown that restores process.env using that snapshot (e.g., add an afterAll or
afterEach hook that assigns the saved previousEnvVars back to process.env or
selectively restores modified keys). Ensure you reference the existing
previousEnvVars identifier and process.env in the chosen fix so the intent is
clear.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 5d1ebc5b-498a-4601-b29c-45d1f13d368b
⛔ Files ignored due to path filters (4)
bun.lockis excluded by!**/*.lockdist/cache-save/index.jsis excluded by!**/dist/**dist/setup/index.jsis excluded by!**/dist/**package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (34)
.github/workflows/format.yml.github/workflows/test.yml.gitignoreREADME.mdaction.ymlbunfig.tomlpackage.jsonpatches/compare-versions+6.1.1.patchpatches/[email protected]src/action.tssrc/atomic-write.tssrc/bunfig.tssrc/cache-save.tssrc/download-bun.tssrc/download-url.tssrc/extract-bun.tssrc/filesystem-cache.tssrc/github-api.tssrc/github-asset.tssrc/index.tssrc/manifest.tssrc/quick-checksum.tssrc/response-storage.tssrc/signing-key.tssrc/url.tssrc/utils.tssrc/verify.tstests/atomic-write.test.tstests/bundled.spec.tstests/bundled.suite.tstests/download-url.spec.tstests/filesystem-cache.test.tstests/init.suite.tstests/response-storage.test.ts
💤 Files with no reviewable changes (1)
- patches/[email protected]
| revision, | ||
| }; | ||
|
|
||
| const cacheKey = getCacheKey(url); |
There was a problem hiding this comment.
Use a stable cache URL source for both restore and save key derivation.
Restore key uses getCacheKey(url) from pre-download input, but saved state URL is derived from result.url; if normalization changes URL, restore and save can drift and cause recurring cache misses.
🔧 Suggested fix
- const cacheKey = getCacheKey(url);
+ const cacheUrl = sUrl;
+ const cacheKey = getCacheKey(cacheUrl);
@@
- cacheState.url = result.url;
+ cacheState.url = result.url; // output URL (raw) for downstream steps
@@
- const stateValue = JSON.stringify({
- ...cacheState,
- url: stripUrlCredentials(cacheState.url),
- });
+ const stateValue = JSON.stringify({
+ ...cacheState,
+ url: cacheUrl, // stable cache/state URL used for restore/save keying
+ });Based on learnings, downloadBun may return a normalized URL different from input, so cache key material should come from a stable state/caching URL rather than post-download output URL.
Also applies to: 237-237, 260-263
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/action.ts` at line 93, Restore and save are deriving cache keys from
different URL sources (getCacheKey(url) for restore vs result.url for save)
which can drift if downloadBun normalizes the URL; change the code to derive a
single stable cache key from the original input (call getCacheKey(url) once into
cacheKey) and reuse that same cacheKey when saving the cache instead of using
result.url, or alternatively compute and persist the normalized URL immediately
after download and consistently use that persisted value for both restore and
save operations (refs: getCacheKey, downloadBun, cacheKey, result.url).
| if (/^bun.*\.zip/.test(name)) { | ||
| const extractedPath = await extractZip(entryPath); | ||
| try { | ||
| return await extractBun(extractedPath); | ||
| } catch { |
There was a problem hiding this comment.
Anchor zip matching and catch extractZip failures in the same fallback block.
Line 14 currently matches filenames that only contain .zip, and Line 15 can throw before your catch executes. That can abort extraction even when valid siblings exist.
🔧 Proposed fix
- if (/^bun.*\.zip/.test(name)) {
- const extractedPath = await extractZip(entryPath);
- try {
- return await extractBun(extractedPath);
- } catch {
- // keep searching sibling entries
- }
- }
+ if (/^bun.*\.zip$/.test(name)) {
+ try {
+ const extractedPath = await extractZip(entryPath);
+ return await extractBun(extractedPath);
+ } catch {
+ // keep searching sibling entries
+ }
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/extract-bun.ts` around lines 14 - 18, The filename check and the
try/catch are misordered: change the regex to anchor the match (e.g.,
/^bun.*\.zip$/ against name) and move the try so it wraps both
extractZip(entryPath) and extractBun(extractedPath) calls; i.e., for the branch
that tests name with the anchored regex, place a try { const extractedPath =
await extractZip(entryPath); return await extractBun(extractedPath); } catch
(err) { /* fallback handling */ } so extractZip failures are caught alongside
extractBun.
| function isSupportedAlgorithm( | ||
| algoName: string, | ||
| ): algoName is SupportedAlgorithmNames { | ||
| return algoName in supportedAlgorithms; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the verify.ts file and how supportedAlgorithms is defined
find . -name "verify.ts" -type fRepository: oven-sh/setup-bun
Length of output: 76
🏁 Script executed:
# Look at the src/verify.ts file around lines 26-30
head -50 src/verify.ts | cat -nRepository: oven-sh/setup-bun
Length of output: 2022
🏁 Script executed:
# Search for supportedAlgorithms definition
rg "supportedAlgorithms" -B 5 -A 5Repository: oven-sh/setup-bun
Length of output: 1231
🏁 Script executed:
# Check the full definition and usage context
cat -n src/verify.ts | head -100Repository: oven-sh/setup-bun
Length of output: 4018
Use an own-property check for the algorithm guard.
Line 29 uses the in operator to check for algorithm support. While this works for the current plain object, using hasOwnProperty is the more precise and defensive approach for dictionary lookups, as in includes inherited properties.
🔧 Proposed fix
function isSupportedAlgorithm(
algoName: string,
): algoName is SupportedAlgorithmNames {
- return algoName in supportedAlgorithms;
+ return Object.prototype.hasOwnProperty.call(supportedAlgorithms, algoName);
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/verify.ts` around lines 26 - 30, The type-guard isSupportedAlgorithm
currently uses the `in` operator which can match inherited properties; change it
to an own-property check by using
Object.prototype.hasOwnProperty.call(supportedAlgorithms, algoName) (or an
equivalent own-property check) so the guard only returns true for keys actually
present on supportedAlgorithms; update the function isSupportedAlgorithm
accordingly to use that check.
| const res = new Response("old content", { | ||
| headers: { "Date": oldDate.toUTCString() }, | ||
| }); | ||
| res.storedAt = oldDate.getTime(); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Check the actual code at the specified lines
fd response-storage.test.ts --type f | head -1 | xargs cat -n | sed -n '100,110p'Repository: oven-sh/setup-bun
Length of output: 441
🏁 Script executed:
# Search for other storedAt usages in the test file
fd response-storage.test.ts --type f | xargs grep -n "storedAt" | head -20Repository: oven-sh/setup-bun
Length of output: 104
🏁 Script executed:
# Search for storedAt in source files (non-test)
rg "storedAt" --type ts -g "src/**" -nRepository: oven-sh/setup-bun
Length of output: 286
Use the test escape-hatch cast when setting storedAt.
Line 104 sets a non-standard property on Response without a type cast, which violates TypeScript's type safety and will fail in strict mode. The source code already establishes the pattern at line 198: (res as any).storedAt. Apply the same minimal cast syntax here.
🔧 Proposed fix
- res.storedAt = oldDate.getTime();
+ (res as any).storedAt = oldDate.getTime();📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| res.storedAt = oldDate.getTime(); | |
| (res as any).storedAt = oldDate.getTime(); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@tests/response-storage.test.ts` at line 104, The test assigns a non-standard
property to the Response object using "res.storedAt = oldDate.getTime()" which
breaks strict TypeScript; change it to use the established escape-hatch cast
pattern so it matches the other occurrence (e.g., replace the assignment with
"(res as any).storedAt = oldDate.getTime()") so the compiler accepts the ad-hoc
property while keeping the rest of the test unchanged.
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/test.yml:
- Around line 299-304: The workflow currently greps for the transport-prefixed
string "HTTP/1.1 GET https://registry.npmjs.org/is-odd" making the assertion
fragile; update the grep to match the registry URL only and as a fixed string
(e.g. use grep -F <<<"$output" -q "https://registry.npmjs.org/is-odd") in the
shell if block that checks installation success, and make the identical change
for the second occurrence of the same assertion elsewhere in the file.
- Around line 43-44: The workflow is appending dist to PATH and GITHUB_PATH too
late so the runner's preinstalled npm/npx are used; change the step that writes
"$(realpath -e dist)" to GITHUB_PATH and updates PATH so that dist is prepended
(not appended) before any lock/install commands run (so that the custom shims in
dist take precedence for commands like bun run 'modules:lock' which invokes
npm); specifically ensure the command that writes to GITHUB_PATH and sets PATH
uses prepending (e.g., PATH="$(realpath -e dist):${PATH}") and that this step
occurs before the modules:lock/install steps.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: ba80fa33-3724-448a-b8c2-05372b6c7d49
📒 Files selected for processing (1)
.github/workflows/test.yml
| if [ 0 -eq "${status}" ] && grep <<<"$output" -q "HTTP/1.1 GET https://registry.npmjs.org/is-odd"; then | ||
| echo "Successfully installed from default registry" | ||
| else | ||
| echo "${output}" | ||
| echo "Return code: ${status}" | ||
| echo "Failed to install from default registry" |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Match on the registry URL, not Bun's exact verbose log prefix.
These assertions currently depend on Bun printing HTTP/1.1 and lowercase %2f. The invariant this workflow owns is the registry URL itself; coupling to the transport/log format will create noisy failures if Bun switches to HTTP/2 or changes its encoding style.
🔧 More robust assertion
- if [ 0 -eq "${status}" ] && grep <<<"$output" -q "HTTP/1.1 GET https://registry.npmjs.org/is-odd"; then
+ if [ 0 -eq "${status}" ] && grep -Fiq "https://registry.npmjs.org/is-odd" <<<"$output"; then
echo "Successfully installed from default registry"
@@
- if [ 0 -eq "${status}" ] && grep <<<"$output" -q "HTTP/1.1 GET https://registry.yarnpkg.com/@types%2fbun"; then
+ if [ 0 -eq "${status}" ] && grep -Fiq "https://registry.yarnpkg.com/@types%2fbun" <<<"$output"; then
echo "Successfully installed from `@types` registry"Also applies to: 314-319
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/test.yml around lines 299 - 304, The workflow currently
greps for the transport-prefixed string "HTTP/1.1 GET
https://registry.npmjs.org/is-odd" making the assertion fragile; update the grep
to match the registry URL only and as a fixed string (e.g. use grep -F
<<<"$output" -q "https://registry.npmjs.org/is-odd") in the shell if block that
checks installation success, and make the identical change for the second
occurrence of the same assertion elsewhere in the file.
There was a problem hiding this comment.
I haven't actually seen this output yet.
However, it seems like removing GET is a mistake. Presumably, the URL may appear multiple times and we only want to check for a particular instance in the output.
There was a problem hiding this comment.
@tcely, that's a fair correction. The real fragility is only the HTTP/1.1 prefix (which breaks if Bun's verbose output switches to HTTP/2), not the GET verb itself. Keeping GET is the right call since it anchors the match to the actual outgoing request line rather than any other mention of the URL (e.g. resolved/downloaded lines).
A minimal adjustment would be to drop only HTTP/1.1 while keeping GET:
- if [ 0 -eq "${status}" ] && grep <<<"$output" -q "HTTP/1.1 GET https://registry.npmjs.org/is-odd"; then
+ if [ 0 -eq "${status}" ] && grep <<<"$output" -Fq "GET https://registry.npmjs.org/is-odd"; then- if [ 0 -eq "${status}" ] && grep <<<"$output" -q "HTTP/1.1 GET https://registry.yarnpkg.com/@types%2fbun"; then
+ if [ 0 -eq "${status}" ] && grep <<<"$output" -Fq "GET https://registry.yarnpkg.com/@types%2fbun"; thenThe -F (fixed-string) flag is also worth keeping since the URL contains characters like / and % that have no special meaning to BRE/ERE but makes intent explicit. Since you mentioned you haven't seen the actual output yet, you may want to revisit this once you have a real run to inspect — if the format turns out to be stable, the current code is fine to leave as-is.
🧠 Learnings used
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: .github/workflows/test.yml:43-44
Timestamp: 2026-03-15T02:04:23.763Z
Learning: In GitHub Actions workflows under .github/workflows, ensure that dist/ is appended to PATH in the Install dependencies step (as in oven-sh/setup-bun PR `#171`). The dist/ directory provides fallback shims for node, npm, and npx so the workflow can run with only bun available, while real versions on the runner should take precedence. Do not prepend dist/ to PATH.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/github-asset.ts:115-128
Timestamp: 2026-03-08T21:18:39.908Z
Learning: In `src/github-asset.ts` of `oven-sh/setup-bun` (PR `#171`), the preferred approach to normalize tag handling in `parseAssetUrl` is `const parts = urlObj.pathname.slice(1).split("/").map(decodeURIComponent)` — this decodes all path segments uniformly. As a result, the existing `decodeURIComponent(parts[5])` on `name` must be removed (it would double-decode). In `fetchAssetMetadata`, `base.tag` should always be stored decoded (including `base.tag = release.tag_name` for latest releases) and encoded with `encodeURIComponent(base.tag)` only when constructing the API URL path.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:59-60
Timestamp: 2026-03-10T19:27:53.007Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), the raw (potentially credentialed) URL is intentionally returned in the `Output` struct (`cacheState.url = result.url`) so that downstream workflow steps consuming the `bun-download-url` output can use it. Credential stripping via `stripUrlCredentials` is only applied to `sUrl`, which is used exclusively for state persistence (the `.json` sidecar) and cache-key comparison. Do NOT flag the returned URL as a credential leak.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/verify.ts:107-117
Timestamp: 2026-03-06T02:54:34.457Z
Learning: In `src/verify.ts` of the `setup-bun` action, the `setOutput("bun-download-checksum", ...)` call inside the GitHub API digest check branch (before PGP manifest verification) is intentional. The author wants to publish the checksum as soon as the GitHub API digest matches the local hash, rather than waiting for the full PGP manifest verification to complete. Do not flag this as a premature output.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:116-121
Timestamp: 2026-03-07T22:33:16.093Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), the `if (!cacheHit) { downloadBun(...) }` branch is intentionally placed **outside** the `if (!revision)` block. This means the fast-path (existing binary + version match, no `.json` sidecar) leaves `cacheHit = false`, so `downloadBun` (and thus `verifyAsset`) still runs to produce a verified checksum before any caching. Setting `cacheHit = cacheState.cacheHit` (not unconditionally `true`) in the fast-path is correct and intentional — do not suggest changing it to `true`.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/github-asset.ts:119-127
Timestamp: 2026-03-06T00:03:37.714Z
Learning: In `oven-sh/setup-bun`, the `request` function in `src/utils.ts` already throws a descriptive error (including status code, status text, and response body) for any non-OK HTTP response before returning. Callers of `request` never need to add their own `!response.ok` guard — suggesting one is redundant.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/download-bun.ts:29-33
Timestamp: 2026-03-08T21:54:43.567Z
Learning: In `src/download-bun.ts` of `oven-sh/setup-bun` (PR `#171`), `downloadBun(url, bunPath, token?)` intentionally returns `Promise<{ binPath: string; checksum: string; url: string }>`. The returned `url` may differ from the input because the URL can be normalized during the download process, and `binPath` may differ from the input `bunPath` because the final binary path is resolved during extraction. Do NOT flag this return contract as redundant or suggest narrowing it to only `{ revision, checksum }`.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/verify.ts:124-141
Timestamp: 2026-03-06T03:02:02.768Z
Learning: In src/verify.ts of the setup-bun repo, all Bun release archives follow the naming convention `bun-<platform>.zip` — they always start with `bun-` and end with `.zip`. Use `lastSegment.startsWith("bun-") && lastSegment.endsWith(".zip")` to validate asset URLs before deriving the manifest URL via path-replacement fallback.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/manifest.ts:61-75
Timestamp: 2026-03-08T21:58:44.853Z
Learning: In `src/manifest.ts` of `oven-sh/setup-bun` (PR `#171`), the `verification.signatures.find(...)` pattern is intentional: a bad (unverified) signature from a trusted key is treated as a hard release-process error, not a fallback scenario. Do NOT suggest iterating over multiple candidate signatures to find a valid one — the single-match approach is by design.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: package.json:21-24
Timestamp: 2026-03-13T23:38:01.902Z
Learning: In `package.json` of `oven-sh/setup-bun` (PR `#171`), all externally-invoked tool commands in scripts (e.g., `patch-package`, `prettier`, `esbuild`) must be prefixed with `npx -- ` (e.g., `"npx -- patch-package"`). This is because Bun intercepts bare `npx` calls and rewrites them to `bun x --` during testing. Always use `npx -- <tool>` for consistency and correctness in this project.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:71-78
Timestamp: 2026-03-14T10:48:53.035Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), when reconciling the `bunx` symlink, caught errors must be narrowed from `unknown` with `instanceof Error` before accessing `.code` (e.g., `if (!(error instanceof Error) || (error as NodeJS.ErrnoException).code !== "ENOENT") throw error`). Use `lstatSync` to check existence and type before touching the path; only call `unlinkSync` (never `rmSync` with `force: true`) and only when `lstatSync` confirms the entry is a symbolic link (`stat.isSymbolicLink()`). Non-symlink entries must not be removed. Create the symlink with `symlinkSync(exe("bun"), bunxPath)` (relative target).
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:71-78
Timestamp: 2026-03-14T10:33:33.207Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), the `bunx` symlink target must be the relative name `exe("bun")` (i.e., `"bun"` on Linux/macOS, `"bun.exe"` on Windows), not the absolute path. The destination (where `bunx` is placed) may be an absolute path. Use `symlinkSync(exe("bun"), bunxPath)` and check staleness with `readlinkSync(bunxPath) !== exe("bun")`. The symlink should be deferred until after `downloadBun` resolves the final binary path (`cacheState.bunPath`).
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:123-125
Timestamp: 2026-03-06T11:58:22.397Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), the correct way to persist the verified download checksum between the main action run and the post-run step is via `saveState`, by adding a `checksum?: string` field to `CacheState` and saving it with `JSON.stringify(cacheState)`. Do not use the filesystem cache (`src/filesystem-cache.ts`) for this purpose.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/action.ts:101-125
Timestamp: 2026-03-08T01:27:05.925Z
Learning: In `src/action.ts` of `oven-sh/setup-bun` (PR `#171`), the fast-path binary trust mechanism uses a `binaryFingerprint` field in `CacheState` (persisted to the `.json` sidecar). The fingerprint is computed by `computeBinaryFingerprint(filePath)`: reads first and last 256 KiB (BINARY_PARTIAL_SIZE = 256 * 1024), individually sha512-hashes each partial, then combines stat metadata (size, mtimeMs, ino) and both partial hashes into an ordered string `size:${size}|mtimeMs:${mtimeMs}|ino:${ino}|head:${headHash}|tail:${tailHash}` and returns `"sha512:" + sha512(combined)`. This fingerprint is stored as `binaryFingerprint?: string` in `CacheState` and compared on the fast path before trusting the existing binary.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/manifest.ts:22-46
Timestamp: 2026-03-07T09:52:23.520Z
Learning: In `src/manifest.ts` of `oven-sh/setup-bun` (PR `#171`), fix `getSignatureDate` to await `VerificationResult.signature` (which is a `Promise<Signature>`) and read the creation time from the resolved `SignaturePacket`, e.g. `sigObj.packets[0]?.created`. Update the function to accept a `VerificationResult` and return the timestamp after awaiting `.signature`. Remove any unnecessary subpacket traversal and ensure the result is a `Date`. Alternatively, the creation time can be emitted inline after `await sigResult.verified` resolves, eliminating the need for a separate helper entirely.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/manifest.ts:22-46
Timestamp: 2026-03-07T09:25:44.242Z
Learning: In `src/manifest.ts` of `oven-sh/setup-bun` (PR `#171`), the `getSignatureDate` helper never successfully returns a timestamp because `VerificationResult.signature` in openpgp.js v6 is a `Promise<Signature>` that must be awaited first. After awaiting, the creation time is available directly as `sigObj.packets[0]?.created` (a plain `Date`) on the resolved `SignaturePacket` — no subpacket traversal is needed. The function should be updated to accept a `VerificationResult` and await its `.signature` property.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/response-storage.ts:81-89
Timestamp: 2026-03-13T23:37:34.531Z
Learning: In `src/response-storage.ts` of `oven-sh/setup-bun` (PR `#171`), the synthetic backdated `Last-Modified` header in the legacy/raw cache handler (`new Date(Date.now() - CACHE_TTL).toUTCString()`) is intentional. It anchors OpenPGP signature verification (via `getValidatedLastModified` → `openpgp.verify({ date })`) in the recent past so that signatures which appear expired relative to `now` still pass. The accepted trade-off is a temporary failure window of up to `CACHE_TTL` (12 hours) for freshly published manifests. Do NOT suggest removing this header.
Learnt from: tcely
Repo: oven-sh/setup-bun PR: 171
File: src/filesystem-cache.ts:98-109
Timestamp: 2026-03-07T09:24:30.540Z
Learning: In `src/filesystem-cache.ts` of `oven-sh/setup-bun` (PR `#171`), the filesystem cache uses zero-padded 3-digit month subdirectories (001–012) under `$RUNNER_TEMP/setup-bun/`. Cache reads check the current AND previous month directories (sliding window). `cleanupFutureCache()` removes only the next-month directory on each `setCache` call to bound growth. A 12-hour mtime check is also applied within `getCache` as an additional TTL. Atomic writes use `randomBytes`-based temp extensions plus `renameSync`.
✅ Actions performedReviews resumed. |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/test.yml:
- Around line 294-305: Extract the duplicated install/assert logic into a small
bash helper function (e.g., run_install_and_assert) that accepts the install
command and expected grep string; inside it run the command (the bun add is-odd
--verbose --force invocation), capture status and output into the same
status/output variables, grep for the expected HTTP/1.1 GET string, print
diagnostics (output and Return code) and exit 1 on failure, and then replace
both duplicate blocks with calls to this helper passing the appropriate expected
registry string.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: d94f60cc-a442-4dc9-a13b-43d346fd377e
📒 Files selected for processing (1)
.github/workflows/test.yml
Introduces end-to-end asset verification (hash + OpenPGP manifest), GitHub API/header and URL helpers, filesystem-backed caching and response storage, signing-key retrieval with cache/failover, CI runner normalization, package/build script updates, and exposes a verified download checksum output.
See oven-sh/bun#27830 for the
canaryfailures.