Skip to content

Systematic audit of Next.js test suite (file-by-file) #204

@southpolesteve

Description

@southpolesteve

Problem

Our Next.js compatibility test tracking (tests/nextjs-compat/TRACKING.md) was built using a feature-first methodology: identify features vinext implements, then find relevant Next.js tests. This missed edge cases and error handling tests that don't map neatly to a "feature."

Example: test/e2e/app-dir/proxy-missing-export/ tests that Next.js throws an error when a proxy/middleware file doesn't export the expected function. Our middleware implementation silently failed open instead, letting requests through unprotected. This was never caught because middleware was already "covered" by other tests (ON-6, ON-11), and the gap analysis never opened this specific test directory.

The root cause is that TRACKING.md was built by asking "what features do we have, and do we have tests?" instead of "what does Next.js test, and do we match?"

Fixed in #203, but we need a systematic audit to find other gaps like this.

Proposal

Do a file-by-file walk through every test directory in the Next.js repo's test suite. For each directory:

  1. Read what the test covers
  2. Determine if it's relevant to vinext (skip build-only, Turbopack-specific, Vercel-deploy-specific)
  3. If relevant, check whether we have equivalent coverage
  4. If not, either port the test or document why it's not applicable

Directories to audit

  • test/e2e/app-dir/ (365+ directories, partially covered by TRACKING.md)
  • test/e2e/ top-level (middleware, pages router, config, etc.)
  • test/unit/ (pure function tests for routing, matching, etc.)

Methodology

For each directory, record in a tracking document:

  • Directory name and what it tests
  • Relevance to vinext (yes/no/partial)
  • Current vinext coverage (covered/missing/partial)
  • Action needed (port tests/skip/N/A)

Priority ordering

Focus first on:

  1. Error handling and validation tests (like proxy-missing-export) since these are the ones most likely to be missed by feature-first analysis and most dangerous when missing
  2. Edge cases for already-implemented features
  3. New feature areas not yet covered

Context

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions