Skip to content

feat: add HuggingFace Transformers.js local inference provider#604

Closed
SaaiAravindhRaja wants to merge 9 commits intoaccomplish-ai:mainfrom
SaaiAravindhRaja:feat/issue-183-huggingface-transformers-local
Closed

feat: add HuggingFace Transformers.js local inference provider#604
SaaiAravindhRaja wants to merge 9 commits intoaccomplish-ai:mainfrom
SaaiAravindhRaja:feat/issue-183-huggingface-transformers-local

Conversation

@SaaiAravindhRaja
Copy link
Contributor

@SaaiAravindhRaja SaaiAravindhRaja commented Feb 22, 2026

Description

Add HuggingFace Transformers.js as a local inference provider, enabling users to run ONNX models locally via a lightweight inference server.

Demo

demo-issue-183.mov

Changes

  • New huggingface-local provider type in agent-core with connection testing and model fetching
  • Migration v009 adding huggingface_local_config column to provider settings
  • Provider form UI with server URL configuration and connect/disconnect flow
  • IPC handlers for testHuggingFaceLocalConnection and fetchHuggingFaceLocalModels
  • HuggingFace logo asset and provider grid entry

Type of Change

  • New feature (non-breaking change which adds functionality)

Checklist

  • My code follows the project's coding standards
  • I have performed a self-review of my code
  • New and existing tests pass locally

Related Issues

Fixes #183

Summary by CodeRabbit

  • New Features

    • HuggingFace Local provider integration: connect to and configure local HuggingFace inference servers.
    • Test connections, discover and fetch available local models, and select models from the UI.
    • New provider settings UI/form, provider icon and logo, and public API surface for HuggingFace Local.
  • Chores

    • Added persistent storage support, migration, and accessors for HuggingFace Local configuration.

…plish-ai#183)

Add full-stack support for running local HuggingFace models via
Transformers.js with ONNX Runtime. Includes provider types, IPC
handlers, config-builder wiring, DB migration v009, UI form
component, and logo assets.
@orcaman
Copy link
Contributor

orcaman commented Feb 22, 2026

Snyk checks have passed. No issues have been found so far.

Status Scanner Critical High Medium Low Total (0)
Open Source Security 0 0 0 0 0 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Caution

Review failed

An error occurred during the review process. Please try again later.

📝 Walkthrough

Walkthrough

Adds HuggingFace Local provider support across desktop IPC/preload, web settings UI, core provider implementation (connection, model fetch, validation), storage layer (types, migration, accessors), and OpenCode config builder integration.

Changes

Cohort / File(s) Summary
Desktop IPC & Preload
apps/desktop/src/main/ipc/handlers.ts, apps/desktop/src/preload/index.ts
New IPC channels and preload APIs: huggingface-local:test-connection, huggingface-local:fetch-models, huggingface-local:get-config, huggingface-local:set-config wired to core provider functions and storage accessors.
Web Settings UI
apps/web/src/client/components/settings/ProviderGrid.tsx, apps/web/src/client/components/settings/ProviderSettingsPanel.tsx, apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx, apps/web/src/client/components/settings/providers/index.ts
Added provider entry and HuggingFaceLocalProviderForm component and export; renders connect/disconnect, server URL, model discovery, and model selection flows.
Web Assets & Client API
apps/web/src/client/components/ui/ProviderIcon.tsx, apps/web/src/client/lib/provider-logos.ts, apps/web/src/client/lib/accomplish.ts
Registered provider color/initials and logo; extended AccomplishAPI with four HuggingFace Local methods (test, fetch, getConfig, setConfig).
Core Provider Types & Exports
packages/agent-core/src/common.ts, packages/agent-core/src/index.ts, packages/agent-core/src/common/types/provider.ts, packages/agent-core/src/common/types/providerSettings.ts
Added HuggingFace Local types, credentials, provider metadata, and exported provider functions/constants/types.
Core Provider Implementation
packages/agent-core/src/providers/huggingface-local.ts, packages/agent-core/src/providers/validation.ts
New provider module with connection test, model fetch, config validation, constants (default URL, timeout, recommended models); validation includes provider.
Model Display Constants
packages/agent-core/src/common/constants/model-display.ts
Added huggingface-local/ to provider prefixes for display name normalization.
Storage, Migration & Facade
packages/agent-core/src/storage/repositories/appSettings.ts, packages/agent-core/src/storage/repositories/index.ts, packages/agent-core/src/storage/migrations/index.ts, packages/agent-core/src/storage/migrations/v009-huggingface-local.ts, packages/agent-core/src/types/storage.ts, packages/agent-core/src/factories/storage.ts
Added DB migration v009 adding huggingface_local_config column; extended AppSettings, storage accessors (getHuggingFaceLocalConfig / setHuggingFaceLocalConfig) and exposed through facades and types.
OpenCode Integration
packages/agent-core/src/opencode/config-builder.ts
Integrated huggingface-local into OpenCode config builder: registers connected provider or legacy config models and maps tool support.
Localization
apps/web/locales/en/settings.json
Added localization strings and UI copy for HuggingFace Local provider and settings.

Sequence Diagram(s)

sequenceDiagram
    participant Renderer as Renderer (Settings UI)
    participant IPC as Main Process (IPC)
    participant HFServer as HuggingFace Local Server
    participant Storage as AppSettings Storage

    Renderer->>IPC: huggingface-local:test-connection(url)
    IPC->>HFServer: GET /v1/models (with timeout)
    alt Success
        HFServer-->>IPC: 200 + models JSON
        IPC-->>Renderer: { success: true, models: [...] }
        Renderer->>IPC: huggingface-local:set-config(config)
        IPC->>Storage: setHuggingFaceLocalConfig(config)
        Storage-->>IPC: saved
        IPC-->>Renderer: void
    else Failure
        HFServer-->>IPC: error / timeout
        IPC-->>Renderer: { success: false, error: "..." }
    end
Loading
sequenceDiagram
    participant OpenCode as OpenCode Config Builder
    participant Storage as AppSettings Storage
    participant Core as Provider Core Logic

    OpenCode->>Storage: getHuggingFaceLocalConfig()
    alt Config present
        Storage-->>OpenCode: { serverUrl, enabled, models, ... }
        OpenCode->>Core: map & validate models
        Core-->>OpenCode: provider entry (modelId, toolSupport)
        OpenCode->>OpenCode: add to config.providers
    else No config
        Storage-->>OpenCode: null
        OpenCode->>OpenCode: skip registration
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~30 minutes

Suggested reviewers

  • mcmatan
  • orcaman

"🐰 I hopped through code at break of dawn,
Found models local, ready to spawn,
A server nearby, no cloud in sight,
I nudged the config — now inference takes flight,
Hooray for paws and bytes so bright!"

🚥 Pre-merge checks | ✅ 3 | ❌ 2

❌ Failed checks (1 warning, 1 inconclusive)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 23.53% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Linked Issues check ❓ Inconclusive The PR implements most core objectives from issue #183 (provider type, config management, IPC handlers, settings UI) but reviewer comments indicate some implementation details may not fully match the original specifications (e.g., i18n handling, utility function usage). Verify that remaining code quality suggestions from the review are intentional design choices or backlog items, particularly regarding i18n implementation in the form component.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: add HuggingFace Transformers.js local inference provider' clearly summarizes the main change, identifying the specific feature (HuggingFace Transformers.js) and its purpose (local inference provider).
Out of Scope Changes check ✅ Passed All changes directly support the HuggingFace Transformers.js local inference integration specified in issue #183. Database migrations, type definitions, provider infrastructure, UI components, and localization are all in-scope for adding a new local inference provider.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx (2)

90-101: Complex type mapping could be simplified.

The model mapping logic handles two different shapes (ConnectedProvider's availableModels and local HuggingFaceLocalModel[]) with verbose type assertions. Consider normalizing the model shape earlier in handleConnect to avoid runtime type checks here.

The current implementation works but the as { name?: string } and as { toolSupport?: ToolSupportStatus } casts reduce type safety. Since you control both paths, consider ensuring a consistent shape at the source.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`
around lines 90 - 101, Normalize the model shape in handleConnect so
connectedProvider.availableModels matches the HuggingFaceLocalModel shape and
you can drop ad-hoc casts in HuggingFaceLocalProviderForm.tsx: in handleConnect
map each incoming model to an object with id (strip /^huggingface-local\//),
displayName (prefer displayName, fall back to name or id), size, and toolSupport
(ensure a ToolSupportStatus value) and assign that to
connectedProvider.availableModels; then update the models computation in
HuggingFaceLocalProviderForm (the const models mapping) to assume
HuggingFaceLocalModel fields exist and remove the `as { name?: string }` and `as
{ toolSupport?: ToolSupportStatus }` casts.

20-25: Consider importing HuggingFaceLocalModel from agent-core instead of duplicating.

This local interface duplicates HuggingFaceLocalModel from packages/agent-core/src/providers/huggingface-local.ts. Consider importing it to maintain a single source of truth.

+import type { HuggingFaceLocalModel } from '@accomplish_ai/agent-core';
 import type {
   ConnectedProvider,
   HuggingFaceLocalCredentials,
   ToolSupportStatus,
 } from '@accomplish_ai/agent-core/common';
-
-interface HuggingFaceLocalModel {
-  id: string;
-  displayName: string;
-  size: number;
-  toolSupport?: ToolSupportStatus;
-}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`
around lines 20 - 25, Remove the duplicated local interface definition of
HuggingFaceLocalModel and instead import HuggingFaceLocalModel from the
agent-core provider module where it is defined; update this file to import the
shared HuggingFaceLocalModel (and ToolSupportStatus if not already imported) and
use that type for props/variables so there is a single source of truth for the
model shape.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/agent-core/src/opencode/config-builder.ts`:
- Around line 455-501: The HuggingFace Local branches add entries to
providerConfigs but never append 'huggingface-local' to enabledProviders; update
both the connected-provider branch (where hfLocalProvider is used and
providerConfigs.push is called) and the legacy branch (where
getHuggingFaceLocalConfig(), hfLocalModels and providerConfigs.push are used) to
also push 'huggingface-local' into the enabledProviders array (same
place/approach used for Azure Foundry) so enabledProviders stays in sync with
providerConfigs.

In `@packages/agent-core/src/providers/huggingface-local.ts`:
- Around line 88-93: Sanitized URL may contain a trailing slash causing a double
slash when building `${sanitizedUrl}/v1/models`; update the logic around the
fetch call (the code that computes sanitizedUrl used by the fetchWithTimeout
call) to normalize/remove any trailing slash from sanitizedUrl before
concatenation (or join paths safely) so the request URL becomes
`${sanitizedUrl}/v1/models` without a double slash; adjust the code near the
fetchWithTimeout invocation (and any helper that produces sanitizedUrl) to trim
trailing '/' from sanitizedUrl.

---

Nitpick comments:
In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`:
- Around line 90-101: Normalize the model shape in handleConnect so
connectedProvider.availableModels matches the HuggingFaceLocalModel shape and
you can drop ad-hoc casts in HuggingFaceLocalProviderForm.tsx: in handleConnect
map each incoming model to an object with id (strip /^huggingface-local\//),
displayName (prefer displayName, fall back to name or id), size, and toolSupport
(ensure a ToolSupportStatus value) and assign that to
connectedProvider.availableModels; then update the models computation in
HuggingFaceLocalProviderForm (the const models mapping) to assume
HuggingFaceLocalModel fields exist and remove the `as { name?: string }` and `as
{ toolSupport?: ToolSupportStatus }` casts.
- Around line 20-25: Remove the duplicated local interface definition of
HuggingFaceLocalModel and instead import HuggingFaceLocalModel from the
agent-core provider module where it is defined; update this file to import the
shared HuggingFaceLocalModel (and ToolSupportStatus if not already imported) and
use that type for props/variables so there is a single source of truth for the
model shape.

Comment on lines +455 to +501
// HuggingFace Local provider
const hfLocalProvider = providerSettings.connectedProviders['huggingface-local'];
if (
hfLocalProvider?.connectionStatus === 'connected' &&
hfLocalProvider.credentials.type === 'huggingface-local' &&
hfLocalProvider.selectedModelId
) {
const modelId = hfLocalProvider.selectedModelId.replace(/^huggingface-local\//, '');
const modelInfo = hfLocalProvider.availableModels?.find(
(m) => m.id === hfLocalProvider.selectedModelId || m.id === modelId,
);
const supportsTools = (modelInfo as { toolSupport?: string })?.toolSupport === 'supported';
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: {
baseURL: `${hfLocalProvider.credentials.serverUrl}/v1`,
},
models: {
[modelId]: { name: modelId, tools: supportsTools },
},
});
console.log(
`[OpenCode Config Builder] HuggingFace Local configured: ${modelId} (tools: ${supportsTools})`,
);
} else {
const hfLocalConfig = getHuggingFaceLocalConfig();
const hfLocalModels = hfLocalConfig?.models;
if (hfLocalConfig?.enabled && hfLocalModels && hfLocalModels.length > 0) {
const models: Record<string, ProviderModelConfig> = {};
for (const model of hfLocalModels) {
models[model.id] = { name: model.displayName, tools: model.toolSupport === 'supported' };
}
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: { baseURL: `${hfLocalConfig.serverUrl}/v1` },
models,
});
console.log(
'[OpenCode Config Builder] HuggingFace Local (legacy) configured:',
Object.keys(models),
);
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check where enabledProviders is consumed
rg -n -C3 "enabledProviders" packages/agent-core/src -g '*.ts'

Repository: accomplish-ai/accomplish

Length of output: 7295


🏁 Script executed:

#!/bin/bash
# Confirm provider ID mapping exists
rg -n -C3 "PROVIDER_ID_TO_OPENCODE" packages/agent-core/src/common -g '*.ts'

Repository: accomplish-ai/accomplish

Length of output: 1681


🏁 Script executed:

#!/bin/bash
# Search for huggingface-local references in the codebase
rg -n -C3 "huggingface-local" packages/agent-core/src -g '*.ts'

Repository: accomplish-ai/accomplish

Length of output: 8387


Ensure HuggingFace Local is added to enabledProviders when configured.

The connected and legacy HuggingFace Local paths push provider configs without explicitly adding 'huggingface-local' to enabledProviders. This creates an inconsistency with how Azure Foundry handles the same scenario (lines 519–521, 539–541), where it explicitly maintains the sync. While the provider ID is correctly mapped in PROVIDER_ID_TO_OPENCODE, the missing enabledProviders update should be added to ensure consistent behavior and prevent potential issues with downstream logic that depends on this field.

Suggested fix
   if (
     hfLocalProvider?.connectionStatus === 'connected' &&
     hfLocalProvider.credentials.type === 'huggingface-local' &&
     hfLocalProvider.selectedModelId
   ) {
     const modelId = hfLocalProvider.selectedModelId.replace(/^huggingface-local\//, '');
     const modelInfo = hfLocalProvider.availableModels?.find(
       (m) => m.id === hfLocalProvider.selectedModelId || m.id === modelId,
     );
     const supportsTools = (modelInfo as { toolSupport?: string })?.toolSupport === 'supported';
     providerConfigs.push({
       id: 'huggingface-local',
       npm: '@ai-sdk/openai-compatible',
       name: 'HuggingFace Local',
       options: {
         baseURL: `${hfLocalProvider.credentials.serverUrl}/v1`,
       },
       models: {
         [modelId]: { name: modelId, tools: supportsTools },
       },
     });
+    if (!enabledProviders.includes('huggingface-local')) {
+      enabledProviders.push('huggingface-local');
+    }
     console.log(
       `[OpenCode Config Builder] HuggingFace Local configured: ${modelId} (tools: ${supportsTools})`,
     );
   } else {
     const hfLocalConfig = getHuggingFaceLocalConfig();
     const hfLocalModels = hfLocalConfig?.models;
     if (hfLocalConfig?.enabled && hfLocalModels && hfLocalModels.length > 0) {
       const models: Record<string, ProviderModelConfig> = {};
       for (const model of hfLocalModels) {
         models[model.id] = { name: model.displayName, tools: model.toolSupport === 'supported' };
       }
       providerConfigs.push({
         id: 'huggingface-local',
         npm: '@ai-sdk/openai-compatible',
         name: 'HuggingFace Local',
         options: { baseURL: `${hfLocalConfig.serverUrl}/v1` },
         models,
       });
+      if (!enabledProviders.includes('huggingface-local')) {
+        enabledProviders.push('huggingface-local');
+      }
       console.log(
         '[OpenCode Config Builder] HuggingFace Local (legacy) configured:',
         Object.keys(models),
       );
     }
   }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// HuggingFace Local provider
const hfLocalProvider = providerSettings.connectedProviders['huggingface-local'];
if (
hfLocalProvider?.connectionStatus === 'connected' &&
hfLocalProvider.credentials.type === 'huggingface-local' &&
hfLocalProvider.selectedModelId
) {
const modelId = hfLocalProvider.selectedModelId.replace(/^huggingface-local\//, '');
const modelInfo = hfLocalProvider.availableModels?.find(
(m) => m.id === hfLocalProvider.selectedModelId || m.id === modelId,
);
const supportsTools = (modelInfo as { toolSupport?: string })?.toolSupport === 'supported';
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: {
baseURL: `${hfLocalProvider.credentials.serverUrl}/v1`,
},
models: {
[modelId]: { name: modelId, tools: supportsTools },
},
});
console.log(
`[OpenCode Config Builder] HuggingFace Local configured: ${modelId} (tools: ${supportsTools})`,
);
} else {
const hfLocalConfig = getHuggingFaceLocalConfig();
const hfLocalModels = hfLocalConfig?.models;
if (hfLocalConfig?.enabled && hfLocalModels && hfLocalModels.length > 0) {
const models: Record<string, ProviderModelConfig> = {};
for (const model of hfLocalModels) {
models[model.id] = { name: model.displayName, tools: model.toolSupport === 'supported' };
}
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: { baseURL: `${hfLocalConfig.serverUrl}/v1` },
models,
});
console.log(
'[OpenCode Config Builder] HuggingFace Local (legacy) configured:',
Object.keys(models),
);
}
}
// HuggingFace Local provider
const hfLocalProvider = providerSettings.connectedProviders['huggingface-local'];
if (
hfLocalProvider?.connectionStatus === 'connected' &&
hfLocalProvider.credentials.type === 'huggingface-local' &&
hfLocalProvider.selectedModelId
) {
const modelId = hfLocalProvider.selectedModelId.replace(/^huggingface-local\//, '');
const modelInfo = hfLocalProvider.availableModels?.find(
(m) => m.id === hfLocalProvider.selectedModelId || m.id === modelId,
);
const supportsTools = (modelInfo as { toolSupport?: string })?.toolSupport === 'supported';
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: {
baseURL: `${hfLocalProvider.credentials.serverUrl}/v1`,
},
models: {
[modelId]: { name: modelId, tools: supportsTools },
},
});
if (!enabledProviders.includes('huggingface-local')) {
enabledProviders.push('huggingface-local');
}
console.log(
`[OpenCode Config Builder] HuggingFace Local configured: ${modelId} (tools: ${supportsTools})`,
);
} else {
const hfLocalConfig = getHuggingFaceLocalConfig();
const hfLocalModels = hfLocalConfig?.models;
if (hfLocalConfig?.enabled && hfLocalModels && hfLocalModels.length > 0) {
const models: Record<string, ProviderModelConfig> = {};
for (const model of hfLocalModels) {
models[model.id] = { name: model.displayName, tools: model.toolSupport === 'supported' };
}
providerConfigs.push({
id: 'huggingface-local',
npm: '@ai-sdk/openai-compatible',
name: 'HuggingFace Local',
options: { baseURL: `${hfLocalConfig.serverUrl}/v1` },
models,
});
if (!enabledProviders.includes('huggingface-local')) {
enabledProviders.push('huggingface-local');
}
console.log(
'[OpenCode Config Builder] HuggingFace Local (legacy) configured:',
Object.keys(models),
);
}
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/agent-core/src/opencode/config-builder.ts` around lines 455 - 501,
The HuggingFace Local branches add entries to providerConfigs but never append
'huggingface-local' to enabledProviders; update both the connected-provider
branch (where hfLocalProvider is used and providerConfigs.push is called) and
the legacy branch (where getHuggingFaceLocalConfig(), hfLocalModels and
providerConfigs.push are used) to also push 'huggingface-local' into the
enabledProviders array (same place/approach used for Azure Foundry) so
enabledProviders stays in sync with providerConfigs.

Comment on lines +88 to +93
try {
const response = await fetchWithTimeout(
`${sanitizedUrl}/v1/models`,
{ method: 'GET' },
HF_LOCAL_API_TIMEOUT_MS,
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Handle potential trailing slash in URL.

If sanitizedUrl ends with /, the constructed URL becomes http://localhost:8787//v1/models with a double slash. While most servers handle this gracefully, it's cleaner to normalize the URL.

🛡️ Proposed fix
   try {
+    const normalizedUrl = sanitizedUrl.replace(/\/+$/, '');
     const response = await fetchWithTimeout(
-      `${sanitizedUrl}/v1/models`,
+      `${normalizedUrl}/v1/models`,
       { method: 'GET' },
       HF_LOCAL_API_TIMEOUT_MS,
     );
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
const response = await fetchWithTimeout(
`${sanitizedUrl}/v1/models`,
{ method: 'GET' },
HF_LOCAL_API_TIMEOUT_MS,
);
try {
const normalizedUrl = sanitizedUrl.replace(/\/+$/, '');
const response = await fetchWithTimeout(
`${normalizedUrl}/v1/models`,
{ method: 'GET' },
HF_LOCAL_API_TIMEOUT_MS,
);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/agent-core/src/providers/huggingface-local.ts` around lines 88 - 93,
Sanitized URL may contain a trailing slash causing a double slash when building
`${sanitizedUrl}/v1/models`; update the logic around the fetch call (the code
that computes sanitizedUrl used by the fetchWithTimeout call) to
normalize/remove any trailing slash from sanitizedUrl before concatenation (or
join paths safely) so the request URL becomes `${sanitizedUrl}/v1/models`
without a double slash; adjust the code near the fetchWithTimeout invocation
(and any helper that produces sanitizedUrl) to trim trailing '/' from
sanitizedUrl.

@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai, do you have any minor suggestions or nitpicks?

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash

# Check for console.log statements that might be debug code
echo "=== Checking for console.log statements ==="
rg -n "console\.(log|debug|info)" \
  packages/agent-core/src/providers/huggingface-local.ts \
  packages/agent-core/src/opencode/config-builder.ts \
  apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx \
  apps/desktop/src/main/ipc/handlers.ts

echo -e "\n=== Reading HuggingFaceLocalProviderForm for UI patterns ==="
cat apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx

echo -e "\n=== Reading huggingface-local.ts for implementation details ==="
cat packages/agent-core/src/providers/huggingface-local.ts

echo -e "\n=== Reading migration file ==="
cat packages/agent-core/src/storage/migrations/v009-huggingface-local.ts

echo -e "\n=== Comparing with Ollama provider form for consistency ==="
cat apps/web/src/client/components/settings/providers/OllamaProviderForm.tsx | head -100

Length of output: 18382


@SaaiAravindhRaja, great work on this feature! Here are some minor suggestions and nitpicks I found:

1. Hardcoded default URL in the form component

File: apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx (Line 40)

The default server URL is hardcoded as 'http://localhost:8787', but you already have a constant HF_LOCAL_DEFAULT_URL exported from @accomplish_ai/agent-core. Consider importing and using it:

import { HF_LOCAL_DEFAULT_URL } from '@accomplish_ai/agent-core';

// Then use it:
const [serverUrl, setServerUrl] = useState(HF_LOCAL_DEFAULT_URL);

Also on line 173 where it's used as a fallback.

2. Missing internationalization (i18n)

File: apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx

The OllamaProviderForm uses useTranslation for all user-facing strings, but the HuggingFace form has hardcoded English strings. For consistency with the rest of the codebase, consider adding translation keys for:

  • "HuggingFace Local Server URL"
  • "Run a local HuggingFace Transformers.js inference server with ONNX Runtime"
  • "Please select a model"
  • "Select a model..."
  • The info banner text (lines 193-200)

3. Simplify model display name extraction

File: packages/agent-core/src/providers/huggingface-local.ts (Line 103)

The current implementation uses a simple split:

displayName: m.id.split('/').pop() || m.id,

Consider using the existing getModelDisplayName utility from model-display.ts for consistency with how other providers format model names, since you already added the 'huggingface-local/' prefix to PROVIDER_PREFIXES.

4. Simplify model mapping logic

File: apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx (Lines 95-105)

The model mapping has some complex type casting. Consider simplifying by normalizing the types earlier or using a type guard function:

const models: HuggingFaceLocalModel[] = (
  connectedProvider?.availableModels || availableModels
).map((m) => ({
  id: m.id.replace(/^huggingface-local\//, ''),
  displayName: m.name || m.displayName || m.id,
  size: 0,
  toolSupport: m.toolSupport || 'unknown',
}));

5. Console.log statements

File: packages/agent-core/src/opencode/config-builder.ts (Lines 478, 496)

There are console.log statements for HuggingFace Local configuration. While these match the pattern used for other providers, consider whether they should remain or if you want to use a proper logging utility for production.

6. Missing JSDoc for public function

File: packages/agent-core/src/providers/huggingface-local.ts (Line 136)

The fetchHuggingFaceLocalModels function is missing JSDoc documentation, while testHuggingFaceLocalConnection has good documentation. Consider adding similar documentation for consistency.


These are all minor polish items — the core implementation looks solid! The functionality follows the established patterns from Ollama and LM Studio providers nicely. 🎉

…SDoc

- Replace hardcoded 'http://localhost:8787' with imported HF_LOCAL_DEFAULT_URL
- Simplify model display name extraction in provider form
- Add comprehensive JSDoc to fetchHuggingFaceLocalModels
@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai, does everything look good now, I have worked on the minor suggestions too!

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash

echo "=== Checking HuggingFaceLocalProviderForm.tsx for default URL import and i18n ==="
cat apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx

echo -e "\n=== Checking huggingface-local.ts for getModelDisplayName usage and JSDoc ==="
cat packages/agent-core/src/providers/huggingface-local.ts

echo -e "\n=== Checking config-builder.ts for console.log statements ==="
cat packages/agent-core/src/opencode/config-builder.ts | grep -A2 -B2 "console.log"

Length of output: 15739


@SaaiAravindhRaja, great progress! 🎉 I can see you've addressed several of the suggestions:

✅ What's been fixed:

  1. Default URL constant - Now properly imports and uses HF_LOCAL_DEFAULT_URL from agent-core in both places (Line 9, 40, 173)
  2. Model mapping logic - Much cleaner now with proper type guards ('name' in m, 'displayName' in m)
  3. JSDoc documentation - Added comprehensive documentation to fetchHuggingFaceLocalModels (Lines 135-143)

📝 Minor items still present:

  1. Internationalization (i18n) - The form still has hardcoded English strings (e.g., "HuggingFace Local Server URL", error messages, etc.), while OllamaProviderForm uses useTranslation. If i18n support is planned for a future PR, that's totally fine!

  2. Model display name - Line 103 in huggingface-local.ts still uses m.id.split('/').pop() instead of the getModelDisplayName utility that you've already configured to handle the 'huggingface-local/' prefix. Though the current approach works fine!

  3. Console.log statements - Still present in config-builder.ts (Lines 487-488, 504-507), but I notice this matches the pattern used for all other providers, so it's likely intentional for debugging.

If items 1-3 are intentional choices or planned for future work, everything looks good to merge! The core functionality is solid and follows the established patterns well. Let me know if you'd like me to take another look after any additional changes, or if you're ready to proceed as-is. 👍

…vider

Replace hardcoded English strings with useTranslation keys matching
the pattern used by all other provider forms. Use the shared
getModelDisplayName utility for consistent model name formatting.
@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai, does everything look good now, I have worked on the minor suggestions too!

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash

echo "=== Checking HuggingFaceLocalProviderForm.tsx for i18n and default URL usage ==="
cat apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx

echo -e "\n=== Checking huggingface-local.ts for getModelDisplayName usage ==="
cat packages/agent-core/src/providers/huggingface-local.ts

echo -e "\n=== Checking if getModelDisplayName is imported in huggingface-local.ts ==="
rg -n "getModelDisplayName" packages/agent-core/src/providers/huggingface-local.ts

echo -e "\n=== Checking config-builder.ts for console.log statements ==="
rg -n "console\.(log|debug|info)" packages/agent-core/src/opencode/config-builder.ts | grep -i hugging

Length of output: 12217


@SaaiAravindhRaja, excellent work! 🎉 Everything looks great now!

I can confirm that you've addressed all the remaining suggestions:

✅ Recently Fixed:

  1. Internationalization (i18n) - Perfect! The form now properly uses useTranslation with translation keys for all user-facing strings:

    • t('huggingfaceLocal.serverUrl')
    • t('huggingfaceLocal.serverHint')
    • t('huggingfaceLocal.selectModel')
    • t('huggingfaceLocal.selectModelPlaceholder')
    • t('huggingfaceLocal.infoBannerTitle')
    • t('huggingfaceLocal.infoBannerDescription')

    This matches the pattern used in other provider forms like Ollama! 👍

  2. Model display name - Excellent! Now using getModelDisplayName(m.id) utility function (line 105 in huggingface-local.ts) instead of the manual split approach. This ensures consistency with how other providers format model names.

✅ Previously Fixed (still good):

  • Default URL constant imported and used correctly
  • Clean model mapping logic with type guards
  • Comprehensive JSDoc documentation
  • Core functionality following established patterns

The implementation is solid, consistent with the rest of the codebase, and ready to merge! Great job on the thorough implementation and addressing all the feedback. 🚀

@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai Thanks so much for confirming!

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Tip

For best results, initiate chat on the files or code changes.

@SaaiAravindhRaja, you're very welcome! 😊

It was a pleasure reviewing this feature. You did an excellent job implementing the HuggingFace Transformers.js local inference provider, and the way you addressed all the feedback shows great attention to detail. The integration follows the established patterns in the codebase perfectly and adds a valuable capability for users who want to run models locally without cloud APIs.

Best of luck with the merge, and feel free to tag me if you need any future reviews or have questions! 🚀

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (2)
apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx (2)

22-27: Import HuggingFaceLocalModel from agent-core instead of redefining locally.

This interface duplicates the one already exported from @accomplish_ai/agent-core (at packages/agent-core/src/providers/huggingface-local.ts), which also includes a quantization?: string field that's missing here. Using the shared type avoids drift and ensures consistency.

Suggested fix
 import type {
   ConnectedProvider,
   HuggingFaceLocalCredentials,
   ToolSupportStatus,
 } from '@accomplish_ai/agent-core/common';
-import { HF_LOCAL_DEFAULT_URL } from '@accomplish_ai/agent-core';
+import { HF_LOCAL_DEFAULT_URL, type HuggingFaceLocalModel } from '@accomplish_ai/agent-core';
 import {
   ConnectButton,
   ...
 } from '../shared';

 import huggingfaceLogo from '/assets/ai-logos/huggingface.png';

-interface HuggingFaceLocalModel {
-  id: string;
-  displayName: string;
-  size: number;
-  toolSupport?: ToolSupportStatus;
-}
-
 interface HuggingFaceLocalProviderFormProps {
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`
around lines 22 - 27, The local HuggingFaceLocalModel interface duplicates the
shared type and omits the quantization field; remove the local definition in
HuggingFaceLocalProviderForm.tsx and instead import HuggingFaceLocalModel from
`@accomplish_ai/agent-core`, updating any usages in the file to rely on the
imported type (e.g., props, state, and functions that reference
HuggingFaceLocalModel) so the shared quantization?: string and other fields are
preserved and type drift is prevented.

117-117: Use i18n for provider name consistency.

The provider name is hardcoded as "HuggingFace Local" while other strings in this component use useTranslation. The localization key providers.huggingface-local already exists in settings.json.

Suggested fix
-      <ProviderFormHeader logoSrc={huggingfaceLogo} providerName="HuggingFace Local" />
+      <ProviderFormHeader logoSrc={huggingfaceLogo} providerName={t('providers.huggingface-local')} />
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`
at line 117, Replace the hardcoded provider name passed to ProviderFormHeader
with the i18n key: call the existing useTranslation hook's t function and pass
t('providers.huggingface-local') as the providerName prop to ProviderFormHeader
(ensure the component uses the t function already imported/initialized in this
file and reference ProviderFormHeader and the localization key
providers.huggingface-local).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In
`@apps/web/src/client/components/settings/providers/HuggingFaceLocalProviderForm.tsx`:
- Around line 22-27: The local HuggingFaceLocalModel interface duplicates the
shared type and omits the quantization field; remove the local definition in
HuggingFaceLocalProviderForm.tsx and instead import HuggingFaceLocalModel from
`@accomplish_ai/agent-core`, updating any usages in the file to rely on the
imported type (e.g., props, state, and functions that reference
HuggingFaceLocalModel) so the shared quantization?: string and other fields are
preserved and type drift is prevented.
- Line 117: Replace the hardcoded provider name passed to ProviderFormHeader
with the i18n key: call the existing useTranslation hook's t function and pass
t('providers.huggingface-local') as the providerName prop to ProviderFormHeader
(ensure the component uses the t function already imported/initialized in this
file and reference ProviderFormHeader and the localization key
providers.huggingface-local).

Import HuggingFaceLocalModel from agent-core instead of redefining
locally. Use translation key for provider name in form header.
@SaaiAravindhRaja
Copy link
Contributor Author

@coderabbitai I've worked on it now! Thank you :)

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

Tip

For best results, initiate chat on the files or code changes.

@SaaiAravindhRaja, you're very welcome! 😊 Great work on this feature — the HuggingFace Local integration looks fantastic. Wishing you a smooth merge! 🚀

@mavishay
Copy link
Collaborator

mavishay added a commit that referenced this pull request Mar 12, 2026
Adds a HuggingFace Local inference provider enabling fully offline LLM
inference via @huggingface/transformers, without relying on external APIs.

The provider runs an OpenAI-compatible HTTP server inside the Electron main
process and integrates with the existing provider settings.

Key changes:
- New huggingface-local provider with model manager and inference server
- HuggingFaceProviderForm UI component in settings
- v009 migration adding huggingface_local_config column
- IPC handlers for model download/management
- Provider icon and logo assets

Credit: GunaPalanivel (#650), nancysangani (#488), SaaiAravindhRaja (#604)
Closes #183
@mavishay
Copy link
Collaborator

Thank you for your contribution! 🙏 This feature has been consolidated into PR #705 which includes work from multiple contributors. Your implementation was a key reference for the final solution. Closing in favor of the consolidated PR.

1 similar comment
@mavishay
Copy link
Collaborator

Thank you for your contribution! 🙏 This feature has been consolidated into PR #705 which includes work from multiple contributors. Your implementation was a key reference for the final solution. Closing in favor of the consolidated PR.

@mavishay
Copy link
Collaborator

Thank you for your contribution! 🙏 This feature has been consolidated into PR #705 which includes work from multiple contributors. Your implementation was a key reference for the final solution. Closing in favor of the consolidated PR.

mavishay added a commit that referenced this pull request Mar 12, 2026
- From #604 (SaaiAravindhRaja): i18n locale strings for HuggingFace Local
  settings panel (provider name, labels, server URL hint, model selection),
  'huggingface-local/' prefix in PROVIDER_PREFIXES, testHuggingFaceLocalConnection(),
  fetchHuggingFaceLocalModels(), and curated HF_RECOMMENDED_MODELS list
- From #488 (nancysangani): HuggingFaceHubModel type, searchHuggingFaceHubModels()
  for discovering ONNX-compatible models via HuggingFace Hub API,
  and additional Xenova mirror models in HF_RECOMMENDED_MODELS
mavishay added a commit that referenced this pull request Mar 12, 2026
Adds a HuggingFace Local inference provider enabling fully offline LLM
inference via @huggingface/transformers, without relying on external APIs.

The provider runs an OpenAI-compatible HTTP server inside the Electron main
process and integrates with the existing provider settings.

Key changes:
- New huggingface-local provider with model manager and inference server
- HuggingFaceProviderForm UI component in settings
- v009 migration adding huggingface_local_config column
- IPC handlers for model download/management
- Provider icon and logo assets

Credit: GunaPalanivel (#650), nancysangani (#488), SaaiAravindhRaja (#604)
Closes #183
mavishay added a commit that referenced this pull request Mar 12, 2026
- From #604 (SaaiAravindhRaja): i18n locale strings for HuggingFace Local
  settings panel (provider name, labels, server URL hint, model selection),
  'huggingface-local/' prefix in PROVIDER_PREFIXES, testHuggingFaceLocalConnection(),
  fetchHuggingFaceLocalModels(), and curated HF_RECOMMENDED_MODELS list
- From #488 (nancysangani): HuggingFaceHubModel type, searchHuggingFaceHubModels()
  for discovering ONNX-compatible models via HuggingFace Hub API,
  and additional Xenova mirror models in HF_RECOMMENDED_MODELS
@mavishay mavishay closed this Mar 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] HuggingFace Transformers.js integration for local model inference

3 participants