Conversation
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
|
Warning Review the following alerts detected in dependencies. According to your organization's Security Policy, it is recommended to resolve "Warn" alerts. Learn more about Socket for GitHub.
|
|
Size Change: -352 B (-0.01%) Total Size: 4.2 MB
|
|
🚀 Preview will be available at https://docs-ia.beta.numerique.gouv.fr/ You must create a access with yopmail. Once this Pull Request is merged, the preview will be destroyed. |
7d5fed3 to
85ae764
Compare
a899b5d to
b617394
Compare
352dbd0 to
d53583f
Compare
d53583f to
f161c8f
Compare
Add AI proxy to handle AI related requests to the AI service.
a284cc1 to
34090ea
Compare
34090ea to
d44f9e9
Compare
We make the AI bot configurable with settings. We will be able to have different AI bot name per instance.
We want to handle both streaming or not when interacting with the AI backend service.
Standard can vary depending on the AI service used. To work with Albert API: - a description field is required in the payload for every tools call. - if stream is set to false, stream_options must be omitted from the payload. - the response from Albert sometimes didn't respect the format expected by Blocknote, so we added a system prompt to enforce it.
We integrate the new Blocknote AI feature into Docs, enhancing the document editing experience with AI capabilities.
AI feature is under AGPL license, so it is removed when the project is under MIT license. NEXT_PUBLIC_PUBLISH_AS_MIT manage this.
Bind ai_proxy abilities to the AI feature. If ai_proxy is false, the AI feature will not be available.
notify screen readers about ai thinking, writing, ready, or error
This is a naive first switch from sync to async. This enables the backend to still answer to incomming requests while streaming LLM results to the user. For sure there is room for code cleaning and improvements, but this provides a nice improvement out of the box.
The frontend application is using Vercel AI SDK and it's data stream protocol. We decided to use the pydantic AI library to use it's vercel ai adapter. It will make the payload validation, use AsyncIterator and deal with vercel specification.
When the tool applyDocumentOperations is used, we have to force the usage of a system prompt in order to force the model to use it the right without inventing different actions. The pydantic Agent class can use a system prompt but this noe is ignoried when a UI adapter is used like the VercelAiAdapter.
We don't need anymore the AI_STREAM settings, we use the stream all the time.
d44f9e9 to
7b20b47
Compare
Purpose
New AI feature powered by Blocknote. 🚀
Proposal
New Settings:
AI_STREAMis a setting because not all model support the stream mode.Careful
You can still use the previous AI feature that is under MIT license.
Demo
Enregistrement.2026-02-03.113122.mp4