I am experiencing a problem with JSON columns in my Azure Function. For some reason, my MS SQL server throws an error due to unsupported syntax in the UPSERT command. The command contains JSON(MAX) which seems to be unsupported syntax. I indeed think this is not valid syntax (https://learn.microsoft.com/en-us/sql/t-sql/data-types/json-data-type?view=azuresqldb-mi-current).
It feels weird that I wasn't able to find any report on this problem, so I may be just doing something wrong. Sorry for wasting your time if this is my mistake guys.
Azure Functions Core Tools
Core Tools Version: 4.8.0+ec58eb7110992ea02aa19e9c060e45ac8882cc02 (64-bit)
Function Runtime Version: 4.1046.100.25610
host.json
{
"version": "2.0",
"logging": {
"logLevel": {
"default": "Warning",
"Function": "Debug"
},
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[4.32, 5.0.0)"
}
}
Database schema
IF OBJECT_ID('Reviews_Process_v2', 'U') IS NULL
BEGIN
CREATE TABLE Reviews_Process_v2 (
id varchar(150) PRIMARY KEY,
status nvarchar(50),
batch_id varchar(100),
positive_aspects nvarchar(max),
negative_aspects nvarchar(max),
user_language nvarchar(50),
suggested_answer_cz nvarchar(max),
suggested_answer_en nvarchar(max),
request JSON,
response JSON
);
-- Index on status
CREATE NONCLUSTERED INDEX IX_Reviews_Process_v2_Status
ON Reviews_Process_v2 (status);
-- Index on batch_id
CREATE NONCLUSTERED INDEX IX_Reviews_Process_v2_BatchId
ON Reviews_Process_v2 (batch_id);
END
Error text
[2026-03-31T15:57:17.816Z] Exception executing query. Message=Column, parameter, or variable Azure/azure-functions-extension-bundles#9: Cannot specify a column width on data type json.
[2026-03-31T15:57:17.817Z] Column, parameter, or variable Azure/azure-functions-extension-bundles#10: Cannot specify a column width on data type json.
[2026-03-31T15:57:17.817Z] Parameter or variable 'json' has an invalid data type.
[2026-03-31T15:57:17.817Z] Parameter or variable 'json' has an invalid data type.
[2026-03-31T15:57:17.817Z] Query=WITH cte AS ( SELECT * FROM OPENJSON(REPLACE(@rowData, N'/', N'\/')) WITH ([id] varchar(150),[source] varchar(50),[status] nvarchar(50),[batch_id] varchar(100),[positive_aspects] nvarchar(max),[negative_aspects] nvarchar(max),[user_language] nvarchar(50),[suggested_answer_cz] nvarchar(max),[suggested_answer_en] nvarchar(max),[request] json(max),[response] json(max)) )
[2026-03-31T15:57:17.817Z] MERGE INTO [dbo].[Reviews_Process_v2] WITH (HOLDLOCK)
[2026-03-31T15:57:17.817Z] AS ExistingData
[2026-03-31T15:57:17.817Z] USING cte
[2026-03-31T15:57:17.817Z] AS NewData
[2026-03-31T15:57:17.817Z] ON
[2026-03-31T15:57:17.817Z] ExistingData.[id] = NewData.[id]
[2026-03-31T15:57:17.817Z] WHEN MATCHED THEN
[2026-03-31T15:57:17.817Z] UPDATE SET ExistingData.[id] = NewData.[id], ExistingData.[source] = NewData.[source], ExistingData.[status] = NewData.[status], ExistingData.[batch_id] = NewData.[batch_id], ExistingData.[positive_aspects] = NewData.[positive_aspects], ExistingData.[negative_aspects] = NewData.[negative_aspects], ExistingData.[user_language] = NewData.[user_language], ExistingData.[suggested_answer_cz] = NewData.[suggested_answer_cz], ExistingData.[suggested_answer_en] = NewData.[suggested_answer_en], ExistingData.[request] = NewData.[request], ExistingData.[response] = NewData.[response]
[2026-03-31T15:57:17.817Z] WHEN NOT MATCHED THEN
[2026-03-31T15:57:17.817Z] INSERT ([id],[source],[status],[batch_id],[positive_aspects],[negative_aspects],[user_language],[suggested_answer_cz],[suggested_answer_en],[request],[response]) VALUES ([id],[source],[status],[batch_id],[positive_aspects],[negative_aspects],[user_language],[suggested_answer_cz],[suggested_answer_en],[request],[response]);
[2026-03-31T15:57:17.911Z] Executed 'Functions.enqueue_reviews_batch_http' (Failed, Id=5162ac02-6da9-484c-986e-fc5f29c6787a, Duration=10367ms)
[2026-03-31T15:57:17.912Z] System.Private.CoreLib: Exception while executing function: Functions.enqueue_reviews_batch_http. Microsoft.Azure.WebJobs.Host: Error while handling parameter _binder after function returned:. Microsoft.Azure.WebJobs.Extensions.Sql: Unexpected error upserting rows. Core Microsoft SqlClient Data Provider: Column, parameter, or variable Azure/azure-functions-extension-bundles#9: Cannot specify a column width on data type json.
[2026-03-31T15:57:17.912Z] Column, parameter, or variable Azure/azure-functions-extension-bundles#10: Cannot specify a column width on data type json.
[2026-03-31T15:57:17.912Z] Parameter or variable 'json' has an invalid data type.
[2026-03-31T15:57:17.912Z] Parameter or variable 'json' has an invalid data type.
Azure Function definition
SQL_CONNECTION_STRING = "SqlConnectionString"
ENQUEUE_SQL_INPUT_ARGNAME = "inputDocuments"
# Larger batches are not well supported because the connection to the SQL database becomes
# unstable when transferring large amounts of data. Not sure what's the root cause but limiting
# to 500 reviews per batch seems to work reliably.
ENQUEUE_SQL_INPUT_COMMAND = """
SELECT TOP (20)
input.ReviewID AS id,
input.ReviewSource as source,
output.status,
input.Author as author,
input.ReviewerComment as comment,
output.positive_aspects,
output.negative_aspects,
output.user_language,
output.suggested_answer_cz,
output.suggested_answer_en,
output.batch_id,
output.request,
output.response
FROM [AI_Reviews].[dbo].[OverALL] input
LEFT JOIN [AI_Reviews].[dbo].[Reviews_Process_v2] output
ON input.ReviewID = output.id
WHERE (output.status IS NULL OR output.status = 'to_retry')
ORDER BY input.ReviewDate DESC;
"""
ENQUEUE_SQL_INPUT_TYPE = "Text"
ENQUEUE_SQL_OUTPUT_BATCHES_ARGNAME = "llmBatches"
# Note that the command is the name of the table. Specifying the database will not have any effect.
# Instead, the connection string must specify the database to connect to. If the table is not in the
# configured database, the function will fail to write to the table.
ENQUEUE_SQL_OUTPUT_BATCHES_COMMAND = "[dbo].[Batches_v2]"
ENQUEUE_SQL_OUTPUT_REVIEWS_ARGNAME = "reviewsEnqueued"
ENQUEUE_SQL_OUTPUT_REVIEWS_COMMAND = "[dbo].[Reviews_Process_v2]"
@app.function_name(name="enqueue_reviews_batch_timer")
@app.timer_trigger(arg_name="timer", schedule="0 0 * * * *", use_monitor=True)
@app.sql_input(
arg_name=ENQUEUE_SQL_INPUT_ARGNAME,
command_text=ENQUEUE_SQL_INPUT_COMMAND,
command_type=ENQUEUE_SQL_INPUT_TYPE,
connection_string_setting=SQL_CONNECTION_STRING,
)
@app.sql_output(
arg_name=ENQUEUE_SQL_OUTPUT_BATCHES_ARGNAME,
command_text=ENQUEUE_SQL_OUTPUT_BATCHES_COMMAND,
connection_string_setting=SQL_CONNECTION_STRING,
)
@app.sql_output(
arg_name=ENQUEUE_SQL_OUTPUT_REVIEWS_ARGNAME,
command_text=ENQUEUE_SQL_OUTPUT_REVIEWS_COMMAND,
connection_string_setting=SQL_CONNECTION_STRING,
)
def enqueue_reviews_batch_timer_func(
timer: func.TimerRequest,
llmBatches: func.Out[func.SqlRow],
inputDocuments: func.SqlRowList,
reviewsEnqueued: func.Out[func.SqlRow],
) -> None:
enqueue_reviews_batch(llmBatches, inputDocuments, reviewsEnqueued)
I am experiencing a problem with JSON columns in my Azure Function. For some reason, my MS SQL server throws an error due to unsupported syntax in the UPSERT command. The command contains
JSON(MAX)which seems to be unsupported syntax. I indeed think this is not valid syntax (https://learn.microsoft.com/en-us/sql/t-sql/data-types/json-data-type?view=azuresqldb-mi-current).It feels weird that I wasn't able to find any report on this problem, so I may be just doing something wrong. Sorry for wasting your time if this is my mistake guys.
Azure Functions Core Tools
host.json
{ "version": "2.0", "logging": { "logLevel": { "default": "Warning", "Function": "Debug" }, "applicationInsights": { "samplingSettings": { "isEnabled": true, "excludedTypes": "Request" } } }, "extensionBundle": { "id": "Microsoft.Azure.Functions.ExtensionBundle", "version": "[4.32, 5.0.0)" } }Database schema
Error text
Azure Function definition