Skip to content

feat: store llm model as env var#16

Open
fredj wants to merge 2 commits intomainfrom
llm_model
Open

feat: store llm model as env var#16
fredj wants to merge 2 commits intomainfrom
llm_model

Conversation

@fredj
Copy link
Copy Markdown
Contributor

@fredj fredj commented Mar 4, 2026

No description provided.

@fredj fredj force-pushed the llm_model branch 2 times, most recently from db62b9b to 4c8fc6e Compare March 9, 2026 07:23
@fredj fredj marked this pull request as ready for review March 18, 2026 09:50
Copy link
Copy Markdown
Member

@zurfjereluhmie zurfjereluhmie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some general suggestions. But a required change for conftest.py otherwise LLM based tests fail

model="gpt-4o",
temperature=0,
api_key=os.getenv("OPENAI_API_KEY")
api_key="sk-..."
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
api_key="sk-..."
api_key=os.getenv("LLM_API_KEY")

if not os.getenv("LLM_API_KEY"):
pytest.skip("LLM_API_KEY not set")
set_llm_cache(InMemoryCache())
llm = init_chat_model(model="gpt-4o", model_provider="openai", temperature=0)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this PR introduces independence from the LLM provider, shouldn’t this also apply to the tests ?

return

# Initialize LLM
print("🔄 Initializing etter...")
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it was nice to catch early undefined env varaibles. Plus we get correct typing this way

Suggested change
print("🔄 Initializing etter...")
print("🔄 Initializing etter...")
LLM_API_KEY = os.getenv("LLM_API_KEY")
if not LLM_API_KEY:
print("❌ LLM_API_KEY not set. Please set it in your .env file.")
return
LLM_MODEL = os.getenv("LLM_MODEL", "gpt-4o")
if not LLM_MODEL:
print("❌ LLM_MODEL not set. Please set it in your .env file.")
return

Comment on lines +53 to +55
model=os.getenv("LLM_MODEL"),
temperature=0,
api_key=os.getenv("LLM_API_KEY"),
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
model=os.getenv("LLM_MODEL"),
temperature=0,
api_key=os.getenv("LLM_API_KEY"),
model=LLM_MODEL,
temperature=0,
api_key=LLM_API_KEY,


# Initialize etter components
llm = ChatOpenAI(model="gpt-4o", temperature=0)
llm = init_chat_model(model=os.getenv("LLM_MODEL"), temperature=0, api_key=os.getenv("LLM_API_KEY"))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By checking for the env variables early we can provide a str to init_chat_model which return a BaseChatModel instad of a _ConfigurableModel and so the typing is correct.

Suggested change
llm = init_chat_model(model=os.getenv("LLM_MODEL"), temperature=0, api_key=os.getenv("LLM_API_KEY"))
LLM_API_KEY = os.getenv("LLM_API_KEY")
if not LLM_API_KEY:
print("❌ LLM_API_KEY not set. Please set it in your .env file.")
return
LLM_MODEL = os.getenv("LLM_MODEL", "gpt-4o")
if not LLM_MODEL:
print("❌ LLM_MODEL not set. Please set it in your .env file.")
return
llm = init_chat_model(model=LLM_MODEL, temperature=0, api_key=LLM_API_KEY)

if not os.getenv("LLM_API_KEY"):
pytest.skip("LLM_API_KEY not set")
set_llm_cache(InMemoryCache())
llm = init_chat_model(model="gpt-4o", model_provider="openai", temperature=0)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Required to pass the API key for it to work

Suggested change
llm = init_chat_model(model="gpt-4o", model_provider="openai", temperature=0)
llm = init_chat_model(model="gpt-4o", model_provider="openai", temperature=0, api_key=os.getenv("LLM_API_KEY"))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants