Skip to content

Commit 7bd41f0

Browse files
committed
WIP
1 parent 941f14c commit 7bd41f0

22 files changed

+228
-1
lines changed

docs/Gemfile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
gem "github-pages", group: :jekyll_plugins

docs/_config.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
remote_theme: pages-themes/cayman@v0.2.0
2+
plugins:
3+
- jekyll-remote-theme # add this line to the plugins list if you already have one
4+
5+
title: python-llms-wrapper
6+
description: A library for making it simple and straightforward to use many different LLMs, including tooling
7+
8+
show_downloads: false

docs/_notes.txt

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
Github help:
2+
* https://docs.github.com/en/pages/setting-up-a-github-pages-site-with-jekyll/about-github-pages-and-jekyll
3+
* https://jekyll.github.io/github-metadata/site.github/
4+
* https://pages.github.com/themes/
5+
* https://github.com/pages-themes/cayman
6+
* https://github.com/pages-themes/midnight
7+
* https://github.com/jekyll/minima
8+
* https://github.com/pages-themes/minimal
9+
* https://github.com/pages-themes/slate
10+
* https://github.com/pages-themes/tactile
11+
* Customize theme: https://docs.github.com/en/pages/setting-up-a-github-pages-site-with-jekyll/adding-a-theme-to-your-github-pages-site-using-jekyll
12+
13+
14+
Test locally: see theme readme, https://github.com/pages-themes/cayman
15+
Then run: bundle exec jekyll serve

docs/_run.sh

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
#!/bin/bash
2+
3+
bundle exec jekyll serve

docs/configuration.md

Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
For every LLM to be used in a project, there should be configuration data that is necessary for using the LLM.
2+
Configuration data is expected as a dictionary which contains the key `llms` associated with a list of per-LLM configurations
3+
and optionally the key `providers` associated with a dictionary mapping provider names to settings common to all LLMs of that
4+
provider.
5+
6+
Here is an example configuration showing a subset of the settings:
7+
```
8+
{
9+
"llms" : [
10+
{
11+
"api_key_env" : "OPENAI_KEY1",
12+
"llm" : "openai/gpt-4o",
13+
"temperature" : 0
14+
},
15+
{
16+
"llm" : "gemini/gemini-1.5-flash",
17+
"temperature" : 1,
18+
"alias": "gemini1"
19+
},
20+
{
21+
"llm" : "gemini/gemini-1.5-flash",
22+
"temperature" : 0,
23+
"alias": "gemini2"
24+
}
25+
],
26+
"providers" : {
27+
"gemini" : {
28+
"api_key_env" : "GEMINI_KEY1"
29+
}
30+
}
31+
}
32+
```
33+
34+
Each llm is identified by the provider name, e.g. "openai" or "gemini" followed by a slash, followed by the model name or id. Provider
35+
names must match the provider names known in the [litellm](https://docs.litellm.ai/docs/providers) package.
36+
37+
Parameters specified for each of the providers in the `providers` section apply to every llm in the `llms` section unless the same
38+
parameter is also specified for the llm, in which case that value takes precedence.
39+
40+
The following parameters are known and supported in the `llms` and/or `providers` sections:
41+
42+
* `llm` (`llms` section only): specifies a specific model using the format `providername/modelid`.
43+
* `api_key`: the literal API key to use
44+
* `api_key_env`: the environment variable which contains the API key
45+
* `api_url`: the base URL to use for the model, e.g. for an ollama server. The URL may contain placeholders which will get replaced with
46+
the model name (`${model}`), or the user and password for basic authentication (`${user}`, `${password}`), e.g.
47+
`http://${user}:${password}@localhost:11434`
48+
* `user`, `password`: the user and password to use for basic authentication, this requires `api_url` to also be specified with the
49+
corresponding placeholders
50+
* `alias` (`llms` section only): an alias name for the model which will have to be used in the API. If no `alias` is specified, the name
51+
specified for `llm` is used.
52+
* `num_retries`: if present, can specify the number of retries to perform if an error occurs before giving up
53+
* `timeout`: if present, raise timeout error after that many seconds
54+
55+
All other settings are passed as is to the model invocation function. Different providers or APIs may support different parameters, but
56+
most will support `temperature`, `max_tokens` and `top_p`
57+
58+
IMPORTANT: the raw configuration as shown in the example above needs to get processed by the function `llms_wrapper.config.update_llm_config`
59+
in order to perform all the necessary substitutions!
60+
61+
#### Other top-leve config entries
62+
63+
Currently the following other top level configuration fields in addition to `llms` and `providers` are recognized:
64+
65+
* `use_phoenix`: if present, should be the URL of a local phoenix endpoint or a list containing the endpoint URL and the project name
66+
67+
## Configuration files
68+
69+
The configuration as described above can be created programmatically or read in from some config file.
70+
The function `llms_wrapper.config.read_config_file` allows reading the configuration from files in any of the following
71+
formats: json, hsjson, yaml, toml. By default, reading the configuration that way will also perform the necessary substitutions by
72+
automatically invoking `llms_wrapper.config.update_llm_config`
73+

docs/index.html

Lines changed: 0 additions & 1 deletion
This file was deleted.

docs/index.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# python-llms-wrapper
2+
3+
The `llms_wrapper` Python package is an opnionated library based on the excellen
4+
[`litelmm`](https://github.com/BerriAI/litellm) package to simplify using diverse
5+
LLMs from many different providers.
6+
7+
This documentation contains the following sections:
8+
9+
* [Installation](installation)
10+
* [Quick Start / Examples](quickstart)
11+
* [Configuration](configuration)
12+
* [Test Script](test-script)
13+
* [Usage](usage)
14+
* [Python Docs](pythondoc/llms_wrapper)
15+
16+
LLMs currently supported: support is based on the [LiteLLM](https://github.com/BerriAI/litellm) and the supported LLMs are listed [here](https://docs.litellm.ai/docs/providers/)
17+
18+

docs/installation.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Installation
2+
3+
### 1. Python environment
4+
5+
* If necessary create a new python environment
6+
* e.g. using Ana/Miniconda: `conda create -y -n llms_wrapper python=3.11`
7+
8+
### 2. Install package
9+
10+
* From local source:
11+
* clone the repository and change into the directory
12+
* run `pip install -e .`
13+
* To install for development: `python -m pip install -e .[dev]`
14+
* From source directly from github: `pip install -U git+https://github.com/OFAI/python-llms-wrapper.git`
15+
* To create a notebook kernel run `python -m ipykernel install --user --name=llms_wrapper`

0 commit comments

Comments
 (0)