Skip to content

vijayrmourya/boto3-helpers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

boto3-helpers

Lightweight AWS SDK utilities for Python projects using boto3.

Eliminates repetitive boto3.client() calls, centralizes region and credential resolution, caches clients at process scope with thread-safe double-checked locking, and provides idiomatic thin wrappers for Bedrock's control-plane and runtime APIs. Designed for consistent boto3 behavior across any number of projects and a single mock-injection point for tests.


Why you need this

When working with boto3 in production Python code, you inevitably need:

  1. One place to set the default region — instead of passing region= to every single call
  2. A client cache — boto3 client creation is expensive; the same service should return the same cached object
  3. Sensible region resolution order — explicit arg > env var > configured default > AWS SDK chain
  4. Validation before API calls — obvious mistakes like empty model ID or invalid top_p should fail immediately with clear errors
  5. Thin wrappers around high-use APIs — Bedrock Converse, ConverseStream, and model listing are used everywhere; don't repeat the request-assembly boilerplate
  6. Thread-safe operations — production workloads use threads and async contexts; client access must be protected

This library provides all of the above. It's small (~350 lines), has no external dependencies beyond boto3, and integrates with any existing boto3 codebase without intrusion.


Features

Capability How it works
Project-wide default region configure(region="us-west-2") sets the default once at startup
Cached boto3 clients get_client() caches by (service, region, profile) — same object returned on every call
Region resolution chain Explicit arg > AWS_DEFAULT_REGION env var > configure() value > SDK chain
Creation logging Client/resource cache hits, misses, and creation failures are logged via Python's standard logging module
Bedrock Converse wrapper runtime.converse() — builds the request dict, handles top_k via additionalModelRequestFields, returns the raw response
Bedrock ConverseStream wrapper runtime.converse_stream() — yields normalised (event_type, payload) tuples, including start/stop/tool events
Model listing wrapper control.list_foundation_models() — returns the full foundation model catalogue as a list of dicts
Model detail wrapper control.get_foundation_model() — returns a single model's full metadata
DynamoDB batch helpers batch_get_items() and batch_write_items() handle multi-item reads and writes with retries/buffering
Cross-platform build scripts build.sh (Linux/macOS) and build.ps1 (Windows) — bump version, build wheel, install

Package layout

boto3-helpers/
├── pyproject.toml          # package metadata and build config
├── setup.py                # minimal stub for editable-install compatibility
├── build.sh                # packaging script (Linux / macOS / WSL)
├── build.ps1               # packaging script (Windows PowerShell)
├── .gitignore
└── boto3_helpers/
    ├── __init__.py          # re-exports configure, get_client, get_resource
    ├── client.py            # generic cached client/resource factory + configure()
    ├── dynamodb.py          # common DynamoDB get/put/delete/scan + batch helpers
    └── bedrock/
        ├── __init__.py
        ├── control.py       # Bedrock control-plane (list / get models)
        └── runtime.py       # Bedrock Converse API wrapper

How it works

configure() — set project-wide defaults

Call once at application startup so you never have to pass region/profile on every individual call:

import boto3_helpers

boto3_helpers.configure(region="eu-west-1", profile="my-prod")

Region resolution order (first match wins):

Priority Source
1 Explicit region= argument on get_client() / get_resource()
2 AWS_DEFAULT_REGION environment variable
3 Value set via configure()
4 boto3's own chain (~/.aws/config, instance metadata, etc.)

get_client() / get_resource() — cached boto3 factory

from boto3_helpers import get_client, get_resource
Function Returns
get_client(service, region, profile) Cached boto3.Client
get_resource(service, region, profile) Cached boto3.ServiceResource

Clients are cached by (service, resolved_region, resolved_profile) — the same object is returned on every subsequent call (thread-safe via double-checked locking).

import boto3_helpers

boto3_helpers.configure(region="us-east-1")

s3  = get_client("s3")                      # uses us-east-1
ddb = get_resource("dynamodb")              # uses us-east-1
eu  = get_client("s3", region="eu-west-1")  # explicit per-call override

bedrock/control.py — Bedrock control-plane

from boto3_helpers.bedrock import control

models  = control.list_foundation_models()
details = control.get_foundation_model("anthropic.claude-3-haiku-20240307-v1:0")
Function AWS API
list_foundation_models(region) ListFoundationModels
get_foundation_model(model_id, region) GetFoundationModel

bedrock/runtime.py — Bedrock Converse API

from boto3_helpers.bedrock import runtime

response = runtime.converse(
    model_id="us.anthropic.claude-3-haiku-20240307-v1:0",
    messages=[{"role": "user", "content": [{"text": "Hello!"}]}],
    max_tokens=256,
    temperature=0.7,
    top_p=0.9,
    top_k=50,
    system_prompt="You are a concise assistant.",
)

text  = response["output"]["message"]["content"][0]["text"]
usage = response["usage"]   # inputTokens, outputTokens, totalTokens

Inputs are validated before the SDK call is made. Obvious mistakes such as an empty model_id, empty messages, invalid top_p, or non-positive max_tokens raise ValueError immediately instead of failing later inside boto3.

Parameter Default Description
model_id Bedrock / inference-profile model ID
messages Converse-format message list
max_tokens 250 Max tokens in the response
temperature None Randomness of sampling — 0.0 = deterministic, 1.0 = most random. None = model default
top_p None Nucleus sampling — only tokens whose cumulative probability reaches this threshold are considered. None = model default
top_k None Limits the sampling pool to the K most-likely next tokens. Passed as additionalModelRequestFields; support varies by model. None = model default
system_prompt None Optional system prompt
region None Per-call region override

bedrock/runtime.py — Bedrock ConverseStream events

runtime.converse_stream() yields normalised (event_type, payload) tuples.

Event type Meaning
message_start Bedrock started a new streamed message
content_block_start A non-tool content block started
text_delta Text chunk emitted by the model
tool_use Tool-use block started
tool_use_delta Incremental tool-use payload
content_block_stop Content block finished
message_stop Message finished with stop reason
metadata Final usage and metrics block
tool_result Tool result event if present in the stream
unknown Any event shape not yet normalised

dynamodb.py — common item helpers

from boto3_helpers import batch_get_items, batch_write_items, get_item, put_item, scan_keys

items = batch_get_items(
    "sessions",
    [{"session_id": "a"}, {"session_id": "b"}],
    region="us-west-2",
)

batch_write_items(
    "sessions",
    [{"session_id": "c", "messages": "[]"}, {"session_id": "d", "messages": "[]"}],
)
Function Purpose
get_item(table_name, key) Fetch one item or return None
put_item(table_name, item) Upsert one item
delete_item(table_name, key) Delete one item
scan_keys(table_name, key_name) Read a projected key across all pages
batch_get_items(table_name, keys) Fetch up to many items with automatic 100-key chunking and retry of unprocessed keys
batch_write_items(table_name, items) Buffered batch writes using DynamoDB's batch writer

Building the package

Two scripts are provided — pick one based on your OS. Both support the same flags and produce identical output in dist/.

Linux / macOS / WSL — build.sh

# First time only — make it executable
chmod +x build.sh

./build.sh                       # build wheel + sdist (uses version in pyproject.toml)
./build.sh --bump 0.2.0          # bump version, then build
./build.sh --install             # build, then install wheel into active Python env
./build.sh --editable            # pip install -e . (local development, no wheel)
./build.sh --bump 0.2.0 --install  # bump + build + install in one step

Windows — build.ps1

# Allow script execution (run once in an elevated PowerShell)
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser

.\build.ps1                          # build wheel + sdist
.\build.ps1 -Bump 0.2.0             # bump version, then build
.\build.ps1 -Install                 # build, then install wheel
.\build.ps1 -Editable                # editable install for dev
.\build.ps1 -Bump 0.2.0 -Install    # bump + build + install

What the scripts do

  1. Bump — rewrites version = "..." in pyproject.toml and __version__ in boto3_helpers/__init__.py atomically.
  2. Build — runs python -m build, which produces:
    • dist/boto3_helpers-X.Y.Z-py3-none-any.whl — the installable wheel
    • dist/boto3_helpers-X.Y.Z.tar.gz — the source distribution (sdist)
  3. Install — runs pip install --upgrade dist/*.whl for a regular install.
  4. Editable — runs pip install -e . so source changes are reflected without reinstalling.

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines on how to contribute to this library.

License

This project is licensed under the MIT License. See the LICENSE file for the full text. When reusing this software, please ensure that the original copyright notice and attribution to Vijay Mourya are included.


Installation

This library is designed to be installed locally within the workspace as an editable package.

From source (local workspace)

# Navigate to the library directory
cd boto3-helpers

# Create a virtual environment (if not already created at workspace level)
python3 -m venv .venv
source .venv/bin/activate  # macOS/Linux
# or
.venv\Scripts\activate     # Windows

# Install in editable mode (recommended for development)
pip install -e .

From another workspace project

# Add to requirements.txt
-e ../boto3-helpers

# Or install directly
pip install -e ../boto3-helpers

Prerequisites

  • Python 3.10+
  • boto3 >= 1.26
  • AWS credentials configured locally (~/.aws/credentials, environment variables, or SSO)

Quick start

import boto3_helpers
from boto3_helpers import get_client
from boto3_helpers.bedrock import runtime

# Configure default region once
boto3_helpers.configure(region="us-west-2")

# Get a cached client
s3 = get_client("s3")

# Call Bedrock
response = runtime.converse(
    model_id="us.amazon.nova-micro-v1:0",
    messages=[{"role": "user", "content": [{"text": "Hello!"}]}],
    max_tokens=256,
)

Building the package

Two build scripts are provided — choose based on your OS. Both produce identical output in dist/.

Linux / macOS / WSL — build.sh

# First time — make executable
chmod +x build.sh

./build.sh                       # build wheel + sdist
./build.sh --bump 0.2.0          # bump version, then build
./build.sh --install             # build, then install wheel
./build.sh --editable            # pip install -e . (development)
./build.sh --bump 0.2.0 --install  # bump + build + install in one step

Windows — build.ps1

# Run once in elevated PowerShell to allow script execution
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser

.\build.ps1                          # build wheel + sdist
.\build.ps1 -Bump 0.2.0             # bump version, then build
.\build.ps1 -Install                 # build and install wheel
.\build.ps1 -Editable                # pip install -e . (development)
.\build.ps1 -Bump 0.2.0 -Install    # bump + build + install

What the build scripts do

  1. Bump — updates version in pyproject.toml and __version__ in boto3_helpers/__init__.py
  2. Build — runs python -m build → produces wheel and sdist in dist/
  3. Install — installs the built wheel into the active Python environment
  4. Editablepip install -e . for development (changes reflected immediately)

Installing in another project

Option 1 — Editable (local development)

Both are equivalent:

# Using the build script
cd /path/to/boto3-helpers
./build.sh --editable

# Or manually
pip install -e /path/to/boto3-helpers

Add to requirements.txt:

boto3_helpers @ file:///absolute/path/to/boto3-helpers
# or relative path
-e ../boto3-helpers

Option 2 — Install built wheel

# Build the wheel first
cd /path/to/boto3-helpers
./build.sh

# Install in your project
pip install /path/to/boto3-helpers/dist/boto3_helpers-*.whl

Or add to requirements.txt:

boto3_helpers @ file:///path/to/boto3-helpers/dist/boto3_helpers-0.1.0-py3-none-any.whl

Using in your project

After installation, import directly — no extra configuration needed:

import boto3_helpers

# Set default region once (at app startup)
boto3_helpers.configure(region="us-west-2")

# Import and use anywhere
from boto3_helpers import get_client, get_resource
from boto3_helpers.bedrock import control, runtime

Override region via environment variable (no code changes):

export AWS_DEFAULT_REGION=ap-southeast-1
python my_app.py

Dependencies

Package Purpose Notes
boto3 AWS SDK for all AWS services Required
(no other dependencies) Intentionally minimal

Project structure

boto3-helpers/
├── pyproject.toml              # Package metadata, build config
├── setup.py                    # Minimal setuptools shim
├── build.sh                    # Linux/macOS build script
├── build.ps1                   # Windows PowerShell build script
├── .gitignore
└── boto3_helpers/
    ├── __init__.py             # Public API: configure, get_client, get_resource
    ├── client.py               # Cached client/resource factory
    ├── dynamodb.py             # DynamoDB convenience functions (get, put, scan, batch*)
    └── bedrock/
        ├── __init__.py
        ├── control.py          # ListFoundationModels, GetFoundationModel wrappers
        └── runtime.py          # Converse, ConverseStream wrappers

Design patterns

Global client cache by service/region/profile — boto3 clients are expensive to create. Caching at module scope (@functools.lru_cache) ensures each unique combination returns the same object on every call. Thread-safe via double-checked locking.

Region resolution chain — Explicit argument > AWS_DEFAULT_REGION > configure() value > boto3 SDK chain. This allows global configuration while remaining overridable.

Thread-safe with minimal locks — Uses threading.Lock only during cache miss and client creation. Cache hits are lock-free reads.

Bedrock Converse validation — Request validation happens before the boto3 call. Invalid top_p, empty model ID, or empty messages raise ValueError immediately with a clear error message.

Thin API wrappers — Functions accept plain Python types (strings, dicts, lists) and return plain dicts or lists. No custom classes or ORM-like abstractions. Easy to mock and test.

DynamoDB batch helpers — Standard operations like batch reads/writes handle pagination, retries, and chunking automatically. You pass items, the library handles the plumbing.


Advanced configuration

In a Jupyter notebook

import boto3_helpers

# Configurable per kernel session
boto3_helpers.configure(
    region="eu-west-1",
    profile="my-profile"
)

**Client caching by `(service, region, profile)` tuple:**
boto3 `Session` and `Client` creation resolves credentials and sets up an HTTP connection pool. Caching by the full key ensures a per-call region override doesn't evict the default-region client.

**Region resolution chain:**
`AWS_DEFAULT_REGION` overrides `configure()` but loses to an explicit argument. This lets code define a default while remaining overridable via environment variablesno conditional branches needed.

**Bedrock Converse API:**
The unified message format `{"role": "user", "content": [{"text": "..."}]}` works across every provider. `usage` in the response always contains `inputTokens` and `outputTokens` regardless of model.

**ConverseStream event protocol:**
Three event types: `contentBlockDelta` (text chunks), `messageStop` (finish reason), `metadata` (token counts). `metadata` arrives last because the model doesn't know its output token count until generation ends.

**`top_k` via `additionalModelRequestFields`:**
Unlike `temperature` and `topP`, `top_k` is not a standard `InferenceConfig` field. It must be passed via `additionalModelRequestFields` as a model-specific parameter.

**Control plane vs data plane:**
`ListFoundationModels` / `GetFoundationModel` are control-plane calls (low rate limit, highly cacheable). `Converse` / `ConverseStream` are data-plane calls (higher rate limits, not cacheable). Separating them into `control.py` and `runtime.py` reflects this operational boundary.

---

## Real-world use cases

### Shared library across a monorepo

`bedrock-model-compare` and `stream-cli` both install `boto3-helpers` as a sibling editable install. When a bug is fixed in `client.py`say, a caching edge caseboth projects pick it up on the next process start with zero reinstall steps. This is the standard monorepo pattern for shared internal libraries.

### Region switching without code changes

Any project that calls `boto3_helpers.configure(region="us-west-2")` at startup can be redirected to another region by setting `AWS_DEFAULT_REGION=us-east-1` in the shell. The resolution chain (`explicit arg > env var > configure() > SDK`) means the shell override wins over the code default, but per-call explicit region arguments still win over both. No `if region is None` branches required.

### Lambda connection reuse

Lambda functions that call `boto3.client("bedrock-runtime")` on every invocation re-initiate the TLS handshake each time. `get_client()` caches at module scopethe same object is returned on warm invocations. For a Lambda running 10 requests per second, that eliminates 864,000 unnecessary handshakes per day per client.

### Single mock injection point for tests

`get_client()` is the only place in any project where a real boto3 client is created. Patching `boto3_helpers.get_client` at test session scope intercepts every AWS call across every module in every project. No function-level patching, no fixture threading, no leaky mocks.

### Adding a new service

Create `boto3_helpers/<service>/__init__.py`, write functions that call `get_client("<aws-service-name>")`, and import them anywhere. The caching, region resolution, and credential chain all work automaticallyzero infrastructure code needed for the new service.

---

## Using in your project

After any install, import directlyno extra path manipulation needed:

```python
# main.py or app entry point — call once
import boto3_helpers
boto3_helpers.configure(region="us-west-2")   # or read from env/config

# Anywhere else in the project
from boto3_helpers import get_client, get_resource
from boto3_helpers.bedrock import control, runtime

Environment variable override — no code changes needed per deployment:

export AWS_DEFAULT_REGION=ap-southeast-1

Releasing a new version

# 1. Bump version, build, and install locally to test
./build.sh --bump 0.2.0 --install

# 2. Verify the installed version
python3 -c "import boto3_helpers; print(boto3_helpers.__version__)"

# 3. Tag the git commit
git tag v0.2.0 && git push origin v0.2.0

# 4. (Optional) Publish to PyPI
pip install twine
twine upload dist/*

Adding new AWS services

  1. Create boto3_helpers/<service>/ with __init__.py and a module per concern.
  2. Call get_client("<aws-service-name>", region) — never construct boto3.client() directly.
  3. Keep functions thin — accept plain Python types, return plain dicts or lists.

Requirements

  • Python ≥ 3.10
  • boto3 (installed automatically as a dependency)
  • AWS credentials via the standard chain (~/.aws/credentials, env vars, instance role)

About

Reusable AWS SDK utilities for Python projects — eliminates repetitive boto3.client() calls, centralizes region/credential resolution, caches clients at process scope, and provides idiomatic wrappers for Bedrock's control-plane and runtime APIs.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors