Skip to content

Commit 3d40c49

Browse files
observability(docs): adopt phase terminology across framework
Changed "stage" to "phase" throughout observability framework to better reflect the non-linear, iterative nature of the model. Phases can be revisited and worked on concurrently, unlike sequential stages. Changes: - Framework page: Updated all section headings from "Stage X:" to "Phase" format, removed numbering from navigation cards, updated prose - All 5 phase guides: Added phase context to subtitle frontmatter (e.g., "This is the INSTRUMENT phase of the observability framework") - Removed numbered stage references throughout Also includes from earlier consistency review: - Framework: Added Test Suites deprecated label, Simulations pre-release - Instrumentation: Removed Call Analysis recommendation, reordered nav cards, added back-link, added inter-stage bridge, removed decorative emoji - Testing strategies: Added prerequisite reference to instrumentation - Extraction patterns: Removed decorative emojis from comparison table
1 parent 98eb623 commit 3d40c49

File tree

6 files changed

+50
-35
lines changed

6 files changed

+50
-35
lines changed

fern/observability/extraction-patterns.mdx

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,11 @@
11
---
22
title: Choosing your extraction pattern
3-
subtitle: Understand the three architectural patterns for getting data out of Vapi
3+
subtitle: Understand the three architectural patterns for getting data out of Vapi. This is the **EXTRACT phase** of the [observability framework](/observability/framework).
44
slug: observability/extraction-patterns
55
---
66

7+
<span className="internal-note">This page is in Rough Draft stage</span>
8+
79
## Why extraction is an architectural choice
810

911
Unlike traditional observability platforms (DataDog, New Relic) where data flows automatically from instrumentation to monitoring, **Vapi requires you to choose how data gets extracted** for analysis.
@@ -31,9 +33,9 @@ Vapi offers three architectural patterns for extracting observability data from
3133

3234
| Pattern | Description | Engineering effort | Data richness | Typical users |
3335
|---------|-------------|-------------------|---------------|---------------|
34-
| **Dashboard Native** | Use Vapi's built-in Boards with scalar Structured Outputs for real-time dashboards | Minimal (no infrastructure) | Basic (scalar fields only) | Solo founders, non-technical teams, startups |
35-
| **Webhook-to-External** | Build custom post-call processing that captures data via webhooks and exports to your data warehouse | 🛠️ High (requires backend infrastructure) | Rich (full object schemas, nested data) | Engineering teams, enterprises with existing data platforms |
36-
| **Hybrid** | Combine both approaches - use Boards for operational metrics, webhooks for deep analysis | ⚙️ Medium (partial infrastructure) | Flexible (mix of scalar and object data) | Growing teams balancing simplicity and power |
36+
| **Dashboard Native** | Use Vapi's built-in Boards with scalar Structured Outputs for real-time dashboards | Minimal (no infrastructure) | Basic (scalar fields only) | Solo founders, non-technical teams, startups |
37+
| **Webhook-to-External** | Build custom post-call processing that captures data via webhooks and exports to your data warehouse | High (requires backend infrastructure) | Rich (full object schemas, nested data) | Engineering teams, enterprises with existing data platforms |
38+
| **Hybrid** | Combine both approaches - use Boards for operational metrics, webhooks for deep analysis | Medium (partial infrastructure) | Flexible (mix of scalar and object data) | Growing teams balancing simplicity and power |
3739

3840
**How to choose**: Start with Dashboard Native (fastest setup). Migrate to Hybrid or Webhook-to-External as your analytics needs grow or when you need features like Scorecard visualization or external BI tools.
3941

fern/observability/instrumentation.mdx

Lines changed: 21 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Instrumentation
3-
subtitle: Configure your assistant to capture operational and business metrics
3+
subtitle: Configure your assistant to capture operational and business metrics. This is the **INSTRUMENT phase** of the [observability framework](/observability/framework).
44
slug: observability/instrumentation
55
---
66

@@ -39,6 +39,8 @@ Think of instrumentation as installing sensors in your assistant:
3939
- What metrics will help you debug failures?
4040
- What data do you need for compliance or reporting?
4141

42+
The schemas you define here become the assertions your Evals validate in the [TEST stage](/observability/testing-strategies).
43+
4244
The "Instrumentation tools at a glance" section below shows how to configure custom instrumentation.
4345

4446
---
@@ -48,7 +50,7 @@ The "Instrumentation tools at a glance" section below shows how to configure cus
4850
| Tool | What it does | Configuration |
4951
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------------------ | ---------------------------------------------------------- |
5052
| **Built-in Instrumentation** | Automatic capture of call metadata (duration, cost, timestamps), transcripts, messages, tool calls, operational metrics. | ✅ Automatic - no configuration needed |
51-
| **Structured Outputs** | AI-powered data extraction using JSON Schema. Define custom schemas for customer info, call outcomes, sentiment analysis, summaries. | ⚙️ Configure schemas on assistant |
53+
| **Structured Outputs** | AI-powered data extraction using JSON Schema. Define custom schemas for customer info, call outcomes, sentiment analysis, summaries. | Configure schemas on assistant |
5254
| **Call Analysis** | Legacy feature for generating call summaries using AnalysisPlan configuration. | ⚠️ Legacy (use Structured Outputs for new implementations) |
5355

5456

@@ -134,7 +136,7 @@ Built-in does NOT cover:
134136

135137
**When NOT to use**:
136138

137-
- You only need simple call summaries (consider Call Analysis)
139+
- You only need simple call summaries (Structured Outputs can generate simple summaries too, but may be overkill if you don't need structured data)
138140
- Built-in operational metrics are sufficient
139141

140142
**[Configure Structured Outputs: Quickstart](/assistants/structured-outputs-quickstart)**
@@ -275,19 +277,27 @@ Start with basic business metrics (call success, customer info), then add qualit
275277
Set up your first custom instrumentation
276278
</Card>
277279

278-
<Card
279-
title="Extraction patterns"
280-
icon="diagram-project"
281-
href="/observability/extraction-patterns"
282-
>
283-
Choose your data extraction strategy
284-
</Card>
285-
286280
<Card
287281
title="Testing strategies"
288282
icon="vial"
289283
href="/observability/testing-strategies"
290284
>
291285
Next stage: Validate your instrumented assistant
292286
</Card>
287+
288+
<Card
289+
title="Extraction patterns"
290+
icon="diagram-project"
291+
href="/observability/extraction-patterns"
292+
>
293+
Choose your data extraction strategy
294+
</Card>
295+
296+
<Card
297+
title="Back to framework"
298+
icon="arrow-left"
299+
href="/observability/framework"
300+
>
301+
Return to the observability maturity model
302+
</Card>
293303
</CardGroup>

fern/observability/monitoring.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Monitoring & Operating
3-
subtitle: Visualize trends, track operational health, and ensure production reliability
3+
subtitle: Visualize trends, track operational health, and ensure production reliability. This is the **MONITOR phase** of the [observability framework](/observability/framework).
44
slug: observability/monitoring
55
---
66

fern/observability/observability-framework.mdx

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ If you're just experimenting or building a demo, you might not need the full fra
4040

4141
## The observability maturity model
4242

43-
Vapi's observability tools support a 5-stage progression:
43+
Vapi's observability tools support a 5-phase progression:
4444

4545
```
4646
┌──────────────────────────────────────────────────────────────────┐
@@ -59,7 +59,7 @@ Vapi's observability tools support a 5-stage progression:
5959

6060
### This is a maturity progression, not a linear checklist
6161

62-
You don't complete one stage and never return to it. Observability is **continuous**:
62+
You don't complete one phase and never return to it. Observability is **continuous**:
6363

6464
- **Instrument** as you build new features
6565
- **Test** after every change
@@ -69,17 +69,17 @@ You don't complete one stage and never return to it. Observability is **continuo
6969

7070
**For teams just starting**: Begin with INSTRUMENT + TEST (validate before production). Add EXTRACT + MONITOR as you scale. OPTIMIZE becomes natural once you have data flowing.
7171

72-
**For experienced teams**: You're likely already monitoring production. This framework helps systematize pre-production testing (TEST stage) and formalize continuous improvement (OPTIMIZE stage).
72+
**For experienced teams**: You're likely already monitoring production. This framework helps systematize pre-production testing (TEST phase) and formalize continuous improvement (OPTIMIZE phase).
7373

74-
<span className="vapi-validation">Is "maturity model" the right framing? Should we emphasize iteration more explicitly? How do customer segments (startups vs enterprises) typically progress through these stages?</span>
74+
<span className="vapi-validation">Is "maturity model" the right framing? Should we emphasize iteration more explicitly? How do customer segments (startups vs enterprises) typically progress through these phases?</span>
7575

7676
---
7777

7878
## How this framework maps to Vapi tools
7979

80-
Each stage uses specific Vapi features. Here's a quick reference:
80+
Each phase uses specific Vapi features. Here's a quick reference:
8181

82-
### Stage 1: INSTRUMENT
82+
### INSTRUMENT Phase
8383

8484
Configure your assistant to capture operational and business metrics.
8585

@@ -89,17 +89,17 @@ Configure your assistant to capture operational and business metrics.
8989

9090
---
9191

92-
### Stage 2: TEST
92+
### TEST Phase
9393

9494
Validate your assistant works correctly before production deployment.
9595

96-
**What you'll use**: Evals, Simulations, Test Suites
96+
**What you'll use**: Evals, Simulations (Pre-release), Test Suites (⚠️ Deprecated)
9797

9898
**[Deep dive: Testing strategies](/observability/testing-strategies)**
9999

100100
---
101101

102-
### Stage 3: EXTRACT
102+
### EXTRACT Phase
103103

104104
Choose your data extraction pattern based on technical capability and analytics needs.
105105

@@ -109,7 +109,7 @@ Choose your data extraction pattern based on technical capability and analytics
109109

110110
---
111111

112-
### Stage 4: MONITOR
112+
### MONITOR Phase
113113

114114
Visualize trends, track operational health, and catch problems early.
115115

@@ -119,7 +119,7 @@ Visualize trends, track operational health, and catch problems early.
119119

120120
---
121121

122-
### Stage 5: OPTIMIZE
122+
### OPTIMIZE Phase
123123

124124
Use observability data to continuously improve your assistant.
125125

@@ -167,47 +167,47 @@ Most teams start with Dashboard Native (simple, no engineering required), add we
167167

168168
## Next steps
169169

170-
### Learn the framework stages
170+
### Learn the framework phases
171171

172172
<CardGroup cols={2}>
173173
<Card
174174
title="Instrumentation"
175175
icon="wrench"
176176
href="/observability/instrumentation"
177177
>
178-
Stage 1: Configure data capture
178+
Configure data capture
179179
</Card>
180180

181181
<Card
182182
title="Testing strategies"
183183
icon="vial"
184184
href="/observability/testing-strategies"
185185
>
186-
Stage 2: Validate before production
186+
Validate before production
187187
</Card>
188188

189189
<Card
190190
title="Extraction patterns"
191191
icon="diagram-project"
192192
href="/observability/extraction-patterns"
193193
>
194-
Stage 3: Choose your data pipeline
194+
Choose your data pipeline
195195
</Card>
196196

197197
<Card
198198
title="Monitoring"
199199
icon="chart-line"
200200
href="/observability/monitoring"
201201
>
202-
Stage 4: Track operational health
202+
Track operational health
203203
</Card>
204204

205205
<Card
206206
title="Optimization workflows"
207207
icon="arrow-trend-up"
208208
href="/observability/optimization-workflows"
209209
>
210-
Stage 5: Continuously improve
210+
Continuously improve
211211
</Card>
212212
</CardGroup>
213213

fern/observability/optimization-workflows.mdx

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
---
22
title: Optimization workflows
3-
subtitle: Use observability data to continuously improve your assistant
3+
subtitle: Use observability data to continuously improve your assistant. This is the **OPTIMIZE phase** of the [observability framework](/observability/framework).
44
slug: observability/optimization-workflows
55
---
66

77
<span className="internal-note">This page is in Skeleton Draft stage - structure and scope for review, detailed content to be developed in iteration 2</span>
88

9+
910
## What is optimization?
1011

1112
**Optimization** is the continuous improvement loop: using observability data to refine prompts, improve tool calls, and enhance conversation flows.

fern/observability/testing-strategies.mdx

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Testing strategies
3-
subtitle: Validate your assistant works correctly before deploying to production
3+
subtitle: Validate your assistant works correctly before deploying to production. This is the **TEST phase** of the [observability framework](/observability/framework).
44
slug: observability/testing-strategies
55
---
66

@@ -18,6 +18,8 @@ Unlike traditional software testing (unit tests, integration tests), voice AI te
1818
- **Edge cases** — How does the system handle interruptions, unclear requests, or unexpected inputs?
1919
- **Regression** — Do changes break existing functionality?
2020

21+
Testing assumes you've already instrumented your assistant with Structured Outputs (see [Instrumentation](/observability/instrumentation)).
22+
2123
<span className="vapi-validation">What other specific validation and/or testing uniqueness have clients reported when working with voice AI testing?</span>
2224

2325
---

0 commit comments

Comments
 (0)