Skip to content

Commit 768b755

Browse files
authored
Fix broken links in tutorials (#427)
* update the pipeline link in tut 44 * updated the 404 links in tut 27, 35 and 40 * remove the Prepare the Colab Environment section from template and tutorials
1 parent 3919e45 commit 768b755

12 files changed

+7
-124
lines changed

tutorials/27_First_RAG_Pipeline.ipynb

Lines changed: 2 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -28,18 +28,6 @@
2828
"For this tutorial, you'll use the Wikipedia pages of [Seven Wonders of the Ancient World](https://en.wikipedia.org/wiki/Wonders_of_the_World) as Documents, but you can replace them with any text you want.\n"
2929
]
3030
},
31-
{
32-
"cell_type": "markdown",
33-
"metadata": {
34-
"id": "QXjVlbPiO-qZ"
35-
},
36-
"source": [
37-
"## Preparing the Colab Environment\n",
38-
"\n",
39-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
40-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
41-
]
42-
},
4331
{
4432
"cell_type": "markdown",
4533
"metadata": {
@@ -361,7 +349,7 @@
361349
"### Initialize a ChatGenerator\n",
362350
"\n",
363351
"\n",
364-
"ChatGenerators are the components that interact with large language models (LLMs). Now, set `OPENAI_API_KEY` environment variable and initialize a [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/OpenAIChatGenerator) that can communicate with OpenAI GPT models. As you initialize, provide a model name:"
352+
"ChatGenerators are the components that interact with large language models (LLMs). Now, set `OPENAI_API_KEY` environment variable and initialize a [OpenAIChatGenerator](https://docs.haystack.deepset.ai/docs/openaichatgenerator) that can communicate with OpenAI GPT models. As you initialize, provide a model name:"
365353
]
366354
},
367355
{
@@ -626,4 +614,4 @@
626614
},
627615
"nbformat": 4,
628616
"nbformat_minor": 0
629-
}
617+
}

tutorials/29_Serializing_Pipelines.ipynb

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -30,18 +30,6 @@
3030
"Although it's possible to serialize into other formats too, Haystack supports YAML out of the box to make it easy for humans to make changes without the need to go back and forth with Python code. In this tutorial, we will create a very simple pipeline in Python code, serialize it into YAML, make changes to it, and deserialize it back into a Haystack `Pipeline`."
3131
]
3232
},
33-
{
34-
"cell_type": "markdown",
35-
"metadata": {
36-
"id": "9smrsiIqfS7J"
37-
},
38-
"source": [
39-
"## Preparing the Colab Environment\n",
40-
"\n",
41-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
42-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
43-
]
44-
},
4533
{
4634
"cell_type": "markdown",
4735
"metadata": {

tutorials/30_File_Type_Preprocessing_Index_Pipeline.ipynb

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -40,18 +40,6 @@
4040
"Optionally, you can keep going to see how to use these documents in a query pipeline as well."
4141
]
4242
},
43-
{
44-
"cell_type": "markdown",
45-
"metadata": {
46-
"id": "rns_B_NGN0Ze"
47-
},
48-
"source": [
49-
"## Preparing the Colab Environment\n",
50-
"\n",
51-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
52-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
53-
]
54-
},
5543
{
5644
"cell_type": "markdown",
5745
"metadata": {

tutorials/31_Metadata_Filtering.ipynb

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -28,18 +28,6 @@
2828
"Although new retrieval techniques are great, sometimes you just know that you want to perform search on a specific group of documents in your document store. This can be anything from all the documents that are related to a specific _user_, or that were published after a certain _date_ and so on. Metadata filtering is very useful in these situations. In this tutorial, we will create a few simple documents containing information about Haystack, where the metadata includes information on what version of Haystack the information relates to. We will then do metadata filtering to make sure we are answering the question based only on information about Haystack 2.0.\n"
2929
]
3030
},
31-
{
32-
"cell_type": "markdown",
33-
"metadata": {
34-
"id": "tM3U5KyegTAE"
35-
},
36-
"source": [
37-
"## Preparing the Colab Environment\n",
38-
"\n",
39-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
40-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
41-
]
42-
},
4331
{
4432
"cell_type": "markdown",
4533
"metadata": {
@@ -269,4 +257,4 @@
269257
},
270258
"nbformat": 4,
271259
"nbformat_minor": 0
272-
}
260+
}

tutorials/32_Classifying_Documents_and_Queries_by_Language.ipynb

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -32,18 +32,6 @@
3232
"In the last section, you'll build a multi-lingual RAG pipeline. The language of a question is detected, and only documents in that language are used to generate the answer. For this section, the [`TextLanguageRouter`](https://docs.haystack.deepset.ai/docs/textlanguagerouter) will come in handy.\n"
3333
]
3434
},
35-
{
36-
"cell_type": "markdown",
37-
"metadata": {
38-
"id": "oBa4Q25cGTr6"
39-
},
40-
"source": [
41-
"## Preparing the Colab Environment\n",
42-
"\n",
43-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
44-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
45-
]
46-
},
4735
{
4836
"cell_type": "markdown",
4937
"metadata": {

tutorials/33_Hybrid_Retrieval.ipynb

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -28,18 +28,6 @@
2828
"There are many cases when a simple keyword-based approaches like BM25 performs better than a dense retrieval (for example in a specific domain like healthcare) because a dense model needs to be trained on data. For more details about Hybrid Retrieval, check out [Blog Post: Hybrid Document Retrieval](https://haystack.deepset.ai/blog/hybrid-retrieval)."
2929
]
3030
},
31-
{
32-
"cell_type": "markdown",
33-
"metadata": {
34-
"id": "ITs3WTT5lXQT"
35-
},
36-
"source": [
37-
"## Preparing the Colab Environment\n",
38-
"\n",
39-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
40-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/setting-the-log-level)"
41-
]
42-
},
4331
{
4432
"cell_type": "markdown",
4533
"metadata": {
@@ -571,4 +559,4 @@
571559
},
572560
"nbformat": 4,
573561
"nbformat_minor": 0
574-
}
562+
}

tutorials/34_Extractive_QA_Pipeline.ipynb

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -29,18 +29,6 @@
2929
"To get data into the extractive pipeline, you'll also build an indexing pipeline to ingest the [Wikipedia pages of Seven Wonders of the Ancient World dataset](https://en.wikipedia.org/wiki/Wonders_of_the_World)."
3030
]
3131
},
32-
{
33-
"cell_type": "markdown",
34-
"metadata": {
35-
"id": "eF_hnatJUEHq"
36-
},
37-
"source": [
38-
"## Preparing the Colab Environment\n",
39-
"\n",
40-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
41-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/logging)"
42-
]
43-
},
4432
{
4533
"cell_type": "markdown",
4634
"metadata": {

tutorials/35_Evaluating_RAG_Pipelines.ipynb

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -52,18 +52,6 @@
5252
"<iframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/5PrzXaZ0-qk?si=lgBSfHatbV2i59J-\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen></iframe>\n"
5353
]
5454
},
55-
{
56-
"cell_type": "markdown",
57-
"metadata": {
58-
"id": "QXjVlbPiO-qZ"
59-
},
60-
"source": [
61-
"## Preparing the Colab Environment\n",
62-
"\n",
63-
"- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration)\n",
64-
"- [Set logging level to INFO](https://docs.haystack.deepset.ai/docs/setting-the-log-level)"
65-
]
66-
},
6755
{
6856
"cell_type": "markdown",
6957
"metadata": {
@@ -382,7 +370,7 @@
382370
"\n",
383371
"In this example, we'll be using:\n",
384372
"- [`InMemoryEmbeddingRetriever`](https://docs.haystack.deepset.ai/docs/inmemoryembeddingretriever) which will get the relevant documents to the query.\n",
385-
"- [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/OpenAIChatGenerator) to generate answers to queries. You can replace `OpenAIChatGenerator` in your pipeline with another `ChatGenerator`. Check out the full list of generators [here](https://docs.haystack.deepset.ai/docs/generators)."
373+
"- [`OpenAIChatGenerator`](https://docs.haystack.deepset.ai/docs/openaichatgenerator) to generate answers to queries. You can replace `OpenAIChatGenerator` in your pipeline with another `ChatGenerator`. Check out the full list of generators [here](https://docs.haystack.deepset.ai/docs/generators)."
386374
]
387375
},
388376
{

tutorials/40_Building_Chat_Application_with_Function_Calling.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"\n",
2828
"📚 Useful Sources:\n",
2929
"* [OpenAIChatGenerator Docs](https://docs.haystack.deepset.ai/docs/openaichatgenerator)\n",
30-
"* [OpenAIChatGenerator API Reference](https://docs.haystack.deepset.ai/reference/generator-api#openaichatgenerator)\n",
30+
"* [OpenAIChatGenerator API Reference](https://docs.haystack.deepset.ai/reference/generators-api#openaichatgenerator)\n",
3131
"* [🧑‍🍳 Cookbook: Function Calling with OpenAIChatGenerator](https://github.com/deepset-ai/haystack-cookbook/blob/main/notebooks/function_calling_with_OpenAIChatGenerator.ipynb)\n",
3232
"\n",
3333
"[OpenAI's function calling](https://platform.openai.com/docs/guides/function-calling) connects large language models to external tools. By providing a `tools` list with functions and their specifications to the OpenAI API calls, you can easily build chat assistants that can answer questions by calling external APIs or extract structured information from text.\n",

tutorials/42_Sentence_Window_Retriever.ipynb

Lines changed: 0 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -24,17 +24,6 @@
2424
"`SentenceWindowRetriever(document_store=doc_store, window_size=2)`"
2525
]
2626
},
27-
{
28-
"cell_type": "markdown",
29-
"id": "784caaa2",
30-
"metadata": {},
31-
"source": [
32-
"\n",
33-
"## Preparing the Colab Environment\n",
34-
"\n",
35-
"- [Enable GPU Runtime](https://docs.haystack.deepset.ai/docs/enabling-gpu-acceleration#enabling-the-gpu-in-colab)\n"
36-
]
37-
},
3827
{
3928
"cell_type": "markdown",
4029
"id": "98c2f9d3",

0 commit comments

Comments
 (0)