You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+8-9Lines changed: 8 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -790,17 +790,18 @@ The program then writes that one record into a local Parquet file, does a second
790
790
791
791
### Bonus: download a full crawl index and query with DuckDB
792
792
793
-
If you want to run many of these queries, and you have a lot of disk space, you'll want to download the 300 gigabyte index and query it repeatedly. Run
794
-
All of these scripts run the same SQL query and should return the same record (written as a parquet file).
793
+
In case you want to run many of these queries, and you have a lot of disk space, you'll want to download the 300 gigabyte index and query it repeatedly.
794
+
795
+
> [!IMPORTANT]
796
+
> If you happen to be using the Common Crawl Foundation development server, we've already downloaded these files, and you can run ```make duck_ccf_local_files```
797
+
798
+
To download the crawl index, there are two options: if you have access to the CCF AWS buckets, run:
> If you happen to be using the Common Crawl Foundation development server, we've already downloaded these files, and you can run ```make duck_ccf_local_files```
803
-
804
805
If, by any other chance, you don't have access through the AWS CLI:
805
806
806
807
```shell
@@ -822,7 +823,7 @@ rm cc-index-table.paths
822
823
cd -
823
824
```
824
825
825
-
The structure should be something like this:
826
+
In both ways, the file structure should be something like this:
826
827
```shell
827
828
tree my_data
828
829
my_data
@@ -835,10 +836,8 @@ my_data
835
836
836
837
Then, you can run `make duck_local_files LOCAL_DIR=/path/to/the/downloaded/data` to run the same query as above, but this time using your local copy of the index files.
837
838
838
-
> [!IMPORTANT]
839
-
> If you happen to be using the Common Crawl Foundation development server, we've already downloaded these files, and you can run ```make duck_ccf_local_files```
839
+
Both `make duck_ccf_local_files` and `make duck_local_files LOCAL_DIR=/path/to/the/downloaded/data` run the same SQL query and should return the same record (written as a parquet file).
840
840
841
-
All of these scripts run the same SQL query and should return the same record (written as a parquet file).
0 commit comments