Skip to content

read_point_cloud tensor API crashes with large Point Clouds (Segmentation Fault) #7402

@photosartd

Description

@photosartd

Checklist

Describe the issue

Hi, I looked for a similar issue but could not find it directly. When reading a large .pcd point cloud (~1.2B points) with xyz and a custom field "labels", the process crashes with a "Segmentation fault" messsage. The same works for smaller .pcd files, however, and the old API can even open larger .pcd files (not with custom fields though).

Here is the header of the file I tried to open:

VERSION 0.7
FIELDS x y z labels
SIZE 4 4 4 4
TYPE F F F U
COUNT 1 1 1 1
WIDTH 1201172100
HEIGHT 1
VIEWPOINT 0.0 0.0 0.0 1.0 0.0 0.0 0.0
POINTS 1201172100
DATA binary

I know I have enough RAM for this (600+ GB), and the whole .pcd filesize is 18GB. I was able to successfully open it with pypcd4, however, it needs a copy to bring it to numpy and then transfer into the Open3D Tensor API.

I saw many similar problems in issues so I do not hope for a fast fix but just letting you know if anyone encounters the same

Steps to reproduce the bug

tensor_pcd = o3d.t.io.read_point_cloud(path)

Error message

Segmentation fault

Expected behavior

No response

Open3D, Python and System information

- Operating system: Debian GNU/Linux 12 (bookworm)
- Python version: Python 3.11.14
- Open3D version: 0.19.0
- System architecture: x86
- Is this a remote workstation?: yes
- How did you install Open3D?: pip

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugNot a build issue, this is likely a bug.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions