Skip to content

storeChunk: Workaround for bug with parallel flushing#5510

Draft
franzpoeschel wants to merge 2 commits intoComputationalRadiationPhysics:devfrom
franzpoeschel:fix-span-parallel-flushing
Draft

storeChunk: Workaround for bug with parallel flushing#5510
franzpoeschel wants to merge 2 commits intoComputationalRadiationPhysics:devfrom
franzpoeschel:fix-span-parallel-flushing

Conversation

@franzpoeschel
Copy link
Contributor

Workaround for this bug: openPMD/openPMD-api#1794

I think that this has little importance for PIConGPU since we process Iterations collectively anyway, making it unlikely to happen.
But I did stumble over this issue a while ago. Of course I did not document it and have no reproducer at hand now…

TODO:

  • Check if other places are affected. I think not, only the particles may have zero contributions from some ranks only.
  • Check for a reproducer.. I guess replacing .writeIterations() with .snapshots() and using WRITE_RANDOM_ACCESS might trigger the issue, but in that case it would be restricted to dev versions of openPMD, hence not so important.

@franzpoeschel franzpoeschel marked this pull request as draft October 29, 2025 12:42
@psychocoderHPC psychocoderHPC added this to the 0.9.0 / next stable milestone Jan 14, 2026
@psychocoderHPC psychocoderHPC added the bug a bug in the project's code label Jan 14, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug a bug in the project's code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants