Skip to content

Docformatter produces error when parsing docstring with Ruff noqa at the end #346

@Eutropios

Description

@Eutropios

When a docstring has a noqa directive at the end (used for suppressing one off linting errors in a docstring), docformatter throws an error.

Ex:
File:

"""
conf.py
~~~~~~~

Copyright (C) 2024 Noah Jenner under CC BY-SA 4.0.
To view a copy of this license, visit https://creativecommons.org/licenses/by-sa/4.0/

------------------------------------------------------------------------

Configuration file for Sphinx.
"""  # noqa: D205, D400

Error output:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\noahj\.local\bin\docformatter.exe\__main__.py", line 10, in <module>
    sys.exit(main())
             ~~~~^^
  File "C:\Users\noahj\AppData\Roaming\uv\data\tools\docformatter\Lib\site-packages\docformatter\__main__.py", line 144, in main
    return _main(
        sys.argv,
    ...<2 lines>...
        standard_in=sys.stdin,
    )
  File "C:\Users\noahj\AppData\Roaming\uv\data\tools\docformatter\Lib\site-packages\docformatter\__main__.py", line 134, in _main
    return formator.do_format_files()
           ~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "C:\Users\noahj\AppData\Roaming\uv\data\tools\docformatter\Lib\site-packages\docformatter\format.py", line 649, in do_format_files
    result = self._do_format_file(filename)
  File "C:\Users\noahj\AppData\Roaming\uv\data\tools\docformatter\Lib\site-packages\docformatter\format.py", line 815, in _do_format_file
    formatted_source = self._do_format_code(source)
  File "C:\Users\noahj\AppData\Roaming\uv\data\tools\docformatter\Lib\site-packages\docformatter\format.py", line 876, in _do_format_code
    _code = tokenize.untokenize(self.new_tokens)
  File "C:\Users\noahj\AppData\Roaming\uv\data\python\cpython-3.14.2-windows-x86_64-none\Lib\tokenize.py", line 341, in untokenize
    out = ut.untokenize(iterable)
  File "C:\Users\noahj\AppData\Roaming\uv\data\python\cpython-3.14.2-windows-x86_64-none\Lib\tokenize.py", line 262, in untokenize
    self.add_whitespace(start)
    ~~~~~~~~~~~~~~~~~~~^^^^^^^
  File "C:\Users\noahj\AppData\Roaming\uv\data\python\cpython-3.14.2-windows-x86_64-none\Lib\tokenize.py", line 178, in add_whitespace
    raise ValueError("start ({},{}) precedes previous end ({},{})"
                     .format(row, col, self.prev_row, self.prev_col))
ValueError: start (11,5) precedes previous end (12,0)

This occurs regardless of options passed into the command line.

Removing the directives does fix the problem, though this reintroduces linting errors produced by Ruff or other linters.

Metadata

Metadata

Assignees

No one assigned

    Labels

    freshThis is a new issue

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions