Skip to content

fix: prevent markdown converter hang and support label custom fields#44

Draft
Hinne1 wants to merge 3 commits intomainfrom
claude/fix-converter-and-fields
Draft

fix: prevent markdown converter hang and support label custom fields#44
Hinne1 wants to merge 3 commits intomainfrom
claude/fix-converter-and-fields

Conversation

@Hinne1
Copy link
Copy Markdown
Contributor

@Hinne1 Hinne1 commented Apr 7, 2026

Summary

Two independent bug fixes discovered while creating NX-15439 — both surfaced from the same incident response session.

1. Markdown converter could hang indefinitely

atl issue create --description '...' could silently spin forever on certain indented markdown — most reliably an indented fenced code block inside an ordered list:

1. Verify the token works:
    \`\`\`
    curl https://api.example.com/v1/me
    \`\`\`

Root cause: parseBlocks checked strings.HasPrefix(line, "\u0060\u0060\u0060") against the raw line, so the leading whitespace caused indented fences to slip past every block branch and fall into parseParagraph. parseParagraph then broke immediately on the trimmed prefix and returned consumed=0. The dispatch loop did i += 0 and re-dispatched the same line forever. No timeout, no error.

Fix:

  • parseBlocks trims before checking the \``` prefix so indented fences are parsed as code blocks.
  • parseParagraph is now defensive: it only treats block-prefix lines as a break condition after at least one paragraph line has been collected. A guarantee that consumed >= 1, so any future dispatcher gap can no longer cause an infinite loop.

2. Label-typed custom fields rejected by --field

Custom fields with type labels (e.g. NX Repo, Applicationschema.custom = "...customfieldtypes:labels", type = "array", items = "string") returned 400 because the CLI sent the bare string instead of an array:

$ atl issue edit NX-15439 --field "Repo=API"
Error: 400 Bad Request: customfield_10410: Specify the value for Repo in an array of strings

The previous code only handled labels when Schema.Custom == "" (the system Labels field), missing every custom-label field. The workaround was building a JSON file and using --field-file for what should be a single-value flag.

Fix:

  • Added Items to FieldSchema so the parser can read the schema's element type.
  • coerceFieldValue now detects label-typed fields three ways: Schema.Custom contains "labels", OR Type=="array" && Items=="string", OR Type=="array" && Custom=="" (preserved behavior). All coerce to []string.

After this:

atl issue edit NX-15439 --field "Repo=API" --field "Application=MTS"

works without --field-file.

Test plan

  • TestMarkdownToADF_IndentedCodeFenceInList — repros the original hang, guarded by 2s timeout
  • TestMarkdownToADF_IndentedCodeFenceTopLevel — top-level indented fence
  • TestParseParagraph_NeverReturnsZero — table-driven check for every block-prefix edge case (#, \``, >, -, 1., ---`)
  • TestCoerceFieldValue_LabelsCustomField — repros the field 400 error, single + comma-separated
  • TestCoerceFieldValue_StandardLabelsField — preserves system Labels behavior
  • TestCoerceFieldValue_SelectStillWorks / _RadioStillWorks — regression guards
  • go test ./... — full suite passes
  • make lint — zero issues
  • End-to-end verified against prod Jira: built from this branch and successfully created NX-15441 with the original hang-triggering markdown AND --field "Repo=API" --field "Application=MTS" in a single command (NX-15441 then closed).

Hinne1 added 2 commits April 7, 2026 13:35
The markdown→ADF converter could hang indefinitely on certain indented
content — most reliably an indented fenced code block inside an ordered
or bulleted list:

    1. Verify the token works:
        ```
        curl https://api.example.com/v1/me
        ```

What happened:
1. parseOrderedList saw the indented "```" line, didn't recognize it as
   a list item, and broke — consuming only the "1." line.
2. parseBlocks then dispatched the "    ```" line, but its check used
   the raw line (`strings.HasPrefix(line, "\u0060\u0060\u0060")`) so
   the leading whitespace caused it to slip past every block branch.
3. It fell into parseParagraph, which broke immediately on the trimmed
   "```" prefix and returned consumed=0.
4. parseBlocks did `i += 0` and re-dispatched the same line forever.

The user-visible symptom was `atl issue create --description '...'`
hanging with no output, no error, no timeout — silent infinite loop.

Two-part fix:
- parseBlocks now trims before checking the "```" prefix, so indented
  fences are routed to parseCodeBlock as they should be.
- parseParagraph now guarantees consumed >= 1 by only treating
  block-shaped lines as a break condition AFTER at least one paragraph
  line has been collected. If a block-shaped line slips through to the
  paragraph parser, it is consumed as literal paragraph text rather
  than causing a hang. This is a defensive backstop for any future
  dispatcher gap.

Tests:
- TestMarkdownToADF_IndentedCodeFenceInList — repros the original hang,
  guarded by a 2s timeout
- TestMarkdownToADF_IndentedCodeFenceTopLevel — top-level indented fence
- TestParseParagraph_NeverReturnsZero — table-driven check that the
  paragraph parser always consumes at least one line for any block-prefix
  edge case (indented #, ```, >, -, 1., ---)

Discovered while creating NX-15439.
Custom fields with type "labels" (e.g. NX `Repo`, `Application` —
schema.custom = "...customfieldtypes:labels", schema.type = "array",
schema.items = "string") were rejected by Jira because the CLI sent the
bare string instead of an array:

    atl issue edit NX-15439 --field "Repo=API"
    Error: 400 Bad Request: customfield_10410: Specify the value for
    Repo in an array of strings

The previous code only handled labels when Schema.Custom was empty
(the system "labels" field), missing every custom-label field. The
workaround was building a JSON file and using --field-file, which is
cumbersome for a single value.

Fix:
- Add Items field to FieldSchema so the parser can read the schema's
  array element type ("string" for label fields).
- coerceFieldValue now detects label-typed fields three ways:
  1. Schema.Custom contains "labels" (custom label fields)
  2. Schema.Type=="array" + Schema.Items=="string" (any string-array)
  3. Schema.Type=="array" + Schema.Custom=="" (preserved behavior for
     the system Labels field)
  All three are coerced to a comma-split []string.

After this:
    atl issue edit NX-15439 --field "Repo=API" --field "Application=MTS"
works without --field-file.

Tests cover custom labels (Repo), system labels, and verify select /
radio coercion is not regressed by the new label detection.

Discovered while creating NX-15439.
Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses an infinite loop in the markdown parser triggered by indented code fences and extends Jira field coercion to support custom label fields. The markdown parsing logic now correctly handles indented fences by trimming whitespace and guarantees that parseParagraph always consumes at least one line to prevent hangs. In the Jira field utility, the logic for identifying label-type fields was expanded to include custom fields. Feedback was provided regarding the need to filter out empty strings when parsing comma-separated labels to avoid potential API rejection.

Jira rejects labels containing empty strings with a 400. Previously
inputs like `Repo=API,,GUI`, `Repo=,API,GUI`, or `Repo=API, ,GUI`
preserved the empty entries verbatim and produced API rejections that
surfaced as confusing validation errors.

Filter empty strings after trimming so:
  "API,,GUI"  -> ["API", "GUI"]
  "API,GUI,"  -> ["API", "GUI"]
  ",,,"        -> []

Addresses gemini-code-assist review feedback on PR #44.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant