Skip to content

Conversation

@aufi
Copy link
Member

@aufi aufi commented Nov 7, 2025

In java provider, there is a .metadata/.log file with full output of jdtls. This might be needed for debugging, adding it to analysis tasks attachments (under more descriptive provider-jdtls.log name).

There were konveyor-related entires of this log already part of provider log, but this PR adds ful content of the jdtls log file.

Fixes: #177

Summary by CodeRabbit

  • New Features
    • Diagnostic upload now automatically includes available metadata logs alongside primary outputs and dependencies.
    • Improves troubleshooting by ensuring richer diagnostic data is submitted when a metadata log exists, with existing error handling preserved.

✏️ Tip: You can customize this high-level summary in your review settings.

In java provider, there is a `.metadata/.log` file with full output of
jdtls. This might be needed for debugging, adding it to analysis tasks
attachments (under more descriptive `provider-jdtls.log` name).

There were konveyor-related entires of this log already part of provider
log, but this PR adds ful content of the jdtls log file.

Fixes: konveyor#177

Signed-off-by: Marek Aufart <[email protected]>
@coderabbitai
Copy link

coderabbitai bot commented Nov 7, 2025

Walkthrough

Added functionality to cmd/analyzer.go to extract and expose the full jdtls language server log file. The analyzer now copies the metadata log to a provider-specific filename, uploads it, and attaches it to the output workflow, with proper error handling for file operations.

Changes

Cohort / File(s) Summary
Log file upload integration
cmd/analyzer.go
Added io package import; introduced conditional block in Run() to detect, copy, rename, upload, and attach jdtls metadata log file with error handling for open, create, copy, and upload operations

Sequence Diagram(s)

sequenceDiagram
    participant Run as Run Function
    participant FS as File System
    participant API as addon.File API
    
    Run->>FS: Check if Dir/.metadata/.log exists
    alt File exists
        Run->>FS: Open metadata log
        Run->>FS: Create Dir/provider-jdtls.log
        Run->>FS: Copy metadata log contents
        Run->>API: Post file (upload)
        alt Upload successful
            Run->>Run: Attach uploaded file to output
        else Upload fails
            Run->>Run: Return error
        end
    else File not found
        Note over Run: Continue (no-op)
    end
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

  • Single file modification with focused scope
  • Straightforward file I/O and upload operations
  • Standard error handling pattern applied consistently across multiple operations
  • No complex logic or structural changes

Poem

🐰 A logger's heart beats full and bright,
No secrets hid from Hub's keen sight,
The jdtls tale now flows complete,
From server logs to user's sheet,
Debugging dreams, debugger's treat!

Pre-merge checks and finishing touches

✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Linked Issues check ✅ Passed The PR meets the requirement from issue #177: it exposes the full jdtls logs from the analyzer by uploading the complete .metadata/.log file as provider-jdtls.log to analysis attachments.
Out of Scope Changes check ✅ Passed All changes are directly related to the objective: importing io package and adding logic to copy, upload, and attach the jdtls log file.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Title check ✅ Passed The title clearly and accurately describes the main change: uploading jdtls logs to analysis attachments, which matches the PR's core objective of exposing full language server logs.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@aufi
Copy link
Member Author

aufi commented Nov 7, 2025

Example log file content:

image

@aufi aufi changed the title Upload jdtls logs to analysis attachments ✨ Upload jdtls log to analysis attachments Nov 7, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
cmd/analyzer.go (1)

59-85: Consider flushing the destination file before upload.

The implementation correctly copies the metadata log and follows the established upload pattern. However, there's a reliability concern: the destination file remains open when addon.File.Post is called on line 79, since the defer dst.Close() on line 73 won't execute until the function returns. While the data from io.Copy is typically visible to other readers on modern systems, explicitly flushing ensures the log is completely written to disk before upload, preventing potential incomplete or buffered data issues.

Apply this diff to explicitly sync the file before uploading:

 		_, err = io.Copy(dst, src)
 		if err != nil {
 			return
 		}
+		err = dst.Sync()
+		if err != nil {
+			return
+		}
 		// Upload the renamed file
 		f, pErr = addon.File.Post(renamedLog)
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c481991 and ab0ff5e.

📒 Files selected for processing (1)
  • cmd/analyzer.go (2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
cmd/analyzer.go (1)
cmd/main.go (1)
  • Dir (23-23)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-and-upload-for-global-ci
🔇 Additional comments (1)
cmd/analyzer.go (1)

4-4: LGTM: Import addition is correct.

The io package is needed for the io.Copy operation introduced later in the file.

}
addon.Attach(f)
}
metadataLog := path.Join(Dir, ".metadata", ".log")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like a java provider specific concern. The java provider knows about jdtls and these log files and should be responsible for logging them properly. I don't think we should be putting provider specific logic in the addon.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree this file is provider-specific (present for java provider only). Reason for not putting it to "standard" tackle2-addon command logging with Reporter is primary its size (typically 100MB per application or more).

There are fragments of this file already added to the java provider output (executed via analyzer), those lines are filtered for konveyor-related stuff. However, it turned out that it is critical to have full jdtls log to debug possible failures on users error reports, where jdtls failed with something, so we need all content that file.

This file is already created and present on the analysis task pod filesystem, so I'd like find an acceptable way how to get it to Hub files attached to analysis. Theoretically could be requested with data on a Task, but that looks risky to me (and we'd need filter allowed paths anyway to not leak e.g. creds).

I'm not sure if provider have a reasonable way to report this file directly. Also I'm thiking about extending the filter for .metadata/.log https://github.com/konveyor/analyzer-lsp/blob/main/external-providers/java-external-provider/pkg/java_external_provider/provider.go#L496-L511 but that also doesn't look as a reliable solution covering unexpected failures and relevant information from the log.

@aufi
Copy link
Member Author

aufi commented Dec 4, 2025

Closing this PR, as it is going to be resolved in different way:

Proposed solution: Use existing addon Data.Verbosity field to tell analyzer (and its providers) expected log level, that will be used by java-external-provider to decide when jdtls should be sent to its stdout.

Steps:

  1. analyzer-lsp - update provider_config.json files with optional logLevel field (CLI args would take priority if both were specified), or push back if there is not suitable solution on analyzer side
  2. java-external-provider - update logging of jdtls command execution according to logLevel to optionally provide full jdtls log
  3. addon-analyzer - pass data.Verbosity to generated initConfig (if >0; note, analyzer uses 5 by default "normal" log)
  4. decide if this update is also needed for containerized (run-local=false) kantra

cc @dymurray

@aufi aufi closed this Dec 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Expose LS full logs from analyzer to Hub and UI

2 participants