-
Notifications
You must be signed in to change notification settings - Fork 272
build: Add spark-4.1 profile and shims #2829
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #2829 +/- ##
============================================
+ Coverage 56.12% 59.50% +3.38%
- Complexity 976 1384 +408
============================================
Files 119 168 +49
Lines 11743 15561 +3818
Branches 2251 2582 +331
============================================
+ Hits 6591 9260 +2669
- Misses 4012 5002 +990
- Partials 1140 1299 +159 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
31cf1ca to
88f8b68
Compare
6d6dfb4 to
2956aac
Compare
|
@andygrove @coderfender Please help review. Do we need to pass all tests now? |
4207552 to
e65600c
Compare
|
Test failure: |
Yes, we either need tests to pass, or we can potentially disable some specific tests by updating the diff file, and file issues to resolve those failures. |
|
@andygrove Sure, since Spark 4.1.1 is to be released soon, I will check again after upgrading to 4.1.1 |
10ba943 to
01cff6e
Compare
Co-authored-by: Andy Grove <agrove@apache.org>
be3cef4 to
524d90a
Compare
Which issue does this PR close?
First step of #2792.
Rationale for this change
What changes are included in this PR?
spark-4.1profile with minor shim versionspark-4.1src/main/spark-4.0tosrc/main/spark-4.xfor common shim classes in spark-4.0 and spark-4.1.CometSumShimandShimSQLConffor spark-4.0 and spark-4.1 specific shims respectivelyMapStatusBuilder.scalato accessorg.apache.spark.scheduler.MapStatusin java.MapStatushas added a constructor argument in Spark 4.1, and only kept compatibility for scala codes.How are these changes tested?
Added UTs.