Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-50113][CONNECT][PYTHON][TESTS] Add @spark_connect_only to check the APIs that only supported with Spark Connect #48651

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

itholic
Copy link
Contributor

@itholic itholic commented Oct 25, 2024

What changes were proposed in this pull request?

This PR proposes to add @spark_connect_only to check the APIs that only supported with Spark Connect

Why are the changes needed?

The current compatibility check cannot capture the missing methods that only supported with Spark Connect

Does this PR introduce any user-facing change?

No, it's test-only

How was this patch tested?

Updated the existing UT

Was this patch authored or co-authored using generative AI tooling?

No

Comment on lines +270 to +285
expected_missing_connect_methods = {
"addArtifact",
"addArtifacts",
"addTag",
"clearProgressHandlers",
"clearTags",
"copyFromLocalToFs",
"getTags",
"interruptAll",
"interruptOperation",
"interruptTag",
"newSession",
"registerProgressHandler",
"removeProgressHandler",
"removeTag",
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now the compatibility check respect the ONLY_SUPPORTED_WITH_SPARK_CONNECT.

cc @HyukjinKwon @hvanhovell FYI

inspect.isfunction(method) or isinstance(method, functools._lru_cache_wrapper)
) and not name.startswith("_"):
source_lines = inspect.getsource(method).upper()
if "ONLY_SUPPORTED_WITH_SPARK_CONNECT" in source_lines:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is too flaky.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's probably don't do this for now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree that it's flaky but I believe this is currently the only way to check functions that only supported from connect.
Should we just close it? Also cc @hvanhovell

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should rely on reading/checking the code itself in the test. This isn't what we meant for signature comparison / compatibility test.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using a new decorator @SparkConnectOnly which doesn't suffer from flakiness? So instead of having a list in test_spark_session_compatibility we now spread the items to their declaration sites.
That would make people more aware of adding such a decorator when needed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using a new decorator @SparkConnectOnly

It sounds like a reasonable suggestion.
Let me create a separate PR and will ping you guys when it's ready.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xupefei just applied the comments. Could you review again when you find some time? Thanks!

Also cc @HyukjinKwon now we apply the @spark_connect_only decorator for APIs that are only supported with Spark Connect and use it from the testing instead of string comparison.

@itholic itholic changed the title [SPARK-50113][CONNECT][PYTHON][TESTS] Compatibility check should respect ONLY_SUPPORTED_WITH_SPARK_CONNECT [SPARK-50113][CONNECT][PYTHON][TESTS] Add @spark_connect_only to check the APIs that only supported with Spark Connect Nov 15, 2024
@github-actions github-actions bot added the CORE label Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants