Replies: 2 comments
-
Currently, the authz supports Spark 3.0 to 3.3 versions, which is validated by CI. If you mean if the authz build against one Spark version is compatible with all supported versions, the authz module is not tested via CI like
We almost use reflection to instantiate and match logical plan nodes everywhere in authz module, if you changed some method args, the invocation methods may need to change correspondingly. Considering it could be a common case that vendors change the catalyst, I think there is room to improve to adapt such cases. We borrowed some reflection tools from Apache Parquet, which allows writing reflection invocation in a chain style. For example: There is a method
internally, it was changed to two args
to adapt to those internal change, you only need to DynMethods.builder("bar")
.impl(foo, a)
+ .impl(foo, a, b) // FOR INTERNAL CHANGE
.buildChecked |
Beta Was this translation helpful? Give feedback.
-
cc @bowenliang123 (one of the primary authors of this module) |
Beta Was this translation helpful? Give feedback.
-
Now we know that authz module loads rules from json files, is it enough to process multi-version of spark?
On the other hand, we had done some feature in spark engine and these may change attributes of logical plan. Is it possible to add a new feature to load json files by config?
Beta Was this translation helpful? Give feedback.
All reactions