lakeFS metadata Spark client
As part of "lakeFS on the Rocks" we're designing client code that will enable you to write programs to process your commits.  We'd like to create a Spark client to read inventories of committed data (metaranges).  These inventories include lakeFS paths, paths to objects on underlying storage, and some additional metadata per object.
We'd like some information on which versions you would like us to support.
Sign in to Google to save your progress. Learn more
I would like to use Spark to process inventories of lakeFS committed data.
If you would like to use some other batch engine, please select "Other".
Clear selection
I can use these versions of Spark:
I prefer to be able to read the implementation in:
Anything else we should know about metadata clients?
[optional] Please email me a summary of results to
Thanks!
Submit
Clear form
Never submit passwords through Google Forms.
This form was created inside of treeverse.io. Report Abuse