IP & Data Security Agreement

By participating in the LLM evaluation tasks hackathon with Apart, you agree to the following:

Intellectual Property Rights

You own what you make. Apart and METR can promote it and use it for benchmarks and future research in collaboration with others. In Legalese, this means that Apart and METR have a perpetual, non-exclusive, transferable, irrevocable, worldwide license to your submission.

Don't Let AIs Train On Your Dataset

Tasks and solutions should stay secret. Please avoid talking about the tasks with chatbots with memory. Doing research with Github Copilot is not fine. Using private Google Docs is fine. Using private Google Colabs is fine (Google ToS). Using a private Github repository is fine. Using any of these publicly is not fine.

Due to these restrictions, please use the provided template package and setup tooling for task-based model evaluation research. Please shortly review the data privacy guidelines for any AI services you use and toggle data security when possible.

Statement of Compliance

You will have to include a statement of compliance that details any ways you used AI in your project so we can backtrack any potential training data violations.

We Must Protect the Tasks

If solutions leak out, the tasks lose value. So if anything goes public, the tasks you make will be significantly less useful. This may disqualify the bounty. These efforts protect the validity of our model evaluation work.

Your submitted project will not be publicly available on submission unlike other hackathons.

Sign in to Google to save your progress. Learn more
Email
By entering your email here, you agree to follow the above terms.
Submit
Clear form
Never submit passwords through Google Forms.
This form was created inside of Apart Research. Report Abuse