Horizon Europe & Eureka - How to (not) use AI in grant proposal writing
This short guide gives you insight into the risks when you use Generative AI (GenAI) while writing a project proposal. It also gives you insight in which questions you should ask yourself, your partners in the consortium and your grant consultant.
Rapid development of new AI tools
New Generative AI (GenAI) models enter the market every day. Especially providers based in the US and China release new versions for different user groups. These models often form the basis for tools that can support the grant process. However, when you use GenAI several things can be unclear, for example:
- Which AI model is behind a certain tool?
- What are the user conditions?
- What happens with the data you enter and generate?
- Who could use this data?
Somebody in your consortium or your grant consultant may be using an AI model without your knowlegde. This can pose certain risks, especially in the domain of research and innovation.
Consequences for eligibility
Not upholding to AI guidelines of funding programmes can make a proposal ineligible
An example is the guidance within the standard application form for CSA proposals (pdf) of the Horizon Europe programme. Page 32 of this form mentions that applicants:
- are fully responsible for parts that are produced by an AI tool;
- must be transparent concerning the tools they used;
- must be able to provide a list of sources used to generate/rewrite content and citations; and
- should be conscious of potential plagiarism caused by text produced by an AI tool.
This example illustrates that you should always be aware of the guidelines on the use of AI tools in the funding programme for which you are applying.
Consequences for privacy
GenAI tools are often never fully private: treat them as open-source tools
Many providers of AI tools can access the input you provided to the model and its output to improve the model and their services. See, for example, the terms and conditions of Le Chat of Mistral. Paragraph 3. Your Data is especially important when you handle sensitive data, such as names and skills of project members, or financial data and intellectual property of your organisation.
Often, you can turn off the use of your data for improving the model. It is important that all involved consortium partners and grant consultants use AI tools with this setting turned off.
Consequences for patents
Do not use GenAI tools on an idea that you would like to patent later
Even if all options for using the input and output for AI tool model improvement are turned off, providers of AI tools can often still monitor their tools and see the provided input. An example of monitoring is the 'abuse monitoring' by OpenAI. If your input relates to a potential invention that you want to patent later, you could lose your right to a patent when you use AI tools. Providing the information to the tool could be seen as disclosing your invention.
Therefore, never use information in AI tools that has the potential to result in a patent. You can reduce the risk if the AI model is run locally by a service provider with whom you have a confidentiality agreement. You can substantially reduce the risk if the AI model is run locally by yourself.
Questions to ask during the proposal process
- Which specific AI models does the tool use? And are they reputable and reliable? Has the model been reviewed by independent parties for reliability, privacy, and ethical considerations?
- What organisation owns and provides the AI tool? And what are their data policies? Always carefully review the terms and conditions before you purchase (license) or use the tool:
- Where is the AI model hosted? And who has access to the server? Is the server local to the organisation providing the tool, or is it cloud-based?
- Can training on your prompts and results be disabled? And is this noted in the (license) contract?
- When is data deleted? Who has access to prompt history and outputs? And do privacy conditions align with your organisation’s standards?
- What manual abuse monitoring checks are in place? And how are they conducted?
- Do you have copyright protection on the generated content?
Practical tips:
- Do not use AI on the last day before handing in the proposal.
- Check if everything is complete and correct.
- Check for plagiarism, sources and state that you used AI.
Does your consultant use AI or any tools based on it? If so, consider to let them sign an agreement on the use of AI, and ask them the questions in section 1.
Do your partners use AI or any tools based on it? If so, consider to sign a mutual agreement on the use of AI and ask them the questions in section 1.
Contact an advisor
Please note: as the National Contact Point and advisors for several European Commission and Eureka network funding programmes, we only support applicants in the proposal writing stage. So, as an applicant you are and will remain responsible for correct use of AI solutions.
Do you have any questions about Horizon Europe? Or need an advisor to think along with you?
- Ministry of Economic Affairs