Horizon Europe & Eureka - How to (not) use AI in grant proposal writing

Published on:
20 May 2025

This short guide gives you insight into the risks when you use Generative AI (GenAI) while writing a project proposal. It also gives you insight in which questions you should ask yourself, your partners in the consortium and your grant consultant.

Rapid development of new AI tools 

New Generative AI (GenAI) models enter the market every day. Especially providers based in the US and China release new versions for different user groups. These models often form the basis for tools that can support the grant process. However, when you use GenAI several things can be unclear, for example: 

  • Which AI model is behind a certain tool?
  • What are the user conditions?
  • What happens with the data you enter and generate?
  • Who could use this data? 

Somebody in your consortium or your grant consultant may be using an AI model without your knowlegde. This can pose certain risks, especially in the domain of research and innovation. 

Consequences for eligibility

Not upholding to AI guidelines of funding programmes can make a proposal ineligible

An example is the guidance within the standard application form for CSA proposals (pdf) of the Horizon Europe programme. Page 32 of this form mentions that applicants: 

  • are fully responsible for parts that are produced by an AI tool;
  • must be transparent concerning the tools they used;
  • must be able to provide a list of sources used to generate/rewrite content and citations; and
  • should be conscious of potential plagiarism caused by text produced by an AI tool. 

This example illustrates that you should always be aware of the guidelines on the use of AI tools in the funding programme for which you are applying.

Consequences for privacy

GenAI tools are often never fully private: treat them as open-source tools

Many providers of AI tools can access the input you provided to the model and its output to improve the model and their services. See, for example, the terms and conditions of Le Chat of Mistral. Paragraph 3. Your Data is especially important when you handle sensitive data, such as names and skills of project members, or financial data and intellectual property of your organisation.

Often, you can turn off the use of your data for improving the model. It is important that all involved consortium partners and grant consultants use AI tools with this setting turned off.

Consequences for patents 

Do not use GenAI tools on an idea that you would like to patent later

Even if all options for using the input and output for AI tool model improvement are turned off, providers of AI tools can often still monitor their tools and see the provided input. An example of monitoring is the 'abuse monitoring' by OpenAI.  If your input relates to a potential invention that you want to patent later, you could lose your right to a patent when you use AI tools. Providing the information to the tool could be seen as disclosing your invention.

Therefore, never use information in AI tools that has the potential to result in a patent. You can reduce the risk if the AI model is run locally by a service provider with whom you have a confidentiality agreement. You can substantially reduce the risk if the AI model is run locally by yourself. 

Questions to ask during the proposal process

Contact an advisor

Please note: as the National Contact Point and advisors for several European Commission and Eureka network funding programmes, we only support applicants in the proposal writing stage. So, as an applicant you are and will remain responsible for correct use of AI solutions.

Do you have any questions about Horizon Europe? Or need an advisor to think along with you?

Contact one of our advisors

Commissioned by:
  • Ministry of Economic Affairs
Is this page useful?