2.3 Transparency and Disclosure Requirements

Learning Objectives

By the end of this lesson, you will be able to:

  • Navigate funder policies regarding AI use in grant applications.

  • Develop transparency and disclosure strategies.

  • Maintain positive relationships with funders while using AI assistance for grant preparation.

  • Create accountability and transparency systems for AI use.

Lesson Content

Funders are formulating policies to address AI use in the writing and reviewing of grant applications. While funders appear to universally forbid members of their review panels from using AI tools when evaluating proposals, funders are less unified on whether AI tools can be used in proposal preparation.

Funders’ Policies on AI Use

Funders’ policies on AI use vary, so as part of the preliminary steps before proposal writing begins, proposal teams must research and document any AI guidance provided by the funder. Typically, if AI guidance exists, it will be referenced in the solicitation; however, it is advisable to check the funder’s website for updates in case its policies have evolved since the solicitation’s release.

  • U.S. government agencies have taken different approaches to AI use during the proposal process. The National Institutes of Health (NIH) has issued a notice about AI use stating that NIH uses software to detect AI-generated content and that applicants must adhere to NIH Grants Policy Statement Section 2.1.2, which requires that the ideas proposed in grant applications be original. NIH will not review applications that have been “substantially” developed by AI. The National Science Foundation (NSF) has released a notice that states that NSF reviewers are prohibited from uploading proposal content to non-approved AI tools, and that applicants are “encouraged” to indicate in the project description the extent to which AI technology was used to develop their proposal. The NIH notice on AI use emphasizes that AI use harms originality in research proposals, while the NSF raises concerns that reliance on AI tools could lead to plagiarism. Other government agencies have different approaches and requirements for disclosing AI use, making it essential for applicants to review not only the solicitation but also any agency notices and published guidelines related to the application process.

  • Foundations also vary in their use of AI in review processes and in their acceptance of AI-assisted proposals. A 2024 survey by Candid found that 10% of funders accept grant applications prepared with AI-generated content, with some foundations reasoning that AI-generated content is inevitable and that what matters is whether grantees accurately describe their mission and work. However, 23% of the surveyed foundations said they would not accept AI-generated proposals. Finally, 67% foundations that responded said they do not have an established policy on AI-generated applications. As with government agencies, because of the variability in policies governing AI use, applicants must check the solicitation and the foundation’s website to determine whether and how AI tools can be used in the proposal process.

Establishing Policies for AI Use & Disclosure

Funder policies around AI use continue to evolve, and organizations may face more stringent disclosure requirements in the future regarding how AI can be used to develop proposals, manage projects, and collect and analyze participant and other data. Professional associations serving the nonprofit community may start releasing draft AI guidance and policy templates that nonprofits can adapt to their needs. In the meantime, organizations should start thinking about internal policies and procedures for AI use and disclosure, so their staff have guidance on best practices and are ready to meet any future disclosure requirements imposed by funders.

AI use policies should cover key areas, including:

  • Transparency regarding when, how, and to what extent AI tools have been used in materials such as proposals or project reports. This includes establishing a culture in which staff feel comfortable being honest about the extent to which they used AI in their work products. Supporting transparency in AI use may involve not only drafting a policy outlining what staff must be transparent about, but also providing a template or other tool for staff to use to document their AI use.

  • Communication guidelines regarding how and what to communicate to funders and other stakeholders regarding how the organization uses AI, as well as the level of detail to share with various external parties. For example, the policy could cover what to communicate (e.g., describing how AI was used to develop a proposal) but explicitly limit disclosure of other information, such as which AI tool or model was used.

  • Documentation systems to track the organization’s permitted AI tools and how those tools can be used, as well as how AI tools were used to develop specific work products, including donor-related materials such as proposals and reports. In addition to documenting how the AI tools were leveraged, it is also essential to document the “human in the loop” policies and procedures adopted to ensure that all AI use is monitored and reviewed by humans. Having a system to document AI use will not only help when AI use needs to be disclosed to stakeholders, but it can also help the organization see how (and whether) the AI tools and training they have invested in are being used.

Professional Development

As AI use becomes more widely adopted in grant writing and other aspects of nonprofit operations, organizations must ensure that their staff receive training on using AI tools effectively and responsibly. Not all staff — in fact, probably the minority of nonprofit staff — are involved in professional associations, so their main source of information and training on AI tools will be their employer.

In developing training programs and messaging around AI, organizations should emphasize the following:

  • AI tools should serve as assistants in the grant writing process, not as grant writer replacements. The tools can make certain tasks easier, such as creating outlines, conducting background research, or improving writing, but AI tools cannot replace the lead proposal writer. Humans must review all AI-generated content, and that review should include fact-checking and confirmation that it is appropriate to the context (i.e., the specific funding opportunity and donor).

  • Staff should complete training on AI tools and their responsible use. Training should cover security issues, like not uploading confidential information to an AI system, and how to identify bias in AI-generated content. As AI tools become more sophisticated and their use cases expand, staff training will need regular updates to keep up with these changes.

  • Quality assurance processes should be integrated into the organization’s standard policies. Because of the risk of unrestricted AI use, organizations must provide guidance to staff on how to verify AI-generated content and evaluate it for relevancy and authenticity. Additionally, training should cover any required documentation for AI use and confirmation of donor requirements for proposal development and project implementation.

AI will continue to grow in importance. Creating systems and procedures for how AI should be used, and continually updating training to reflect the latest developments, will help future-proof the organization and its staff. Staying current with AI developments is key, as organizations that are not actively working to understand how to effectively leverage AI may find it challenging to catch up as the technology evolves and its use becomes standard practice.

Finally, grant writers and staff may fear that AI will replace them and that their jobs are at risk. This fear can make employees resistant to learn about how AI tools can enhance their work. To facilitate the integration of AI into the workplace, organizations may find it valuable to reassure their staff by emphasizing that AI cannot replace human expertise and insight.

Lesson Reflection and Assignment

Reflection Questions

  • How would you explain your AI use to a funder that doesn’t have any specific policies around AI use in grant applications?

  • What documentation could you prepare to demonstrate that your AI use supports professional grant writing standards?

  • How could transparency about AI use strengthen your relationships with funders?

Assignment: Transparency and Communication Plan

Develop a comprehensive approach to AI transparency, including:

  • Funder research: Develop a process for researching and documenting funders’ policies around AI use.

  • Disclosure strategies: Develop strategies, tailored for different types of funders and other stakeholders, to share how your organization uses AI.

  • Record-keeping: Implement a record-keeping system to document how AI is used in proposals.

  • Communication Templates: Prepare draft language to inform donors and other stakeholders about the organization’s AI use in proposals and other documents.

  • Relationship Management: Develop policies for using AI to support funder and partner relationships, including how the organization uses AI systems to process data.

To be prepared to discuss your organization’s AI use with a funder, develop talking points on how you use AI tools in proposal writing.

Coming Up Next

In Module 2.4, we'll address the security and privacy considerations for AI-enhanced grant writing, ensuring your workflows protect sensitive organizational information while enabling productive AI collaboration.


SOURCES

National Institutes of Health.2.1.2 Recipient Staff*. (n.d.). Retrieved October 12, 2025, from https://grants.nih.gov/grants/policy/nihgps/html5/section_2/2.1.2_recipient_staff.htm

National Science Foundation. (2023, December 14). Notice to research community: Use of generative artificial intelligence technology in the NSF merit review process*. National Science Foundation. https://www.nsf.gov/news/notice-to-the-research-community-on-ai

Mika, G. (n.d.). Where do foundations stand on AI-generated grant proposals?* Candid. Retrieved October 12, 2025, from https://candid.org/blogs/funders-insights-on-ai-generated-grant-application-proposals/


Lesson Summary