nonprofit programs

Evidence-Based Practices in Grant Writing: How to Find, Align, and Use Them

 
 

Table of Contents

When I started my career in grant writing, I was writing federal grants. Evidence-based practices weren't optional — they were expected. You designed your program around an established model, you cited the research, and you explained how your implementation aligned with what had been proven to work. It was just part of the job.

So when I moved into private foundation grant writing and started working with a wider range of nonprofits, I was genuinely surprised. I'd work with an organization whose program was clearly built on solid research — the approach was thoughtful, the design was strong — but nobody had ever connected it to the evidence base. They were doing evidence-informed work without knowing it, and leaving one of their strongest selling points on the table.

I also worked with clients who were designing new programs and had no idea that you could look up evidence-based models before building from scratch — that there were registries of programs that had been studied, tested, and proven effective, waiting to be adapted.

Both situations are fixable. And if you're writing grants without leveraging the evidence behind your work, this article is for you.

What "Evidence-Based" Actually Means

The term "evidence-based" gets thrown around a lot — often loosely. Before you can use it effectively in a grant proposal, it helps to understand what it actually means and where your program falls on the spectrum.

Most funders and researchers recognize a continuum of evidence, something like this:

Evidence-based refers to programs or practices that have been rigorously evaluated — typically through randomized controlled trials or multiple high-quality studies — and have demonstrated measurable positive outcomes. These are the gold standard. Think Nurse-Family Partnership, Multisystemic Therapy, or the Good Behavior Game.

Evidence-informed describes programs that are designed based on research and best available knowledge, even if the specific program model hasn't been formally evaluated. The design draws from what research shows works; it just hasn't been put through a formal efficacy study.

Promising practice typically refers to programs that have shown positive results in at least one rigorous study, but where the evidence base is still being built. Some funders use this term specifically; others use it more loosely.

Best practice is the broadest category — approaches that are widely accepted by experts and practitioners as effective, based on professional experience and available evidence, even without formal research backing.

Why does this matter for grant writing? Because the claim "our program is evidence-based" means something very specific to many funders — and using it when what you mean is "best practice" can undermine your credibility. Be precise. If your program is evidence-informed, say that. If it's based on an established model, name the model. If it's a best practice in your field, describe it as such.

What "Evidence-Based" Actually Means

Federal funders have required evidence-based program design for decades. If you've written federal grants, you know this — your program plan has to identify the practice model, cite the research, and describe how your implementation fidelity will be maintained.

Private foundations vary much more. Some require it explicitly. Others appreciate it but don't mandate it. Many are increasingly interested in it as the field has moved toward more rigorous grantmaking.

But here's the underlying logic, and it's worth understanding: funders are making a bet. They're investing money in the hope of a particular outcome. An organization that can point to research showing that this approach — when implemented well — produces results, is a safer bet than one that is designing something from scratch with no evidence behind it.

That's not to say innovative programs have no place in grantmaking. They do. But even innovative programs benefit from being anchored to what the research says about the population, the intervention type, and the likely pathways to change.

How to Find Evidence-Based Models

This is the step clients most often skip — and it's often the most valuable one, especially during program design.

Several national registries catalog evidence-based programs by topic area, population, and outcome. These are places where researchers and policymakers have done the work of reviewing the evidence and rating program models. You can use them to find an established model to adapt, or to find research that supports an approach your organization is already using.

The registries below are listed alphabetically. Start with the one most relevant to your population and topic area — and note that some sectors have robust registries while others are still building that infrastructure (more on that below).

Aging and Disability Evidence-Based Programs and Practices (ADEPP) (acl.gov/programs/strengthening-aging-and-disability-networks/aging-and-disability-evidence-based-programs) — Maintained by the Administration for Community Living (ACL), ADEPP reviews interventions for older adults and people with disabilities. Programs are rated for quality of research and readiness for dissemination. The go-to registry for aging services, fall prevention, chronic disease self-management, and disability-related programs.

Blueprints for Healthy Youth Development (blueprintsprograms.org) — Focuses on youth-serving programs with rigorous rating criteria. A great resource for juvenile justice, substance use prevention, and youth mental health.

California Evidence-Based Clearinghouse for Child Welfare (CEBC) (cebc4cw.org) — Covers programs related to child welfare, family strengthening, and mental health for youth and families. Despite the California name, it's widely used nationally and is one of the more user-friendly registries for practitioners.

Community Guide, The (thecommunityguide.org) — Run by the Community Preventive Services Task Force, this covers public health interventions across a wide range of topics, including chronic disease, mental health, violence prevention, and more.

Conservation Evidence (conservationevidence.com) — A peer-reviewed database of scientific evidence on conservation interventions — habitat management, species protection, land use, and more. It's the closest thing the environmental sector has to a formal EBP registry, and it's rigorous.

CrimeSolutions.gov — Maintained by the National Institute of Justice; covers criminal justice, corrections, and community safety programs.

Home Visiting Evidence of Effectiveness (HomVEE) (homvee.acf.hhs.gov) — Maintained by the Administration for Children and Families (ACF), HomVEE reviews evidence for home visiting programs serving pregnant women and families with young children. If your work touches early childhood, maternal health, or family support, this is your registry.

OJJDP Model Programs Guide (ojjdp.gov/mpg) — Specifically for juvenile justice and delinquency prevention.

SAMHSA Evidence-Based Practices Resource Center (samhsa.gov/libraries/evidence-based-practices-resource-center) — SAMHSA's current hub for evidence-based resources in mental health and substance use. Covers treatment approaches, toolkits, and practice guidelines across a broad range of behavioral health topics. This replaced the older NREPP registry and is the first stop for anything related to substance use, mental health, and co-occurring disorders.

Title IV-E Prevention Services Clearinghouse (preventionservices.abtsites.com) — Also maintained by ACF, this clearinghouse rates programs designed to prevent child abuse, neglect, and family separation. Essential for organizations working in child welfare, family strengthening, and foster care prevention.

What Works Clearinghouse (ies.ed.gov/ncee/wwc) — Maintained by the Institute of Education Sciences, this is the go-to resource for education-related programs. If you're working with schools, youth development, or literacy, start here.

What about arts, animal welfare, and workforce development? Some fields simply don't have a formally rated program registry — and that includes arts and culture, animal welfare, and workforce development. That's worth being honest about with clients and in your grant narratives. It doesn't mean evidence doesn't exist; it means it lives in different places.

For arts and culture organizations, the National Endowment for the Arts (arts.gov/impact/research) publishes research on the impact of arts programs on health, learning, and social-emotional well-being. The NEA's National Arts Statistics and Evidence-based Reporting Center (NASERC) is a newer resource building this evidence base. These aren't program registries, but they are legitimate sources you can cite to support your program model.

For animal welfare organizations, no centralized EBP registry exists. Cite peer-reviewed research from veterinary, zoological, or humane education journals, and reference standards from national professional bodies such as the American Veterinary Medical Association (AVMA) or the Association of Zoos and Aquariums (AZA), where applicable.

For workforce development organizations, anchor your program design to research from labor economists, the Department of Labor's Employment and Training Administration, or sector-specific workforce studies. Programs like Individual Placement and Support (IPS) for supported employment are well-documented in the research literature, even without a single clearinghouse to point to.

When no formal registry exists for your sector, anchor your program to peer-reviewed research, published reports from sector leaders, or national organization standards. The absence of a rated registry doesn't mean the absence of evidence — and a well-sourced narrative built on solid research can be just as compelling.

When you find a model that's relevant to your work, don't just cite the name — read the research summaries. Understand what outcomes were measured, in what population, under what conditions. That detail is what allows you to write about it with specificity and credibility.

How to Align Your Program With an EBP

Finding a relevant model is the first step. The more nuanced challenge — and the one I help clients work through most often — is alignment: how do you connect what your organization is actually doing to an established evidence base?

A few scenarios:

Your program closely matches an established model. This is the clearest case. If your youth mentoring program follows the Big Brothers Big Sisters model, say so. If your afterschool literacy intervention uses the Reading Recovery approach, name it. Identify the model, describe how your implementation aligns with its core components, and note any adaptations you've made and why.

Your program is inspired by a model but adapted for your context. This is very common — and completely legitimate. Programs get adapted for different populations, geographies, languages, and resource environments all the time. The key is to be explicit: name the model you adapted, describe the adaptations, and explain why those adaptations were appropriate for your community while preserving the core elements that drive outcomes.

Your program is built from multiple evidence bases. Sometimes an organization has designed a program that draws on several bodies of research without following any single established model. In this case, you're not citing an EBP program — you're citing an evidence base. Describe your program design and then anchor each component to relevant research: "Our case management approach is grounded in Motivational Interviewing, which has been shown to increase treatment engagement in..." This is evidence-informed design, and it's absolutely defensible in a grant narrative.

You're designing a new program and want to use the research. This is the best use of the registries I described above. Before you finalize your program model, look up what the research says about effective approaches for your population and outcome area. You may find an established model worth adapting. At minimum, you'll find research that should inform your design — and that can be cited in your grant proposal.

How to Write About EBPs in Your Grant Narrative

Knowing you're using an evidence-based approach and writing about it effectively are two different skills. Here's how to do it well.

Name it specifically. "Our program is based on evidence-based practices" tells the reviewer nothing. "Our program is based on the Incredible Years curriculum, a certified evidence-based program included in the OJJDP Model Programs Guide, which has been shown in multiple randomized trials to reduce conduct problems in children ages 3–8" tells them everything they need to know.

Cite the research. Funders — especially federal funders — want citations. At a minimum, include the name of the model and the registry or clearinghouse where it is listed. For stronger proposals, cite a specific study or meta-analysis. You don't need to write an academic paper, but a sentence with a citation in parentheses carries weight.

Explain your fidelity plan. One of the things that distinguishes organizations that use EBPs well from those who just name-drop them is a fidelity plan: how will you ensure your implementation actually reflects the model? That might mean trained facilitators, certified trainers, fidelity checklists, or regular supervision. Funders who know their EBPs will look for this.

Connect the model to your population. Research is conducted in specific contexts with specific populations. If there's a good match between the population the EBP was tested with and the community you serve, say so. If there's a gap — different age group, different language, different cultural context — acknowledge it and explain how you've adapted the model to address it.

Don't overcite. A grant narrative that is more literature review than program description has gone too far in the other direction. The evidence supports your program design — it doesn't replace the narrative about why your organization is uniquely positioned to do this work.

When You Can't Point to a Formal EBP

Not every program has a formal evidence-based model to reference. And that's okay — especially for innovative programs, emerging fields, and communities whose needs haven't been the subject of significant research.

In these cases, you have a few options:

Cite the evidence base, not the model. Even if there's no formal EBP for what you're doing, there's likely research on the population, the type of intervention, or the intended outcomes. Anchor your program design to that research. Show that your approach is grounded in what the field knows, even if it doesn't fit neatly into a named model.

Name it honestly. If your program is a best practice in your field — widely used, professionally endorsed, growing evidence behind it — call it that. Don't stretch a "promising practice" into an "evidence-based" claim if the evidence doesn't support it. Precision builds trust.

Make the case for why innovation is warranted. Some funders — particularly those focused on equity, emerging issues, or underserved communities — are explicitly open to funding innovative approaches. In those cases, make the argument: the standard models weren't designed for this population, or this problem requires a new approach, and here's what you're drawing on as you design it.

Build evaluation in. If you can't point to strong prior evidence, demonstrating that you're building the evidence through rigorous evaluation of your own work is a meaningful response. A well-designed evaluation plan shows funders that you're committed to learning and contributing to the field — not just doing the work and hoping for the best.

Frequently Asked Questions About Evidence-Based Practices in Grant Writing

Do all grant funders require evidence-based practices?
No — requirements vary widely. Federal funders almost always require or strongly prefer evidence-based program designs. Private foundations vary: some require it, many appreciate it, and some are more focused on innovation and community-driven approaches. Always read the guidelines carefully and tailor your language accordingly.

What if my organization is doing something that works, but there's no formal research on it?
This is more common than you might think, especially for programs serving communities whose needs have been historically underresearched. In this case, describe your program as evidence-informed or grounded in best practice, anchor it to whatever relevant research exists, and consider whether building your own evidence base through evaluation would strengthen future grant applications.

Can I just say my program is "evidence-based" without citing a specific model?
I'd strongly advise against it. Without specifics, the claim is unverifiable and reads as boilerplate. Name the model, cite the research, or describe the evidence base you're drawing from. Vague references to "evidence-based practices" are a red flag for experienced reviewers.

Where do I start if I'm designing a new program and want to make it evidence-based?
Start with the registries. Search the What Works Clearinghouse, the Community Guide, or whichever registry is most relevant to your topic area. See what models exist, read the research summaries, and evaluate whether any of them are a good fit to adapt. You'll either find a model worth building from, or you'll build a much stronger evidence-informed design by understanding what the research says.

What's the difference between fidelity and adaptation?
Fidelity refers to implementing a program the way it was designed — following the curriculum, using trained facilitators, and hitting the recommended dosage. Adaptation refers to modifications made to fit a different context, population, or resource environment. The key is to adapt intentionally and thoughtfully: preserve the core components that drive the model's outcomes, and document your reasoning for any changes you make.

Evidence-based practices are one of the most underused tools in the private-sector grant writer's toolkit — and one of the most powerful. Whether you're designing a new program, strengthening an existing one, or simply trying to write more compelling grant narratives, understanding how to find, align with, and communicate about the evidence behind your work will make you a significantly stronger advocate for the organizations you support.

If you want a formatted, printable version of this registry list — plus bonus content you won't find in this article — check out the Evidence-Based Practice Registry Guide in the shop. It includes the full registry list with a "best for" column so you can find your match at a glance, a plain-English explanation of how each registry's rating system works, tips for searching each one effectively, and a Program Evidence Worksheet for documenting the evidence base behind your own program before you write your next grant. It's the reference doc you'll reach for every time a funder asks, "What's the evidence?"

Have a question about a specific program model or how to approach evidence in a particular grant? Leave it in the comments — I read everything.

Allison is a grant writing educator with 25+ years of experience. She is the founder of Spark the Fire Grant Writing and creator of the Certificate in Grant Writing program.