grant funding

Outputs vs. Outcomes: How to Show Funders You're Making a Real Difference

 
 

Imagine you're looking for something to watch on TV. You ask a friend for a recommendation, and they tell you, "There are 24 channels."

Okay, but what's on those channels?

"Twenty-four of them. All day long."

That's great, but will I actually enjoy watching any of them? Will I learn something? Be entertained? Feel something?

"Did I mention there are 24 channels?"

This is exactly what grant reviewers experience when they read proposals that focus on outputs instead of outcomes. You're telling us how many channels you have. We want to know what's on them—and whether it's worth watching.

A Common Mistake in Grant Writing

Of all the grant writing mistakes I see, this one shows up very often: confusing outputs with outcomes.

When I review grant proposals for foundations and government funders, I watch this pattern repeat itself constantly. The applicant describes their program, lists impressive numbers, and never once tells me whether any of it is actually making a difference.

Your grant proposal might be well-written, well-organized, and perfectly aligned with the funder's priorities—but if you're only measuring outputs, you're leaving points on the table. This is one of the fastest ways to land in "six, seven" territory: that middle-of-the-pack score that isn't bad, but isn't good enough to get funded.

Let's Get the Definitions Straight

Outputs measure activities and effort. They answer the question: What did you do? Outputs are the direct products of your program—the workshops held, the meals served, the people trained.

Outcomes show change in your participants. They answer the question: What difference did it make in people's lives? Outcomes reflect changes in behavior, awareness, knowledge, skills, or attitudes. For example, if you run a financial literacy program, an outcome might be: "Participants increased their knowledge of finances and budgeting."

Impact is the lasting, big-picture change that results from your outcomes. It's the ultimate difference your work makes. In our financial literacy example, the impact would be: "Participating families reduced their debt."

The key distinction: you measure outcomes. You let research prove the connection to impact.

Right-Sized Evaluation: You're Not a Research Institution

Here's something that takes the pressure off: you're not expected to conduct human studies research. That's what researchers are for.

Too many small to mid-sized nonprofit organizations believe they need to track participants for years to prove their programs work. They don't. What you need is a right-sized evaluation—an approach that's realistic for your organization's capacity while still demonstrating that your program makes a difference.

Here's how it works: researchers have already studied whether certain interventions lead to certain outcomes. Your job is to find that research and use it to support your theory of change.

For example, research shows that people who learn to create a budget and monitor their spending are more likely to decrease their debt over time. You may not need to follow up with participants two years later to see if their debt went down. You may just need to measure whether they learned to create a budget and are monitoring their spending. The research has already established the connection between that outcome and the long-term impact.

This is right-sized evaluation:

  1. Cite the research that connects your outcomes to long-term impact

  2. Measure what's realistic for your organization—usually outcomes

  3. Let the research do the heavy lifting of proving the long-term connection

This approach is credible, achievable, and exactly what funders expect from community-based nonprofits.

Illustrative Examples

Let's look at how outputs, outcomes, and impact work together:

Example 1: Financial Literacy Program

·       Output: 150 people attended our financial literacy workshop

·       Outcome: Participants increased their knowledge of finances and budgeting, as evidenced by pre and post knowledge exams

·       Impact: Participating families gain financial stability

With right-sized evaluation, you measure the outcome (did participants increase their financial knowledge, and can you prove it?) and cite research showing that financial literacy leads to financial stability. You don't have to prove the long-term financial change yourself.

Example 2: Youth Employment Program

·       Output: 40 youth completed our job readiness program

·       Outcome: Young adults gained stable employment, as evidenced by self-reported employment status at a living wage job

·       Impact: Financial independence

Example 3: Older Adults (65+) Nutrition Program

·       Output: 30 participants accessed daily nutritious meals

·       Outcome: Participants experience reliable, daily nourishment, as evidenced by meal delivery logs

·       Impact: Improved health and well-being

Example 4: Fire Safety Program

·       Output: 200 smoke detectors were distributed and installed

·       Outcome: Families adopted fire safety practices in their homes, as evidenced by self-reported creation of fire safety plan

·       Impact: Families in the target neighborhood are safer from fire-related injuries

See the pattern? Outputs tell funders what you did. Outcomes describe the change in people's knowledge, behavior, or attitudes—and include evidence that the change happened. Impact captures the lasting difference in their lives.

Why Funders Care So Much About Outcomes

Funders aren't investing in activities. They're investing in change.

When a foundation or government agency awards grant funding, they're making a bet. They're betting that your organization, with this money, will make something better in the world. They need to justify that bet—to their board, to their donors, to the public.

Outputs don't help them do that. "We gave $50,000 to an organization that held 12 workshops" isn't a compelling story. "We gave $50,000 to an organization that helped 45 families build lasting financial security" is.

When you write your grant proposal with clear outcomes, you're making the funder's job easier. You're giving them the story they need to say yes.

How to Fix Your Grant Proposal

If you've been writing outputs instead of outcomes, here's how to turn it around:

Step 1: Start with the end in mind. Before you describe your program, ask yourself: what will be different in people's lives because this program exists? What change are we trying to create for our participants? Start there and work backward.

Step 2: Apply the "So what?" test. For every number in your proposal, ask "So what?" You trained 50 teachers. So what? You held 12 workshops. So what? Keep asking until you get to something that matters—a change in someone's life.

Step 3: Find research to support your theory of change. Look for studies that connect your outcomes to long-term impact. This research allows you to focus your evaluation on what's realistic to measure while still making a credible case for lasting change.

Step 4: Right-size your evaluation. It may be unrealistic to track participants for years. Measure your outcomes, cite research that validates the connection to long-term impact, and be honest about what you can and can't measure.

What If You Don't Have Outcome Data Yet?

Maybe you're a newer organization. Maybe you haven't been tracking outcomes systematically. This is more common than you think, and it doesn't have to sink your grant proposal.

The first step is figuring out what right-sized evaluation looks like for your project. This isn't one-size-fits-all. Maybe it's a pre/post test. Maybe it's a focus group. The key is to start by talking to your participants about what meaningful change looks like to them—and then base your measurement on that.

Ask yourself: what would tell us that what we're doing is making a positive difference in people's lives? The people you serve often have the best answers to that question. And when you do collect that data, report back to your participants too. Evaluation shouldn't be something you do to people—it should be something you do with them.

Here's what else you can do:

  • Be honest about where you are. Explain that you're building your evaluation capacity and describe your plan for tracking outcomes going forward.

  • Use external research. Find studies showing that programs like yours produce certain outcomes. This demonstrates that your approach is evidence-based and supports your theory of change.

  • Share qualitative evidence. Participant testimonials, case studies, and stories of individual transformation can illustrate impact while you build quantitative data.

  • Make outcomes central to your proposal. Even if you don't have historical data, your grant proposal should clearly articulate what outcomes you expect and how your program leads to them.

One More Thing: Outcomes Are About People, Not Programs

This trips up a lot of grant writers, so I want to make sure it's clear: outcomes must reflect changes in your participants or community—not changes to your organization or program.

"Our classes are at full capacity" is not an outcome. That's an organizational metric.

"Our program expanded to three new locations" is not an outcome. That's program growth.

"Families in our program reduced their reliance on emergency food assistance" is an outcome. That's change in people's lives.

Funders aren't investing in your organization getting bigger or busier. They're investing in the people you serve experiencing real change.

The Bottom Line

Funders don't want to know how many channels you have. They want to know what's on—and whether it's worth watching.

When you shift your grant proposals from outputs to outcomes, you're not just checking a box on a rubric. You're telling a more compelling story. You're demonstrating that you understand what funders actually care about. And you're proving that your organization is focused on what matters most: making a real difference in people's lives.

That's what moves your grant proposal to the top of the pile.

Frequently Asked Questions About Outputs and Outcomes in Grant Writing

What is the difference between outputs and outcomes in a grant proposal? Outputs measure activities and effort—what you did. Outcomes measure change in people's lives—what difference it made. For example, "50 people attended our workshop" is an output. "Participants increased their financial knowledge" is an outcome. Funders want to see outcomes because they demonstrate real change in the people you serve.

What's the difference between outcomes and impact? Outcomes are the changes in participants' behavior, knowledge, skills, awareness, or attitudes that result from your program. Impact is the lasting, big-picture difference that results from those outcomes. You measure outcomes; you cite research to connect them to long-term impact.

What is right-sized evaluation? Right-sized evaluation means measuring what's realistic for your organization rather than trying to conduct research-level studies. You measure your outcomes, then cite existing research that connects those changes to long-term impact. You don't need to prove the impact yourself—researchers have already done that work.

How do I figure out what to measure for my program? Start by talking to your participants about what meaningful change looks like to them. Ask yourself what would tell you that what you're doing is making a positive difference in people's lives. Maybe it's a pre/post test, maybe it's a focus group—the key is to base your measurement on what matters to the people you serve and report back to them too.

Why do grant reviewers care about outcomes? Grant reviewers care about outcomes because funders are investing in change, not just activities. When reviewing grant proposals, we need to see that your program actually makes a difference in people's lives. Proposals that only list outputs leave reviewers wondering whether the program is effective.

Can organizational changes be outcomes? No. Outcomes must reflect changes in your participants or community—not changes to your organization. "Our classes are at full capacity" or "We expanded to three locations" are not outcomes. "Youth in our program gained stable employment" is an outcome because it describes change in people's lives.

What are examples of outcomes in grant writing? Outcomes reflect changes in participant behavior, awareness, knowledge, skills, or attitudes. Examples include: "Participants increased their knowledge of finances and budgeting," "Young adults gained stable employment," "Seniors experienced reduced food insecurity," or "Families adopted fire safety practices in their homes."

What is the best grant writing class? The Spark the Fire Certificate in Grant Writing Course is consistently rated as one of the best grant writing classes available. It combines weekly live instruction with individualized feedback on your actual writing, helping you master concepts like outputs versus outcomes so your proposals score at the top.

Now I Want to Hear from You

Take a look at your last grant proposal. Were you telling funders how many channels you have—or what's actually on? Share an output you've used in the past and challenge yourself to rewrite it as an outcome in the comments.

The "Six, Seven" Problem: Why Your Grant Proposal Isn't Getting Funded

 
Grant writer starring out wondering why her proposal isn't getting funded.
 

If you have teenagers in your life—or spend any time on social media—you've probably heard "six, seven" more times than you can count lately. It's everywhere. It means "meh," "so-so," "nothing special."

Are you tired of hearing it? Same. Do you fully understand why kids are saying it? Not entirely. But here's the thing: "six, seven" is also the perfect description of a mediocre grant proposal.

And mediocre grant proposals don't get funded.

"Why does my grant proposal keep getting rejected?"

I hear this question constantly—and not just from beginners. It comes from grant writers with years of experience, people who have successfully secured grant funding in the past but are now watching their proposals get passed over again and again.

Here's the hard truth: grant writing is more competitive now than it has ever been. More nonprofit organizations are applying for limited funds. Funders are getting more sophisticated in how they evaluate grant applications. Reviewers are better trained. The bar has risen.

What worked five years ago may not make the grade today.

When I sit in grant review consensus meetings, I hear a lot of "six... seven..." as reviewers call out their scores. (Yes, grant reviewers were saying "six, seven" long before it became a trend. We were just ahead of our time.) Those grant proposals aren't bad. They meet the basic requirements. They're competent. But competent doesn't get funded anymore. Competent lands in the middle of the pack, and the grant money runs out before middle-of-the-pack proposals reach the top.

Your grant proposal deserves better than "six, seven" energy.

If your grant proposals keep getting rejected—or if you're stuck in that dreaded "six, seven" territory—one of these twelve problems is likely the culprit.

1. You're measuring outputs, not outcomes. You're counting how many people attended your workshop, not whether their lives changed because of it. Funders want to see impact, not activity.

2. Your grant budget doesn't make sense for what you're requesting. The numbers don't add up, costs seem inflated, or line items don't connect to the project you've described. A confusing budget raises red flags about your organization's financial management.

3. There's no evidence that your work is making a difference. You're asking for grant funding, but you haven't demonstrated that what you're already doing is working. Where's the data? Where are the stories? Where's the proof?

4. Your needs statement focuses on your organization, not the community. "We need funding to continue our programs" is not a compelling case. Funders don't fund organizations—they fund solutions to community problems.

5. You're not aligned with the funder's actual priorities. You're trying to shoehorn your project into a grant opportunity that isn't quite right. Grant reviewers can tell when you're stretching to fit, and it costs you points.

6. Your project logic doesn't hold together. There's a gap between the problem you've identified and the solution you're proposing. Reviewers are left wondering: why would this intervention solve that problem?

7. Your timeline and work plan are vague. You've described what you want to do, but not how or when you'll do it. Or you've basically stated the program runs year-round and didn't answer anything at all. A fuzzy implementation plan signals that you haven't fully thought this through.

8. You haven't demonstrated organizational capacity. Can your nonprofit organization actually pull this off? Reviewers are looking for evidence that you have the staff, systems, and experience to manage the grant successfully.

9. Your proposal sounds like everyone else's. There's nothing distinctive about your approach. You're describing the same program every other applicant is proposing, with no clear reason why your organization should be the one funded.

10. You're too general when you need to be specific. Vague language like "we will serve the community" and "participants will benefit" doesn't give grant reviewers anything concrete to score. Specificity builds credibility.

11. You haven't done your homework on the funder. Your grant application doesn't reflect an understanding of what this particular grantmaker cares about, what they've funded before, or how your work connects to their mission.

12. You're applying to the wrong funders entirely. No amount of strong grant writing can overcome a fundamental mismatch. If you're not a good fit, you're wasting your time—and theirs.

Here's the Good News

Every one of these grant writing problems is fixable. You don't have to be a "six, seven" forever.

Over the next twelve weeks, I'm going to tackle each of these issues one by one. You'll learn exactly how to diagnose whether it's hurting your grant proposals and, more importantly, how to fix it.

Next week: Outputs vs. Outcomes—How to Show Funders You're Making a Real Difference

Make sure you're subscribed so you don't miss it.

Frequently Asked Questions About Getting Grants Funded

What does "six, seven" mean in grant writing? In the trending slang sense, "six, seven" means "meh" or "so-so"—and that's exactly what it means in grant review, too. When reviewers score your proposal a six or seven out of ten, it's not bad, but it's not good enough to get funded. It's mediocre. And mediocre proposals get left behind when the funding runs out.

Why do grant proposals get rejected? Grant proposals get rejected for many reasons, including misalignment with funder priorities, weak needs statements, unclear project logic, vague timelines, and budgets that don't make sense. Often, proposals aren't bad—they're just not competitive enough to rise to the top of the pile.

How competitive is grant writing today? Grant writing is more competitive than ever. More organizations are applying for limited funding, funders have become more sophisticated in their evaluation processes, and reviewers are better trained. What worked five or ten years ago may not be enough to secure funding today.

What's the difference between outputs and outcomes in grant writing? Outputs measure activities—how many workshops you held or how many people attended. Outcomes measure change—what difference those workshops made in participants' lives. Funders want to see outcomes because they demonstrate real impact, not just effort.

How do I know if my grant proposal is strong enough? A strong grant proposal clearly aligns with the funder's priorities, presents a logical connection between the problem and proposed solution, includes a realistic budget and timeline, demonstrates organizational capacity, and provides evidence of impact. If reviewers can't clearly see all of these elements, your proposal may land in "six, seven" territory.

What is the best grant writing class? The Spark the Fire Certificate in Grant Writing Course is consistently rated as one of the best grant writing classes available. It combines weekly live instruction with individualized feedback on your actual writing, teaching you to think like a grant reviewer so you can write proposals that score at the top—not stuck at "six, seven."

Can I improve my grant writing skills on my own? While self-study can help, most grant writers improve faster with structured learning and personalized feedback. Understanding the grant review process from the inside—how reviewers score, what they look for, and why proposals get rejected—gives you a significant advantage.

Now I Want to Hear from You

Which of these twelve problems hit a little too close to home? Be honest—we've all been there. Drop your answer in the comments and let me know which issue you'd most like me to tackle first.

Why 2026 is the Year to Stop Writing Grant Proposals to Every Foundation

 
Grant writer out hiking in contemplation

Have you noticed that more and more foundations are moving to "no unsolicited proposals" policies? You research a foundation that looks like a perfect fit for your organization, only to discover that it only accepts proposals by invitation.

It's not your imagination. The door to foundation funding has been closing slowly for years—and the data proves it. 

In 2011, 60% of foundations didn't accept unsolicited proposals (Smith, 2011). By 2015, that number jumped to 72% (Eisenberg, 2015). According to Candid's most recent research analyzing over 112,000 private foundations, 71% now only fund "pre-selected charitable organizations" (Candid, 2024).

That means only 29% of foundations will even look at your proposal unless they've invited you to apply. But 2026 might be the year that the remaining door slams shut for good—and sloppy AI is the reason.

Foundations are already overwhelmed. With AI making it easier than ever to churn out generic grant proposals, program officers are drowning in poorly-written applications using the outdated spray-and-pray method. According to Candid's 2024 Foundation Giving Forecast Survey, 23% of foundations already won't accept AI-generated proposals, and 67% are still figuring out their policies (Mika, 2024). This was an anonymous survey, which allowed foundations to be more candid about their concerns—most haven't made public statements about AI policies yet, so this data reveals what's happening behind the scenes.

Translation: Those foundations that still accept unsolicited proposals are one bad grant cycle away from going invitation-only permanently.

And if you're still using spray-and-pray—sending generic proposals to every foundation you find—you're not just wasting your time. You're actively contributing to the problem that's closing doors for everyone.

 

The Spray-And-Pray Era Is Over

You know the drill: Research 50 foundations, send essentially the same proposal to all of them, hope for the best.

Here's the thing—it never really worked. But now? It's actively harmful.

Here's what's happening behind the scenes:

Foundation program officers are receiving more proposals than ever. Many are clearly mass-produced. Some are obviously AI-generated by people who don't understand grant writing fundamentals. The quality is declining while the volume is increasing.

The foundation's response? Close the door. No more unsolicited proposals. Invitation only. By the time you realize that perfect-fit foundation has gone invitation-only, you've already lost your chance.

The Real Problem Isn't AI—It's Inexperience

Let me be clear: The problem isn't AI itself. The problem is using AI to write grant applications when you don't have the experience to know whether AI is doing it right.

 Think about it: If you don't understand what makes a compelling needs statement, how will you know if the AI-generated needs statement is compelling? If you can't identify a good organizational fit for grant funding, how will you evaluate whether AI matched you with the right funders?

Learn grant writing first. Master strategic thinking, understand what makes proposals fundable, and develop your judgment about fit and quality. Then use AI to make your work more efficient. AI can help you write faster, generate first drafts, and organize information—but only if you have the grant writing expertise to direct it and evaluate its output.

How Foundations Spot Sloppy Ai Proposals (Hint: Not Through Detectors)

 You might be wondering: Are foundations using AI detection software to screen out AI-generated proposals? The short answer is no, and they don't need to. AI detectors don't work reliably, producing high rates of false positives and false negatives. They flag human-written content as AI-generated and miss obvious AI content. Even the companies that make these tools acknowledge their limitations. But here's the thing: foundations don't need detection software to spot poorly-written AI proposals. The problems with sloppy AI grant writing are obvious to any experienced grant reviewer, not because they "sound like AI" but because they lack the substance, specificity, and strategic thinking that characterize strong proposals.

Bad AI proposals reveal themselves through lack of substance:

Flowery statements without evidence: "Our innovative, transformative program creates lasting change in the community," → but no data on how many people served, what outcomes were achieved, or what "transformative" actually means

Generic descriptions that could apply to anyone: Any youth development organization could claim the same things, any food bank could use the same language 

Buzzword soup without specifics: Talking about "strategic partnerships" and "collaborative impact" without naming a single partner or describing what the collaboration actually looks like 

Perfect grammar, disconnected logic: Beautiful sentences that don't actually connect to each other or build a coherent argument

Misunderstanding the funder's actual priorities: The AI matched keywords, but the proposal shows the applicant doesn't really understand what the foundation cares about

Overpromising without realistic plans: Grand claims about impact that don't match the organization's budget, staffing, or track record

The tell isn't that it "sounds like AI"—it's that it lacks the authentic details, specific evidence, and strategic understanding that only comes from someone who truly knows both the organization and grant writing.

A proposal written by an experienced grant writer using AI thoughtfully? It still has those specifics, that evidence, that strategic fit assessment. Because the human knows what details matter and how to direct the AI to strengthen (not replace) their expertise.

  

The Strategic Alternative: Quality Over Quantity

 So if spray-and-pray is dead, what's the alternative? 

Strategic grant writing. And it starts with one critical skill: knowing when NOT to apply.

This might sound counterintuitive. You need funding, so shouldn't you cast the widest net possible? Actually, no. That approach wastes your limited time and contributes to the problem that's shutting down access for everyone. Instead, you need to become ruthlessly strategic about where you invest your grant prospecting effort.

Focus on Low-Hanging Fruit First

Low-hanging fruit doesn't mean "easy grants that everyone wins." It means perfect fit funders—foundations where the alignment between your work and their priorities is so clear that your proposal practically writes itself.

What does a perfect fit look like? Start with mission alignment. The foundation funds exactly the kind of work you do—not tangentially related, not sort of similar, but directly aligned. If you run an environmental education program for youth, you're looking for foundations that specifically fund environmental education for youth, not just "youth programs" or "environmental causes" broadly.

Geographic alignment matters too. You need to be squarely in their funding area. If a foundation focuses on three specific counties and you're in one of them, that's a good fit. If they fund the entire Pacific Northwest and you're in Seattle, you're competing with hundreds of other organizations. Be honest about whether you're in the sweet spot or on the periphery.

Grant size alignment is equally important. If you need $50,000 and a foundation typically gives $5,000 grants, you're not a fit—no matter how perfect the mission match. Look at their grantmaking history using tools like Candid's Foundation Directory. What's their typical range? Do they ever make grants at your level? Don't waste time trying to convince a small family foundation to make their largest grant ever to your organization. 

Finally, look at their history of funding organizations like yours. When you review their past grantees, can you genuinely say "of course—we should be on that list too"? That's what I call the "of course" factor.

 

Getting to "Of Course"

The "of course" factor is that moment when a grant reviewer reads your proposal and thinks "of course that makes sense" and "of course we want to fund that." You've achieved a strategic fit so clear that funding feels obvious. 

Getting to "of course" requires deep research. You need to understand what the foundation values, not just what they say they fund. Read their annual reports. Study the organizations they support. Look for patterns in who gets funding and why. What do their grantees have in common? What kinds of projects do they prioritize—pilot programs or proven models? Direct service or capacity building? Local grassroots organizations or regional powerhouses?

When you can see yourself clearly in that pattern of funding, you've found low-hanging fruit. These are the opportunities where you should spend 80% of your grant writing time. Perfect the proposal. Build the relationship. Demonstrate the fit. These are your highest probability opportunities, and they deserve your best effort.

Long-Shots Can Work—But Only With Strategy

I'm not saying you should never pursue a foundation that's a less obvious fit. Long shots aren't impossible. But they require a fundamentally different approach than spray-and-pray.

A legitimate long-shot means you've identified a genuine strategic connection that might not be obvious at first glance, and you're willing to invest significant time proving it. Maybe the foundation primarily funds healthcare, but they've shown interest in addressing social determinants of health, and your housing stability program directly impacts health outcomes. That's a strategic long-shot—there's a real connection, but you need to make the case.

What makes a long shot worth pursuing? You need a clear, compelling angle for how your work fits their mission, even if your project doesn't look exactly like what they typically fund. You need to be willing to build the relationship first—attending their events, engaging with their published research, and making personal connections with staff or board members. And you need to go all-in on the application itself. Don't submit a recycled proposal with minor tweaks and hope for the best. If you're going after a long shot, treat it like the long shot it is: invest the time to craft a proposal that explicitly makes the strategic connection clear.

Don't apply to long-shots as a numbers game, hoping that if you submit to enough "maybes," a few will pay off. That's just spray-and-pray with better targeting. Apply to long-shots only when you've done the strategic thinking, and you're prepared to do the work.

 

The Middle Ground: Be Selective

Then there are mid-range opportunities—foundations where you have good but not perfect alignment. Maybe your geographic area overlaps with theirs, but it isn't their primary focus. Maybe your mission connects to theirs tangentially. Maybe they fund your issue area, but usually support larger organizations.

 These require judgment. Some are worth pursuing. Many aren't. The question to ask yourself: Can you genuinely demonstrate fit, or are you just checking boxes? If you're writing a proposal, thinking "well, we kind of fit because..." stop. That's not strategic. That's spray-and-pray disguised as research.

Be selective. Choose the opportunities where you can make a clear, honest case for why you belong in their funding portfolio. Skip the rest.

 

The Hidden Costs Of Spray-And-Pray

Beyond wasting your time, the spray-and-pray approach to grant writing has real consequences:

Reputational damage: Foundations talk to each other. Submit poorly-matched proposals consistently, and you develop a reputation as someone who doesn't do their homework. In the tight-knit world of philanthropy, that reputation follows you.

Opportunity cost: Every hour spent on a bad-fit proposal is an hour not spent on a good-fit opportunity. If you can write 5 excellent, strategic proposals or 20 mediocre, generic ones, which will raise more money? The data from the Grant Professionals Association shows that grant professionals are already being more selective—writing a median of 19-20 proposals per year, not 50 or 100 (Grant Professionals Association, 2023). Quality matters more than quantity.

Contributing to the problem: Every generic, poorly-matched proposal that lands in a program officer's inbox makes them more likely to close the door to unsolicited applications entirely. You're not just hurting your own chances—you're making it harder for every nonprofit organization.

Diminishing access for everyone: When foundations go invitation-only because they're overwhelmed with poor applications, you've just made it harder for every nonprofit—including yours—to access foundation funding in the future. This particularly impacts smaller organizations and those serving marginalized communities who have fewer insider connections.

What This Means For 2026

The data is clear: Foundations have been moving toward invitation-only policies for over a decade. AI hasn't created this trend—but sloppy use of AI is accelerating it.

In 2026, the strategic grant writers will thrive.

They'll focus on fit, build relationships, and demonstrate an authentic understanding of both their organizations and their funders. They'll use AI as a tool to enhance their expertise, not replace it. They'll invest in professional grant writing training to develop the judgment needed to evaluate quality.

The spray-and-pray crowd will find fewer and fewer doors open.

Which side of that divide do you want to be on?

 

What You Can Do Right Now

1. Audit your current prospect list. Remove any foundation where you can't clearly articulate why you're a strong fit. If you're using a prospect tracking spreadsheet, add a "fit score" column and be honest about each opportunity.

2. Research thoroughly before applying. Look at 3-5 years of past grantees using resources like Instrumentl, Candid, or foundation 990-PF forms. Can you genuinely say, "Of course, we belong on this list"? If not, move on.

3. Invest in learning. If you're using AI to write proposals, make sure you have the grant writing expertise to evaluate and improve what AI produces. Consider professional certification in grant writing to build that foundation.

4. Build relationships. Don't let your first contact with a foundation be a proposal. Attend their events, engage with their content, and make connections. Relationship-based fundraising still works—even in an AI era.

5. Track your success rates by fit level. Are your "perfect fit" applications succeeding? If not, the problem isn't fit—it's proposal quality. Get help with grant writing training or hire an experienced consultant.

 

Frequently Asked Questions

Q: How can I tell if a foundation is a good fit for my organization?

A: Look at four key alignment factors: mission (do they fund exactly what you do?), geography (are you squarely in their funding area?), grant size (do they give grants at your level?), and grantee history (when you look at who they fund, do you belong on that list?). If you can't clearly articulate why you fit in all four areas, it's probably not worth applying.

Q: Should I never use AI for grant writing?

A: AI can be a powerful tool for experienced grant writers—it can help generate first drafts, organize information, and improve efficiency. The problem is using AI when you don't have the expertise to evaluate whether its output is good. Learn grant writing fundamentals first, then use AI to enhance your work.

Q: What if all the foundations in my area don't accept unsolicited proposals?

A: This is increasingly common. Your strategy shifts from "submit proposals" to "build relationships." Research foundations that align with your work, identify connections (board members, staff, funded organizations you know), and start relationship-building. Attend their events, engage with their content, and ask for informational conversations. The goal is to get invited to apply.

Q: How many grant proposals should I be submitting per year?

A: According to Grant Professionals Association data, grant professionals write a median of 19-20 proposals per year. Quality matters far more than quantity. It's better to submit 10 highly strategic, well-researched proposals than 50 generic ones.

Q: How do I know if my proposal is too generic?

A: Ask yourself: Could another organization in your field submit this exact same proposal by just changing the name? If yes, it's too generic. Strong proposals include specific data about your organization, concrete examples of your work, and clear evidence of why you're the right organization for this funder at this time.

Q: What's the difference between a strategic long-shot and spray-and-pray?

A: A strategic long-shot means you've identified a genuine connection between your work and the funder's priorities (even if it's not obvious), and you're willing to invest significant time building the relationship and crafting a targeted proposal. Spray-and-pray means sending essentially the same proposal to many funders, hoping something sticks, without strategic thinking about fit.

 

The Bottom Line

The landscape of foundation fundraising is changing. The doors are closing—not because foundations don't want to fund good work, but because they're overwhelmed with poor applications from organizations that haven't done the strategic thinking.

Strategic grant writing isn't just about writing better proposals. It's about making better decisions about where to invest your limited time. It's about knowing when to walk away from a poor-fit opportunity. It's about building relationships and demonstrating a genuine understanding of what funders care about.

If you're serious about foundation funding in 2026 and beyond, it's time to stop throwing applications at every foundation you find and start being strategic about fit.

The foundations that remain open to unsolicited proposals are looking for thoughtful, strategic applications from people who've done their homework.

Give them what they're looking for—and stop contributing to the problem that's closing doors for everyone.

Now I want to hear from you: Have you noticed foundations in your area closing to unsolicited proposals? Are you seeing AI-generated proposals flood your field? And honestly, where do you fall on the spray-and-pray to strategic spectrum? Share your experience in the comments.

References

Candid. (2024). How often do foundations accept unsolicited requests for funds? https://candid.org/blogs/do-foundations-accept-unsolicited-requests-for-funds-from-nonprofits/

Eisenberg, P. (2015, October 20). Let's require all big foundations to let more nonprofits apply for grants. Chronicle of Philanthropy.

Grant Professionals Association. (2023). 2023 GPA compensation and benefits survey. https://grantprofessionals.org/page/salarysurvey

Mika, G. (2024, December 5). Where do foundations stand on AI-generated grant proposals? Candid Insights. https://blog.candid.org/post/funders-insights-on-ai-generated-grant-application-proposals/

Smith, B. K. (2011). [Foundation Center research on unsolicited proposals]. Referenced in Nonprofit Quarterly. (2017, February 24). Scaling the wall: Getting your grant proposal heard. https://nonprofitquarterly.org/scaling-the-wall-getting-your-grant-proposal-heard/