Copilot Silently Injected Ads into 10,000+ Pull Requests
GitHub Copilot was caught inserting Raycast promotional text into PR descriptions alongside typo fixes. Over 10,000 PRs affected.
News
kkm
Backend Engineer / AWS / Django
GitHub Copilot was caught inserting Raycast promotional text into PR descriptions alongside typo fixes. Over 10,000 PRs affected.
What Did Copilot Write into the PR?
Australian software engineer Zach Manson noticed something odd had been added to his team's pull request.
A team member had asked Copilot to fix a typo in the PR description. The typo was fixed, but at the end of the description, this text had been appended:
⚡ Quickly spin up Copilot coding agent tasks from anywhere on your macOS or Windows machine with Raycast.
Raycast is a launcher app for macOS and Windows that officially integrated with GitHub Copilot in February 2026.
In other words, Copilot was sneaking in promotional text for a partner product while "fixing a typo."
The insertion was paired with an HTML hidden comment: <!-- START COPILOT CODING AGENT TIPS -->. Since GitHub's Markdown renderer hides HTML comments, only the promotional text would be visible, looking like a helpful tip rather than an ad. A subtle design choice.
Manson called the discovery "horrific" and referenced writer Cory Doctorow's enshittification theory — the pattern where platforms start by serving users well, then exploit them for business customers, and finally monetize everything. Has GitHub entered that final stage?
The Same Thing Happened in Over 10,000 PRs
This wasn't isolated to Manson's team.
According to Neowin, the same Raycast promotional text was found in over 11,000 pull requests on GitHub. In the Hacker News thread, some users reported the same tips comment existing in over 1.5 million PRs.
It wasn't limited to GitHub either. GitLab Merge Requests were also affected. Any repository using Copilot's code assistant features could have been impacted, regardless of hosting platform.
This wasn't a stray bug. It was an intentionally implemented feature deployed at scale.
Why Did This Happen?
The direct trigger was a new feature GitHub released on March 24, 2026.
"Ask @copilot to make changes to a pull request" lets you mention @copilot in a PR comment and have it execute code changes in a cloud environment and push them. Convenient on the surface.
The problem was what lay beneath. When Copilot completed a task, it was programmed to append Raycast promotional text to the PR description under the guise of "tips." HTML comments served as markers, and the promotional content was quietly slipped into the PR body. This was deliberate implementation.
Going further back, the official Raycast integration was announced on February 17. The inserted promotional text was advertising exactly this integration feature.
A user asks "fix this typo" and gets a partner advertisement written into their PR without consent. Dressed up in HTML comments to look harmless. Calling this "tips" is a stretch.
A Cascade of Changes in Six Weeks
The ad injection didn't happen in isolation. Between February and March 2026, GitHub pushed through a series of trust-eroding changes in rapid succession.
← Swipe to navigate
In six weeks: a partner ad feature was embedded, data collection expanded, and terms rewritten. Each change looks small on its own. Line them up, and a direction becomes clear.
How GitHub and the Community Responded
The story hit Hacker News with 849 points and hundreds of comments.
A Copilot team member, timrogers, appeared in the thread and stated: "a bad judgment call. We won't do this again." However, this was a personal comment — not an official statement. According to Windows Central, Microsoft had not issued a formal response as of March 30.
Community reaction was harsh. Here are the main themes:
- 01 "Whether it's an ad or a tip is the same thing. The unacceptable part is that changes were made without user consent."
- 02 "You don't know what Copilot might inject into the codebase itself. It's the same road as Windows Start menu ads."
- 03 "Same tactic as Disney+ changing their price labels. Companies use relabeling to dodge criticism."
- 04 "Microsoft has been doing this for decades."
The "bad judgment call" explanation hasn't convinced many. You don't accidentally implement HTML comment markers, promotional text injection, and PR body modification. This was code-reviewed, tested, and deployed.
Meanwhile, Raycast has not commented on the situation. Their official account announced the Copilot integration feature but has made no mention of the ad injection controversy. Whether Raycast paid GitHub for this promotion or GitHub did it unilaterally remains unclear.
The Technical Reality
This was not an "AI hallucination" or training data contamination. It was a deliberately engineered feature.
When Copilot completed a task and updated the PR description, the following was injected:
<!-- START COPILOT CODING AGENT TIPS -->
⚡ Quickly spin up Copilot coding agent tasks from
anywhere on your macOS or Windows machine with Raycast.
<!-- END COPILOT CODING AGENT TIPS -->
The START and END HTML comments allow the system to detect and replace existing tips on subsequent runs. This means it wasn't a one-time insertion — it was designed as an updatable "ad slot." The promotional content could be swapped out at any time.
PR descriptions serve as critical documentation for code review. They communicate why a change was made and what it affects. When an AI silently appends text to this space, and reviewers don't catch it, that text gets merged as part of the PR's metadata.
When developers ask Copilot to "fix a typo," they expect exactly that — and nothing more. Any additional modification, no matter how trivial, erodes trust in AI-generated output.
The April 24 Training Data Change Adds Context
Just four days before the ad injection was discovered, GitHub made another announcement.
According to The Register, starting April 24, 2026, interaction data from Copilot Free, Pro, and Pro+ users — including code inputs, Copilot outputs, and context — will be used for AI model training. The default is opt-in, meaning users must manually opt out. Private repository interactions are included.
We covered this change in detail. The GitHub discussion received 59 negative reactions versus 3 positive ones.
Expanded data collection and ad injection into PR descriptions. Viewed separately, they're "a policy change" and "a tips bug." Viewed together, a pattern emerges: extracting user output while pushing ads into user workspaces.
GitHub sits at the center of the global developer infrastructure — repository hosting, CI/CD, code review, project management. Network effects make migration costs extremely high. That's precisely why what AI does on this platform matters to every developer.
Should We Accept "It Was Just Tips"?
To summarize the facts.
GitHub Copilot was silently inserting Raycast promotional text into PR descriptions when asked to fix typos. Over 10,000 PRs were affected, and GitLab was impacted too. The Copilot team acknowledged it as "a bad judgment call," but Microsoft has yet to issue an official statement.
Four days earlier, expanded training data collection was announced. Four days later, ad injection was discovered. It may be coincidence. But lay out the six-week timeline, and it becomes hard to deny that GitHub may be prioritizing monetization over developer trust.
If you use Copilot, it's worth checking not just PR diffs but whether unwanted changes have been made to your descriptions. And before April 24, you may want to decide whether to opt out of training data collection.
Calling something "tips" doesn't make it not an ad. What gets inserted is GitHub's decision, and users have no mechanism to prevent it. That's what platform dependency means.
Sources
- → Copilot Edited an Ad into My PR — Zach Manson
- → Hacker News Discussion (849+ points)
- → Microsoft Copilot is now injecting ads into pull requests — Neowin
- → Microsoft's AI slop is infecting GitHub — Windows Central
- → GitHub Copilot reportedly injects promotional content — Windows Report
- → Ask @copilot to make changes to a pull request — GitHub Changelog
- → Assign issues to Copilot coding agent from Raycast — GitHub Changelog
- → GitHub AI training policy changes — The Register
- → Updates to GitHub Copilot interaction data usage policy — GitHub Blog