Summary: Greg Brockman, OpenAI’s president and cofounder, gave millions to President Trump in 2025 and told WIRED the gifts were “for humanity.” That claim has split opinion inside OpenAI and among paying users. Some employees see a clash between leadership choices and company values. Some customers canceled subscriptions. This post examines what happened, why it matters for governance and trust, what leaders should do next, and concrete steps OpenAI and similar organizations can take to manage political risk without silencing legitimate influence.
Interrupt. Engage.
A leader says they are “not a political person.” A leader gives millions to a highly political figure. A leader says the donation is “for humanity.” Which of those lines rings true to you? Which of those lines do your employees and customers hear? The collisions between private political action and public corporate mission are loud. The question now is how to turn noise into a clear plan.
What happened, plainly stated
Greg Brockman, who helped build OpenAI and still runs major parts of it, donated large sums to President Trump in 2025. Brockman told WIRED he does not consider himself “a political person,” and that the donations were made “for humanity.” Some OpenAI staff disagree that the donations match the company’s ethos. After the article, multiple ChatGPT subscribers publicly canceled their subscriptions, citing the donations as the reason. Internal morale and external trust are now under strain.
Why Brockman says “for humanity” — and why people mirror that phrase back
“For humanity.” Repeat: for humanity. What does that mean in practice? Is this a moral claim, a strategic bet, a policy preference, or a mix of all three? Leaders often frame political acts as higher-purpose decisions. That framing can persuade some stakeholders, and alienate others. The question to ask is simple: what concrete policies or outcomes was the money meant to influence, and how do those outcomes map to OpenAI’s mission?
When a leader calls themselves “not a political person” yet gives large political gifts, people naturally mirror that phrase and push back: not a political person? not a political person? The dissonance creates the backlash we see now.
Employee reaction: values, voice, and risk
Employees are not a single bloc. Some will back a leader’s private choices. Others will feel betrayed. When staff perceive leadership actions as inconsistent with stated values, two things happen: trust drops and internal friction increases. That’s what many reports describe in OpenAI’s case. Employees asked whether the company’s public mission—safe, broadly beneficial AI—lines up with the political agenda their leader supported. If the answer is unclear, employees will act, speak, or quietly disengage.
How should management respond to that anger without dismissing it? How should employees be heard while the company keeps operating? Those are open questions worth answering now.
Customer fallout and the business signal
Some users canceled subscriptions after the donations became public. That’s social proof: customers vote with their wallets. Public cancellations are visible signals to other users and the market. They shape perception and generate headlines. Will the churn be a short blip or a longer drag on growth and retention? That depends on how leadership addresses transparency, accountability, and alignment with user expectations.
Governance and fiduciary questions
Leaders are individuals, but they also carry weight as stewards of an organization and brand. Boards and executives must balance freedom of political expression with fiduciary duty and reputational risk. Key questions for boards and executives include: did the donations come from personal funds? Were they disclosed? Was the board consulted? What internal rules govern political gifts by senior officers?
No one expects zero political activity from industry leaders. Saying “No” to political engagement is neither realistic nor desirable. But boards should enforce clarity: conflicts of interest, disclosure, and guardrails against actions that could harm the company’s mission or stakeholders.
Legal and regulatory shadow
Large political donations draw attention from regulators and lawmakers. For AI firms already under moral and legal scrutiny, political donations to a polarizing leader raise questions about access, influence, and possible regulatory capture. Even if the donations are lawful, they increase the likelihood of hearings, subpoenas, or new rules aimed at limiting political influence by tech leaders. That’s a governance risk investors and boards must price in.
How leaders can act now — concrete steps
Give employees a voice: convene listening sessions and Q&A with clear ground rules. Mirror concerns back: “You say this undermines our mission—help me understand which parts?” Ask open-ended questions so you get specifics rather than general anger.
Publish clear policy: require disclosure of large political contributions by senior leaders, set thresholds, and create a review process. Make the policy public so users and employees can judge consistency.
Separate personal donations from corporate advocacy: if a leader wants influence, channel efforts through a transparent foundation with independent governance. That keeps corporate balance sheets and brand identity cleaner.
Offer restitution where appropriate: if internal policies were unclear, revise them and commit to follow-through. That builds reciprocity: give something of value back to reestablish trust.
Recommendations specifically for OpenAI
1) Public disclosure and timeline. Publish a clear account of donations, dates, and whether these were personal funds. Clarity reduces rumor and shows accountability.
2) Internal ethics review. Commission an independent team to assess how leadership political actions could affect product policy, partnerships, and safety priorities.
3) A formal political-activity policy. Set thresholds for required board notification and possible recusal from decisions that might create conflicts.
4) Customer outreach. Proactively explain steps being taken and invite dialogue. Ask customers: what actions would rebuild your trust?
5) Employee safety valve. Allow anonymous feedback and public summaries of concerns and actions taken, so employees see that their voice matters.
Wider lessons for tech leaders and boards
Technology is shaping politics as much as politics shapes technology. That reciprocity means leaders must plan how their private actions will look in public. Boards should consider political donations part of enterprise risk. They should set policies that respect personal freedom while protecting stakeholders.
Leaders: ask yourself, what trade-offs am I making? What trust am I risking for potential influence? Those are sensible questions, not attacks. When answers are uncomfortable, name the discomfort and fix what can be fixed.
Questions to provoke better decisions
What signals does this send to nonpartisan employees who worried about neutrality? What does this say to global partners and regulators? How will this affect recruitment and retention? How would you explain the decision to a skeptical user who canceled? How would your board answer those questions?
Asking open-ended questions invites nuance. It turns accusations into information. It forces leaders to explain specifics rather than slogans. That’s where progress begins.
Empathy, persuasion, and responsibility
I sympathize with leaders who want to influence policy. Influence is legitimate—many civic goals need it. I also sympathize with employees and users who fear mission drift. Both feelings are real. Name them. Mirror them. Ask, “What outcome would convince you this aligns with our mission?” Then act on the answer where reasonable.
If you’re an employee who wants to push back, say “No” where you must. Saying “No” preserves integrity and forces clearer decisions. If you’re a leader, be ready to hear “No” without shutting down the conversation.
Final take
Private political gifts by high-profile tech leaders will never be purely private. They touch brand, talent, users, and regulators. The immediate damage can be managed with transparency, stronger governance, and sincere listening. The deeper work is cultural: ensuring that company actions—personal or corporate—align with the norms and values the company claims. If leadership wants the benefit of public trust, it must accept the scrutiny that comes with it.
What will you do with this information? How would you respond if you led the board or sat on the product team? Those are the questions worth answering now.
#OpenAI #GregBrockman #AIethics #TechPolitics #CorporateGovernance #Leadership #TrustInTech
Featured Image courtesy of Unsplash and Markus Winkler (ZYFjjsmWklg)
