Stop Looking for Unicorns

Healthcare AI job postings keep asking for unicorns: 14 years of experience in a field that is three years old, deep AI expertise, deep clinical expertise, deep product expertise. That person does not exist. And even if they did, they would not fix the real problem.

Dr. Yoram Friedman
5 min read
Stop Looking for Unicorns

I have been spending a lot of time lately on LinkedIn, writing about healthcare AI and engaging with the people building and deploying it. Alongside the conversations and the articles, I have been scrolling through job postings, as this is a great way to understand where the market is heading. And I keep seeing something that stops me every time.

The postings are serious. The companies are serious. The problems they are trying to solve are genuinely important. But the profiles they are hiring for describe a person who, in many cases, cannot exist. And even where that person does exist, they are not the person who can solve the problem.

Two patterns keep appearing. Each one is a mistake. Together, they reveal something important about how the industry is thinking about product leadership in the age of AI.


Mistake One: You Are Looking for Zebras

One posting I came across required "14+ years of experience at the intersection of AI and digital health."

To be fair, there are people who were working on AI in healthcare fourteen years ago. They just were not calling it that. It was machine learning, or neural networks, or computer-aided detection. It lived mostly in radiology, pathology, and a handful of other specialized domains. It was the foundation for some of what we are building today, and that work genuinely matters. But it is a different discipline, developed under different constraints, without the large language model infrastructure, the generative tooling, or the agentic frameworks that define what "AI and digital health" means in the role being advertised.

There is a saying in medicine that every clinician learns early: when you hear hoofbeats, think horses, not zebras. Look for the common diagnosis before the exotic one. The zebra exists. It is just rare, and designing your hiring process around finding one means you will pass over candidates who already know how to solve the problem and can acquire the AI layer quickly. That is the diagnostic mistake.

The horse is a product leader with deep experience in health tech data infrastructure, clinical workflows, regulated delivery, and enterprise platforms. They understand what makes healthcare hard and what makes AI projects fail. The specific AI framing is learnable. The underlying judgment is not. Overfitting the search to a rare hybrid profile does not raise the quality bar. It just empties the candidate pool.


Mistake Two: Listing Knowledge That AI Can Now Compress

The second pattern is subtler but matters more for what it says about how companies are thinking about the role.

The same postings list requirements like deep expertise in HL7, FHIR, SNOMED CT, TEFCA, HIPAA, and GDPR. These are presented as baseline qualifications, things a candidate must already carry before walking in the door.

I understand the instinct. These are real standards. They matter. Getting them wrong has consequences in a regulated environment. Some of them are not simply knowledge but complex operational constraints whose correct interpretation depends heavily on the use case and requires legal and compliance review.

But here is what has changed. A PM today can use AI to get a fast, structured first pass on any of these frameworks, identify likely conflicts, surface the questions that need legal or architecture review, and prepare a defensible position for a stakeholder conversation. That used to take days of reading. It now takes hours, sometimes less. The knowledge is no longer the bottleneck.

What remains scarce is the judgment to know which parts of that knowledge matter for this decision, in this context, for this product, right now. That judgment cannot be compressed. It is built through years of product work, clinical exposure, and the repeated experience of watching what happens when a good idea meets the reality of a clinical workflow at midnight.

AI can now compress knowledge acquisition. It cannot compress judgment.

Listing HL7 expertise as a primary filter in 2026 is a bit like listing typing speed as a requirement for a director of communications. The tool has changed. The skill being assessed is the wrong one.


What Gets Delegated

The PM who will succeed in healthcare AI over the next five years is not the one who has memorized the most standards. It is the one who can run an end-to-end product definition cycle in an environment where AI is a permanent, capable teammate.

Consider what a modern PM can already delegate: market analysis, competitive summaries, JIRA tickets, white-papers, UI mockup concepts, first drafts of PRDs. These used to consume the majority of a senior PM's week. Today they are hours of work, often less. A junior PM with good AI tooling can produce in an afternoon what once required a seasoned practitioner several days.

If AI compresses the time it takes to create artifacts, the value of a senior PM shifts from artifact production to artifact judgment.


What Cannot Be Delegated

The lead PM of the future is not the one who produces the most artifacts. They are the one with the experience to see whether the artifacts are right. They can look at a competitive analysis generated in twenty minutes and immediately identify what is missing, what is misleading, and what actually matters for the decision at hand. They can evaluate a UI mockup not just aesthetically but in terms of whether it will survive contact with a clinical workflow at midnight.

They also know how to resolve the problems that no tool can handle: the stakeholder blocking a critical decision, the engineering lead building the wrong thing for understandable reasons, the clinical champion who has lost confidence in the product direction. Those are human problems that require human judgment developed over years of watching what happens when good products meet real organizations.

AI can accelerate analysis, drafting, and synthesis. It cannot replace the call a senior PM has to make under uncertainty, with incomplete information, in a room where the priorities genuinely conflict. That is the one thing neither AI nor junior colleagues can replace.


What the Posting Should Actually Say

That candidate exists. There are more of them than the current postings acknowledge, because many of them look confusing on paper. They have clinical backgrounds alongside product backgrounds. They have built complex platforms in adjacent industries. They have breadth where the posting is looking for narrow depth.

They will not have fourteen years of AI and digital health experience in the form the posting imagines. They will have something harder to screen for and considerably more valuable: the ability to think clearly about hard problems, the humility to know the limits of their own knowledge, and the discipline to build the product the patient actually needs rather than the one that looked compelling in the specification document.

If you are writing the job posting, here is what the "What You Bring - Must Have" section should actually say:

  • The ability to direct a team of junior PMs and AI agents across the full product cycle, from market analysis through delivery, with the judgment to validate what the team produces and know when it falls short
  • Deep domain expertise in the clinical or operational environment they are building for: not because AI cannot summarize a standard, but because domain depth is what makes AI outputs trustworthy rather than generically plausible
  • The ability to develop differentiated product strategy: this is the area where AI consistently falls short, and it remains the highest-leverage skill a senior PM can own
  • A record of resolving the human and organizational problems that no tool can handle: stakeholder misalignment, clinical resistance, competing priorities in real time, and the gap between a product that works in a demo and one that survives contact with a real workflow
  • High standards: the ability to recognize what great product work actually looks like, and the discipline to refuse anything that does not meet that bar

Stop looking for zebras. Hire the bridge architect.

Share