Skip to content

All posts

The Intelligence Gap: The Human Intelligence Automation can't capture...Yet

A recent Digiday piece on influencer agencies and the shift toward automation platforms captured something important: the industry is undergoing real structural change. The pattern it identified—agencies collapsing, platforms raising capital, sophistication moving from service to software—tracks with what we're seeing.

"The influencer marketer's role is dead," Jonathan Chanti declared in the piece. "All the jobs they used to do—find people, curate them, reach out to them, nurture them, report on them—you can build an AI that does a better job of that right off the bat."

It's a bold claim, and the directional thesis is right. But it got me thinking about what automation actually looks like once you're in the operational trenches—and whether we're building toward the foundation that makes it possible, or just automating the easy parts while the hard infrastructure problems go unaddressed.

What "Human in the Loop" Actually Means

The phrase "human in the loop" has become ubiquitous. Everyone agrees humans will be involved. The question is: doing what?

A brand wants to activate 200 creators for a trending moment. Discovery platforms surface candidates instantly. AI drafts outreach and suggests rates. Then the human part starts: A creator's manager says the rate is too low and they only do 60-day exclusivity, not 30. Legal flags usage rights concerns. The creator delivers early with extra content someone needs to review. Their manager mentions expanding into a new vertical—worth exploring?

Email threads branch. WhatsApp conversations happen outside the system. Spreadsheets track responses. Slack holds approval status. Contracts go through DocuSign, get saved as PDFs somewhere. The intelligence generated—what rates closed, which terms caused delays, what delivery patterns emerged—scatters across tools. Most never becomes structured data.

The Data That Exists Versus The Data We Can Access

The automation confidence assumes the data problem is solved. But the constraint isn't AI capability—it's what data exists in accessible form.

Discovery platforms pull from social media APIs. Everyone has the same follower counts, engagement rates, demographics. This intelligence is commoditized.

The differentiated intelligence—what you paid, what terms you negotiated, which creators over-deliver, which managers respond quickly, what contract structures work—lives in scattered communications. Email archives, WhatsApp messages, PDFs on desktops, departing employees' institutional knowledge.

AI can't analyze data that doesn't exist in structured form. This is why Jamie Gutfreund's insight in the piece resonated so strongly: "Access isn't the advantage anymore. Data is." She's exactly right. But which data? The performance metrics everyone can access, or the operational intelligence nobody's capturing?

Why Consolidation is Harder Than It Looks

The natural response: build one platform that handles everything. Discovery, outreach, negotiation, contracting, performance tracking.

This runs into a problem that's behavioral, not technical. People use different tools for different contexts: email for formal negotiation, WhatsApp for urgent coordination, Slack for internal approvals, DocuSign because legal requires it, spreadsheets for tracking what platforms don't handle, discovery platforms for sourcing intelligence.

Each tool serves a purpose. The fragmentation reflects how workflow actually functions. The "one platform to rule them all" fails because human behavior naturally fragments across contexts. The better question: how to consolidate the intelligence those tools generate, not the tools themselves.

What Element Human and Others Are Actually Wrestling With

Steph Money from Element Human articulated the industry's challenge clearly in the discussion thread around the piece. She sees two divergent paths: agencies becoming specialized cultural experts operating within brand tech constraints, while brands bring technology in-house to treat creators like media inventory.

But both paths converge when AI-driven efficiency triggers creative mediocrity. The future edge isn't just performance and execution—those get automated. It's mastering the "holy trinity": creativity, tangible ROI, and scale.

That trinity requires connecting creative quality to economic results to operational scalability. You need to know not just what performed well, but what you paid, what terms you negotiated, what delivery patterns emerged, whether those patterns scale.

Element Human and similar firms wrestle with capturing this when it lives in unstructured communications across fragmented systems. It's not a discovery or performance tracking problem—those are solved. It's an intelligence consolidation problem.

The Behavioral Complexity Nobody's Accounting For

There's an assumption embedded in the automation narrative: micro and nano creators can be automated because they have less leverage. They're not working with managers or agents, they need the income more—so deals are simpler, faster, more standardized.

Are we sure? When you're activating 200 micro-creators instead of 20 mid-tier influencers, coordination complexity explodes. Each has their own communication style, schedule, comfort level with legal language. And risk compounds: with 200 creators, you're exponentially more likely to encounter someone who doesn't understand exclusivity, misses deadlines, delivers off-brand content, or has a controversy emerge.

We saw this building Basa. A teenage creator's parent joins calls about image rights worried about college applications. A micro-creator reviews terms on their phone between work shifts with questions about unfamiliar language. Someone delivers technically compliant content that's tonally wrong in ways automation wouldn't flag.

The Spectrum of Complexity

Consider complexity across tiers. Mid-tier creators often have managers but not full representation—who approves contracts when they disagree? Macro and mega influencers have agents, managers, lawyers, business managers, publicists. A single deal might require approval from four people with different priorities.

Each tier has its own choreography. Micro-creators need hand-holding but decide quickly. Mid-tier moves faster on communications, slower on approvals. Mega influencers have sophisticated teams that negotiate every clause. And creative varies: nano creators deliver on phones, macro influencers have content teams, celebrities involve photographers and stylists.

Then there's the brand side. Straightforward posts need legal and brand approval. Complex partnerships—video series, product claims—require legal, brand, product, PR, potentially executive sign-off. Each layer adds friction. When something goes wrong, responses aren't automated: Do we extend deadlines? Request reshoots? Terminate? These require understanding contract terms, relationship history, timing, and risk tolerance.

The automation narrative assumes that once you solve discovery and outreach, the rest is straightforward execution. But execution is where behavioral complexity lives—different at every tier, across cultures, based on relationship history. That's not a problem AI solves without infrastructure designed around behavioral reality.

How Close Are We Really?

So how close are we to total automation? It depends on what we mean. Can AI handle discovery and initial outreach? Increasingly. Generate reports and surface performance patterns? Absolutely.

But can it synthesize intelligence from scattered communications? Navigate behavioral complexity that compounds with volume? Make judgment calls about risk, relationship history, strategic trade-offs? Not without infrastructure that doesn't exist yet.

Agencies struggle not because AI can replace them, but because they never built infrastructure to capture their own intelligence. Platforms raise money not because they've solved it, but because everyone recognizes something needs building. Brands go in-house not because they have better technology, but because they're tired of operating blind on contract economics.

The real question isn't whether AI will automate creator marketing. It's whether we'll build infrastructure that makes automation meaningful—that captures operational intelligence, connects it to performance data, makes both accessible for better decisions. That's not replacing human judgment. It's giving judgment better intelligence to work with.

The Intelligence Layers Nobody's Connecting

If the constraint is intelligence consolidation, what does infrastructure actually look like? The answer: it depends which intelligence layer you're prioritizing. There isn't one missing piece. There are several serving different functions.

Four Different Intelligence Sources

Element Human focuses on social listening and cultural intelligence—sentiment, trends, brand perception from social platforms, comments, engagement patterns. This informs creative strategy. The challenge: structuring unstructured social signals into actionable insights, knowing what's signal versus noise.

Contract economics reveals what deals cost, what terms work, what patterns predict success. Operational intelligence—who responds quickly, what causes delays, relationship quality—lives in email, WhatsApp, Slack. Performance data from platform APIs is already solved.

Which layer should be prioritized? It depends. Staying culturally relevant needs social listening. Scaling efficiently needs contract economics. Moving fast needs operational intelligence. The problem: these layers operate in silos with no meaningful connections.

You can know a creator performed well without knowing what you paid or what terms you negotiated. You can understand cultural trends without knowing which creators execute efficiently. You can see responsiveness without knowing if their audience cares about the cultural moment.

The Synthesis Problem

How should these layers connect? Who's responsible? Right now: humans manually. Someone looks at performance dashboards, reads cultural intelligence reports, checks email for contract terms, synthesizes into decisions. Works at small scale. Breaks at volume.

What's needed isn't just better point solutions—it's connective tissue between layers. The ability to ask: "Show me creators who performed well on similar campaigns, whose audience engages with this cultural moment, who we've worked with successfully, and what we paid them."

No single platform can capture this because intelligence sources are fundamentally different. Social listening needs real-time cultural analysis. Contract data needs structured deal-making workflow. Performance tracking needs platform API integrations. Operational intelligence needs capturing communications across fragmented tools.

Ecosystem Over Platform

What we might need isn't "one platform to rule them all" but an ecosystem where specialized tools focus on their intelligence layer and expose it for others to access. Where Element Human's cultural insights inform Basa's contract negotiations, which connect to discovery platforms' performance data, which feed back into social listening.

This is harder than "automate creator marketing with AI." It requires rethinking how intelligence flows through the industry, not just building better isolated solutions.

Maybe that's why the automation narrative feels off. It assumes we can automate execution once we solve sourcing and reporting. But execution requires synthesizing multiple intelligence layers that don't connect—many of which don't exist as structured data at all.

The question isn't whether automation is coming. It's whether we're building the intelligence infrastructure that makes it meaningful. Right now, most investment goes toward parts already commoditizing—discovery, initial outreach, performance reporting—while harder infrastructure problems go unaddressed.

Maybe we get to meaningful automation faster by acknowledging what's actually missing rather than assuming the foundation already exists.

The Digiday piece surfaced the structural shift that's happening. These are the operational layers we're seeing—not as a counterpoint, but as texture on what the transition requires.

© Basa Futura, Inc, 2025