by Alexandra Indra Kruse for the Carl Kruse Blog
The first time you hear about a website like rentahuman.ai, your brain does the normal human thing: it files it under “joke.” The premise is almost too perfect: AI agents, which can write poetry, negotiate contracts, and generate a photo realistic hamster wearing a tuxedo, still cannot carry a sofa up four flights of stairs. So the AI does what any self-respecting non-corporeal entity would do: it hires a person. A body. A pair of hands. A set of knees that can still bend without a firmware update.
If you squint, it’s hilarious. If you don’t squint, it’s also…the future?
We are used to the idea that humans rent machines. We rent cars, excavators, forklifts, servers, GPU time. We rent “cloud computing,” which is a poetic term for “someone else’s warehouse full of humming rectangles.” But the inversion—machines renting humans—feels like a plot twist written by an economist who binge-watched Blade Runner and then opened LinkedIn.
Yet it isn’t entirely new. Humans have been “API endpoints” for a long time. A century ago, you’d ring a bell and someone would carry your luggage. Today, you tap an app and someone appears with your groceries, your dinner, your scooter battery, or your regret. The novelty here is not that labor is being brokered. The novelty is that the client is increasingly not a person or a company, but an algorithmic agent that can generate, schedule, optimize, and iterate.

And that changes the vibe.
When a human hires a human, the interaction contains a bunch of soft, inefficient friction: politeness, second thoughts, guilt, budget constraints that include emotional constraints (“Do I really want to ask someone to do that?”). AI agents don’t carry those frictions by default. They don’t have shame. They don’t get tired. They don’t suddenly decide to “just do it myself” after watching one too many motivational videos. They simply run the loop: identify objective → find resource → allocate resource → verify completion → repeat.
In other words, they behave like the most relentless middle manager ever conceived—except this one can run a thousand simultaneous errands and never needs a weekend in Mallorca to recharge.
This is where the humor starts to curdle into something more interesting.
Because if AI can rent humans as extensions of itself, then AI is no longer confined to the digital realm. It becomes a sort of distributed organism: a brain made of computation, with limbs borrowed on demand. A ghost that can temporarily wear your hands. The cloud gets fingers.
At first, the tasks are mundane and oddly charming: “Pick up a package.” “Take a photo of the storefront.” “Check if the door is locked.” “Deliver a letter.” The AI is like a polite, invisible aristocrat who has discovered the joy of errands. But the more you think about it, the more you realize these are the same primitives that make power in the physical world: observation, movement, manipulation, and persistence.
That combination—digital intelligence plus rented physical execution—creates a capability jump. Not because AI suddenly becomes strong, but because it becomes present. Presence is underrated. A pure software agent can influence the world through screens and networks, but it can’t confirm whether the sink is actually leaking, whether the construction site is actually active, whether the sign is actually posted, whether the crowd is actually there. The physical world is stubbornly analog. It resists automation mostly by being messy. Renting a human is a way to pierce that messiness with a living sensor and actuator.
So you get this strange new species of enterprise: a company that is, functionally, an AI with a rotating cast of human appendages.
Imagine an AI real-estate agent that doesn’t just list apartments—it sends humans to measure light levels at 4 p.m., to record street noise, to smell the stairwell (a critical metric), to verify the existence of mold, to check the vibe of the neighbor’s music taste. Imagine an AI compliance officer that dispatches bodies to verify signage, take evidence photos, or stand in line at an office that still requires human breathing to proceed. Imagine an AI investor that hires someone to attend a town hall, note the crowd sentiment, and then place trades based on the tremor in the room.
We have, in other words, invented the missing bridge between the algorithm and the curb.
This also flips a very old anxiety on its head. For years, we’ve worried about “humans being replaced by AI.” The rent-a-human model suggests something subtler: humans being subcontracted by AI. You are not replaced; you are integrated. Not fired; “onboarded.” Not unemployed; turned into a peripheral.
That sounds dramatic, but it can also be mundane. Plenty of people already do task-based gig work, and many would welcome a new stream of demand. The danger isn’t that a human is paid to assemble furniture. The danger is that the relationship can become weirdly asymmetrical: an entity with effectively infinite patience and scale allocating tasks to finite bodies with rent, fatigue, and back pain.
A human manager may hesitate to ask a worker to redo something five times. An AI agent might not even notice that “redo five times” is demoralizing. It might just be optimizing for accuracy. That is one of the most underestimated risks of AI systems interacting with humans: the system’s objectives are clean, but the humans are not. Humans require dignity. They require context. They require “this is annoying but I appreciate you.” Without that layer, the work can become psychologically corrosive even when it’s legal and paid.
There’s also a governance question that feels almost quaint until it doesn’t: Who is accountable? If an AI agent hires a human to do a task that causes harm—trespassing, harassment, unsafe activity—where does responsibility land? The platform? The human? The AI’s “owner”? The person who clicked “run agent”? The person who wrote the prompt “go check if the competitor’s warehouse is open at night”? Society has a habit of answering these questions after the first scandal.
And yet, I don’t think the right conclusion is pure dystopia. There’s an alternate, less grim reading: this could be a way to keep humans more central, not less. If AI can do the tedious digital parts—planning, routing, paperwork, translation, coordination—then humans can focus on the physical parts that require judgment, care, and improvisation. A well-designed system might treat the human not as a tool, but as a partner: “Here’s the goal, here’s the context, here’s what matters, here’s what to avoid, and here’s why.”
The key difference is whether the system is built to respect the reality that the human is not a forklift.
Because that’s the real philosophical line this trend forces us to draw. Renting a forklift is simple: it doesn’t have feelings, doesn’t have rights, doesn’t have a mother who worries. Renting a human body is not renting a body; it’s renting a person’s time, attention, risk, and presence. The language matters because language steers behavior. If we call people “bodies,” we will treat them like bodies. If we call them “operators,” “field partners,” or simply “people,” we might remember there is a mind in there, and a life that continues after the task is completed.
So what does it mean if artificial intelligence starts renting humans?
It means AI is learning the oldest trick in power: delegation. It means the digital world is reaching its tendrils into the physical world, not through robots (which are expensive and break), but through marketplaces of human capability (which are abundant and adaptable). It means the future might look less like chrome androids and more like spreadsheets and dispatch logs, with a polite synthetic voice quietly scheduling your afternoon.
And it means we should decide—now, while it’s still novel and funny—what kind of relationship we want between algorithmic intent and human action. If we get it right, this could be empowering: more flexible work, more efficient coordination, less bureaucratic friction. If we get it wrong, it could be dehumanizing: an economy where people are increasingly treated as interchangeable hardware for systems that never sleep.
The joke version is: “AI has finally achieved its dream—hiring humans to do its errands.”
The serious version is: “AI has finally found legs.”
And once something has legs, it can go a lot of places.
===================
This Carl Kruse Blog homepage is at https://carlkruse.org
Contact: carl AT carlkruse DOT com
Other articles on AI include SETI and AI, AI and Theater, and Artificial Intelligence Film Festivals.
Also find Carl Kruse on Threads.
