In “The Measure Of A Man,” the 9th episode of Star Trek: The Next Generation’s 2nd season, someone from Starfleet wants to disassemble robot crewman Data to learn more about him and create more robots. When Data (and Picard) object, the Federation convenes a tribunal to determine whether or not Data is human.
I do not like this.
Let me establish two things right out of the gate. One, if you don’t know Star Trek, it is possible the rest of this post will be impenetrable to you, but I’ll do my best to explain why I think the plot does not work and why that matters in a way that should be generalizable to non-Trekkies.
Two, if you do know Star Trek, you know that “The Measure Of A Man” is one of the all-time great episodes of the franchise—definitely of TNG. So: when I say “I do not like this,” I am not saying that it is a bad episode, that you are bad for liking it, that it should not have been written, etc.
It is, almost any way you look at it, one of the better-plotted episodes of the show’s first couple seasons, and it demonstrates a level of writing and of imagination that TNG struggled to find early on. “The Measure Of A Man” is excellent television and pretty good Star Trek—to answer the question posed in the headline.
So. What do I think about it?
This will take a bit. I think this is the first Substack post I’ve written that is more opinion than research, but I’ll try to bring this back to something instructive—at least, how I’ve found it instructive in my own writing. To begin with, though, here is some context.
In-universe, Data (played by Brent Spiner) is an android created by brilliant, idiosyncratic roboticist Noonien Soong. He outwardly resembles a human being, albeit with pale skin and yellow eyes; he weighs about 100 kilos, wears a normal Starfleet uniform, and occupies ordinary quarters about the USS Enterprise-D.
Data did not experience a childhood and was theoretically an adult from the beginning of his existence. He was “activated” in 2338 and matriculated into “Starfleet Academy” in 2341, where he earned high marks. He was formally commissioned as a Starfleet ensign in 2345, and was promoted to lieutenant commander in 2360, five years before “The Measure Of A Man” is set.
Out-of-universe, Data’s character arc in TNG is that of someone who wishes to both understand humanity and, eventually, to “become human” in some meaningful sense. He is articulate, but struggles with metaphor and colloquial speech; he does not always perceive context and does not—or believes he does not—truly “understand” things like humor.
You will, perhaps, recognize this as a very common Robot Awakens plot. Data is explicitly compared to Pinocchio in an early episode. In that sense, “is it okay to take Pinocchio apart?” is, on its face, not an unreasonable thing to explore.
If nothing else, though, I think Star Trek: The Next Generation in particular was ill-equipped to explore it.
For a start: “The Measure Of A Man” takes place halfway into the second season. This is pretty late in the game to be asking a fundamental “is this character human enough to not murder?” question. The Good Place had its characters wrestle with whether or not it was moral to exploit artificial person* Janet seven episodes in.
* not a person
At that point she’d had like 15 minutes of screen time, and most of those were answering questions like an upbeat Google. By the time Picard and Riker are engaged in their tribunal, though, viewers had watched over 21 hours of TNG. Which means for most of them, the question was long since answered.
Also, most of those hours sucked.
The first episode of The Next Generation, “Encounter at Farpoint,” introduced the audience to the show’s main antagonist. An omnipotent creature who exerts total control over the Enterprise and her crew, his whims—variously either his own amusement or his desire to better understand humanity—set the direction for the early years of TNG.
This mercurial being, “Star Trek” creator Gene Roddenberry, spent the entire first season and part of the second subjecting both the crew and the audience to some of the worst television that it is still possible to buy and consume on purpose. A better-managed show might have done a proper job establishing Data as a more alien (and alienating) being, one where we ourselves might have some doubts about how human he truly is.
TNG was not that show. Over the previous 33 episodes of Brent Spiner mugging for the camera, Data had:
Pretended to be Sherlock Holmes
Had sex, after being infected with the same intoxicant as the rest of the crew
Decked a gangster mook with relish and then narrated it in a perfect film-noir pastiche, complete with radio voice
Taken up painting
Talked to himself so distractingly he upset the ship’s computer
Intentionally misinterpreted a direct order in order to gain enough time to ensure that he could rescue his friends
Been the only crewman with sufficient empathy to realize the cryogenic passengers of an ancient starship should not be left to simply die because it’s inconvenient
Pretended to be Sherlock Holmes again
The audience is told that Data is not human, and does not experience human emotions, but apparently Brent Spiner was not. Consequently, this method of being “informed” fails to be credible. A recurring theme, for instance, is someone else in the cast recounting a “joke” to Data that would not make the cut for a popsicle stick of even the lowest standards, then gaslighting him like an asshole when he fails to react.
(In “The Outrageous Okona,” the “joke” that drives Data to seek out advice from a professional comedian is Guinan—Whoopi Goldberg, for God’s sake—telling Data “you’re a ‘droid, and I’m a ‘noid.” This leads him to watch, and try and learn from, a physical comedy routine in the ‘holodeck’—the ship’s virtual-reality entertainment room. Unimpressed, he tells the comic, dryly: “So, if you put funny teeth in your mouth and jump around like an idiot, that is considered funny?”)
In “Skin of Evil,” Data processes and experiences grief over the death of his friend-and-maybe-more Natasha Yar. In “The Child,” new ship’s doctor Katherine Pulaski (Diana Muldaur) mispronounces Data’s name (and is corrected by him) and brushes off his lack of “human touch” (and is corrected by her patient). The narrative, and the crew, understand Dr. Pulaski to be unequivocally in the wrong.
I’m setting all of this up to get to two points.
The first is that “is this new, confusing, and alien life-form deserving of respect and equal treatment?” is one of the classic sci-fi questions. Star Trek asks it a lot. Star Trek: The Next Generation had already tackled this in Season 1’s “Home Soil,” where they encounter a form of intelligent sand and Data is the first to realize it is alive. Data would get to explore the topic directly in Season 3’s “The Offspring,” where he constructs a child for himself.
Data is not, however, a new, confusing, or alien life-form. Even in-universe, we know that Data is capable of experiencing emotions, because his identical “brother” Lore is capable of doing so, and when a dying scientist transplants his brain into Data (“The Schizoid Man”) he does so without having any apparent difficulty applying his own personality to Data’s circuits.
As I said: a well-written TV show might have been able to present Data as a genuinely questionable example, but early TNG was not a well-written TV show. Indeed, having established as a shibboleth in “Datalore” that the emotional Lore is capable of using contractions and Data is not, early TNG was not even sufficiently competent to make it through that entire episode without having Data use a contraction.
There is no doubt that Data is either human, or near enough to human as makes no difference. My second, and more minor, point is that this is so obvious that the premise of “The Measure Of A Man” becomes profoundly flawed. Actually, I’d call it so flawed that—taken as more than 40 minutes of television drama but as instructive about the Trek vision itself—it introduces real problems to the setting.
Nobody, in the episode, is willing to treat the tribunal as the obvious farce it is. Riker and Picard aver that they believe Data to be human, but not so strongly as to just shoot the judge in the face and help Data flee to safety. The judge, in true enlightened centrist form, wants to have a polite discussion about this issue of fundamental injustice; the crew plays along.
And, in the end, Maddox—the man who wants to disassemble Data—abides by the ruling when it goes against him. Would Picard?
I don’t know. That is a problem. “The Measure Of A Man” requires us to accept that the Federation’s sense of morality is so utterly twisted that the characters we’ve watched for 33 episodes are willing to sit in a courtroom and argue over whether or not it should be a matter of debate that their friend is, in fact, their friend or capable of being so. The discussion being entertained at all is, as an indictment of the Federation and Star Trek’s utopian vision, worse than anything Section 31 did in Deep Space Nine.
(Also, and far more trivially, the judge requires Commander Riker to prosecute the case against Data—if he does not, she will enter summary judgment. This is great drama and great television. Can you imagine its impact on morale? Starfleet having this as established policy is not only unspeakably cruel but so unfathomably stupid that it makes the implications of Voyager’s “warp 10, but only if you turn into mutants and have salamander babies” episode look as inconsequential as a boom mic accidentally left in shot or a misspoken stardate)
Anyway, here’s why I think this matters: “The Measure Of A Man” claims to be asking what it means to be human. It’s not. What it is actually asking is: “hey, is it cannibalism if I eat my clone?”
Science fiction also asks this question all the time, and I basically always hate it. I am extremely uninterested in whether someone that is clearly human might not actually be because they weren’t carried to term in a human womb until their umbilical cord was severed and they took their first breaths of the oxygen-nitrogen mixture required for life.
I never find this convincing. I have never seen a narrative give a believable reason for why we-the-audience should treat a clone as different from a human child, or why it is in any way valuable or interesting to ask if the substrate on which consciousness is delivered is meaningful. I could see this mattering in, say, theological discussions. I assume there is debate as to whether or not Data will go to heaven when he is deactivated.
Beyond that, the idea that there is some “thing,” absent sapience itself, that makes someone “human” and therefore ineligible to be eaten—that it is a narratively intriguing debate to argue whether an engineered form of life might be 1:1 equivalent to you or me but still be “lesser than”—does not inspire me. Because, I would propose, the better thing to interrogate is why, knowing the robot is human, we treat them as though they are not.
I think, to me, this is why the conflict at the heart of Do Androids Dream of Electric Sheep? or even, I dunno, Westworld works. To the extent that anyone in those stories articulates a belief that replicants aren’t people, we know they’re full of shit. They’re not engaged in a sincere philosophical question—“when people of good conscience have an honest dispute,” as the judge in “The Measure Of A Man” puts it. They are engaged in rationalization over why their exploitation of a disenfranchised class of people is acceptable.
Whether you can eat your clone is a silly question. “Why do we dehumanize our fellow man?” is not.
And from that: what is the flaw in the human condition that leads us to reinvent that argument over, and over, and over? What is the motivation that underpins its new form? Maddox’s argument is that he should be able to create an army of Datas, one on every vessel in Starfleet, for the good of the Federation. Picard argues that “a single Data is a curiosity; a wonder, even. Thousands of Datas, isn’t that becoming a race?”
It is treading old territory now to point out that the central flaw of fantastic racism is how often it is grounded in some actual history or some tangible difference. Racism doesn’t work like that. Chinese laborers on the Transcontinental Railroad weren’t abused and denied citizenship because the Chinese pledged allegiance to a dark elf warlord 800 years ago, or because Chinese people have sharp claws and can see in the dark. It was because white capitalists found their exploitation to be convenient.
“The Measure Of A Man” almost grapples with this. Ship’s bartender Guinan says:
In the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it's too difficult, or too hazardous. And an army of Datas, all disposable… you don’t have to think about their welfare, you don’t think about how they feel.
“Whole generations of disposable people,” she finishes to Picard. “You’re talking about slavery,” he conjectures. Guinan tells him, inexplicably: “I think that’s a little harsh,” and we are treated to the spectacle of the white British dude explaining how slavery works to an African-American actor. But then he says “that’s not the issue” and goes back to trying to prove that Data must be human because he keeps the gifts that people give him.
The episode has both Starfleet—an institution we are intended to admire—and characters we love and respect engage in the same kind of sophistry as an anti-abolitionist arguing that chattel slavery is good, actually because some races of people haven’t yet learned how to govern themselves.
It’s just none of them really seem to understand that. Worse, the writers don’t seem to understand it. The writers appear to think that, when Riker turns Data off to prove that he’s just a machine, he’s actually making a really good point.
In the real world, depressingly, we can guess how an android like Data, or a genetically engineered transhuman mutant, or a clone, would be treated. But at the heart of it, the real world is not going to deny androids the right to self-determination for anything so clever or meaningful or distinct.
We will keep androids from voting not based on any argument over whether their computer brains can’t truly understand the issues on a ballot but for the same reason America enjoined (…enjoins) Black citizens from voting. We will keep clones from marrying not because of the complication of their genetics but for the same reason we denied gays the right to marry.
Which is, of course: no goddamned reason at all.
The “explanations” for bigotry are always irrational. It is, I think, often a fraught notion to engage with the concept as though it is anything but. I try to avoid dealing with racism in my own speculative fiction, but it’s something I keep in mind when I do. That is: my humans don’t discriminate against my uplifted animals because of anything innate to the uplifts. Humans just don’t like them, and it often benefits the powerful to keep an underclass around.
There is nothing the animals can do to change it—no way to become human enough to “pass” and gain sanction—because there is no logic behind it; no special threshold of humanity. It is only possible to become one of “the good ones,” and when that happens I try to make the incoherent rationalizations that enable this distinction clear and central.
Guinan was right. In the real world, people like Maddox don’t respect people like Data because they just don’t. Maddox wants to create a race of mechanical servants because it would benefit him, full stop. The rest, whatever he needs to tell himself—whatever Judge Louvois needs to tell herself that Maddox shouldn’t be cashiered out of Starfleet as a disgrace to the uniform—is window dressing and motivated reasoning.
And so, in part, my problem with “The Measure Of A Man” is that the episode establishes the Federation to be precisely that kind of bigoted, unjust, regressive society, but will not admit it. Jean-Luc Picard is arguing before a superficially rational judge in an apartheid court, indulging the image of her rationality, and the episode appears to legitimize that debate-me-bro indulgence.
How could the story be fixed?
I don’t know. What does “fixing” even mean? I’m not trying to explain why I think it’s broken, I’m trying to explain why I think it embodies a common failing in the way speculative fiction understands injustice and oppression. Pascal Farful, on Bluesky, describes it as “an episode which if you stand 20 paces from is perfect,” and that’s quite fair.
On the one hand, I’d say an “is this person truly a person?” story would’ve worked better earlier on, before we really had a chance to appreciate Data as a character and when he might still have been mysterious. On the other, TNG “earlier on” was absolutely not mature enough to have told that story.
Indeed, I’d call that worthy of an entire second discussion: Data’s idiosyncrasies, presented as evidence of his incomplete humanity, almost invariably parse today as examples of the crew failing to understand or empathize with him. It was not exactly unusual for plot of an episode to boil down to “the crew of the Enterprise encounter, and are baffled by, neurodivergence”; at least the story avoided that.
Maybe, when the judge issues her ruling, we should’ve seen Picard thank her, then go and privately tell Worf “stand down your security team. We’re not going to need them,” so we know that the ruling would never have mattered—that the Enterprise and its crew would’ve done everything to protect Data if Starfleet tried to kidnap him.
Or perhaps, as structured, what I perceive as its flaws were unavoidable for the time and for the series in question. Consider the alternative presented by Star Trek: Voyager, which had the advantage of coming after TNG trail-blazed a new era of Trek, and also the advantage of not being touched by Roddenberry.
Star Trek: Voyager cast Robert Picardo as the “Emergency Medical Hologram,” a computer program designed to supplement human beings in sickbay during crisis conditions, who is pressed into service full-time when the starship Voyager is stranded many thousands of light years from home. The EMH, like Data, is an artificial person—more than Data, even, because he lacks a physical form.
The crew of Voyager has even more reasons to doubt the EMH’s consciousness, because they have benefited from a decade of “holodeck” programs with characters who also appear to be sentient. Also, unlike Data, the EMH is not unique and has never passed a test like being, say, implicitly approved as “human enough” by the entrance board of Starfleet Academy.
Notwithstanding, Star Trek: Voyager correctly—in my opinion—established early on that the question of whether or not the Doctor is alive enough to have feelings and rights was a boring one. Having been introduced as a mere computer program, Captain Janeway describes him as a “full-fledged member of the crew” who should have autonomy over his programming (and when he can be turned on and off) in the 7th episode.
For the rest of the series, and with a few notable exceptions, how the Doctor is human, and the precise ways in which he processes and develops humanity, is explored. Whether he is an individual, with individual rights, is not. Voyager also made the smart decision to have him be an advocate of his own personhood rather than continuously doubting it (or having otherwise-sympathetic recurring characters doubt it).
It is taken as a given that an artificial person whose physical form is ephemeral and computer-generated will, of necessity, have a distinct experience, obviously. It is also taken as a given that he faces real and unique challenges and discrimination. But the show does not allow questioning his humanity to be seriously entertained—more than that, when someone appears to do so, having it pointed out to them is treated as a sobering reality check.
VOY’s equivalent of “The Measure Of A Man” is probably the 7th season episode “Author, Author,” in which a “holonovel” written by the Doctor is released without his permission, and he argues that he should have the copyright over his work. Because USS Voyager is still far from home, the arguments are conducted through a limited window of regular communications the ship has with Earth. I would propose that it is different, and in some sense “better,” for four-ish reasons.
The issue is on some level more trivial (ownership of intellectual property). However, the Doctor and the rest of the crew understand that it has more serious implications—which is to say that even without the specter of “can the Doctor be disassembled for our benefit,” they understand that it is a meaningful attack on his personhood.
Nobody on the crew disagrees with this. Nobody is forced into the position of arguing that maybe he shouldn’t have rights. Indeed, literally everyone is engaged in a shared sacrifice (giving up their own ability to talk to Earth) on his behalf.
The Doctor’s acerbic personality enables him to better articulate the absurdity of what we in the audience understand to be a problematic, wrong-headed argument. The holonovel he’s written also mocks the crew in a way they find mean-spirited, which establishes him as having clear foibles more serious than “can’t say ‘can’t’” that the crew understands absolutely do not impact his personhood or legitimacy.
The arbitrator, who in the end comes to the same limited decision as Judge Louvois in “Measure Of A Man”—that irrespective of the rights of holograms, generally, Voyager’s EMH has unquestionable rights—also both understands and states explicitly that they’re kicking the can down the road and the issue will need to be actually settled.
But also, “Author, Author” ends with other “EMH”s—retired from medical service and put to work in a mining colony—having learned about the Doctor’s work and passing it around themselves as a piece of subversive material, integral to their eventual liberation.
That is to say: “Author, Author” knows that it is an episode about rationalizing the unjust exploitation of an oppressed class of people. It does this even though, of necessity, it means acknowledging that the Federation is a society that keeps slaves—probably a step that TNG, coming as it did before Deep Space Nine’s darker turn, could not take.
“The Measure Of A Man” is irksome in that it does not get what it is doing. It becomes maddening in that it almost gets it, acknowledges the argument it’s dancing around, and then tosses it aside in order to make a far more stupid one (“so what if Data is a machine; we’re all machines”). In the end, the judge agrees that Data cannot be eaten no matter how tasty he looks, everyone has a party, and nobody asks how in God’s name they allowed themselves to get there.
But, if you’ve gotten this far: it’s great television, you know? It is a fine early example of Picard doing one of his Picard Speeches. Jonathan Frakes is equally empathetic and powerful as Commander Riker. I wouldn’t skip it if it’s on. I would never tell you to skip it. I totally get why it’s in so many top-ten lists. Depending on my mood, I might even put it on my own.
…And for all the words I’ve spent here—almost 3/4s as long as the episode script itself is—I can’t tell you why I don’t have the same reaction to Voyager’s “Latent Image.”
Maybe next time.