This page looks best with JavaScript enabled

On Black Museum

 ·  ☕ 3 min read

Black Museum is an anthology story about a failed neural-tech salesman and the lives he’s ruined by pushing untested products on vulnerable people. His products include a sensor and receiver that allows a doctor with a much-too-high mortality rate to feel the experiences and sensations of patients, a method of transferring human consciousness from a woman in a coma into her husband’s mind (then into a stuffed monkey), and lastly transferring a death row inmate’s consciousness into a freak show main attraction where attendees can repeatedly flip the switch for his electric chair sentence.

Each story (and technology) builds off of the previous and collectively serves to demonstrate the slippery slope of creating products without considering unintended effects. I thought the first two stories were relatively solid in demonstrating potential pitfalls, but the final arch (which did also tie everything together), seemed a little lackluster, especially considering how repeated electrocutions caused the holographic consciousness to degrade while creating fully conscious copies for key chain souvenirs. If the consciousness is replicable, it shouldn’t degrade (kind of like working on layers in Photoshop instead of directly on the image). Sure, maybe the ex-salesman didn’t consider future proofing his investment, but that somehow seems less likely.

I found the second story the easiest to relate to. I have a hard enough time reconciling my own thoughts. I can’t imagine adding another person to that continual conversation. That said, I fully agreed with Emily’s line that deleting “Carrie” is just deleting code–she could very well still be lying in the hospital bed. This “barbarism” was similarly touched on in the 2001 movie A.I. where we see rednecks making obsolete robots fight each other.

A.I. Robot Fight

While artificial intelligence is separate from a copy of consciousness, I don’t think either amount to human rights. Perhaps there’s more to the story as the fallout of transferring Carrie into the stuffed monkey cost the salesman his job with no separation package, but the episode didn’t touch on it. That said, forcing a consciousness to repeatedly relive an execution sounds evil until realizing that it’s not much different from slaughtering NPCs in video games. Just like those robots in A.I., none of them are actually alive.

I’m not sure how to relate my Black Mirror observations to this, but Norman’s views of automation are still surprisingly insightful 12 years later. He almost exactly predicted the autonomous car timeline, and perfectly described my life doing battle with automated tools that supposedly make my job easier but really don’t, and often require more manual work than doing it myself initially. It’s not a Terminator-style slavery but an observation of Thoreau; “men have become tools of their tools.” (Also AI is dumb. SO DUMB.)

Not knowing the side effects of something (that neural link addiction) is drastically different from not considering likely outcomes (making two minds share a head, “no privacy for him, no agency for her). QA is dying, and their responsibilities are being forced onto other roles. How can technical communicators push back against development decisions?

Share on

Cody
WRITTEN BY
Cody
A recovering prescriptivist, woodwind doubler, teaching artist