Bodyword #3 Organic Intelligence (O.I.)

In defense for why your brain needs a first draft

Bodyword #3 Organic Intelligence (O.I.)
Drawing by Tucker Legerski

Back in high school, a psychology teacher liked to say, “It’s three pounds of gray matter. That’s it.” Miss Michaels1 said this with her hands on her head. “It’s amazing what this three pounds of matter can do.” She pinned back her graying brown hair, had a left mole on her cheek. She took off her glasses when she made a point and let them hang from her neck from a beaded lanyard. Miss Michaels was tall, older, a pointed nose, and a deeper tuba-toned voice. She walked like a wrath or river nymph, it seemed. She floated between desks and spun the wonders of psychology and the mysteries of the human brain to us — the fleshy organ between the ears.

Since the boom and thunder of Artificial Intelligence this year, I’ve been thinking a lot about the brain and how we connect to information. What does it means to connect to information? To add information to that three pounds? Or in the case of A.I. tools, to detach from the process of learning and metabolizing information?

A.I. boosters say the tools are in step with every technical tool that has made an impact on cognitive work: cameras, calculators, a dictionary, even writing itself is a technology that replaced the strength of our memories and oral abilities. There are trade-offs with every new piece of technology. When you gain something, you lose something else. I fear A.I. tools will automate our ability to connect to information. In short, it will atrophy our ability to research and process information. A skill I believe we need if we hope to do anything important or continue to create anything new. We need that slow churn and focus and bonding with information that gives us the ability to write a first draft.

For me, one of the wonders of the brain is how I connect and build a relationship with information, and how that information becomes not only a part of my mind, but my rhetoric and my understanding of the world. Looking at a wikipedia page is a first date. Finding good sources of information is a second, third, and interviewing someone about a subject (or reading a book on the subject) is a relationship going steady. A first draft is a huge milestone like a six month anniversary. This means things are going excellent, you can’t stop thinking about that information.

In early 2019, when I entered graduate school, I made the decision I wanted to understand why my mother died from complications of Type-1 diabetes at a young age. I wanted to know what happened to her pancreas, exactly — the organ responsible for causing diabetes within the body. I stumbled upon a whole world. I came across a video about high insulin prices. Then I found articles, scientific papers, books, political speeches. People with Type-1 diabetes were dying from not affording their essential medicine — a medical tool that was overpriced and inaccessible for many, especially those without insurance.

I was hooked, and I started diving in. This led me to connect with activist groups, reading the history of insulin creation, and the long history of people who have died from diabetes. Along the way, I discovered a trove of information about my mother that would have otherwise been buried and undiscovered. It was information that changed the narrative of what happened to my sick mother, and changed how I felt about the disease.

I wrote an unpublished book about the experience, chronicling the discovery of insulin and the historic plight of living with Type-1 diabetes. Researching and the insights I gained from the research connected me to this important disease. I got to know my mother in a new way. That information became a part of me, a part of my brain, and understanding of the world and my own past. If I’d ask for an A.I. tool to write or summarize all this information, I wouldn’t have gained the same connection to the information around diabetes, healthcare, self-care, and acceptance of bodies. I needed the time to wade through the information, read, talk, and think.

Sure, do I wish I could jack that info into my head and have it all there, like Neo in The Matrix? A whole neural network of info that flows straight into my three pounds of brain? Yes. But A.I. is not that tool. Even using an A.I. tool such as Humata, where you can enter a file and ask questions about the file, can’t recreate the experience of connecting to information. A.I. can scrape and scan text in seconds, but is it understanding?  If you read a summary, is that understanding? Is watching a five-minute recap of a film the same as watching the full two hours? Are Spark Notes the same as reading a novel? Does the user reach the same conclusions and thoughts if they just read the text? My answer is no.

To garner a relationship with information and connect with research, you need to do the dirty and slow work of thinking, processing, and giving your thoughts the room to take in new information. You need to read, watch, listen, talk, process. It takes time and attention, and I believe that’s not a bad thing. It’s good to give your attention to a single subject for a sustained period of time. To be wholly dedicated to one task, or understanding of a moment in history, or how a chronic disease biologically works in the body.

Journalist and podcaster Ezra Klein wrote an op-ed2 on what it means to have tools that access information with the click and smash of buttons. He ditches the idea that speed and more information are always better. For speed, “…misses much of what’s really happening when we spend nine hours reading a biography. It’s the time inside that book spent drawing connections to what we know and having thoughts we would not otherwise have had that matters.” Our tools — the Internet, email, Slack, chatbots, synthetic relationships with Claude to Bard to Chat to Sydney — aren’t built for the reality of how human cognition works.

Klein quotes professor Gloria Mark, who studies the relationship between humans and computers, on why over automation of cognition tasks and creative tasks is a fast way to losing that relationship with information:

“Nobody likes to write reports or do emails, but we want to stay in touch with information,” Mark said. “We learn when we deeply process information. If we’re removed from that and we’re delegating everything to GPT — having it summarize and write reports for us — we’re not connecting to that information.”

A.I. has the potential to dilute, confuse, distract, and sever our connection to information. Information we not only need for a cognitive task, but we need as a part of our unique selves. What happens if our novels, political documents, essays, every email and bill of sale, joke and game and poem gets created by these infinite language models? As the writer and journalist Anna Wiener3 put it in a recent article, we will be outsourcing our conversations and information processing to a bunch of server farms and racks “in Altoona and Ashburn—a world of kaleidoscopic interfaces waiting to be prompted, ready to say just what users wanted to hear.” Do we really want to limit our relationship to information to a chatbot that imitates the speech of a human? Letting it decide what information is best within it’s vast network?

We will won’t find anything new or surprising or weird. We won’t feel the nooks and crannies that comes with diving into a subject. We will talk with fewer actual people. Above all, we will lose our understanding of what we wish to communicate. Being severed from information is like losing language. Our minds won’t connect to valuable and nutritious information to help them grow and expand.

Taking in new information, having it slowly work into your mind and rhetoric, isn’t about doing or hearing exactly what you want. Just take learning a second language, or a third or fourth. It’s hard. It can be a lot to put your brain through. But something happens when you learn a new language. You have access to more people, places, literature, art, experiences, and perceptions than if you are monolingual.4 But to get to that bilingual space, you have to put in the work. To add that language — that information — is to add more to who you are as a person. I think it’s the same when you learn a new subject. Whether that be diabetes, mathematics, Chinese history, how to code, writing screenplays; no matter what you learn and want to talk and write and think about, we are the information we consume. We are what we read, listen, and watch.

From what you read, you create first drafts. Your brain needs that first draft. Many fear the first draft, but it’s the sign you’re ready to use the information in your head — it means you’re excited and prepared to finally communicate the information that your brain has taken in.

The sci-fi writer Ted Chiang called an A.I.’s first draft superficial. He equates the writing of ChatGPT to making a photocopy rather than making an original. Wrestling a first draft out of your brain is necessary and vital:

“Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.”5

Writing a first draft, an original idea expressed poorly, is using organic intelligence. Original thought is organic thought. With each draft you grow the idea. When it comes to creating a first draft with an A.I. tool and your own brain, it’s the difference between growing a farm-raised tomato and concocting a Cheeto. One had the care thought and intimacy, the other was whipped up by a faraway lab and chemist, churned out infinitely. “Chatbots, for all their ostensible personalization, are in the business of mass production,” Wiener writes.

Organic intelligence has stronger roots and is fed with more natural nutrition. In short, human written text feels more real and alive. Text written with O.I. has more possibility and, ironically a stronger connection to information. A.I. bots may have access to more raw information, but that doesn’t mean it’s always right, or trustworthy, safe, intuitive, or even good. It’s still just mathematically plucking a word based on its algorithm. A human is thinking, feeling the information they are selecting.

“In some ways, the robots need us, they need human creativity, to be able to produce good answers. If they keep on ingesting their own outputs they get worse,” tech reporter Kevin Roose said on a recent episode of his podcast Hard Fork.6 Kevin and his co-host Casey Newton cite a recent study that concluded language models fed on A.I. generated data will become poisoned and eventually collapse. And in a world where everyone from advertisers, corporations, lawyers, to fiction writers start using A.I. chatbots, the junkifcation of the web with artificially written text and “thoughts” won’t give A.I. new training ground.7 It will start eating itself and gobbling up unoriginal, stale text, especially as the internet becomes a junkyard of A.I. generated text.

The study concludes: “To make sure that learning is sustained over a long time period, one needs to make sure that access to the original data source is preserved and that additional data not generated by LLMs remain available overtime.” LLMs, and the companies that offer these tools, need “original data.” They — the companies, investors, the servers, the tools, interfaces — need O.I. to survive. They need O.I. that was crafted and grown in a human brain.

Maybe we will get to a point like in Black Panther where an A.I. computer helps a build a new super suit and medical equipment. Or LLM’s will fold into our phones and ear buds like in Her or the translator devices in 2013’s Snowpiercer. Maybe A.I. will serve me like Google has served me at times: helps me find information to connect with. But if this technology distracts, and severs an important part of what it means to be a person, or worse exploits my data and manipulates my time for a profit, then it’s not tech that elevates or deepens my experience of being a human. I am willing to say I don’t know what will happen. I could be using these tools everyday within a year. But as of now, these tools don’t serve my relationship to connecting to information — they only degrade my relationship to information.

If we are what information we have within our bodies and brains, I want to end on famous movie scene from 2000’s Erin Brockovich. Based on a true story, the film stars Julia Roberts who plays Erin, a twice divorced and single mother who finds herself working as a file clerk for an law firm in the San Fernando Valley. She stumbles across some information — a box of medical records— while sorting files. She followed her curiosity of the alarming information and drove out to the desert town of Hinkley, California, where she found two-headed frogs, green water, and a sick population full of cancer, skin disorders, and miscarriages. Erin uncovers that the company Pacific Gas & Electric has polluted the water with the chemical compound hexavalent chromium, which knowingly caused people to get sick. Erin would become beyond important to building and eventually winning the case that ended in a 333 million dollar settlement.

As the scene demonstrates, it was Erin’s connection to this important information that helped win the case. It’s a showcase of wondrous organic intelligence. It’s finding and building a relationship with information and doing something powerful with it. Something only a human could do.

END IT


  1. Not her real name.

  2. Beyond the Matrix Theory of Mind” by Ezra Klien Ny Times 2023.

  3. The Age of Chat by Anna Wiener New Yorker 2023.

  4. Much of this information came from the one and only Hank Green on Sci Show Psych 2019.

  5. ChatGPT is a Blurry JPEG of the Web by Ted Chiang, New Yorker, 2023.

  6. Is A.I. Poisoning itself? (quoted material from 16:00 minute on Spotify). Hard Fork via New York Times 2023.

  7. Is A.I. eating itself? by Casey Newton on Platformer 2023.