Most people remember Jobs as a tech visionary. That framing misses almost everything. He was a product of the counterculture — LSD, Zen Buddhism, an India pilgrimage at nineteen, a calligraphy class audited at Reed College for no reason at all. The calligraphy became the Macintosh's typefaces. The Zen became the obsessive subtraction: what can you remove and still have the thing? The India trip shattered his certainty that Western rational thought was the only lens. He spent the rest of his life building tools that served both sides of the mind.
“You can't connect the dots looking forward. You can only connect them looking backwards.”
— Steve Jobs, Stanford Commencement Address, 2005
Why They Belong Here
Jobs belongs here because he proved, in hardware and film and cultural memory, that the deepest questions about how humans perceive and create are not separate from commerce — they are its hidden engine.
Jobs argued the next great things would come from people who could move between art and technology. He didn't just preach it. He built companies around it, twice.
His study under Kobun Chino Otogawa wasn't a hobby. The principle of *mu* — beautiful emptiness, the thing stripped to its essential nature — became Apple's design philosophy. Simplicity was a spiritual practice applied to industrial objects.
Jobs returned from India in 1973 convinced the West had trained itself to ignore intuition. He spent his career designing products that responded to human instinct rather than demanding users learn a system.
Adopted, and aware of it from childhood, Jobs described the experience as both abandonment and liberation — freedom from inherited trajectories. Early dislocation from expected paths produces the capacity to see sideways. His life is a case study in that pattern.
Jobs was verbally cruel, prone to taking credit for others' ideas, and capable of what employees called a "reality distortion field." His legacy forces an unanswered question: whether the products justify the methods — and whether that question even has a clean answer.
He audited Reed College for eighteen months after dropping out, following curiosity with no plan. That calligraphy class had no apparent use. Then it had every use. His life argues that undirected learning is not waste — it is investment with a hidden maturity date.
Timeline
Jobs moved in arcs — expulsion, exile, return — and each phase produced something the previous one made possible.
Jobs enrolls at Reed College in Oregon, then drops out after six months. He stays eighteen more months, auditing classes including calligraphy. No degree. No credential. The typefaces on the first Macintosh trace directly to this period.
Jobs travels to India with friend Dan Kottke, reads Ram Dass's *Be Here Now*, and returns committed to Zen practice. He begins studying formally under Kobun Chino Otogawa. Intuition becomes, for him, a design principle rather than a personality trait.
Jobs co-founds Apple Computer with Steve Wozniak and Ronald Wayne. Wayne sells his 10% stake for $800. Apple goes on to become the first company to reach a $1 trillion market cap in 2018 — long after Jobs's death.
Apple's board forces Jobs out of the company he built. He buys a computer graphics division from George Lucas for $10 million and names it Pixar. He also founds NeXT. Both decisions look like failure. Neither is.
Pixar releases *Toy Story*, the first feature-length computer-animated film. It grosses $373 million worldwide. Jobs, who had funded Pixar through years of losses, becomes a billionaire. The lesson he draws: artists and technologists in the same room, neither subordinate.
Apple acquires NeXT and Jobs returns as interim CEO. He cuts the product line from 350 items to four. The iMac, iPod, iPhone, and iPad follow in sequence. In 2011, Jobs dies of pancreatic cancer, a disease he had been diagnosed with in 2003.
Our Editorial Position
Jobs is not here because he was successful. Plenty of successful people have nothing to say about the questions this platform exists to ask. He is here because his life is a working argument — tested in public, at scale — for the idea that the inner life and the outer world are not separate projects.
He crossed between Zen philosophy and processor architecture. Between calligraphy and computer interfaces. Between the wound of adoption and the freedom to invent himself without a template. Each crossing produced something that neither field could have made alone. That is the esoteric claim made concrete.
We feature him without resolving him. The cruelty is real. The vision is real. The intersection he kept pointing to — Liberal Arts / Technology, intuition / reason, East / West — remains unresolved in the culture he left behind. That unresolved quality is precisely why he belongs here.
The Questions That Remain
What is lost when companies that claim his legacy mistake the aesthetic for the philosophy — when simplicity becomes a brand rather than a discipline of removal?
Can the "reality distortion field" be separated from abuse — or does the question become meaningless once the products are already in a billion hands?
When he told Stanford in 2005 that "death is very likely the single best invention of life," was he making a Zen observation about impermanence — or telling an auditorium full of young people something he already knew was coming for him?