This is adapted from a talk Will Canine gave at SynBioBeta SF 2015 during the Automation and the Internet of Biotech Things session.
Biology in the 21st Century
We are in the century of software — a world with an API for everything. And yet, the way we do life-science research remains artisanal.
Biologists, by-and-large, use laborious manual processes for every step of their experiments, relying as much on their craftsmanship with a micropipette as their scientific knowledge. The status quo — $28 billion wasted in irreproducible research every year, a shocking 80% of research according to some studies — is untenable for a world that needs biotech solutions to sustain itself.
Digitizing the way we experiment with and design life will make it possible to do world-changing things with biotechnology. Organizing the chaos of biology to make sense of its complex systems is the key to solving big problems in everything from medicine to agriculture, and managing complexity is what software is best at. We can only hope that we will see more software ‘eating’ biology.
And digital platforms running biology experiments based on software are the key to them all — we need an API for the wetlab that brings life-science into the digital age and unlocks biotech for software development. To that end, here is an overview of hardware platforms that let people write code to do biology.
First, There Were the Mainframes
The first digital biology platform was the Swiss-made Tecan Sampler 500 Series, released in 1985. It was a ‘process controlled pipetting robot’ — engineers could code up protocols that the machine would execute, transferring liquid from vial to vial at an industrial scale never before possible. With this machine, the high-throughput liquid handler paradigm was born and lots of other robot brands (like the Hamilton below) have entered the market since.
This paradigm is common to industrial style robotics: expensive, heavy, proprietary technology that requires certified engineers to run.
For a long time, the only real need for this scale of experimentation — orders of magnitude more samples processed and data collected than manual workflows could provide — was from pharmaceutical companies. Drug trails require huge amounts of testing on tens of thousands of tissue samples, and that is what these robots were designed to do. And they’re still doing it largely the same way, thirty years later.
These platforms are great at what they are for — centralized high-throughput workflows. Projects from the Human Genome Project to engineered perfume producing microbes at Ginkgo Bioworks use these mainframe style machines to great effect. But they are also very limited platforms; if we want biologists to be able to work like software developers — decentralized, inter-connected, low up-front investment — we need a a digital biology infrastructure that allows that.
Coffee Shop Biotech: Cloud Labs
Next came the cloud labs, Transcriptic and Emerald. Their goal is to do for lab-work what Amazon Web Services (AWS) did for server maintenance — automate it away. With these platforms, scientists are able to ‘spin up’ experiments in a robot lab far away and, so the story goes, build a biotech from their laptops.
I am a huge fan of both of these companies. They are disrupting the traditional out-source lab business with fully digitized robot-labs, and should drastically lower the cost of scaling new biotechnologies. Cloud labs will be an essential piece of biotech innovation infrastructure going forward.
I am especially appreciative of Autoprotocol, Transcriptic’s open-source data format for experiments. It and others like Antha are establishing some of the core principals of wetlab software representation.
But, the cloud alone is not sufficient. For lots of projects, outsourcing is just not possible, especially in the earliest stages of development. And much of the creative improvisation that happens in a lab is removed when everything is done off-site — how do you tinker in the cloud? Innovation ecosystems thrive when they are de-centralized, when the tools are where the people are. We need a digital biology platform that people can use themselves.
OpenTrons: The PC of Lab Automation
At OpenTrons we’re building the world’s first personal digital biology platform. The OT-One is a robot that biologists use themselves, to run experiments on their own lab benches, from their own web browsers.
You can download experiment protocols to run on the OT-One from Mix.Bio. These have all been vetted by the OpenTrons team and, increasingly, by our community of users. We’re building open, work-horse protocols — simple PCR preps, serial dilutions, ELISAs, transformations, and more — that everyone can use to automate their work.
Our mission at OpenTrons is to democratize digital biology tools. We want to empower as many people as possible to write code that does biology — everyone doing life-science should have a lab robot. That’s why the OT-One starts at $3,000.
The OT-One brings digital biology to everyone’s lab. For the first time, there’s a community of individuals who can code experiments to run on their own robots. Only time will tell what this distributed, digital biotech community will achieve (watch this blog for news), but OT-One robots are already in more than 40 labs around the world, and that number grows every week.
Conclusion
It’s time to stop just talking about how DNA is the code of life, and start using code to do biology.
Biotech is poised for a Cambrian explosion due to its convergence with digital tech. And we’re finally seeing a complete tool-set emerge for doing biology digitally — mainframe, cloud, and personal platforms — that will accelerate the next generation of research. It’s an exciting time to be in biotech!