Mantis Biotech CEO Georgia Witchel on Building Digital Twins, the Ethics of Predictive Data and Why…

Posted on

Mantis Biotech CEO Georgia Witchel on Building Digital Twins, the Ethics of Predictive Data and Why Startup Founders “Are Not Special”

…The number one thing to remember at all times is that you are not special. No matter how you feel, you are not special. If you feel like a poseur who doesn’t deserve to be in the room and fear someone will expose you, everyone feels that way. If you feel like your company is outperforming everyone else, you’re on top of the world, and you’re about to be acquired for millions, a lot of other startups feel that exact same way. Look around and see where they actually are. If you feel like everything is going to fall apart tomorrow and you have no idea what you’re doing, everybody feels that way too. That is definitely one major lesson…

I had the pleasure of talking with Georgia Witchel. At just twenty-four years old, she has already lived a few different lifetimes. She is the founder and CEO of Mantis Biotech, a company that builds digital models of human beings to solve complex problems in medicine, sports, and defense. But before she was navigating the cutthroat world of Silicon Valley venture capital, she was hanging off frozen waterfalls.

Born in San Francisco, Witchel’s trajectory shifted in 2008 when her parents moved the family to Durango, a small town in Colorado where tourism rules the local economy. For a restless teenager, the best way to make a buck was in the outdoor industry. Her after-school hustle was rock and ice climbing. She carried ropes, belayed tourists, and taught them the ropes. But weekend downtime with other young guides sparked a wild idea. Ice climbing was blowing up in Europe and Asia, but the United States was barely on the map. Witchel and her friends decided to change that, piecing together a US team to compete on the global stage. Against all odds, they started winning. Everyone was amazed that they won a World Cup at fifteen years old.

While two of her friends went on to climb professionally, Witchel had a different realization. Looking at what they had built, she knew that while she liked ice climbing, she really loved building things. She craved the experience of exploring the unknown. So, she took a hard left turn. She dropped out of high school, quit climbing completely cold turkey, and threw herself into academia. She landed at Harvey Mudd College for computer science, eventually moving through Johns Hopkins and the University of Washington to study math and biomedical engineering.

That relentless drive to build led her to create Mantis Biotech. The company focuses on creating synthetic data to train machine learning models, essentially building digital twins to close what the industry calls the “sim-to-real gap.” Instead of waiting decades to collect enough real-world data on a rare infant heart condition to train a surgical robot, Mantis uses a physics engine to expand a handful of existing cases into thousands of simulated ones. They create a synthetic data set of that simulation and train a machine learning model to be pre-weighted, so it understands the chaotic theories of movement within that environment.

Despite the heavily funded buzzwords surrounding artificial intelligence, Witchel views the technology with a striking, pragmatic neutrality. She is quick to separate cold science from messy societal issues. When discussing the ethical dilemmas of predictive health data — like using brain scans to predict a person’s sexual orientation from a young age or using DNA to predict alcoholism — she maintains that individuals have an absolute right to their own electronic personal information. She firmly believes that a person’s data should never be kept from them, arguing that if third-party industries like gambling casinos are going to figure out your propensities, you should at least be armed with that same knowledge.

She brings this same grounded perspective to the emotional toll of running a business. A common myth in Silicon Valley is that emotions are bad and should be ignored, but Witchel argues that negative emotion is your body’s way of signaling that something needs to change. If you wake up every day feeling bad, she insists you must identify the source; otherwise, you will slowly succumb to that unresolved issue.

This blunt realism also extends to her views on the defense industry. While some tech giants engage in public hand-wringing over military contracts, Witchel cuts through the noise. She suggests that companies hesitating to hand over artificial intelligence for autonomous weapons are doing so largely to avoid the blowback of a machine firing on civilians. She views the debate as a marketing and optics issue, noting that the technology is simply too powerful not to be used in autonomous warfare by the end of the decade.

Building a biotech company as a solo female founder is notoriously difficult, with less than one percent of venture capital dollars going to women in this role. Yet, Witchel actively pushes against the manufactured, polished image of the modern female entrepreneur. Working in biotech, she gets compared to Elizabeth Holmes on a daily basis, a reality she chalks up to a lack of highly visible female executives in the media. She is exhausted by the assumption that successful women charm their way into funding. If that were the case, she points out, there would be significantly more successful female entrepreneurs.

Instead of leaning into the hype, she offers a sobering reality check for anyone trying to build something new. The number one thing to remember at all times, she says, is that you are not special. The startup grind is mostly waiting out the silent gaps where you work incredibly hard and see absolutely no immediate benefit. To survive it, she says you have to find your own spreadsheet — the grueling, unglamorous task you can do relentlessly for fifteen hours a day without stopping.

Witchel does not want to be a symbol or an Instagram persona. She just wants to be a founder. In an industry obsessed with selling the future, she is one of the few people perfectly content to do the quiet, unglamorous work required to actually build it.

Yitzi: Georgia, it’s so nice to meet you. Before we dive in deep, our readers would love to learn about your personal origin story. Can you share with us a story of your childhood, how you grew up, and the seeds for all the amazing work that has come since then?

Georgia: Nice to meet you as well. Yes. I was born in San Francisco, and I grew up there until 2008. My parents decided they did not want to raise kids in the city, so they moved us to a very small town in Colorado called Durango. It’s a little bit bigger now. Colorado’s primary export is tourism. When you’re a kid growing up in Colorado and you reach your teenage years, the best way to make money is to help out the guides who run the local tourism attractions. The hot job in high school was either being a guide or an assistant to a guide. My job, the way I made money and spent my after-school hours, was doing rock and ice climbing. In the summer, I’d do rock climbing, and in the winter, I’d do ice climbing. I carried the rope, sat on the other end, belayed the tourists, and taught them how to climb.

On the weekends when you don’t work, you hang out with all the other kids who are also guide assistants or working in the outdoor industry in some capacity. We were all thinking about what we wanted to do with our lives, having a lot of fun ice climbing, and researching how we could use that to see more of the world and get out of the small town. It just so happened that during that period, ice climbing was becoming really popular in Europe, Russia, and China. We realized they were having all these World Cup events and the US just wasn’t participating. We thought, “What if we put together a US team? We’ll be the US team and compete.”

We did that, and we actually did really well and started winning these competitions. Obviously, it was this whirlwind thing where everyone was amazed that we won a World Cup at 15 years old. That was crazy. We kept doing it, and two of my friends who I started the team with went on to do it professionally. I ended up looking at what we built and thought to myself that while I liked ice climbing, I really loved building things. I love building teams, doing things that haven’t been done before, and the experience of exploring. I decided I wanted to explore other things that are even more unknown. I actually ended up dropping out of high school, went to a small school in California called Harvey Mudd, and went totally cold turkey on climbing. I got a degree in computer science. Then, I went back into sports tech. While working in sports tech, I was still thinking that I wanted to continue to innovate. I decided to get a master’s in engineering, and while doing that, I conducted all the research for my current company and started a business out of it. And now we’re here.

Yitzi: So inspiring. Can I ask how old you are?

Georgia: I am 24.

Yitzi: Unbelievable. Please tell us about the company you built. Tell us what it does and what problem it solves.

Georgia: Absolutely. We are a digital twin company. That means we create digital versions of humans. The way we do that is by ingesting various data sources from around the internet, combining them, and simplifying them. Then, we use a machine learning model that takes all that data and makes it significantly more valuable and expansive. We use that to make existing machine learning models more accurate. Creating synthetic data is our main focus. Most recently, we completed a proof of concept where we made the leading COVID prediction model 4% more accurate, which was very exciting. We work in the space of human data expansion and human data simulation.

Yitzi: When you say synthetic data, for the layperson, is that referring to data used to train Generative AI, or something else?

Georgia: Generative AI and classifiers. We are primarily a data company; we don’t specialize in the AI models themselves, but rather in the source data.

Yitzi: Please share some other real-world applications of having a digital twin. For a regular person like me, what is the benefit of having one?

Georgia: A common misconception is that we make digital twins of specific people. That is not really what we do. We create digital humans or digital twins in a general sense. An example use case is training a robot to perform surgery on a very rare heart condition that only exists in infants. To train this robot, you need many examples of infants with this specific cardiac condition. You can either wait decades for enough cases to emerge and document them, or you can take the 10 existing documented cases and use our product and physics engine to expand them through simulation. By using our engine and digital twins based on those original source cases, you can train a machine learning model to be much more accurate when performing the necessary intervention or surgical procedure.

Yitzi: That’s a great explanation. Just to clarify, I went to a Techstars demonstration where a founder presented an AI system for hospitals. If a hospital wanted to change its workflow, instead of testing it out live, they would use a simulation. They would simulate how the workflow would change. I was skeptical about the accuracy of those simulations. But you’re saying every single point in that simulation is a digital twin modeling a certain type of person or avatar, and that’s how it works?

Georgia: Absolutely. A common vocabulary term I would reference here is the “sim-to-real gap.” It is a concept everyone in the simulation space is obsessed with. What you just referenced is essentially doubting the technology’s effectiveness because the sim-to-real gap is too large, which many people believe. They claim they can simulate a hospital, but the way hospitals work and people behave is far too chaotic to predict perfectly. We actually agree with you. A lot of companies advertise that the sim-to-real gap is zero, but if that were true, human behavior wouldn’t be fundamentally unpredictable. That is why the industry is currently obsessed with machine learning; it closes the sim-to-real gap better than anyone imagined.

We have figured out how to leverage machine learning to make that gap as small as possible. We create data sets used to train the foundational part of a model. Our models are used in the first steps of training, which preserves all the actual real-world data for the final steps when the model makes the jump across the sim-to-real gap. In your hospital example, we are not just simulating the hospital and providing that as the output. We create a synthetic data set of that simulation and train a machine learning model to be pre-weighted, so it understands the chaotic theories of movement within that environment. Then, we stack real-world data on top of it to make that final jump. That is how we outperform existing simulation technology.

Yitzi: That is so interesting. Have you watched Black Mirror?

Georgia: Oh my god, I love Black Mirror.

Yitzi: Me too; it’s my absolute favorite series. Keeping that in mind, along with the law of unintended consequences, can you point out some potential unintended consequences of your technology that we need to consider before developing it further?

Georgia: A consequence of this technology is that if you can accurately predict a person’s future, it begs the question of whether you should make decisions based on that in the present. An existing example I think about constantly — which is unrelated to my company but exists in the same space — involves a part of the brain called the third interstitial nucleus. There is a clear correlation between the size of this nucleus and a person’s sexual orientation, and this is present from a very young age. If you take a brain image of a child and measure that area, you can predict their sexual orientation later in life with a high degree of certainty.

The ethical question becomes: do you tell this person about the size of their third interstitial nucleus, or do you let them discover their orientation on their own? My personal opinion is that this problem is not as complicated as people make it out to be. At the end of the day, that is electronic personal information (EPI). It is simply health data, and people have a right to access every possible piece of information about their own health. The broader question is how that framework should be utilized in society. We are conflating societal issues with scientific knowledge. Everyone should be given their health information, and we have no right to redact it under any circumstances. However, the stigma attached to that information indicates a separate societal issue that needs to be addressed from a socio-cultural perspective.

Yitzi: That’s a great point. It relates to a much broader application. For instance, as we map the human genome, we can correlate DNA with potential health outcomes. If a person has a propensity toward alcoholism or diabetes, it doesn’t guarantee they will develop those conditions. Given their propensity and lifestyle choices, they still have agency.

Georgia: I firmly believe that a person’s data and EPI should never be kept from them. The bigger question is what information third parties are allowed to use in their decision-making. People should always be made aware if they have a propensity for alcoholism, especially since outside industries will likely figure it out anyway. For example, the gambling industry will identify if you have a propensity for addiction. At the very least, you should be armed with that same knowledge. Then, as a society, we must answer questions like: if a person is at risk for a gambling addiction, should they be allowed in a casino? Those are two entirely different questions. I am strongly in favor of separating the science from the societal consequences. The people who conduct the science and build the solutions are rarely the same people in charge of determining the laws and distribution surrounding them.

Yitzi: What is your business model? Who are your clients, and who pays for your services?

Georgia: We focus on enterprise sales. We generally sell to management levels because we deal with human optimization. We work directly with organizations that employ high-value individuals. The military, professional athletes, medical patients, and factory workers — whose employers want to minimize injuries and workers’ compensation claims — are all clients of ours.

Yitzi: I’m sure you are familiar with Anthropic and OpenAI and how there is a significant debate right now about whether the military should implement their AI technology. Do you have any thoughts on that discussion, and how does your technology fit into or differ from this?

Georgia: Absolutely. The way the Department of Defense is approaching the Anthropic situation is quite ironic. They are simultaneously acknowledging that Anthropic is a national threat, yet also essential to American democracy and should be handed over. My personal opinion is that Anthropic hasn’t handed over their technology yet because they don’t want to be responsible for the repercussions of its use in autonomous warfare. Deep down, Anthropic knows there is a high probability that an autonomous machine might open fire on civilians, and they are trying to avoid that blowback.

It isn’t as if Anthropic is standing on extreme moral ground, considering the tech is already used in other defense contexts. They are simply stating that as a company, they aren’t ready for this specific autonomous use case. Meanwhile, the DOD is essentially saying they don’t have time for corporate morality; they’ve decided the tech is ready and want it handed over. I view this as a marketing and optics issue. OpenAI and ChatGPT seem ready to make that jump, accepting the risks. It is Anthropic’s right to resist, but the US government may forcibly take the technology, causing Anthropic to take a stock hit. Alternatively, a worst-case scenario could occur with an AI system on an autonomous weapon, causing the responsible company’s stock to plummet. That is really how we should view it. The technology is simply too powerful not to be used in autonomous warfare, and it absolutely will be by the end of the decade. The real question is who will assume the risk.

Yitzi: Is that what Palantir does?

Georgia: Palantir has a software called Foundry that primarily focuses on battlefield surveillance rather than large language model-facilitated autonomous weapons. The controversy surrounding Anthropic’s use in defense comes down to its employment in autonomous weapons where the human is out of the loop, and the AI itself decides whether a human being should be killed.

Yitzi: Wow, unbelievable. Could companies use your technology to make a digital twin of me and sell it to advertisers so they know my weak points? Similar to what you mentioned about a propensity for gambling or alcohol, could they target me with those specific things?

Georgia: Yes, absolutely. We actually had an opportunity to do that, and we chose not to.

Yitzi: This is our signature question. You’ve been blessed with a lot of success, and you must have learned a lot from your experiences. Looking back to when you first started your company, can you share five things you’ve learned over the years that would have been nice to know in the beginning?

Georgia: The number one thing to remember at all times is that you are not special. No matter how you feel, you are not special. If you feel like a poseur who doesn’t deserve to be in the room and fear someone will expose you, everyone feels that way. If you feel like your company is outperforming everyone else, you’re on top of the world, and you’re about to be acquired for millions, a lot of other startups feel that exact same way. Look around and see where they actually are. If you feel like everything is going to fall apart tomorrow and you have no idea what you’re doing, everybody feels that way too. That is definitely one major lesson.

Another thing I’ve learned is that for everything you do, there will be a period where you try endlessly and no one notices. Ninety percent of startup life is a willingness to shoot a series of gaps. The first gap occurs when you decide to raise money and start the company, which usually takes six months to a year. Anyone who claims they raised money overnight or in six weeks either lacks a defensible business or is lying. After that, there is another six-month gap to navigate where you secure your first customers, onboard people, and figure out how to allocate funding. Then comes another gap where you try to scale. During these periods, you will work incredibly hard and see absolutely no immediate benefit, but eventually, it will all come together almost overnight.

Three. A common myth in Silicon Valley is that emotions are bad and should be ignored. In reality, your emotions are usually your most intuitive secret weapon. If you wake up every day feeling bad, you need to figure out why and address it. I often see companies fail because founders refuse to acknowledge their negative feelings. Negative emotion is your body’s way of signaling that something needs to change. If you are constantly in pain, you must identify the source, even if the remedy is difficult; otherwise, you will slowly succumb to that unresolved issue. That makes three.

Number four is that you must simultaneously listen to everything and listen to nothing, which sounds contradictory. However, 95% of people will criticize you daily simply because they dislike that you are challenging the status quo. The other 5% will offer constructive criticism, and you must pay close attention to them. A large part of being a founder is distinguishing between valuable feedback and resistance to progress. Additionally, you only get to be “special” once or twice. You cannot do everything differently; there is a reason standard practices exist. Pick one or two specific areas to innovate and stick to them.

Lastly, always swim with the current. You are ten times more likely to succeed at something you are genuinely passionate about or naturally skilled in. Someone once memorably summarized why autistic founders are so highly valued: you want to back the person who doesn’t need stimulants to stare at a spreadsheet for 15 hours a day. Find your own “spreadsheet” — the task you can do relentlessly without stopping — and make it the core of your business. Assume your competitors are working just as hard but with a smile on their face. Even if a field like finance seems glamorous, do not pursue it unless you are willing to analyze financial models for 15 hours a day happily. If you force yourself through every step, you will never beat the people who genuinely love the work.

Yitzi: That is amazing. Georgia, this is our final aspirational question, which we ask in all our interviews. Because of your great work and the platform you’ve built, it’s not an exaggeration to say you are a person of enormous influence. If you could spread an idea or inspire a movement that would bring the most amount of good to the most amount of people, what would that be? You never know how far your idea can spread.

Georgia: I would say that more women should start companies. Starting a business is incredibly hard, and doing it as a woman can feel like operating with your hands tied behind your back, which leads many women to approach it from a place of frustration. I believe more women should embrace the idea that you don’t have to be a “female founder”; you can simply be a founder. You don’t have to attach your gender to your business model by starting a women’s health or rights company. You can start any company, and the fact that you are a woman does not need to define it. While there are statistical challenges regarding funding and respect, none of that has to matter unless you let it. If you want to be an entrepreneur without making it about gender, “girl power,” or an Instagram persona, you can do that. There are women quietly building highly successful companies exactly like this; you just don’t see them because they aren’t seeking a platform for it. If you are a woman who wants to start a company, recognize that this path is entirely possible.

Yitzi: That is a beautiful answer. I once interviewed a female founder who told me she feels that whenever there is a successful woman founder, people assume she is the next Elizabeth Holmes.

Georgia: Yes. Since I work in biotech, I get compared to Elizabeth Holmes probably on a daily basis.

Yitzi: Really?

Georgia: Yes. Many people don’t mean it offensively; it happens simply because there aren’t many highly visible female CEOs in the media. When people try to think of one, they only have a couple of references, and Elizabeth Holmes happens to be one of them. It is also funny how many people assume that if a woman raises funding, she only succeeded because she charmed her way in or relied on her looks. Respectfully, if that were the case, there would be significantly more successful female entrepreneurs. Trust me, I wish it were that easy! But unfortunately, that remains a commonly held misconception about female founders.

Yitzi: That is a great answer. Georgia, it is so nice to talk to you. I could talk to you for much longer, but I want to respect your time. How can our readers continue to follow your work, support you, or engage your services?

Georgia: If you are an interested client, you should email me at georgia@mantisbiotech.com. If you are an aspiring female founder, you can also email me or follow my public Instagram, @gorgagarad, which I keep public specifically to show that you don’t have to fit the typical “Instagram founder” mold. Otherwise, to follow my journey, just Google me to see what’s new.

Yitzi: Amazing. It has been great interacting with you. I hope we can stay in touch and perhaps do this again next year. I wish you continued success and blessings. Thank you, and keep up the amazing work.

Georgia: You as well. It was very nice meeting you.


Mantis Biotech CEO Georgia Witchel on Building Digital Twins, the Ethics of Predictive Data and Why… was originally published in Authority Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.