close
close

Apre-salomemanzo

Breaking: Beyond Headlines!

Roboflow, Vision AI startup, raises  million in Series B
aecifo

Roboflow, Vision AI startup, raises $40 million in Series B

Roboflow is a company with a thousand use cases.

When I met Joseph Nelson, CEO and co-founder of Roboflow, I asked him a question I ask almost everyone: If we met by chance at a party, how would you explain what your company does ?

“I tell people that Roboflow creates developer tools to create a sense of visual understanding,” Nelson told me. But, he added, depending on the person’s work, “I’ll immediately give them an example that makes it real for them.”

Then, for the next ten minutes, Nelson had a blast talking about different jobs and the corresponding visual use cases for Roboflow. For doctors, think about medical imaging and diagnosis. Firefighters, what about early detection of forest fires? If anyone works in environmental research, Nelson points out that Roboflow is already used to monitor coral reefs and underwater ecosystems. There are many potential use cases for Roboflow, a computer vision startup, because the company deals with something very fundamental, even primordial: what we see.

Roboflow raised $40 million for its Series B, Fortune exclusively learned. The round was led by GV, joined by Craft Ventures and Y Combinator, as well as Guillermo Rauch of Vercel, Jeff Dean of Google and Amjad Masad of Replit. Previous investors in the company include Lachy Groom, Sam Altman and Scott Belsky.

“The amazing thing is that if you think of machine learning, you immediately think of technical teams, and that’s true,” Nelson said. Fortune. “But the truth is that the impact is almost everything else, every other place that wants to have a better understanding of the visual world.”

In short, Roboflow uses AI to make sense of what we see. There is a deluge of enterprise AI platforms, but Roboflow is directly tied to the touch and physical world. It’s a clear throughline for Nelson, who grew up with a keen awareness of the relationship between technology and real-world environments. Nelson grew up in Iowa, where his family runs a traditional Midwestern row crop farm, growing primarily corn and soybeans. Even today, he comes back for the harvest. When Roboflow signed a contract with a major agricultural equipment manufacturer, that was “the moment my parents realized we had a real, real company,” Nelson jokes. (He declined to say who exactly, but some web searches suggest it’s probably John Deere.)

The company said Fortune that more than 25,000 organizations build with Roboflow, including more than half of the Fortune 100 companies. In the last 30 days, Roboflow has seen its open source packages downloaded more than 1 million times.

And these downloads suggest a dizzying number of customer use cases happening right now. You have, for example, Pella Windows & Doors, which uses the Roboflow platform to scan products for defects. Automaker Rivian also uses Roboflow for quality control, while Wimbledon and US Open broadcasters leverage the company’s models for player and ball tracking. BNSF Railway, another Roboflow customer, uses the platform to literally run trains. The company uses computer vision to maintain yard inventory in real time, reducing search times and optimizing loading and delivery at yards of up to 10,000 containers. A subsidiary of Berkshire Hathaway and one of the largest freight railroads in the United States, BNSF also uses Roboflow for real-time safety inspections.

The world is “full of images and things to see, and people make decisions based on what they see,” said Crystal Huang, general partner at GV, who will join Roboflow’s board of directors. Huang believes that solving these types of physical problems represents a largely untapped “new” opportunity, and she sees potential in verticals such as manufacturing, logistics, retail, healthcare and hospitality .

Nelson understands the challenge inherent in many use cases: after all, if you’re known for everything, you can end up being known for nothing.

Yet there is an almost overwhelming sense of scale that makes me think about how much we see every day and the extent to which that constitutes our primary source of information. There are many ways of seeing, some more comprehensive than others. And if you can help people see things more clearly, there’s a market for it.

“As humans, our sight predates our use of language,” Nelson said. “The ability to experience the world, understand it and synthesize it is innate to intelligence. And if you think about it, so much software that exists in the world doesn’t have that sense, right ?So much is invisible and unknown, but the promise of visual AI is that these things can be improved and better understood.

We are so back…ServiceTitan has filed its S-1 for its planned IPO on Nasdaq. Some figures include the software company’s net loss of $35.7 million, on revenue of $193 million, for the quarter ending July 31. Discover the file here. As always, I love hearing your takes.

ICYMI…My colleague Jason Del Rey exclusively spoke with Aravind Srinivas, CEO of Perplexity, about the company’s new procurement tools.

See you tomorrow,

Allie Garfinkle
Twitter:
@agarfinks
E-mail: [email protected]
Submit a bid for the Term Sheet newsletter here.

Correction, November 19, 2024: The electronic version of this article incorrectly stated Jeff Dean’s professional association.

Nina Ajemian curated the deals section of today’s newsletter. Subscribe here.

This story was originally featured on Fortune.com