The explosion of AI firms has pushed demand for computing energy to new extremes, and corporations like CoreWeave, Collectively AI and Lambda Labs have capitalized on that demand, attracting immense quantities of consideration and capital for his or her potential to supply distributed compute capability.
However most firms nonetheless retailer knowledge with the large three cloud suppliers, AWS, Google Cloud, and Microsoft Azure, whose storage methods have been constructed to maintain knowledge near their very own compute assets, not unfold throughout a number of clouds or areas.
“Fashionable AI workloads and AI infrastructure are selecting distributed computing as a substitute of massive cloud,” Ovais Tariq, co-founder and CEO of Tigris Knowledge, informed Trendster. “We need to present the identical possibility for storage, as a result of with out storage, compute is nothing.”
Tigris, based by the group that developed Uber’s storage platform, is constructing a community of localized knowledge storage facilities that it claims can meet the distributed compute wants of contemporary AI workloads. The startup’s AI-native storage platform “strikes along with your compute, [allows] knowledge [to] mechanically replicate to the place GPUs are, helps billions of small information, and gives low-latency entry for coaching, inference, and agentic workloads,” Tariq stated.
To do all of that, Tigris not too long ago raised a $25 million Sequence A spherical that was led by Spark Capital and noticed participation from present traders, which embrace Andreessen Horowitz, Trendster has completely realized. The startup goes in opposition to the incumbents, who Tariq calls “Huge Cloud.”
Tariq feels these incumbents not solely supply a dearer knowledge storage service, however additionally a much less environment friendly one. AWS, Google Cloud, and Microsoft Azure have traditionally charged egress charges (dubbed “cloud tax” within the trade) if a buyer needs emigrate to a different cloud supplier, or obtain and transfer their knowledge in the event that they need to, say, use a less expensive GPU or practice fashions in several elements of the world concurrently. Consider it like having to pay your health club additional if you wish to cease going there.
In accordance with Batuhan Taskaya, head of engineering at Fal.ai, one in every of Tigris’ clients, these prices as soon as accounted for almost all of Fal’s cloud spending.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
Past egress charges, Tariq says there’s nonetheless the issue of latency with bigger cloud suppliers. “Egress charges have been only one symptom of a deeper downside: centralized storage that can’t sustain with a decentralized, high-speed AI ecosystem,” he stated.
Most of Tigris’ 4,000+ clients are like Fal.ai: generative AI startups constructing picture, video, and voice fashions, which are likely to have giant, latency-sensitive datasets.
“Think about speaking to an AI agent that’s doing native audio,” Tariq stated. “You need the bottom latency. You need your compute to be native, shut by, and also you need your storage to be native, too.”
Huge clouds aren’t optimized for AI workloads, he added. Streaming large datasets for coaching or operating real-time inference throughout a number of areas can create latency bottlenecks, slowing mannequin efficiency. However having the ability to entry localized storage means knowledge is retrieved sooner, which suggests builders can run AI workloads reliably and extra cost-effectively utilizing decentralized clouds.
“Tigris lets us scale our workloads in any cloud by offering entry to the identical knowledge filesystem from all these locations with out charging egress,” Fal’s Taskaya stated.
There are different explanation why firms need to have knowledge nearer to their distributed cloud choices. For instance, in extremely regulated fields like finance and healthcare, one giant roadblock to adopting AI instruments is that enterprises want to make sure knowledge safety.
One other motivation, says Tariq, is that firms more and more need to personal their knowledge, pointing to how Salesforce earlier this yr blocked its AI rivals from utilizing Slack knowledge. “Corporations have gotten increasingly conscious of how vital the info is, how it’s fueling the LLMs, the way it’s fueling the AI,” Tariq stated. “They need to be extra in management. They don’t need another person to be answerable for it.”
With the recent funds, Tigris intends to proceed constructing its knowledge storage facilities to help growing demand — Tariq says the startup has grown 8x yearly since its founding in November 2021. Tigris already has three knowledge facilities in Virginia, Chicago, and San Jose, and needs to proceed increasing within the U.S. in addition to in Europe and Asia, particularly in London, Frankfurt, and Singapore.





