According to GeekWire, the Allen Institute for AI launched OlmoEarth on Tuesday as an open-source alternative to proprietary Earth monitoring systems. The platform uses AI models trained on millions of Earth observations totaling about 10 terabytes of data. It includes OlmoEarth Studio for creating datasets and OlmoEarth Viewer for exploring AI-generated maps. Early adopters are already using it to update global mangrove maps twice as fast with 97% accuracy and detect Amazon deforestation. Ai2 CEO Ali Farhadi says the initiative makes Earth AI accessible to frontline workers, while lead researcher Patrick Beukema emphasizes cross-field collaboration. The full platform is rolling out to select partners with additional collaborations being invited.
Taking on the giants
Here’s the thing about geospatial analysis: it’s been dominated by Big Tech for years. Google Earth Engine and Microsoft’s Planetary Computer have petabytes of satellite data, but they’re not exactly accessible to your average conservation group or small research team. You need serious technical chops to make them work. Ai2 is basically saying “enough with the walled gardens” and offering a completely open alternative.
And they’re not just talking about data access – they’re providing the whole system. Model fine-tuning, deployment tools, the works. It’s the kind of end-to-end platform that could actually level the playing field. Think about it: what happens when local environmental groups can monitor deforestation as effectively as Google can?
The open source advantage
Ai2 is really leaning into their “true openness” philosophy here. They’re directly calling out Google’s AlphaEarth Foundations for only releasing “embeddings” rather than the actual model. That’s a pretty bold move. They claim their fine-tuned OlmoEarth “outperformed AEF substantially” and held its own against models from Meta, IBM, and NASA too.
But here’s what matters: when models are truly open, researchers can build on each other’s work. They can fine-tune for specific regions or problems. They can understand why the AI makes certain decisions. That transparency is crucial for something like climate science, where trust in the results really matters.
Beyond the lab
The early use cases they’re highlighting are pretty compelling. Updating mangrove maps twice as fast? That’s not just an academic exercise – mangroves are critical for coastal protection and carbon storage. Detecting Amazon deforestation and mapping vegetation dryness for wildfire prediction? These are immediate, real-world problems where faster, cheaper monitoring could literally save lives and ecosystems.
I’m curious how this plays out at the grassroots level. Will local governments start using this instead of paying for expensive proprietary services? Could indigenous communities monitor their own territories without relying on outside tech giants? The potential here feels significant.
The road ahead
Right now, the OlmoEarth Viewer is publicly available, and they’ve put the code and documentation on GitHub. The full platform with OlmoEarth Studio is going out to select partners first. It’s a smart rollout – get it into the hands of people who can really test it and build meaningful use cases.
This feels like part of a bigger trend we’re seeing across AI. The closed, proprietary approach versus the open, collaborative one. For climate and conservation work specifically, having open tools feels particularly important. After all, we’re all living on the same planet – shouldn’t the tools to understand it be accessible to everyone?
