Neu.ro AI Cloud presentation at Digital Transformation Week in Amsterdam

December 08, 2021

The Neu.ro team has recently officially presented our zero-emissions AI Cloud during the Digital Transformation Week in Amsterdam. Sebastiaan Holtslag of atNorth and Neu.ro CEO Constantine Goltsev presented at the main stage. Watch the video from this event or read the summary to learn more about sustainable compute and responsible AI.

MLOps AI Cloud introductions

Sebastiaan Holtslag: Our data centers are concentrated in an audience where we leverage a combination of geothermal and hydropower energy. The next step in this process is infrastructure cooling methods. For that, we have implemented a system that takes full advantage of the Northeast countries, cool-climate temperatures, and which it circulates only filtered outdoor air through different surfer rooms within the data center. That helps to create ideal conditions for cooling systems that drain more than 50% of power in an average status. And next, we look at leveraging heat waste instead of wasting heat by emitting it out of our facility as most data centers do, atNorth will harness it for the benefit of local residential heating rates. 

Not only does that enable a sharp boost in energy efficiency, but it will also allow us to utilize a resource that is normally treated as certain bus waste. So in summary, the way in which we have engineered our data centers enables them to operate much more efficiently than anywhere else, resulting in huge capital savings that we are able to pass through to our customers to lower their total cost of ownership and to help them to use to concentrate on innovating and modeling faster in a way that doesn’t cost the earth. So with that, I would like to ask Constantine from Neu.ro to explain how their sustainable MLOps platform takes full advantage of this. 

Constantine Goltsev: Hi, great to be here. So we are Neu.ro and we are an AI infrastructure tooling company and partner with great people from atNorth to deliver a sustainable and ethical AI. So there are a couple of things, actually, two things that I really want us to remember out of this. AI obviously is a trend today. AI is gonna be ubiquitous, everybody talks about there is going to be the metaverse, there are going to be self-driving cars, and much more, but it’s only a baby, it’s developing. It’s the beginning of its trajectory. So over the next decade, it might get x10 or x100, nobody really knows, but we know it’s going to be huge. 

The point is that it’s up to us today to make sure that it’s ethical and sustainable, and we’re not even talking about the AO Armageddon, like Elon Musk prophecies. Like that artificial general intelligence is going to destroy humanity. Might it be, or might not, we don’t know about this. But with sustainability, there is something we can do about this with partners like atNorth. And to give a cogent example to this already, about 10% of our electricity is going toward sustaining information, infrastructure that we have right now. And it’s going to roughly double over the next decade. So imagine here in Europe, electricity is just going into pushing bits around 

and along with increasing internet traffic and everything else is just gonna be a lot of, a lot of electricity is going to be used on this. But within that, it’s expected that AI is, as a portion of that computation might take around 50% or more of that compute so an enormous amount of capacity is going to go into AI. And here’s the thing about AI right now, most of it is powered by mathematical agents for neural networks and these operations algorithms. So how does your typical compute works? You want to accomplish something and then you want to do x2 of that. So you just double your computing capacity. 

Well, unfortunately, neural networks don’t behave like that. That AI capacity is subject to a power law. So that means that in order to go the next 10% in quality of the output, what you might need is to expand the x10 resources. And as unfortunate as it is, we have shown it here on a logarithmic scale because otherwise, we’d run out of vertical space to display this. But roughly three years ago Open AI released the first GPT model and that for a GPT model was a breakthrough in natural language processing, having a lot of great capabilities and it had around 100 million parameters and parameter issues like a vector that multiply together and adjust. It is just mathematical computations, that computer doesn’t expense that amount of electricity. So to run one epoch of the training, we have to do 100 million times, 100 million of this multiplication. And then we need to run thousands and thousands of these epochs to train the network. So that was, the parameter space in GPT2, GPT3 which arrived measly two years later, already had 175 billion parameters. So the number of parameters in these computations in just two years has grown 1 200 times. And the next level GPT4 is going to be another factor of two factors parameter increase over that. It prophesies to have about 100 trillion or more parameters and just another year has passed. So it clearly scales as a power law. 

And if you can imagine in order to offset the GPT2 carbon footprint, which roughly is equivalent to driving your average car 1.5 million miles, you need to maybe sequester it, you need to take a small patch of trees and have these trees sequestered for carbon footprint for a year, but for GPT4, in order to sequester that carbon because of, you know, 100 million parameters and what’s not, as a huge data centers and train this model. You actually need to patch the size of 1000 New York Central Parks to just have offset that one model in terms of CO2 spent. And this is only one organization Open AI that has trained this model and now these models are going to be ubiquitous, going to go to every single large corporation or bank because they are so powerful. The carbon footprint is going to explode. So it’s quickly coming to sustainability. And we have a responsibility to try to do something about it right away. And what we have done, we actually have built out a toolkit that not only creates MLOps pipelines for data scientists, but also takes sustainability to account. It can help you offload resources to the nodes like the computational capacity of the atNorth to make sure that you train your humongous model. You’re actually limiting the amount of damage you cause to the environment. Alternatively, it also helps you optimize this compute line. Because in order for you to get to your goal, there are several ways you can take. You can take no prisoners approach in, you know, burn as much electricity and use as much capacity as you can or you can do it may be slower, do it with appropriate tooling and minimize the amount of CO2 and impact of the environment. So this is what we’re trying to engineer with the course and tried to make that future sustainable. The scope is an avalanche of cloud compute power from heating up the planet. Thank you.