“Neu.ro’s approach to ML/AI models development, training, and inference perfectly aligns with our view of how sustainability should look in the data processing industry”

on-premise
Dean Nelson
Cato Digital CEO
Client
Low-cost, low carbon bare metal provider for retail and wholesale customers in all the right places. Cato Digital is a member of the iMasons Climate Accord, reducing carbon in materials, products, and power, and it commits to tackling Scope-3 emissions.
Industry
IT infrastructure, cloud and data center services

THE OPPORTUNITY

“The Neu.ro MLOps Platform is created and developed with a responsible and sustainable AI philosophy. In this regard, Cato Digital is the perfect match for Neu.ro; together, we are committed to achieving a true carbon-neutral AI Cloud.” – Neu.ro CEO Uri Soroka

AI is a significant and growing driver of increased cloud usage in the data center industry, but teams require orchestration and integration support at each development and deployment stage. And all of these should come with a seamless solution to track the associated carbon footprint of training and a tool to reduce it.

There is a growing demand for cloud-based AI infrastructure to support the AI associated workloads. Cloud providers invest heavily in building their AI capabilities to meet this demand, offering machine learning, natural language processing, and computer vision services.

AI progress, often driven by larger models, such as GPT-4, or more extensive data sets, comes at a real cost to the environment. Organizations of all sizes are increasingly conscious of the impact of their operations on the climate. The focus in designing and operating AI systems should be on energy efficiency, which can be achieved by using algorithms that demand minimal computational resources and removing unnecessary energy consumption.

AI’s portion of electricity consumption is growing much faster than other technologies. Deep Learning models and the data sets they train upon are increasing at a truly extraordinary rate – in the near times, the leading language model will have increased in size by over 100,000x.

“Low-cost, Carbon-Free Data Modeling”
- Carbon-neutral ML Cloud environment
- Complete MLOps stack with best-in-breed tool sets
- Complete ML pipelines
- White glove support
- Carbon-neutral data modeling and training

The Challenge

Efficient and secure AI development in the cloud demands robust management of resources, processes, and assets in an environment that prioritizes convenience and efficacy. Cloud providers that solely offer bare-metal GPU servers for AI, hoping their customers will possess the skills, time, and resources necessary to create a custom pipeline, will find themselves at a significant competitive disadvantage.

The hyperscale providers have already internally developed cloud AI software stacks (i.e., Azure Machine Learning, AWS Sagemaker, GCP AI). Still, these have disadvantages—namely vendor lock-in and lack of support for on-premise and hybrid architectures. Moreover, the environmental overhead of existing hyperscale cloud providers is not even close to zero emission.

Cato's Groundbreaking GPU Servers
Cato's bare metal GPU servers are designed to fast-track complex computing tasks, such as machine learning, scientific simulations, and virtual reality. The use of graphics processing units (GPUs) provides considerable parallel processing power, reducing processing times from days or weeks to hours or even minutes. These high-performance servers are available at significantly lower costs compared to equivalent offerings from major cloud providers. Neu.ro comes pre-installed on each of Cato's GPU servers, equipping them with advanced AI development and inference capabilities.

The Solution

Cato Digital is fully dedicated to constructing the world’s most sustainable bare metal platform. It is achieved using second-life hardware, stranded data center power capacity, and renewable energy. In alignment with the iMasons Climate Accord, Cato addresses scope-3 emissions as its contribution.

To facilitate AI workload growth and satisfy its current and future AI needs, Neu.ro installed its orchestration and interoperability MLOps solution to reside natively on Cato Digital data centers. 

This platform offers unique advantages such as easy and rapid access to the computing infrastructure via a CLI or menu system, access control and permissions for authorized team members, orchestration of both cloud and on-prem compute resources, workflow automation, and protection of AI assets and artifacts throughout the ML lifecycle.

Moreover, the platform integrates a wide selection of best-in-breed AI/ML toolsets that cover the entire ML lifecycle. In an accelerated timeline, Neu.ro successfully installed, tested, and launched turnkey AI/ML services on Cato, enabling the company to leverage 100% green data center infrastructure.

Additionally, the platform’s functionality allows for multi-cloud and hybrid cloud architectures, and users can access pre-integrated AI/ML products, apps, and APIs – encompassing open-source and proprietary options. This provides users with several benefits, including cost, time, difficulty, and risk reduction for their AI development projects.

Additional features:

  • Industry-unique Green Scheduler to manage the environmental impact of your AI program
  • Monitor the CO2 footprint of every job, model, project, and team
  • Optimize compute usage to balance performance / environmental impact 
  • Automate reporting for ESG disclosures

 

Processor: 2 x Intel Xeon E5-2680v4 2 x Intel Xeon E5-2680v4
vCores: 56 56
Processor Speed: 2.4-3.3Ghz 2.4-3.3Ghz
Monthly †: Coming Soon Coming Soon
Hourly ‡: Coming Soon Coming Soon
SLA: 99.5% 99.5%
Memory: 256GiB 256GiB
Local Storage: 1 x 512GB SSD 1 x 512GB SSD
Attached Storage:
Network: 10Gbps 10Gbps
Data Transfer Included: 5TiB 5TiB
Network Overage: $0.10/GiB $0.10/GiB
GPU: 1x Nvidia T4 8x Nvidia V100
Cuda Cores: 2,560 40,960
Tensor Cores: 320 5,120