Resource Management
Neu.ro Tool Integrations
In our experience, nearly all AI development efforts, be they at large enterprises or new startups, begin by spending the first 3-6 months building their first ML pipelines from available tools. These custom integrations are time consuming and expensive to produce, can be fragile and frequently require drastic changes as project requirements evolve.
Frequently, these custom ML pipelines only support a small set of built-in algorithms, or a single ML library and are tied to each company’s infrastructure. Users cannot easily leverage new ML libraries, or share their work with a wider community.
Neuro facilitates adoption of robust, adaptable Machine Learning Operations (MLOps) by simplifying resource orchestration, automation and instrumentation at all steps of ML system construction, including integration, testing, deployment, monitoring and infrastructure management.
To maintain agility and avoid the pitfalls of technical debt, Neuro allows for the seamless connection of an ever-expanding universe of ML tools into your workflow.
We cover the entire ML lifecycle from Data Collection to Testing and Interpretation. All resources, processes and permissions are managed through our Neu.ro platform and can be installed and run on virtually any compute infrastructure, be it on-premise or in the cloud of your choice.
Resource Management
The various components of a machine learning workflow can be split up into independent, reusable, modular parts that can be pipelined together to create, test and deploy models.
Our toolset integrator, Neu.ro Toolbox, contains up to date out of the box integrations with a wide range of open-source and commercial tools required for modern ML/AI development.
For Resource Management, the Neu.ro Platform provides native functionality for managing Docker environments and allows for utilization of YAML for configuration of routine things such as starting a Jupyter Notebook in the platform, starting a Training Pipeline, opening a file browser for Remote Storage, etc., etc.
YAML:
YAML is a language commonly used for configuration files and in applications where an object state is being stored or transmitted. YAML targets many of the same communications applications as Extensible Markup Language (XML) but has a minimal syntax which intentionally differs from SGML. It uses both Python-style indentation to indicate nesting, and a more compact format that uses […] for lists and {…} for maps so that JSON files are valid YAML 1.2.
A user can create YAML files that configure routine things, e.g. starting a Jupyter Notebook in the platform, starting a Training Pipeline, opening a file browser for Remote Storage, etc.
Neu.ro Docs:
Creating a Cluster with YAML
Neu.ro Flow
Docker
Neu.ro uses Docker containers to run jobs in isolated environments and allows users to use Docker images as templates containing an application and all the dependencies to run that application. With Neu.ro, you can run jobs from images both on the public Docker registry and on the platform registry.
Neu.ro Docs:
Environments (Docker Images)
Additional Toolset integrations in Neu.ro are:
- Label Studio (Open source)
- DVC (open source)
- Pachyderm
- VS Code (open source)
- Jupyter (open source)
- Git (open source)
- Neu.ro native
- YAML
- Docker
- Neu.ro native
- NNI (open source)
- W&B
- MLflow (open source)
- W&B
- Neu.ro native
- MLflow (open source)
- W&B
- TensorBoard
- DVC (open source)
- MLflow (open source)
- W&B
- Neu.ro native
- Algorithmia
- Seldon Core (open source)
- Algorithmia
- Prometheus + Grafana (open source)
- Fiddler
- Seldon Alibi (open source)
- WhyLabs