Get Started
In the current era of large-scale deep learning, the most interesting AI models are massive black boxes that are both costly and difficult to run. Ordinary commercial inference services and APIs let us interact with these models, but they do not let us access model internals. We are changing this with NDIF and NNsight.
NDIF
An inference service hosting large open-weight LLMs for use by researchers — free of charge.
Together, NDIF and NNsight enable researchers to run complex experiments on huge open AI models easily, with full transparent access. Follow the steps below to get started.
Install NNsight
To start using NNsight, install it via pip:
For a deeper exploration of NNsight, run through the full NNsight walkthrough.
We welcome open-source contributions and suggested improvements on GitHub.
Sign up for NDIF remote model access
To remotely access LLMs through NDIF, sign up for an NDIF API key.
Register for your free API keyNDIF hosts multiple LLMs, including various sizes of the Llama 3.1 models and DeepSeek-R1 models. All models are open for public use. View the full list of hosted models on the status page.
Access LLM internals
Now that you have your NDIF API key, you're ready to start exploring LLM internals. We've put together a Colab notebook to help you get started.
Open in Google ColabThe notebook covers:
- Installing NNsight
- Setting up your NDIF API key
- Loading a LLM in NNsight
- Accessing and altering LLM internals remotely
Get involved
This has been a quick overview to get started with NDIF's remote models. To learn more, dive into these resources:
- Get a comprehensive overview with the NNsight Walkthrough
- Explore NNsight implementations of common LLM interpretability techniques
- Join the conversation in the NDIF Discord community
