Inference and AI infrastructure for developers and DevOps teams
Run your AI models and applications in a secure, EU-based environment with full control over your code and data.
GitOps-based infrastructure
Skip complex Terraform scripts and the AWS console. Define your infrastructure using simple YAML files in your Git repo.
Automatic sync – changes in your repo are automatically applied to your infrastructure
Version control – full history and ability to roll back changes
Portability – your infrastructure is defined as code and can be moved between environments
CI/CD integration – connect with your existing CI/CD pipelines
OpenAI-compatible API
Use the same code and libraries you're already using with OpenAI, but with full control over your data.
Minimal code change – just switch your API key and endpoint to migrate from OpenAI
Support for popular libraries – works with LangChain, LlamaIndex and other frameworks
GDPR-compliant – all data stays within the EU and is never stored
CO₂ tracking – get carbon emission data for every API call
Kubernetes without the complexity
Get the benefits of Kubernetes without having to manage the infrastructure yourself.
Preconfigured clusters – get started in minutes instead of days
Standardised Helm charts – install databases, caching and services with a single command
GPU support – run your own models on dedicated GPUs when needed
Autoscaling – automatically scale up and down based on demand
Developer tools that simplify
Tools to help you build, test, and deploy your AI applications faster.
CLI tools – manage your resources directly from the terminal
Logging and monitoring – built-in integration with Prometheus and Grafana
Cost tracking – see exactly what costs and optimise your resources
Developer environments – preconfigured environments for rapid development
Manage your infrastructure with GitOps
GitOps brings the best practices of software development to infrastructure management. Version control, collaboration, and automated deployments - all through Git.
Git repository
Your infrastructure defined as code, version controlled and collaborative
Berget cluster
Secure and scalable Kubernetes cluster for your AI workloads
AI APIs and services
Connect to any AI service or model you need
Ready to start building?
Get started in under 5 minutes with our CLI tool and build your first AI application.
Models
View allAccess a broad selection of powerful open models to build diverse AI solutions.
Comprehensive model selection
- Broad selection of model types including instruct, re-rerank, and moderation models
- Optimised for agentic applications and complex AI workflows
- Support for multiple modalities: text, image, speech-to-text, and text-to-speech