Data sovereignty

Your data stays in your jurisdiction.

Inference in London (UK) by default. No US transfer. No third-party model provider in the request path.

Marigold runs open-weight models on private AWS infrastructure in your chosen region. No request reaches OpenAI, Anthropic, Google, or any other external provider. GDPR data processing agreements are available on the Pro tier.

Most AI API products route your prompts to a US-based model provider. Marigold runs the model on private infrastructure in your chosen AWS region. There is no external inference dependency.

01

Region selection

Default is London (UK). Ireland is also available. Inference, storage, and queuing all run within the selected region. Additional regions are available on the Pro tier.

02

No third-party model provider

Inference inputs are not forwarded to OpenAI, Anthropic, Google, or any other external provider. The model runs on private AWS infrastructure from weights held in your region.

03

No prompt retention

Inference requests are not stored beyond the duration of the job. Outputs are held briefly for retrieval and then deleted on schedule.

04

Open-weight models

Models run from weights held on private AWS infrastructure. No model provider receives inference traffic. See the full registry for available models.

05

DPA available

Pro tier accounts can obtain a GDPR-compliant data processing agreement. Marigold acts as processor; your organisation retains controller status.

06

Audit trail

Every inference job records status and timing within your region. Execution history is queryable. No telemetry leaves the deployment.

Criterion Marigold OpenAI API Anthropic API
Inference region UK / EU (your choice) US (OpenAI) US / EU (Anthropic)
Prompt sent to third party Never Always Always
Training on prompts Never Opt-out required No (by default)
Data processing agreement Available (Pro) Available Available
You control deployment region Yes No No
Open-weight models Yes No No

Can I use Marigold for GDPR-regulated data?

Marigold is designed for use with data subject to EU and UK GDPR. Inference runs within your chosen AWS region. No data crosses jurisdictional boundaries as part of inference. A data processing agreement is available to Pro tier accounts. You remain data controller; Marigold acts as processor.

Does Marigold use US-based sub-processors for inference?

No external model provider is in the inference request path. Models run on private AWS infrastructure in your chosen region. The only external dependency at inference time is AWS itself.

Is Marigold suitable for processing sensitive personal data?

The architecture supports it: no cross-border transfer, no third-party model provider, no prompt retention. Whether a specific processing activity is lawful depends on your legal basis, sector, and the nature of the data. Marigold does not provide legal advice. Pro tier accounts can discuss specific use cases and obtain a DPA before processing begins.

Which AWS regions are supported?

Default is London (UK). Ireland is available. Other regions including Sydney and Canada are available on request for Pro tier accounts. Region selection is fixed at account provisioning.

Inference that does not leave your region.

Leave your email and we will reach out when access opens. Include any specific compliance requirements and we will address them before onboarding.

Join the waitlist

No spam. One email when access opens.

Noted. We will be in touch.