Skip to content
← ALL WRITING

2026-04-23 / 9 MIN READ

Server-side GA4 via Measurement Protocol: the real setup

Tutorial walkthrough for server-side GA4 using the Measurement Protocol. Endpoint, payload shape, api_secret rotation, and the consent-aware event filter.

The GA4 Measurement Protocol is Google's server-to-server API for sending events into a GA4 property. Most DTC brands have heard of it, most have not set it up, and most of what is published about it online is a GTM server-container tutorial that skips the actual protocol layer.

This is the real setup. Endpoint, payload, authentication, consent handling, and the gotchas I have hit in production. It assumes you have a working warehouse-first analytics stack and you want GA4 to be a downstream consumer instead of the source of truth.

mp payload builder
purchase event
include fields
POST body → /mp/collect
{
  "client_id": "1847123456.1714089000",
  "user_id": "sha256_hashed_email",
  "timestamp_micros": 1714089600000000,
  "consent": {
    "ad_user_data": "GRANTED",
    "ad_personalization": "GRANTED"
  },
  "events": [
    {
      "name": "purchase",
      "params": {
        "engagement_time_msec": 1,
        "currency": "USD",
        "value": 127.5,
        "session_id": "1714089000",
        "transaction_id": "shop-order-4821"
      }
    }
  ]
}
Required fields marked with asterisk. Missing any and GA4 rejects silently unless you hit /debug/mp/collect.

The endpoint and payload

GA4 Measurement Protocol has a single endpoint:

https://www.google-analytics.com/mp/collect

There is also a validation endpoint at /debug/mp/collect that returns errors in JSON instead of silently dropping events. Use debug during development, switch to collect in production.

Every request needs two query parameters and a JSON body:

POST https://www.google-analytics.com/mp/collect
  ?measurement_id=G-XXXXXXXXXX
  &api_secret=YOUR_API_SECRET

{
  "client_id": "uuid-or-ga-client-id",
  "user_id": "optional-canonical-user-id",
  "timestamp_micros": 1714089600000000,
  "non_personalized_ads": false,
  "consent": {
    "ad_user_data": "GRANTED",
    "ad_personalization": "GRANTED"
  },
  "events": [
    {
      "name": "purchase",
      "params": {
        "session_id": "1714089000",
        "engagement_time_msec": 100,
        "transaction_id": "shopify-order-12345",
        "currency": "USD",
        "value": 127.50,
        "items": [
          {
            "item_id": "SKU-001",
            "item_name": "Product Title",
            "item_brand": "Brand",
            "item_category": "Category",
            "price": 49.00,
            "quantity": 2
          }
        ]
      }
    }
  ]
}

The canonical reference is at developers.google.com/analytics/devguides/collection/protocol/ga4. The fields above are the ones that matter for DTC.

The six fields that actually matter

Measurement Protocol has dozens of optional fields. Six are load-bearing for DTC.

client_id is the GA4 user identifier. It must match the _ga cookie value on the browser if you want session continuity. If you do not have a browser _ga cookie (pure server scenario, bot traffic, etc.), mint a UUID and use it consistently per user.

user_id is optional but strongly recommended. It is your canonical user ID (hashed email, customer_id, etc.) and lets GA4 cross-device attribute the same user. Always hash the value (SHA-256) before sending.

timestamp_micros is the event time in microseconds. If you omit it, GA4 uses the request arrival time. For events backfilled from a warehouse, always set this explicitly. GA4 accepts events up to 72 hours in the past.

session_id and engagement_time_msec together tell GA4 this event is part of a session. Without them, every event is treated as its own single-event session, which breaks your funnel reports. session_id is a Unix timestamp of session start; engagement_time_msec is how long the user was active (1 to 100 is fine for server-synthesized events).

consent is the consent-mode signal. GA4 uses it to decide whether to process the event for personalization and ads. For a user who declined consent, send "ad_user_data": "DENIED" and "ad_personalization": "DENIED" and GA4 will treat the event as anonymized.

transaction_id on purchase events is how GA4 deduplicates. Missing or duplicated transaction_id is the single most common cause of doubled-up GA4 revenue.

The ingestion service pattern

The server-side GA4 code sits inside your ingestion service (the one that is also writing to BigQuery). Every event gets written to the warehouse first, then a subset gets forwarded to GA4.

Pseudocode for the pattern I use (Node.js; Python is analogous):

// server/ingestion/forward-to-ga4.js
import { z } from "zod";

const EVENT_SCHEMA = z.object({
  client_id: z.string().min(1),
  user_id: z.string().optional(),
  event_name: z.string(),
  event_params: z.record(z.unknown()),
  occurred_at: z.string().datetime(),
  consent: z.object({
    ad_user_data: z.enum(["GRANTED", "DENIED"]),
    ad_personalization: z.enum(["GRANTED", "DENIED"]),
  }),
});

const GA4_ECOMMERCE_EVENTS = new Set([
  "view_item_list",
  "select_item",
  "view_item",
  "add_to_cart",
  "view_cart",
  "remove_from_cart",
  "begin_checkout",
  "add_shipping_info",
  "add_payment_info",
  "purchase",
  "refund",
]);

export async function forwardToGa4(rawEvent) {
  const event = EVENT_SCHEMA.parse(rawEvent);

  // Skip events GA4 does not model in ecommerce
  if (!GA4_ECOMMERCE_EVENTS.has(event.event_name)) return;

  const payload = {
    client_id: event.client_id,
    ...(event.user_id && { user_id: event.user_id }),
    timestamp_micros: new Date(event.occurred_at).getTime() * 1000,
    consent: event.consent,
    events: [
      {
        name: event.event_name,
        params: {
          session_id: event.event_params.session_id,
          engagement_time_msec: 1,
          ...event.event_params,
        },
      },
    ],
  };

  const url = new URL("https://www.google-analytics.com/mp/collect");
  url.searchParams.set("measurement_id", process.env.GA4_MEASUREMENT_ID);
  url.searchParams.set("api_secret", process.env.GA4_API_SECRET);

  const res = await fetch(url, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify(payload),
  });

  if (!res.ok) {
    // Measurement Protocol returns 2xx for success, 4xx for validation
    throw new Error(`GA4 MP failed: ${res.status}`);
  }
}

A few production details. Retries: Measurement Protocol does not have an idempotency key, so retries can cause duplicate events. Use the Cloud Tasks or SQS pattern and only retry on 5xx, never on 4xx. Rate limiting: GA4 will drop requests above 50 events per second per property; batch events in the events array (up to 25 per request) for high-volume spikes.

Consent mode v2 is the legal gating layer. For events from users who declined consent, you have three options in order of preference:

Option 1: do not forward at all. Respect the user's denial. The warehouse still has the event for your own analysis; GA4 just does not get it. This is the most conservative and the one I default to.

Option 2: forward with consent DENIED. GA4 receives the event but treats it as anonymized. The event shows up in non-personalized reports only. This is what Google's own guidance suggests for modeling purposes.

Option 3: forward with consent DENIED only for users in gating regions. Non-EU users flow through with GRANTED; EU users flow through with DENIED. Requires geo-IP lookup at the event-forward layer. More work, but preserves optimization signal in non-EU markets.

Most DTC brands I work with land on Option 2 as the default with a per-region override. It is the safest middle ground, but run it by your DPO.

api_secret rotation

The api_secret is a GA4 Admin-level credential that authenticates your server to the property. If it leaks, any attacker can send events to your GA4 property and corrupt your data. Two defenses:

Rotate the api_secret quarterly. Google lets you have multiple active secrets per property; create a new one, deploy the new one, retire the old one. 15-minute maintenance window.

Scope the secret to a specific ingestion service. Do not share the same api_secret across a warehouse ingestion service, a Klaviyo webhook handler, and a custom landing-page form. Each gets its own, so a leak in one does not force rotation across all.

Store the secret in a secret manager (AWS Secrets Manager, Google Secret Manager, Doppler, 1Password), never in a repo and never as a plain environment variable in a shared config.

Verifying the setup works

Three checks I run on every new Measurement Protocol integration.

Debug endpoint test. Hit /debug/mp/collect with a sample event and read the JSON response. It will tell you which fields are missing or malformed. Do this before pointing at the production endpoint.

Realtime verification. Open GA4 → Reports → Realtime. Fire one event from your ingestion service. It should appear within 30 seconds. If it does not, check the api_secret, the measurement_id, and the consent fields.

24-hour reconciliation. After a day, compare GA4 purchase count to warehouse purchase count for the same window. GA4 should land 2 to 8 percent below the warehouse (some events dropped by Google's bot filter, some rejected for bad schema, some from pre-consent states). If the gap is above 15 percent or below 2 percent, something is off.

Measurement Protocol is the protocol, not a tool. The hard part is the ingestion service around it, the consent layer in front of it, and the warehouse behind it.

FAQ

Do I still need the browser pixel if I have server-side GA4?

Yes, for now. GA4 still uses browser signals (client_id continuity, session cookies, etc.) to stitch cross-device journeys. The pattern that works is: browser pixel for session continuity, server-side for guaranteed event delivery. Kill the duplicate firings via event_id, not by killing one side.

What is the difference between GA4 MP and GTM server container?

GTM server container is a visual tag manager that sits between your client and GA4. It can use Measurement Protocol under the hood. MP is the API itself, which you call from your own ingestion code. GTM server is easier to stand up; MP is more controllable and fits a warehouse-first stack better.

How do I handle refunds?

Send a refund event with the same transaction_id as the original purchase and a negative value if needed. GA4 will subtract the refund from the original transaction's revenue in reports.

Can I use Measurement Protocol to backfill historical data?

Up to 72 hours, yes. Beyond 72 hours, GA4 rejects the event. For historical backfill, you cannot use MP. You can import via the Data Import feature in GA4 Admin, which accepts CSVs, but it is a different workflow.

What is the error rate and how do I monitor it?

Measurement Protocol returns 2xx for both success and validation failure; you have to use the debug endpoint to see errors. In production, monitor the ratio of events forwarded to events landing in GA4 (via a daily query against BigQuery Export if you have the paid tier, or a proxy count in your own warehouse). A 2-8 percent drop is normal. A 15+ percent drop means something is broken.

What to try this week

Hit the debug endpoint with one real event from your system. Here is a minimum curl:

curl -X POST "https://www.google-analytics.com/debug/mp/collect?measurement_id=$MID&api_secret=$SEC" \
  -H "Content-Type: application/json" \
  -d '{
    "client_id": "test-client-001",
    "events": [{
      "name": "purchase",
      "params": {
        "transaction_id": "test-order-001",
        "currency": "USD",
        "value": 49.00,
        "items": [{"item_id": "SKU-001", "item_name": "Test", "price": 49.00, "quantity": 1}]
      }
    }]
  }'

If it returns {"validationMessages": []}, your setup works. If it returns validation errors, fix them before pointing at production.

If you want the full warehouse-first rebuild done end-to-end rather than one protocol at a time, a DTC Stack Audit scopes the work for your specific stack and identifies which stage is the lowest-hanging rebuild. For the reconciliation step that follows this ingestion pattern, see reconciling Shopify, GA4, and Meta.

Sources and specifics

// related

DTC Stack Audit

If this resonated, the audit covers your tracking layer end-to-end. Server-side CAPI, dedup logic, and attribution gaps - all mapped to your stack.

>See what is covered