Marketing Analytics Glossary
Plain-language definitions for the terms you'll meet running growth, paid media or analytics. EN + RU, with practical context for the Kazakhstan market where it matters.
Acquisition
6CAC
Customer Acquisition CostTotal marketing + sales spend divided by new customers acquired in the same period.
Always specify the channel and the period. Blended CAC mixes paid and organic and hides which channel is actually working. Compare CAC to LTV: a healthy ratio is roughly 1:3 (LTV at least three times CAC). For Kazakhstan fintech, watch out for one-off bonuses — a 5,000 ₸ welcome cashback inflates first-purchase CAC but can pay back over 6 months.
Benchmarks- LTV : CAC ≥ 3 : 1Healthy SaaS / fintech
- LTV : CAC = 1 : 1Burning cash, no margin
- Payback period< 12 months for B2C, < 18 for B2B
LTV
Customer Lifetime ValueThe total revenue (or gross profit) you expect from a customer over their lifetime.
Measured in revenue or gross profit — the second is usually more honest. Calculate as ARPU × gross margin × average customer lifespan, or run a cohort revenue curve and project. Don't use blended LTV for ad bidding decisions; segment by acquisition channel because retention differs sharply.
Benchmarks- LTV : CAC = 3 : 1Standard SaaS target
- LTV : CAC > 5 : 1You're likely under-spending on growth
- Use gross profit, not revenueMore honest input
ROAS
Return on Ad SpendRevenue generated by ads divided by ad spend.
A 4× ROAS means you earned ₸4 for every ₸1 spent. ROAS is a vanity number on its own — it doesn't account for COGS, returns, or fulfillment costs. Use POAS (Profit on Ad Spend) when shopping platforms underweight margin, or compare ROAS to your contribution-margin breakeven.
Benchmarks- ≥ 4×E-commerce target with 25%+ margin
- 2–4×Acceptable, watch margins
- < 2×Likely losing money on the channel
- Breakeven point1 / contribution margin
CPA
Cost per AcquisitionAverage ad spend per single conversion (lead, install, purchase).
Closely related to CAC but usually narrower: CPA per a specific conversion event vs CAC for "becoming a paying customer." In ad platform UIs CPA usually appears as "Cost / Conversion." Always pair with the conversion definition — "lead" can mean form-fill or qualified call.
Benchmarks- Lead-gen B2C (KZ)500–3,000 ₸ per lead
- Mobile install (utility)$1–4
- Mobile install (fintech)$5–25
- Healthy thresholdCPA × expected close rate ≤ unit margin
CTR
Click-Through RateClicks divided by impressions.
Most useful for comparing creatives within the same audience. Industry CTR benchmarks are noisy — your historical baseline matters more. A sudden CTR drop with stable spend usually means audience saturation; refresh creative before bidding more.
Benchmarks- Google Search · brand6–15%
- Google Search · non-brand2–4%
- Google Display / GDN0.4–0.8%
- Meta · paid social0.9–1.6%
- TikTok Ads1.0–2.5%
- Email campaigns2–5%
CPM
Cost per Mille (per 1,000 impressions)How much you pay for one thousand ad impressions.
A signal of audience demand and creative quality (algorithms reward higher engagement with lower CPM). Programmatic and YouTube use CPM bidding by default. Watch for CPM inflation in Q4 (Black Friday → New Year) and during election cycles in markets with political ad budgets.
Benchmarks- Meta · Kazakhstan$0.50–2.50
- Meta · global$5–15
- YouTube In-Stream$5–20
- Programmatic display$1–5
- Q4 inflation+20–40% Black Friday → New Year
Retention
5Retention
Customer retention rateShare of customers from a cohort still active in a later period.
Measured by cohort: out of users who signed up in March, how many were still active in April, May, June. Critical because LTV ≈ ARPU × retention. Define "active" by the action that matters for your business — login is not the same as a paid transaction.
Benchmarks- Mobile app · Day 125–35%
- Mobile app · Day 710–15%
- Mobile app · Day 304–7%
- B2C SaaS · Month 160–75%
- E-commerce · 12-month repeat20–40%
Churn
Customer churn rateInverse of retention: customers who left in a period.
Tracked monthly for SaaS/subscription, weekly for marketplaces. Differentiate voluntary churn (canceled) from involuntary (failed payment) — the latter is fixable with dunning automation. Churn under 5% monthly is decent for B2C subscription, under 1% is excellent.
Benchmarks- B2B SaaS · monthly< 1% excellent, 1–2% normal
- B2C subscription · monthly< 5% decent, 5–7% high
- Involuntary share20–40% of total — fix with dunning
Cohort
Cohort analysisA group of users who share a starting event (signup week, first purchase month).
The single most valuable view in product analytics. A cohort retention table reveals whether changes you ship actually improve long-term behavior, separate from acquisition mix shifts. GA4, Amplitude and Mixpanel all support cohort tables out of the box.
ARPU
Average Revenue Per UserTotal revenue divided by active users in a period.
Use Monthly ARPU for SaaS, Weekly ARPU for high-frequency apps. Going up with stable users = monetisation working; going up because heavy churn dropped low-value users is a false positive — always pair with retention.
MAU / DAU
Monthly / Daily Active UsersDistinct users active in the last 30 (MAU) or 1 (DAU) days.
DAU/MAU ratio is "stickiness" — how often the typical monthly user comes back. 50%+ is excellent (used roughly every other day), 20% is solid for niche tools, under 10% means a notification or content problem.
Benchmarks- Daily-use products (chat, social)50%+
- Weekly-use products (SaaS, fintech)20–35%
- Niche tools10–20% acceptable
- Below 10%Notification or content problem
Product
4Aha moment
Aha moment / activationThe first time a user experiences the product's core value.
Quantify it: Facebook's was "7 friends in 10 days," Slack's was "2,000 messages sent." Find yours by comparing retained vs churned cohorts and looking for the threshold. Then redesign onboarding to push every new user toward that exact action.
PMF
Product-Market FitWhen the product reliably solves a real problem for a defined market.
Sean Ellis's test: ask users "How would you feel if you couldn't use this product?" — 40%+ saying "very disappointed" is the threshold. Without PMF, growth tactics paper over the gap; with PMF, retention curves flatten and word-of-mouth becomes a real channel.
North Star metric
North Star metricThe single output metric that best predicts long-term value creation.
Should reflect customer value (not just revenue) and be measurable at high frequency. Airbnb: "nights booked." Spotify: "time spent listening." Pick one — having three is having none.
AAARRR
Pirate metrics frameworkAwareness · Acquisition · Activation · Retention · Revenue · Referral.
Dave McClure's framework for breaking the funnel into stages each with their own metrics. Useful for diagnosing where growth is stuck — most "marketing" problems are actually activation or retention problems further down the funnel.
Analytics
7GA4
Google Analytics 4Google's event-based analytics product, replacing Universal Analytics in 2023.
Every interaction is an event with parameters; sessions are derived, not primary. Free tier processes up to 10M events/month, has a 14-month default lookback, and ships with native BigQuery export — use it. Server-side measurement via Measurement Protocol or sGTM closes gaps from ad blockers.
GTM
Google Tag ManagerA tag management system that lets marketers deploy tracking without engineering for each change.
Container holds tags, triggers and variables. Three tag types you'll use 95% of the time: GA4 Event, Meta Pixel, custom HTML. Always work in Preview mode and version everything — every Publish creates a restorable version.
dataLayer
Data layerA JavaScript array your site pushes events into so GTM can listen.
The single most important contract in any tracking setup. Define the schema before writing tags: which events fire, what parameters they carry, naming conventions (snake_case is standard). A messy dataLayer is the #1 cause of "GTM stopped working."
Event Taxonomy
Event taxonomy / tracking planA documented list of every event the product emits, with parameters and triggers.
Lives in a Google Sheet or Notion table that engineering, analytics and marketing all reference. Columns: event name, trigger condition, parameters (type, example), destination (GA4, Meta, internal). Without one, you ship inconsistent events that break reports six months later.
UTM
UTM parametersFive URL parameters (source, medium, campaign, term, content) that label traffic.
GA4 reads them as session-level dimensions. Lowercase, no spaces, kebab-case — and never tag internal links. Common mistake: tagging the same campaign as `meta` from one team and `facebook` from another, splitting the report into two ghosts.
CR
Conversion RateShare of visitors who complete a target action.
Always specify the numerator (which conversion) and the denominator (sessions, users, or qualified visitors). E-commerce industry CR averages mask huge segment differences — first-time vs returning, mobile vs desktop, paid vs organic. A useful baseline is your own moving 90-day average; meaningful improvement is a 20%+ relative lift sustained across two cohorts.
Benchmarks- E-commerce · global avg1.5–3.5%
- E-commerce · top quartile5–7%
- B2B SaaS lead form2–4%
- Landing page → trial5–10%
- Mobile vs desktop gapMobile typically 30–50% lower
AOV
Average Order ValueTotal revenue divided by number of orders.
Easier to grow than CR for many businesses. Levers: bundles, free-shipping thresholds, tiered cashback, post-purchase upsells. Watch median alongside mean — a long tail of large B2B orders can hide that the typical order is shrinking.
Benchmarks- KZ e-commerce · electronics40,000–120,000 ₸
- KZ e-commerce · apparel15,000–35,000 ₸
- Free-shipping threshold ruleSet at 1.3× current AOV
Attribution
4Attribution
Attribution modelRule for assigning credit for a conversion across multiple touchpoints.
Last-click is the default and the worst — it gives all credit to the channel closest to purchase, usually branded search. Data-driven attribution uses ML to weight touches; only Pro/360 GA4 has it. For DIY: position-based (40/20/40) is a reasonable middle ground.
Last-click
Last-click attributionGives 100% credit for a conversion to the most recent paid touchpoint.
Easy to compute and dangerous to use for budgeting. Brand search and direct often dominate the report despite contributing little to net new demand. Always run a holdback experiment for any "high-ROAS" channel before doubling its budget.
MTA
Multi-touch attributionAttribution that distributes credit across all touchpoints in a journey.
Linear, time-decay, position-based and data-driven are the common variants. MTA depends on cookies — which iOS ATT and 3rd-party cookie deprecation have crippled. Pair with MMM (Marketing Mix Modeling) for the budget-allocation question.
CAPI
Conversions API (Meta)Server-side event sending from your backend directly to Meta, bypassing the browser.
Recovers signal lost to ad blockers, iOS ATT and consent rejections. Send the same event from both pixel and CAPI with a deduplication ID; Meta merges them. Without CAPI, your Meta ad cost-per-conversion looks 30-50% worse than reality on iOS.
Technical
6S2S
Server-to-server trackingConversions sent from your backend (CRM, payment) to ad platforms via API.
Used for offline conversions (loan approved, contract signed) and to bypass browser-side gaps. Google Ads has Offline Conversion Imports + Enhanced Conversions for Leads; Meta has CAPI; TikTok has Events API. For fintech KZ, S2S is essential — most loan approvals happen days after the click.
sGTM
Server-side Google Tag ManagerA self-hosted GTM container that runs on your servers, intercepting events before they reach platforms.
Lets you enrich, redact, deduplicate and route events server-side. Runs on Google App Engine, Cloud Run, or self-hosted Docker. Initial setup is fiddly but pays back: faster page loads, first-party cookies, recovered iOS signal. Budget $30-100/month for App Engine.
Pixel
Tracking pixelA tiny script (often invisible 1×1 image) that an ad platform places on your site to collect events.
Meta Pixel, TikTok Pixel, LinkedIn Insight Tag and Yandex Metrika are all variations. Each platform has its own naming for events ("Purchase" vs "purchase" vs "transaction") — getting them wrong silently breaks ad optimization. Test in the platform's native debugger before going live.
First-party data
First-party dataData you collect directly from your customers — email, behavior, purchases.
The strategic asset of the post-cookie era. Build a Customer Data Platform or at minimum a unified customer table in BigQuery. Hashed first-party emails feed Customer Match (Google) and Custom Audiences (Meta) for cookieless retargeting.
iOS ATT
App Tracking TransparencyApple's 2021 prompt asking users to opt into IDFA tracking. Most say no.
Opt-in rate hovers around 25%. For iOS apps relying on Meta or TikTok ads, the post-ATT world means SKAdNetwork (24h conversion windows, aggregated data) and CAPI/CAPI for Apps are now mandatory. Web tracking on iOS Safari is similarly hobbled by ITP — server-side becomes the only reliable path.
CWV
Core Web VitalsGoogle's page-experience metrics: LCP, INP, CLS.
LCP (Largest Contentful Paint) under 2.5s, INP (Interaction to Next Paint) under 200ms, CLS (Cumulative Layout Shift) under 0.1. These influence Google rankings for both desktop and mobile and correlate with conversion rate. Defer GTM, lazy-load images and self-host fonts as the three biggest wins.
Benchmarks- LCP< 2.5 s good · 2.5–4.0 s needs work · > 4.0 s poor
- INP< 200 ms good · 200–500 ms needs work · > 500 ms poor
- CLS< 0.1 good · 0.1–0.25 needs work · > 0.25 poor
- Mobile pass rate (Goodlabs target)≥ 75% of sessions in "good" buckets