The Google Ads interface is built for campaign management. It is not built for reporting. Switching between views, adjusting date ranges, pulling cross-campaign comparisons — it is doable but slow. And sending a client into the Google Ads interface to review performance is an invitation for them to touch things they should not touch.

Looker Studio gives you a way to present Google Ads data cleanly, in a format the client can read, without exposing the campaign controls.

Connecting the Google Ads Data Source

In Looker Studio, click Add Data and select the Google Ads connector. Authorize with the Google account that has access to the ad account. Select the specific Google Ads account — not the manager account level if you are using an MCC.

Once connected, the data source exposes Google Ads dimensions and metrics: campaigns, ad groups, keywords, impressions, clicks, cost, conversions, conversion value, and more.

One field to check immediately: Conversion vs All Conversions. Google Ads tracks these separately. Conversions includes only the conversion actions you have marked as “primary.” All Conversions includes secondary actions and cross-device conversions. Make sure you are using the metric that matches what the client monitors in the Google Ads dashboard — otherwise your Looker Studio numbers will not match their interface and you will spend time explaining a discrepancy that is not actually a problem.

Page 1: Executive Summary

The first page should answer the questions a client asks every time they look at a report: How much did we spend? How many conversions did we get? What did each conversion cost?

Add four scorecards at the top:

Enable comparison periods on each scorecard. Period-over-period comparison is the fastest way to answer “is performance improving?”

Below the scorecards, add a time series chart showing daily Cost and Conversions on dual axes. This gives an immediate visual of whether spend and conversions are tracking together — a widening gap between cost and conversions is the earliest signal of a performance problem.

Add a date range control in the top right corner. Set the default to Last 30 days.

Page 2: Campaign Breakdown

The second page goes deeper. Add a table with:

Enable heatmaps on the Cost and Conversions columns. Sort by Cost descending by default. This surfaces the campaigns spending the most at the top — where the client’s attention should be.

Add a bar chart showing Conversions by Campaign next to the table. Clients who scan dashboards visually rather than reading tables will get the campaign ranking immediately from the bar chart.

Page 3: Keyword Performance

This page is for deeper-dive sessions, not daily monitoring. Add a table with:

Add a dropdown filter for Campaign so the client can filter to a specific campaign’s keywords. Set row limit to 25 with pagination.

This page answers the question: which keywords are actually driving conversions, and which are spending without contributing?

Useful Filters and Controls

Date range control — essential on every page.

Campaign name filter — a text input or dropdown that filters all charts on the page to a single campaign.

Device type filter — lets you split performance between mobile, desktop, and tablet in a single click.

Network filter — separates Search network from Display network performance. These two networks have fundamentally different CPCs, CTRs, and conversion rates. Mixing them in the same chart without a filter can mask problems.

Metrics to Track vs Metrics to Avoid

Not every Google Ads metric belongs on a client dashboard.

Include: Cost, Clicks, Impressions, CTR, Conversions, CPA, Conversion Rate, Conversion Value, ROAS.

Exclude from executive view: Impression Share (important but requires explanation), Quality Score (column-level metric, not row-level in most contexts), Search Top IS (useful for reporting on specific search campaigns but confusing without context).

Quality Score and Impression Share belong on an agency-internal monitoring view, not the standard client report. They require explanation that derails the conversation away from results.

Keeping Numbers Consistent With the Google Ads Interface

The most common client complaint about Looker Studio reports is that the numbers do not match what they see in Google Ads. This is almost always a configuration issue, not a data problem.

Check these settings first:

Date range alignment. Make sure the default date range in the report matches what the client typically views in Google Ads. If they check Last 7 days in the interface and your report defaults to Last 30 days, the numbers will look different.

Conversion column. As mentioned earlier, Conversions vs All Conversions. Verify which column the client monitors and match it.

Attribution model. Google Ads applies a default attribution model to conversion reporting. If the attribution model in your data source does not match the account’s attribution settings, conversion counts can differ. Check the account’s attribution model in the Google Ads Measurement settings.

Time zone. The Google Ads connector uses the account’s time zone. If you have data from multiple accounts in different time zones, date-level data may not align cleanly.

Document the field choices you made when you built the report and leave a note in the report (use a Text element in the header) so anyone auditing later can see which columns were used and why.

Related Posts

Building Your First Looker Studio Dashboard: A Step-by-Step Guide

Looker StudioAnalyticsReportingLooker Studio Intro Series

Data Blending in Looker Studio: Combine GA4 and Google Ads in One Chart

Looker StudioGA4Google AdsAnalyticsLooker Studio Intro Series

Connecting Data Sources in Looker Studio: GA4, Google Ads, and Beyond

Looker StudioGA4Google AdsAnalyticsLooker Studio Intro Series
Adnan Agic

Adnan Agic

Google Ads Strategist & Technical Marketing Expert with 5+ years experience managing $10M+ in ad spend across 100+ accounts.

Need Help With Your Google Ads?

I help e-commerce brands scale profitably with data-driven PPC strategies.

Get In Touch
Back to Blog