Complete guide to GA4 BigQuery export including setup, schema documentation, SQL query patterns, and data analysis. Use when exporting GA4 data to BigQuery, writing SQL queries for GA4 data, analyzing event-level data, working with nested/repeated fields (UNNEST), or building custom reports from raw data. Covers BigQuery linking, events_* tables, SQL patterns, and performance optimization.
/plugin marketplace add henkisdabro/wookstar-claude-code-plugins/plugin install ga-suite@wookstar-claude-code-pluginsThis skill inherits all available tools. When active, it can use any tool Claude has access to.
references/sql-query-cookbook.mdGA4 BigQuery export provides raw, event-level data access for advanced analysis, custom reporting, machine learning, and long-term data warehousing.
Invoke this skill when:
Requirements:
Setup Steps:
Create Google Cloud Project:
Link GA4 to BigQuery:
Export Options:
Data Availability:
Table Naming:
project.dataset.events_YYYYMMDD - Daily exportproject.dataset.events_intraday_YYYYMMDD - Intraday (partial day)project.dataset.events_* - Wildcard for all datesKey Schema Fields:
Event Fields:
event_date: YYYYMMDD format (STRING)event_timestamp: Microseconds since epoch (INTEGER)event_name: Event name (STRING)event_params: Event parameters (RECORD, REPEATED)event_value_in_usd: Event value in USD (FLOAT)User Fields:
user_id: User ID if set (STRING)user_pseudo_id: Anonymous user ID (STRING)user_properties: User properties (RECORD, REPEATED)user_first_touch_timestamp: First visit timestamp (INTEGER)Device Fields:
device.category: desktop, mobile, tabletdevice.operating_system: Windows, iOS, Androiddevice.browser: Chrome, Safari, etc.Geo Fields:
geo.country: Country namegeo.region: State/regiongeo.city: City nameTraffic Source Fields:
traffic_source.source: Source (google, direct)traffic_source.medium: Medium (organic, cpc)traffic_source.name: Campaign nameE-commerce Fields:
ecommerce.transaction_id: Transaction ID (STRING)ecommerce.purchase_revenue_in_usd: Purchase revenue (FLOAT)items: Items array (RECORD, REPEATED)SELECT
event_name,
COUNT(*) as event_count
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
GROUP BY
event_name
ORDER BY
event_count DESC
SELECT
event_date,
event_name,
user_pseudo_id,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') as page_location,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') as page_title
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
AND event_name = 'page_view'
LIMIT 1000
SELECT
event_date,
COUNT(DISTINCT user_pseudo_id) as purchasers,
COUNT(DISTINCT ecommerce.transaction_id) as transactions,
SUM(ecommerce.purchase_revenue_in_usd) as total_revenue,
AVG(ecommerce.purchase_revenue_in_usd) as avg_order_value
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
AND event_name = 'purchase'
AND ecommerce.transaction_id IS NOT NULL
GROUP BY
event_date
ORDER BY
event_date
SELECT
event_date,
item.item_name,
item.item_category,
SUM(item.quantity) as total_quantity,
SUM(item.item_revenue_in_usd) as total_revenue
FROM
`project.dataset.events_*`,
UNNEST(items) as item
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
AND event_name = 'purchase'
GROUP BY
event_date,
item.item_name,
item.item_category
ORDER BY
total_revenue DESC
WITH user_events AS (
SELECT
user_pseudo_id,
event_timestamp,
event_name,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') as page_location
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX = '20250115'
)
SELECT
user_pseudo_id,
ARRAY_AGG(
STRUCT(event_name, page_location, event_timestamp)
ORDER BY event_timestamp
) as event_sequence
FROM
user_events
GROUP BY
user_pseudo_id
LIMIT 100
SELECT
event_date,
traffic_source.source,
traffic_source.medium,
traffic_source.name as campaign,
COUNT(DISTINCT user_pseudo_id) as users,
COUNT(DISTINCT CONCAT(user_pseudo_id,
(SELECT value.int_value FROM UNNEST(event_params) WHERE key = 'ga_session_id'))) as sessions
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
GROUP BY
event_date,
traffic_source.source,
traffic_source.medium,
traffic_source.name
ORDER BY
sessions DESC
-- Create reusable functions for parameter extraction
CREATE TEMP FUNCTION GetParamString(params ANY TYPE, target_key STRING)
RETURNS STRING
AS (
(SELECT value.string_value FROM UNNEST(params) WHERE key = target_key)
);
CREATE TEMP FUNCTION GetParamInt(params ANY TYPE, target_key STRING)
RETURNS INT64
AS (
(SELECT value.int_value FROM UNNEST(params) WHERE key = target_key)
);
-- Use in query
SELECT
event_date,
GetParamString(event_params, 'page_location') as page_location,
GetParamInt(event_params, 'engagement_time_msec') as engagement_time
FROM
`project.dataset.events_*`
WHERE
_TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
Best Practices:
WHERE _TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
NOT:
WHERE event_date BETWEEN '20250101' AND '20250131'
event_name and event_timestamp:WHERE event_name IN ('page_view', 'purchase')
SELECT event_name, user_pseudo_id, event_timestamp
NOT:
SELECT *
-- Good: inline UNNEST
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location')
-- Avoid: full UNNEST in FROM
FROM table, UNNEST(event_params) as param
WHERE param.key = 'page_location'
LIMIT 1000 -- Test query first
BigQuery Pricing:
Reducing Costs:
Free Tier:
GA4 Export Retention:
Setting Expiration:
ALTER TABLE `project.dataset.events_20250101`
SET OPTIONS (
expiration_timestamp=TIMESTAMP "2026-01-01 00:00:00 UTC"
)
1. Unsampled Reporting:
2. Custom Attribution:
3. Data Integration:
4. Machine Learning:
5. Long-term Analysis:
Table Names:
events_YYYYMMDDevents_intraday_YYYYMMDDevents_*Filter by Date:
WHERE _TABLE_SUFFIX BETWEEN '20250101' AND '20250131'
Extract Parameter:
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'param_name')
UNNEST Items:
FROM table, UNNEST(items) as item
Costs:
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.