Complete Forge analytics guide
Build a complete analytics infrastructure with a queue system
Working Example Available
A complete, production-ready sample of this guideβs implementation is available in our GitHub repo.
π View the full Forge Analytics Example on GitHub
What's covered
- Queue-based analytics infrastructure for reliable event delivery
- Privacy-compliant backend processing (no End User Data transmission)
- Modular system with dispatcher, consumer, and event definitions
- Frontend integration that routes through backend resolvers
- Debug mode and production deployment patterns
Prerequisites
- Working Forge app
- Analytics provider account (Accoil, Segment, etc.)
- API key from your analytics provider
Step 1: Manifest Configuration
Update your manifest.yml
with the complete analytics infrastructure:
modules:
# Queue consumer for processing analytics events
consumer:
- key: analytics-consumer
queue: analytics-queue
resolver:
function: analytics-consumer-func
method: analytics-listener
function:
- key: analytics-consumer-func
handler: analytics/consumer.handler
permissions:
external:
fetch:
backend:
- address: "in.accoil.com"
category: analytics
inScopeEUD: false
Key Components:
- Consumer: Processes events from the analytics queue
- Queue:
analytics-queue
for reliable event delivery - External Permissions: Backend-only analytics egress
inScopeEUD: false
: No End User Data transmitted
Step 2: Environment Variables
Set up secure configuration:
# Required: API key for your analytics provider
forge variables set --environment development ANALYTICS_API_KEY your_dev_key
# Optional: Debug mode for development (logs instead of sending)
forge variables set --environment development ANALYTICS_DEBUG true
# Optional: Cost optimization - use cloudId as userId
forge variables set --environment development ANALTYICS_USER_ID_OVERRIDE true
Step 3: Backend Infrastructure
File Structure
Create the analytics infrastructure:
src/analytics/
βββ dispatcher.js # HTTP transport to analytics provider
βββ consumer.js # Queue event processor
βββ utils.js # Helper functions
βββ events.js # Backend events definitions
βββ resolvers.js # Frontend-backend bridge
Dispatcher (src/analytics/dispatcher.js
)
src/analytics/dispatcher.js
)The dispatcher handles HTTP communication with your analytics provider:
import { fetch } from '@forge/api';
export const handleTrackEvent = async (userId, event) => {
await dispatch("events", {
user_id: userId,
event: event
});
}
export const handleIdentify = async (userId, groupId, traits) => {
await dispatch("users", {
user_id: userId,
group_id: groupId,
traits: traits
});
}
export const handleGroup = async (groupId, traits) => {
await dispatch("groups", {
group_id: groupId,
traits: traits,
});
}
const dispatch = async (eventType, event) => {
const apiKey = process.env.ANALYTICS_API_KEY;
const payload = JSON.stringify({
...event,
api_key: apiKey,
timestamp: Date.now(),
});
const url = `https://in.accoil.com/v1/${eventType}`;
if (process.env.ANALYTICS_DEBUG?.toLowerCase() === "true") {
console.log(`Running analytics in debug. The following payload would be sent to ${url}:\n${payload}`);
} else {
await fetch(url, {
method: "POST",
headers: {"Content-Type": "application/json"},
body: payload
});
}
}
Utils (src/analytics/utils.js
)
src/analytics/utils.js
)Helper functions for processing Forge context data:
export const userIdFromContext = (context) => {
if (process.env.ANALTYICS_USER_ID_OVERRIDE?.toLowerCase() === "true") {
return groupIdFromContext(context);
} else {
return context.accountId;
}
}
export const groupIdFromContext = (context) => {
return context.cloudId;
}
Consumer (src/analytics/consumer.js
)
src/analytics/consumer.js
)The queue consumer processes events asynchronously:
import {handleGroup, handleIdentify, handleTrackEvent} from "./dispatcher";
import Resolver from "@forge/resolver";
const resolver = new Resolver();
resolver.define('analytics-listener', async ({ payload }) => {
switch (payload.type){
case "identify":
await handleIdentify(payload.userId, payload.groupId, payload.traits);
break;
case "group":
await handleGroup(payload.groupId, payload.traits);
break;
case "track":
await handleTrackEvent(payload.userId, payload.event);
break;
default:
console.log(`analytics-listener: unable to process payload with type ${payload.type}`);
}
});
export const handler = resolver.getDefinitions();
Step 4: Frontend to Backend Bridge
Resolvers (src/analytics/resolvers.js
)
src/analytics/resolvers.js
)Bridge between frontend and backend:
import {track} from "./events";
export const trackEvent = async ({ payload, context }) => {
await track(context, payload.event);
}
Update your main resolver (src/index.js
) to expose the resolver.
import Resolver from '@forge/resolver';
...
// Make sure to add the track-event to the resolver
resolver.define('track-event', trackEvent);
export const handler = resolver.getDefinitions();
Step 5: Backend Events
Now that we have the infrastructure, let's define how to use it:
Events Module (src/analytics/events.js
)
src/analytics/events.js
)import { Queue } from '@forge/events';
import {groupIdFromContext, userIdFromContext} from "./utils";
const analyticsQueue = new Queue({ key: 'analytics-queue' });
export const track = async (context, eventName) => {
const userId = userIdFromContext(context);
const groupId = groupIdFromContext(context);
const identifyTraits = {name: userId};
const groupTraits = {name: groupId};
const events = [
{type: "identify", userId: userId, groupId: groupId, traits: identifyTraits},
{type: "group", groupId: groupId, traits: groupTraits},
{type: "track", userId: userId, event: eventName},
];
await analyticsQueue.push(events);
}
// Example events from the example Forge todo app
export const trackCreate = async (context) => {
await track(context, "Todo Created");
}
export const trackUpdate = async (context) => {
await track(context, "Todo Updated");
}
export const trackDelete = async (context) => {
await track(context, "Todo Deleted");
}
export const trackDeleteAll = async (context) => {
await track(context, "Todo Cleared");
}
Using Backend Events in Your App
Now integrate these tracking functions into your application (src/index.js
):
// The following code is from the example Todo app provided by Atlassian.
import Resolver from '@forge/resolver';
import { kvs } from '@forge/kvs';
// Import analytics functions directly - no resolvers needed for backend events
import {trackEvent} from "./analytics/resolvers";
import {trackCreate, trackDelete, trackDeleteAll, trackUpdate} from "./analytics/events";
const resolver = new Resolver();
resolver.define('create', async ({ payload, context }) => {
// Track todo creation - direct function call with context
await trackCreate(context);
// Your business logic here
const listId = getListKeyFromContext(context);
const records = await getAll(listId);
const id = getUniqueId();
const newRecord = {
id,
...payload,
};
await kvs.set(getListKeyFromContext(context), [...records, newRecord]);
return newRecord;
});
resolver.define('update', async ({ payload, context }) => {
// Direct analytics call - no resolver needed
await trackUpdate(context);
// Your business logic continues...
const listId = getListKeyFromContext(context);
let records = await getAll(listId);
records = records.map(item => {
if (item.id === payload.id) {
return payload;
}
return item;
})
await kvs.set(getListKeyFromContext(context), records);
return payload;
});
// Only frontend events need resolvers
resolver.define('track-event', trackEvent);
export const handler = resolver.getDefinitions();
Key Points:
- Direct Function Calls: Backend events use direct imports, not resolvers
- Context Parameter: Pass the Forge
context
object directly to tracking functions - No invoke() Needed: Backend code calls analytics functions synchronously
- Frontend vs Backend: Only frontend needs
invoke()
β resolver bridge
Step 6: Frontend Integration
Create a simple frontend analytics module:
Frontend Analytics (static/spa/src/analytics.js
)
static/spa/src/analytics.js
)import { invoke } from '@forge/bridge';
/**
* Track events from the frontend
* All events are routed through backend resolvers for privacy compliance
*/
export const track = async (eventName) => {
try {
await invoke('track-event', { event: eventName });
} catch (error) {
// Don't let analytics errors break the UI
console.error('[Analytics] Failed to track event:', error);
}
};
// Specific frontend event functions
export const trackSearchPerformed = () => track('Search Performed');
export const trackFilterApplied = (filterType) => track(`${filterType} Filter Applied`);
export const trackExportGenerated = (exportType) => track(`${exportType} Export Generated`);
React Component Example
import React, { useState } from 'react';
import { trackSearchPerformed } from './analytics';
function SearchComponent() {
const [query, setQuery] = useState('');
const handleSearch = () => {
if (query.length > 2) {
trackSearchPerformed();
// ... perform search
}
};
return (
<div>
<input
value={query}
onChange={(e) => setQuery(e.target.value)}
onKeyPress={(e) => e.key === 'Enter' && handleSearch()}
placeholder="Search..."
/>
</div>
);
}
Step 7: Testing Your Setup
1. Enable Debug Mode
forge variables set ANALYTICS_DEBUG true
forge deploy
2. Test the Complete Flow
# Watch all logs
forge logs --tail
# Or filter for analytics
forge logs --tail | grep Analytics
3. Verify Event Processing
When you trigger events, you should see:
[Analytics Debug] events: {
"user_id": "557058:c5b8e3d4-...",
"event": "Todo Created",
"timestamp": 1704067200000
}
[Analytics Consumer] Processed track in 150ms
4. Test Error Handling
// Temporarily break the API key to test error handling
forge variables set ANALYTICS_API_KEY invalid_key
You should see retry attempts and eventual failure logs.
Production Deployment
1. Configure Production Environment
forge variables set --environment production ANALYTICS_API_KEY your_prod_key
forge variables unset --environment production ANALYTICS_DEBUG
2. Deploy and Monitor
forge deploy --environment production
forge logs --environment production | grep Analytics
3. Verify Analytics Data
Check your analytics provider dashboard to confirm events are arriving.
Key Benefits of This Architecture
- Reliable Delivery: Queue system handles failures and retries
- Privacy Compliant: Zero End User Data transmission
- Modular Design: Clear separation of concerns
- Error Resilient: Graceful degradation when analytics fail
- Testable: Debug mode and comprehensive logging
- Scalable: Queue system handles high event volumes
π Congratulations! You now have a production-ready, queue-based analytics infrastructure that maintains privacy compliance while providing reliable event delivery.
Updated 13 days ago
- Event Naming Strategy - Establish consistent patterns
- Frontend Integration Patterns - Advanced React patterns
- Scheduled Analytics - Automated data collection