Build an Auto-Refreshing Dashboard with React, BigQuery, and Deloc
This guide walks through building a revenue dashboard that refreshes its data daily from BigQuery, requires no server to maintain, never needs redeployment after the initial deploy, and costs about $1 per month to run.
The key idea: the dashboard is a static React app deployed once to Deloc. The data is a separate JSON file that gets updated on a schedule by a Cloud Run job. The frontend fetches the data file on page load. When the data file changes, the dashboard shows fresh numbers. No redeploy, no server, no CI/CD pipeline.
Architecture
+------------------+ +---------------+ +----------+
| Cloud Scheduler |---->| Cloud Run Job |---->| BigQuery |
| (daily at 6 AM) | | (Python) |<----| |
+------------------+ +-------+-------+ +----------+
|
| POST data.json
v
+-------+-------+
| Deloc |
| (CDN + Auth) |
+-------+-------+
|
| Static files + data.json
v
+-------+-------+
| Browser |
| (React app) |
+---------------+
Cloud Scheduler triggers the Cloud Run job on a cron schedule. The job queries BigQuery, formats the result as JSON, and pushes it to Deloc using the data upload API. The React dashboard fetches ./data.json on every page load and renders the latest numbers. The dashboard itself is served from Deloc’s CDN, so load times are fast and there is nothing to maintain.
Prerequisites
- Node.js 18+
- A Google Cloud project with BigQuery enabled and data you want to visualize
- A Deloc account (sign up at deloc.dev)
gcloudCLI installed and authenticated- Docker (for building the Cloud Run job container)
Step 1: Build the dashboard
You have two paths here. You can scaffold it yourself or let an AI coding tool do the work.
Option A: Scaffold with Vite
npx create-vite revenue-dashboard --template react-ts
cd revenue-dashboard
npm install recharts
The important pattern is this: your dashboard loads data from a relative path at runtime. Create public/data.json with sample data for development:
{
"last_updated": "2026-04-09T06:00:00Z",
"summary": {
"total_revenue": 2847500,
"month_over_month": 0.12,
"active_customers": 1842
},
"monthly_revenue": [
{ "month": "2025-11", "revenue": 2340000 },
{ "month": "2025-12", "revenue": 2510000 },
{ "month": "2026-01", "revenue": 2425000 },
{ "month": "2026-02", "revenue": 2680000 },
{ "month": "2026-03", "revenue": 2540000 },
{ "month": "2026-04", "revenue": 2847500 }
],
"revenue_by_region": [
{ "region": "North America", "revenue": 1250000 },
{ "region": "Europe", "revenue": 820000 },
{ "region": "Asia Pacific", "revenue": 480000 },
{ "region": "Latin America", "revenue": 297500 }
]
}
Then in your main component, fetch it on load:
import { useEffect, useState } from "react";
import {
LineChart, Line, BarChart, Bar,
XAxis, YAxis, Tooltip, ResponsiveContainer
} from "recharts";
interface DashboardData {
last_updated: string;
summary: {
total_revenue: number;
month_over_month: number;
active_customers: number;
};
monthly_revenue: { month: string; revenue: number }[];
revenue_by_region: { region: string; revenue: number }[];
}
function App() {
const [data, setData] = useState<DashboardData | null>(null);
useEffect(() => {
fetch("./data.json")
.then((res) => res.json())
.then(setData)
.catch(console.error);
}, []);
if (!data) return <div className="loading">Loading...</div>;
const fmt = (n: number) =>
"$" + (n / 1_000_000).toFixed(1) + "M";
return (
<div style={{ maxWidth: 1200, margin: "0 auto", padding: 24 }}>
<h1>Revenue Dashboard</h1>
<p style={{ color: "#666" }}>
Last updated: {new Date(data.last_updated).toLocaleString()}
</p>
<div style={{ display: "flex", gap: 24, marginBottom: 32 }}>
<div className="card">
<div className="card-label">Total Revenue</div>
<div className="card-value">{fmt(data.summary.total_revenue)}</div>
</div>
<div className="card">
<div className="card-label">MoM Growth</div>
<div className="card-value">
{(data.summary.month_over_month * 100).toFixed(1)}%
</div>
</div>
<div className="card">
<div className="card-label">Active Customers</div>
<div className="card-value">
{data.summary.active_customers.toLocaleString()}
</div>
</div>
</div>
<h2>Monthly Revenue</h2>
<ResponsiveContainer width="100%" height={300}>
<LineChart data={data.monthly_revenue}>
<XAxis dataKey="month" />
<YAxis tickFormatter={(v) => fmt(v)} />
<Tooltip formatter={(v: number) => fmt(v)} />
<Line
type="monotone"
dataKey="revenue"
stroke="#4f46e5"
strokeWidth={2}
/>
</LineChart>
</ResponsiveContainer>
<h2>Revenue by Region</h2>
<ResponsiveContainer width="100%" height={300}>
<BarChart data={data.revenue_by_region}>
<XAxis dataKey="region" />
<YAxis tickFormatter={(v) => fmt(v)} />
<Tooltip formatter={(v: number) => fmt(v)} />
<Bar dataKey="revenue" fill="#4f46e5" radius={[4, 4, 0, 0]} />
</BarChart>
</ResponsiveContainer>
</div>
);
}
export default App;
This is a minimal example. In practice, you would add better styling, responsive breakpoints, loading states, and error handling. Or you would let the AI tool handle all of that.
Option B: Let AI build it
Open Claude Code or Cursor in an empty directory and say:
Build me a revenue dashboard with React and TypeScript. It should load data from ./data.json on page load. Include a summary row at the top showing total revenue, month-over-month growth, and active customers. Below that, a line chart of monthly revenue and a bar chart of revenue by region. Use Recharts. Make it clean and professional — dark sidebar, white content area, good typography. Here is the data shape: [paste the sample JSON above].
You will get a complete, styled, production-ready dashboard in a few minutes. Iterate by describing changes: “make the sidebar narrower,” “add a date range filter,” “switch to a stacked bar chart.”
Either way, the contract between frontend and backend is simple: the dashboard reads ./data.json at runtime. Everything else is a design choice.
Step 2: Deploy to Deloc
Build and deploy:
npm run build
npx @deloc/cli deploy --name "Revenue Dashboard"
Output:
Deploying Revenue Dashboard...
Uploading 14 files (842 KB)...
Done!
https://revenue-dashboard--deloc-team.deloc.app
Link copied to clipboard.
Your dashboard is live. Right now it shows the sample data from public/data.json that was included in the build. Next, we wire up the real data.
Step 3: Create the data refresh script
This is a Python script that queries BigQuery, formats the result, and pushes it to Deloc. It will run as a Cloud Run job.
main.py
import json
import os
from datetime import datetime, timezone
import requests
from google.cloud import bigquery
DELOC_TOKEN = os.environ["DELOC_TOKEN"]
APP_SLUG = os.environ.get("APP_SLUG", "revenue-dashboard")
DELOC_API = os.environ.get("DELOC_API", "https://api.deloc.dev")
def query_bigquery() -> dict:
client = bigquery.Client()
# Monthly revenue for the last 6 months
monthly_query = """
SELECT
FORMAT_DATE('%Y-%m', order_date) AS month,
SUM(revenue) AS revenue
FROM `your_project.your_dataset.orders`
WHERE order_date >= DATE_SUB(CURRENT_DATE(), INTERVAL 6 MONTH)
GROUP BY month
ORDER BY month
"""
monthly_rows = list(client.query(monthly_query).result())
monthly_revenue = [
{"month": row.month, "revenue": float(row.revenue)}
for row in monthly_rows
]
# Revenue by region
region_query = """
SELECT
region,
SUM(revenue) AS revenue
FROM `your_project.your_dataset.orders`
WHERE order_date >= DATE_TRUNC(CURRENT_DATE(), MONTH)
GROUP BY region
ORDER BY revenue DESC
"""
region_rows = list(client.query(region_query).result())
revenue_by_region = [
{"region": row.region, "revenue": float(row.revenue)}
for row in region_rows
]
# Summary
current_month = monthly_revenue[-1]["revenue"] if monthly_revenue else 0
prev_month = monthly_revenue[-2]["revenue"] if len(monthly_revenue) >= 2 else 0
mom_growth = (current_month - prev_month) / prev_month if prev_month else 0
customer_query = """
SELECT COUNT(DISTINCT customer_id) AS active_customers
FROM `your_project.your_dataset.orders`
WHERE order_date >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
"""
customer_rows = list(client.query(customer_query).result())
active_customers = customer_rows[0].active_customers if customer_rows else 0
return {
"last_updated": datetime.now(timezone.utc).isoformat(),
"summary": {
"total_revenue": current_month,
"month_over_month": round(mom_growth, 4),
"active_customers": active_customers,
},
"monthly_revenue": monthly_revenue,
"revenue_by_region": revenue_by_region,
}
def push_to_deloc(data: dict) -> None:
url = f"{DELOC_API}/api/apps/{APP_SLUG}/data/data.json"
response = requests.post(
url,
headers={
"Authorization": f"Bearer {DELOC_TOKEN}",
"Content-Type": "application/json",
},
data=json.dumps(data),
)
response.raise_for_status()
print(f"Data pushed to Deloc: {response.status_code}")
def main():
print("Querying BigQuery...")
data = query_bigquery()
print(f"Got {len(data['monthly_revenue'])} months of data")
print("Pushing to Deloc...")
push_to_deloc(data)
print("Done.")
if __name__ == "__main__":
main()
Replace the BigQuery project, dataset, and table references with your own. The queries are straightforward SQL — adapt them to your schema.
requirements.txt
google-cloud-bigquery==3.27.0
requests==2.32.3
Dockerfile
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY main.py .
CMD ["python", "main.py"]
Alternative: Use the MCP tool
If you are working in Claude Code with the Deloc MCP server installed, you can skip writing this by hand:
Set up auto-refresh for revenue-dashboard from BigQuery. The data source is
your_project.your_dataset.orderswith columns order_date, revenue, region, and customer_id. Refresh daily at 6 AM UTC.
The setup_data_refresh MCP tool generates the Python script, Dockerfile, and Cloud Run/Scheduler configuration tailored to your schema.
Step 4: Deploy the Cloud Run job and schedule it
First, build and push the container image:
# Set your project
export PROJECT_ID=your-gcp-project
export REGION=us-central1
# Build and push to Artifact Registry
gcloud artifacts repositories create deloc-jobs \
--repository-format=docker \
--location=$REGION \
--description="Deloc data refresh jobs" \
2>/dev/null || true
gcloud builds submit \
--tag $REGION-docker.pkg.dev/$PROJECT_ID/deloc-jobs/revenue-refresh:latest
Create the Cloud Run job:
gcloud run jobs create revenue-dashboard-refresh \
--image $REGION-docker.pkg.dev/$PROJECT_ID/deloc-jobs/revenue-refresh:latest \
--region $REGION \
--set-env-vars "DELOC_TOKEN=dl_your_token_here,APP_SLUG=revenue-dashboard" \
--max-retries 2 \
--task-timeout 300s \
--memory 512Mi
A note on the DELOC_TOKEN: generate one from your Deloc dashboard under Settings > API Tokens, or run npx @deloc/cli token create. For production use, store it in Google Secret Manager instead of passing it as a plain environment variable:
# Store the token as a secret
echo -n "dl_your_token_here" | gcloud secrets create deloc-token --data-file=-
# Reference the secret in the Cloud Run job
gcloud run jobs update revenue-dashboard-refresh \
--set-secrets "DELOC_TOKEN=deloc-token:latest" \
--region $REGION
Now schedule it with Cloud Scheduler:
gcloud scheduler jobs create http revenue-dashboard-refresh-schedule \
--location $REGION \
--schedule "0 6 * * *" \
--time-zone "UTC" \
--uri "https://$REGION-run.googleapis.com/apis/run.googleapis.com/v1/namespaces/$PROJECT_ID/jobs/revenue-dashboard-refresh:run" \
--http-method POST \
--oauth-service-account-email $PROJECT_ID[email protected]
The cron expression 0 6 * * * means “at 6:00 AM UTC every day.” Adjust to match when your data is typically fresh. Some common patterns:
0 6 * * *— daily at 6 AM UTC0 */4 * * *— every 4 hours0 6 * * 1-5— weekdays only at 6 AM UTC0 6 1 * *— first of each month at 6 AM UTC
Step 5: Verify it works
Run the job manually to test:
gcloud run jobs execute revenue-dashboard-refresh --region $REGION --wait
Check the logs:
gcloud run jobs executions list --job revenue-dashboard-refresh --region $REGION
You should see output like:
Querying BigQuery...
Got 6 months of data
Pushing to Deloc...
Data pushed to Deloc: 200
Done.
Now open your dashboard at https://revenue-dashboard--deloc-team.deloc.app. It should show live data from BigQuery instead of the sample data. The “Last updated” timestamp should reflect when the job ran.
You can also verify the data file directly:
npx @deloc/cli upload-data --app revenue-dashboard --list
This shows all data files attached to the app, their sizes, and when they were last updated.
Step 6: Add access control
By default, your Deloc app is accessible to anyone with the link. That might be fine for some dashboards, but for revenue data you probably want access control.
Password protection (quick and simple):
npx @deloc/cli password revenue-dashboard
This prompts for a password. Anyone visiting the dashboard will need to enter it before seeing the content. Remove it later with npx @deloc/cli password revenue-dashboard --remove.
Domain restriction (for organizations):
If your team uses Google Workspace or Microsoft 365, you can restrict access to your email domain. Visitors authenticate with their work account, and Deloc’s edge proxy checks their email domain before serving the page.
This is configured from the Deloc dashboard under your app’s settings, or via the API:
curl -X PATCH https://api.deloc.dev/api/apps/revenue-dashboard \
-H "Authorization: Bearer $DELOC_TOKEN" \
-H "Content-Type: application/json" \
-d '{"visibility": "domain_restricted"}'
No code changes to your dashboard. No auth libraries. The authentication happens at Deloc’s CDN edge before the first byte of your app is served.
Adapting for other data sources
The architecture works with any data source. The Deloc push (the push_to_deloc function) stays the same. Only the query logic changes.
Snowflake:
import snowflake.connector
def query_snowflake() -> dict:
conn = snowflake.connector.connect(
account=os.environ["SNOWFLAKE_ACCOUNT"],
user=os.environ["SNOWFLAKE_USER"],
password=os.environ["SNOWFLAKE_PASSWORD"],
warehouse=os.environ["SNOWFLAKE_WAREHOUSE"],
database=os.environ["SNOWFLAKE_DATABASE"],
)
cursor = conn.cursor()
cursor.execute("""
SELECT DATE_TRUNC('month', order_date) AS month,
SUM(revenue) AS revenue
FROM orders
WHERE order_date >= DATEADD(month, -6, CURRENT_DATE())
GROUP BY month ORDER BY month
""")
rows = cursor.fetchall()
# ... format and return
PostgreSQL:
import psycopg2
def query_postgres() -> dict:
conn = psycopg2.connect(os.environ["DATABASE_URL"])
cursor = conn.cursor()
cursor.execute("""
SELECT TO_CHAR(order_date, 'YYYY-MM') AS month,
SUM(revenue) AS revenue
FROM orders
WHERE order_date >= CURRENT_DATE - INTERVAL '6 months'
GROUP BY 1 ORDER BY 1
""")
rows = cursor.fetchall()
# ... format and return
REST API:
import requests
def query_api() -> dict:
response = requests.get(
"https://api.your-service.com/v1/revenue",
headers={"Authorization": f"Bearer {os.environ['API_KEY']}"},
params={"period": "6m"},
)
response.raise_for_status()
raw = response.json()
# ... transform and return in your dashboard's expected format
The pattern is always the same: query the source, shape the data into the JSON structure your dashboard expects, push to Deloc.
Cost breakdown
Monthly costs for a dashboard that refreshes daily from BigQuery:
| Component | Cost |
|---|---|
| Deloc (free tier) | $0 |
| Cloud Run job (runs once daily, ~30 seconds, 512MB) | ~$0.50/mo |
| Cloud Scheduler (one trigger) | Free (up to 3 free per account) |
| BigQuery queries (a few MB scanned daily) | ~$0.01/mo |
| Artifact Registry (one small container image) | ~$0.10/mo |
| Total | ~$1/mo |
Viewers are unlimited and free. There are no per-seat costs for people who look at the dashboard. Compare this to Tableau’s per-viewer licensing and the math is not close.
The free tier includes 3 live apps, 100MB storage, and 1GB bandwidth per app per month. Apps expire after 30 days but can be renewed with one click. For teams that need permanent links, no branding, and more apps, paid plans start at $10/month.
Next steps
A few things you might want to add once the basic setup is running:
Multiple data files. Your dashboard does not have to load a single data.json. You can push multiple files — revenue.json, customers.json, pipeline.json — and have different sections of the dashboard load different datasets. Each gets its own upload endpoint.
Error handling in the refresh script. Add alerting when the BigQuery query fails or the Deloc push returns an error. A simple approach: catch exceptions and send a notification to Slack or email. Cloud Run’s built-in logging also shows failures in the Google Cloud console.
Multiple dashboards from one data source. The Cloud Run job can push data to multiple Deloc apps in a single run. Query once, shape the data differently for each dashboard, push to each.
Historical snapshots. Push timestamped data files (data-2026-04-09.json) alongside the current data.json. Add a date picker to the dashboard that loads historical snapshots for comparison.
The working example for this guide is deployed at https://revenue-dashboard—deloc-team.deloc.app.
For the business case behind this approach — why custom dashboards are replacing Tableau and Looker for a growing number of teams — read the companion post: You Don’t Need Tableau Anymore.