I was paying $200/month for a Solana analytics tool that gave me the same data everyone else had. Same dashboards, same metrics, same delayed insights. Then I realized most of this data is sitting right there on-chain, free, waiting to be queried.
So I built my own. It took a weekend to get the first version running, and now I have analytics that nobody else sees. That’s the whole point.
What You Can Pull from Solana RPC
People underestimate what’s available through a standard Solana RPC node. You can get transaction history, slot data, token balances, program interactions — basically everything that happens on-chain.
The stuff that matters for analytics:
- Recent transaction counts per slot
- Program-specific transactions (Raydium, Jupiter, etc.)
- Wallet activity and signature history
- Token account changes and swap patterns
You don’t need a fancy indexer for most of this. A well-structured set of RPC calls gets you surprisingly far.
Project Setup
I’m using Next.js with API routes as a lightweight backend. The frontend is React with some simple charting. Nothing fancy.
// package.json dependencies that matter
{
"@solana/web3.js": "^1.95.0",
"next": "^14.2.0",
"recharts": "^2.12.0",
"swr": "^2.2.0"
}
The connection setup lives in a shared util so every API route uses the same instance:
// lib/solana.ts
import { Connection } from "@solana/web3.js";
const RPC_URL = process.env.SOLANA_RPC_URL || "https://api.mainnet-beta.solana.com";
export const connection = new Connection(RPC_URL, {
commitment: "confirmed",
wsEndpoint: RPC_URL.replace("https", "wss"),
});
I use a paid RPC endpoint from Helius. The public one works for testing but it’ll rate-limit you fast in production.
Fetching Transaction Volume
The first metric I wanted was transaction volume over time. Here’s the API route that pulls recent slot data and counts transactions:
// app/api/volume/route.ts
import { connection } from "@/lib/solana";
import { NextResponse } from "next/server";
export async function GET() {
const currentSlot = await connection.getSlot();
const slots = Array.from({ length: 20 }, (_, i) => currentSlot - i);
const blocks = await Promise.all(
slots.map(async (slot) => {
try {
const block = await connection.getBlock(slot, {
maxSupportedTransactionVersion: 0,
transactionDetails: "full",
});
return {
slot,
txCount: block?.transactions.length ?? 0,
timestamp: block?.blockTime ?? 0,
};
} catch {
return { slot, txCount: 0, timestamp: 0 };
}
})
);
return NextResponse.json({ blocks: blocks.reverse() });
}
I used to fetch 100 slots at once. Bad idea. RPC nodes don’t like that, and you’ll get timeouts. 20 slots at a time is the sweet spot — enough data to show a trend without hammering the endpoint.
Tracking DEX Volume
This is where it gets interesting. I monitor transactions that interact with Raydium and Jupiter programs to track DEX volume:
// app/api/dex-volume/route.ts
import { connection } from "@/lib/solana";
import { PublicKey } from "@solana/web3.js";
import { NextResponse } from "next/server";
const DEX_PROGRAMS = {
raydium: new PublicKey("675kPX9MHTjS2zt1qfr1NYHuzeLXfQM9H24wFSUt1Mp8"),
jupiter: new PublicKey("JUP6LkbZbjS1jKKwapdHNy74zcZ3tLUZoi5QNyVTaV4"),
};
export async function GET() {
const results: Record<string, number> = {};
for (const [name, programId] of Object.entries(DEX_PROGRAMS)) {
const signatures = await connection.getSignaturesForAddress(programId, {
limit: 100,
});
results[name] = signatures.length;
}
return NextResponse.json({
dexActivity: results,
period: "recent",
fetchedAt: Date.now(),
});
}
This gives you a rough count of recent swaps per DEX. It’s not perfect volume in USD — you’d need to decode the transaction data for that — but it tells you which DEX is hot right now. And honestly, relative activity is more useful than exact dollar amounts for spotting trends early.
Active Wallets Counter
Counting unique active wallets is trickier than you’d think. I pull recent block transactions and extract the fee payers:
// app/api/active-wallets/route.ts
import { connection } from "@/lib/solana";
import { NextResponse } from "next/server";
export async function GET() {
const currentSlot = await connection.getSlot();
const uniqueWallets = new Set<string>();
for (let i = 0; i < 10; i++) {
try {
const block = await connection.getBlock(currentSlot - i, {
maxSupportedTransactionVersion: 0,
transactionDetails: "full",
});
if (!block) continue;
for (const tx of block.transactions) {
const feePayer = tx.transaction.message.staticAccountKeys[0];
if (feePayer) {
uniqueWallets.add(feePayer.toBase58());
}
}
} catch {
continue;
}
}
return NextResponse.json({
activeWallets: uniqueWallets.size,
slotRange: { from: currentSlot - 9, to: currentSlot },
});
}
I made a mistake early on — I was counting all account keys in a transaction as “active wallets.” That inflates the number massively because program accounts and token accounts get included. Only the fee payer matters if you want real human wallet activity.
Trending Tokens Detector
This one’s my favorite. It finds tokens with the most swap activity in recent blocks by parsing Jupiter transactions:
// app/api/trending/route.ts
import { connection } from "@/lib/solana";
import { PublicKey } from "@solana/web3.js";
import { NextResponse } from "next/server";
const JUPITER_V6 = new PublicKey("JUP6LkbZbjS1jKKwapdHNy74zcZ3tLUZoi5QNyVTaV4");
export async function GET() {
const signatures = await connection.getSignaturesForAddress(JUPITER_V6, {
limit: 200,
});
const tokenCounts = new Map<string, number>();
const txs = await connection.getParsedTransactions(
signatures.map((s) => s.signature),
{ maxSupportedTransactionVersion: 0 }
);
for (const tx of txs) {
if (!tx?.meta?.postTokenBalances) continue;
for (const balance of tx.meta.postTokenBalances) {
if (balance.mint) {
tokenCounts.set(balance.mint, (tokenCounts.get(balance.mint) || 0) + 1);
}
}
}
const trending = [...tokenCounts.entries()]
.sort((a, b) => b[1] - a[1])
.slice(0, 20)
.map(([mint, count]) => ({ mint, swapCount: count }));
return NextResponse.json({ trending });
}
The trick is looking at postTokenBalances in the transaction metadata. Every token involved in a swap shows up there. Count the mints, sort by frequency, and you’ve got your trending list. I’ve caught tokens pumping 10-30 minutes before they showed up on Birdeye’s trending page.
React Dashboard Components
For the frontend, I use SWR for data fetching and Recharts for visualization. Here’s the volume chart:
// components/VolumeChart.tsx
"use client";
import useSWR from "swr";
import { BarChart, Bar, XAxis, YAxis, Tooltip, ResponsiveContainer } from "recharts";
const fetcher = (url: string) => fetch(url).then((r) => r.json());
export function VolumeChart() {
const { data, error } = useSWR("/api/volume", fetcher, {
refreshInterval: 5000,
});
if (error) return <p>Failed to load volume data</p>;
if (!data) return <p>Loading...</p>;
return (
<div className="rounded-lg border border-zinc-800 bg-zinc-900 p-4">
<h3 className="mb-3 text-sm font-medium text-zinc-400">
Transactions per Slot
</h3>
<ResponsiveContainer width="100%" height={240}>
<BarChart data={data.blocks}>
<XAxis dataKey="slot" tick={{ fontSize: 10 }} />
<YAxis tick={{ fontSize: 10 }} />
<Tooltip />
<Bar dataKey="txCount" fill="#14b8a6" radius={[2, 2, 0, 0]} />
</BarChart>
</ResponsiveContainer>
</div>
);
}
And a trending tokens table:
// components/TrendingTokens.tsx
"use client";
import useSWR from "swr";
const fetcher = (url: string) => fetch(url).then((r) => r.json());
export function TrendingTokens() {
const { data } = useSWR("/api/trending", fetcher, {
refreshInterval: 30000,
});
if (!data) return <p>Loading trending tokens...</p>;
return (
<div className="rounded-lg border border-zinc-800 bg-zinc-900 p-4">
<h3 className="mb-3 text-sm font-medium text-zinc-400">
Trending Tokens (Last 200 Jupiter Swaps)
</h3>
<div className="space-y-2">
{data.trending.map((token: { mint: string; swapCount: number }, i: number) => (
<div key={token.mint} className="flex items-center justify-between text-sm">
<span className="font-mono text-zinc-300">
{i + 1}. {token.mint.slice(0, 8)}...{token.mint.slice(-4)}
</span>
<span className="text-teal-400">{token.swapCount} swaps</span>
</div>
))}
</div>
</div>
);
}
SWR’s refreshInterval handles the polling. 5 seconds for volume, 30 seconds for trending tokens. That balance keeps the dashboard feeling live without burning through RPC credits.
Real-Time Updates with WebSockets
Polling is fine for most metrics, but for whale alerts I want instant notifications. Solana RPC supports WebSocket subscriptions:
// lib/websocket.ts
import { connection } from "./solana";
import { PublicKey, LAMPORTS_PER_SOL } from "@solana/web3.js";
const WHALE_THRESHOLD = 500 * LAMPORTS_PER_SOL;
type WhaleCallback = (tx: { signature: string; amount: number }) => void;
export function subscribeToWhaleTransactions(onWhale: WhaleCallback) {
const subId = connection.onLogs(
new PublicKey("11111111111111111111111111111111"),
async (logs) => {
if (logs.err) return;
try {
const tx = await connection.getParsedTransaction(logs.signature, {
maxSupportedTransactionVersion: 0,
});
if (!tx?.meta) return;
const balanceChange = Math.abs(
(tx.meta.postBalances[0] ?? 0) - (tx.meta.preBalances[0] ?? 0)
);
if (balanceChange >= WHALE_THRESHOLD) {
onWhale({
signature: logs.signature,
amount: balanceChange / LAMPORTS_PER_SOL,
});
}
} catch {
// Transaction fetch failed, skip it
}
},
"confirmed"
);
return () => connection.removeOnLogsListener(subId);
}
I subscribe to the System Program logs because every SOL transfer goes through it. When a transfer exceeds 500 SOL, that’s a whale move worth knowing about. You can pipe these into a Next.js API route using Server-Sent Events to push updates to the frontend.
I tried subscribing to every program at once early on. The WebSocket connection died within minutes. Pick your subscriptions carefully — one or two high-value streams beats ten noisy ones.
When You Need a Database
For the first few weeks, I ran everything off live RPC calls. No database. It worked fine for real-time data, but I couldn’t answer questions like “what was the transaction volume at 3am last Tuesday?”
Here’s my rule: if you only need current state, RPC is enough. If you need historical comparisons, add a database.
I use a simple SQLite setup through Prisma. Every 5 minutes, a cron job snapshots the key metrics:
// lib/snapshot.ts
import { prisma } from "./db";
import { connection } from "./solana";
export async function takeSnapshot() {
const slot = await connection.getSlot();
const block = await connection.getBlock(slot, {
maxSupportedTransactionVersion: 0,
});
await prisma.metric.create({
data: {
timestamp: new Date(),
slot,
txCount: block?.transactions.length ?? 0,
type: "VOLUME",
},
});
}
SQLite is perfect here. No external services, no connection pooling headaches. It just works. I only moved to Postgres when I had 3 months of data and queries started slowing down.
RPC Costs and Optimization
The public Solana RPC is free but useless for anything serious. Rate limits will kill your dashboard within minutes.
I use Helius at $50/month. It handles everything I need, and their enhanced APIs fill gaps that standard RPC can’t cover — like parsed transaction history and token metadata. QuickNode and Triton are solid alternatives.
Three things that cut my RPC costs in half:
Caching is non-negotiable. I cache API responses for 3-5 seconds. Most dashboard viewers don’t need sub-second freshness:
// Simple in-memory cache for API routes
const cache = new Map<string, { data: unknown; expiry: number }>();
export function getCached<T>(key: string, ttlMs: number, fetcher: () => Promise<T>): Promise<T> {
const cached = cache.get(key);
if (cached && cached.expiry > Date.now()) return Promise.resolve(cached.data as T);
return fetcher().then((data) => {
cache.set(key, { data, expiry: Date.now() + ttlMs });
return data;
});
}
Batch requests where possible. Instead of 20 individual getBlock calls, I use Promise.all with a concurrency limit. That’s fewer round trips, lower latency.
Ask for less data. The transactionDetails parameter in getBlock is huge. If you only need transaction count, you don’t need full transaction data. Use "signatures" instead of "full" when you can.
Enhanced APIs for the Hard Stuff
Some data is painful to get from standard RPC. Token prices, historical balances, NFT metadata — that stuff requires parsing hundreds of transactions or hitting multiple endpoints.
Helius DAS API gives me token metadata and ownership data in a single call. Their enhanced transaction history is parsed and human-readable, which saves me from writing deserialization logic for every program.
For token prices, I hit Jupiter’s price API. It’s free and fast:
// lib/prices.ts
export async function getTokenPrice(mint: string): Promise<number | null> {
try {
const res = await fetch(`https://api.jup.ag/price/v2?ids=${mint}`);
const data = await res.json();
return data.data?.[mint]?.price ?? null;
} catch {
return null;
}
}
I used to try calculating prices from on-chain pool reserves. Don’t do that. It’s a rabbit hole of edge cases and stale data. Just use Jupiter’s API.
Putting It All Together
The dashboard page is straightforward. Just components in a grid:
// app/page.tsx
import { VolumeChart } from "@/components/VolumeChart";
import { TrendingTokens } from "@/components/TrendingTokens";
export default function Dashboard() {
return (
<main className="min-h-screen bg-black p-6 text-white">
<h1 className="mb-6 text-2xl font-bold">Solana Analytics</h1>
<div className="grid gap-4 md:grid-cols-2">
<VolumeChart />
<TrendingTokens />
</div>
</main>
);
}
Nothing complex. The components handle their own data fetching with SWR, so the page is just layout. I add new metrics by creating an API route and a component. Takes maybe 20 minutes each.
The Real Value
The paid analytics tools show you what everyone else already knows. By the time a token shows up on Birdeye’s trending page or DEX Screener’s gainers list, the move is mostly done.
When you build your own dashboard, you control what you track. You can monitor specific programs, specific wallets, specific token pairs. You can set thresholds that matter to your strategy, not some generic “top gainers” list.
I’ve been running this for a few months now. The trending tokens detector alone has paid for itself many times over. Not because it’s smarter than existing tools — it’s just faster. I see the data before it gets packaged and delayed through someone else’s platform.
Build your own analytics. The data is free. The edge comes from seeing it first.