How We Built a Real-Time Customer Portal
A B2B client came to us with a problem: their customers were emailing project managers for status updates, and project managers were spending 30% of their time writing update emails instead of shipping features.
The solution was a self-serve customer portal. Real-time project visibility, file delivery, messaging, and invoice tracking — all without a single status email.
Here's exactly how we built it.
Architecture Decisions
Why Not a Dashboard Template
We evaluated Retool, Appsmith, and three React admin templates. All had the same problem: they're built for internal tools, not customer-facing portals. The UX expectations are completely different.
Internal tools tolerate loading spinners and table-heavy layouts. Customer portals need to feel polished, branded, and responsive. We went custom.
The Stack
- Next.js 16 with App Router — Server components for initial data, client components for real-time
- Supabase — Auth, PostgreSQL, Realtime subscriptions, Storage for file delivery
- Tailwind CSS v4 — Rapid UI development with dark mode support
- Framer Motion — Subtle animations that make the portal feel alive
Data Flow
Client Browser
→ Supabase Auth (JWT)
→ Next.js Server Component (initial load via Supabase server client)
→ Client Component (Supabase Realtime subscription for live updates)
→ Supabase Storage (signed URLs for file downloads)
The key architectural decision: server components fetch the initial state, then client components subscribe to changes. This gives us fast initial page loads with SEO-friendly HTML, plus real-time updates without polling.
Authentication
We needed three access levels:
- Customer — sees only their own projects, invoices, and messages
- Team Member — sees assigned projects and can respond to messages
- Admin — sees everything, manages all projects and clients
Supabase Auth handles the session. A profiles table extends auth.users with a role column. Row Level Security policies enforce access at the database level.
// src/lib/supabase/server.ts
import { createServerClient } from '@supabase/ssr'
import { cookies } from 'next/headers'
export const createClient = async () => {
const cookieStore = await cookies()
return createServerClient(
process.env.NEXT_PUBLIC_SUPABASE_URL!,
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
{
cookies: {
getAll: () => cookieStore.getAll(),
setAll: (cookiesToSet) => {
for (const { name, value, options } of cookiesToSet) {
cookieStore.set(name, value, options)
}
},
},
}
)
}
Real-Time Project Updates
The portal's killer feature: when a developer pushes a milestone to "completed," the customer sees it update instantly without refreshing.
Server Component: Initial Load
const ProjectDashboard = async () => {
const supabase = await createClient()
const { data: projects } = await supabase
.from('projects')
.select(
`
*,
milestones (*, deliverables (*)),
messages (*, sender:profiles (full_name, avatar_url))
`
)
.order('updated_at', { ascending: false })
return <ProjectList initialProjects={projects ?? []} />
}
Client Component: Real-Time Subscription
'use client'
const ProjectList = ({ initialProjects }) => {
const [projects, setProjects] = useState(initialProjects)
const supabase = createBrowserClient(/* ... */)
useEffect(() => {
const channel = supabase
.channel('project-updates')
.on(
'postgres_changes',
{ event: '*', schema: 'public', table: 'milestones' },
(payload) => {
// Update the specific milestone in our local state
setProjects(prev => updateMilestoneInProjects(prev, payload))
}
)
.subscribe()
return () => { supabase.removeChannel(channel) }
}, [supabase])
return (/* render projects */)
}
Handling Reconnection
WebSocket connections drop. Mobile users switch networks. Supabase Realtime handles reconnection automatically, but there's a gap: events that occurred while disconnected are lost.
Our solution: on reconnection, re-fetch the full dataset from the server. The status callback tells us when the subscription state changes:
channel.subscribe((status) => {
if (status === 'SUBSCRIBED') {
// Connected — we're receiving live updates
}
if (status === 'CHANNEL_ERROR') {
// Connection lost — will auto-retry
// On next SUBSCRIBED, re-fetch full data
setShouldRefetch(true)
}
})
File Delivery
Customers need to download deliverables — design mockups, code packages, documents. We use Supabase Storage with signed URLs.
Why signed URLs? Files are private. A signed URL grants time-limited access (we use 1 hour) without exposing storage credentials.
const getDeliverableUrl = async (filePath: string) => {
const supabase = await createClient()
const { data } = await supabase.storage.from('deliverables').createSignedUrl(filePath, 3600)
return data?.signedUrl
}
Upload flow for team members:
const uploadDeliverable = async (projectId: string, file: File) => {
const supabase = createBrowserClient(/* ... */)
const path = `${projectId}/${Date.now()}-${file.name}`
const { error } = await supabase.storage
.from('deliverables')
.upload(path, file, { upsert: false })
if (!error) {
await supabase.from('deliverables').insert({
project_id: projectId,
file_url: path,
file_name: file.name,
uploaded_by: (await supabase.auth.getUser()).data.user?.id,
})
}
}
Messaging
Each project has a message thread. Messages appear in real-time for both the customer and the team.
The critical UX detail: optimistic updates. When a user sends a message, it appears immediately in their UI before the server confirms. If the insert fails, we remove the optimistic message and show an error.
const sendMessage = async (content: string) => {
const optimisticId = crypto.randomUUID()
const optimisticMessage = {
id: optimisticId,
content,
sender: currentUser,
created_at: new Date().toISOString(),
pending: true,
}
setMessages((prev) => [...prev, optimisticMessage])
const { error } = await supabase.from('messages').insert({
project_id: projectId,
content,
sender_id: currentUser.id,
})
if (error) {
setMessages((prev) => prev.filter((m) => m.id !== optimisticId))
toast.error('Failed to send message')
}
}
Notifications
When something changes — a milestone completes, a file is uploaded, a message arrives — the customer gets notified via email (Resend) and in-app notifications.
We use Supabase Database Functions (triggers) to generate notifications:
CREATE OR REPLACE FUNCTION notify_milestone_complete()
RETURNS TRIGGER AS $$
BEGIN
IF NEW.status = 'completed' AND OLD.status != 'completed' THEN
INSERT INTO public.notifications (user_id, type, title, body, link)
SELECT
p.client_id,
'project_update',
'Milestone Completed',
NEW.title || ' has been marked as complete',
'/customer/projects/' || NEW.project_id
FROM public.projects p
WHERE p.id = NEW.project_id;
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
Results
After three months in production:
- Status update emails dropped 87% — customers check the portal instead
- Average project visibility score increased from 3.2 to 4.7 (customer survey, scale of 5)
- File delivery time reduced from 24 hours to instant — no more email attachment size limits
- Message response time dropped from 8 hours to 45 minutes — real-time notifications
The customer portal is now our default offering for all B2B engagements. Every new project gets a portal from day one.
Lessons Learned
- Real-time is table stakes — Customers expect live updates in 2026. Polling and manual refresh feel broken.
- Optimistic updates are non-negotiable — Any perceptible delay in sending a message or updating state feels like a bug.
- RLS simplifies everything — We never write authorization checks in application code. The database handles it.
- Signed URLs beat public buckets — The extra complexity of signed URLs is worth the security guarantee.
- Server components + Realtime is the right split — Fast initial load, live updates, zero loading spinners on navigation.
Austin Coders
We build SaaS & AI apps that actually scale. React, Next.js, and AI-powered solutions for startups and enterprises.