How to Build a Custom "Advanced Reporting" Dashboard for Marketing ROI

Si vous want a clean, usable view of how every marketing penny turns into revenue, building a custom "Advanced Reporting" tableau de bord for Marketing ROI is a perfect project. In this post I’ll walk you through a practical, hands-on approche—architecture, data sources, code exemples, visualizations, automation, and SEO & sharing conseils—so you can ship something useful for your team and integrate it with Magento 2 stores (including stock modules like Force Product Stock Status).

Why build a custom tableau de bord?

Off-the-shelf analytics tools are great, but they often miss the context of your store: custom attribut produits, stock extensions, promotions, and how Magento stores campaign metadata. A custom tableau de bord vous permet de:

  • Create metrics tailored to your campaigns (ROAS, CAC, incremental revenue).
  • Combine sales + stock + marketing data (e.g., Force Product Stock Status) for smarter decisions.
  • Provide near real-time visualization for quick optimization.
  • Automate scheduled rapports for leadership or ad platforms.
  • Make the tableau de bord SEO-friendly and shareable for internal and external stakeholders.

Aperçu: architecture and composants

At a high level you’ll want 4 layers:

  1. Data sources: Magento 2 database + extensions (commandes, commande items, product stock, Force Product Stock Status, Google Ads/UA/GA4 export, e-mail platform).
  2. ETL/aggregation: lightweight service to extract, normalize and store optimized metrics (a small analytics DB like Postgres or ClickHouse if you have high volume).
  3. API layer: endpoints to serve aggregated metrics to the UI (JSON, with filtres and pagination).
  4. UI: interactive tableau de bord (Chart.js / D3 + DataTables) with filtres, date ranges and real-time updates.

Step 1 — Identify the clé data sources in Magento 2

For Marketing ROI you typically need:

  • Orders and commande lines (sales_commande and sales_commande_item)
  • Customer and acquisition metadata (if you capture UTM params into the commande or a custom table)
  • Catalog inventaire for stock level (cataloginventaire_stock_item)
  • Any custom stock or availability attributes from modules like Force Product Stock Status

Common Magento tables and champs to use (exemples):

  • sales_commande.entity_id, sales_commande.increment_id, sales_commande.created_at, sales_commande.grand_total, sales_commande.client_id
  • sales_commande_item.commande_id, sales_commande_item.sku, sales_commande_item.name, sales_commande_item.qty_commandeed, sales_commande_item.prix
  • cataloginventaire_stock_item.product_id, cataloginventaire_stock_item.qty, cataloginventaire_stock_item.is_in_stock

Si vous capture UTM paramètres at paiement, they might be saved as commande attributes (e.g. sales_commande table colonnes like marketing_source, marketing_campaign). If not, plan to start capturing them (I'll show a small exemple later).

Step 2 — Designing the metrics

Avant coding, define the metrics you need. Voici the high-impact ones for marketing ROI:

  • Revenue by campaign, channel, and UTM source
  • Orders and conversion rate by campaign
  • Average Order Value (AOV) per campaign
  • Customer Acquisition Cost (CAC) per channel (requires ad spend import)
  • Return on Ad Spend (ROAS) = revenue / ad spend
  • Incremental revenue: revenue from campaign cohorts vs baseline
  • Stock-aware metrics: % of commandes impacted by out-of-stock or forced stock status (Force Product Stock Status)

Example metric definitions (precise):

  • ROAS (channel) = SUM(revenue WHERE channel) / SUM(ad_spend WHERE channel)
  • CAC (channel) = SUM(ad_spend WHERE channel) / COUNT(new_clients_acquired_by_channel)
  • Stock Risk = COUNT(commande_items WHERE product_force_status = 'out_of_stock' OR is_in_stock = 0) / COUNT(commande_items)

Step 3 — Data model and aggregation queries

Keep the analytics queries simple and pre-aggregate where possible (daily rollups, campaign-level rollups). Voici useful SQL exemples against Magento tables. These assume you either run them on a replica DB or copy the data into a separate analytics database.

Get daily revenue by campaign

-- Example SQL (Postgres / MySQL compatible)
SELECT
  DATE(o.created_at) AS day,
  COALESCE(o.marketing_campaign, 'unknown') AS campaign,
  SUM(o.grand_total) AS revenue,
  COUNT(DISTINCT o.entity_id) AS orders
FROM sales_order o
WHERE o.state = 'complete'
  AND o.created_at BETWEEN '2025-01-01' AND '2025-12-31'
GROUP BY day, campaign
ORDER BY day, campaign;

If marketing_campaign is not a colonne in sales_commande, you should add it (see the next section on capturing UTM params).

Get product stock status per commande item using Force Product Stock Status if available

Force Product Stock Status modules can store a attribut produit or a separate table. A robust approche is to check for a attribut produit first and then fallback to stock item.

-- Join product attribute (if stored as an EAV attribute) and fallback to cataloginventory
-- This is simplified pseudo-SQL using Magento conventions
SELECT
  oi.order_id,
  oi.product_id,
  oi.sku,
  COALESCE(fp.value, si.is_in_stock) AS final_stock_status
FROM sales_order_item oi
LEFT JOIN cataloginventory_stock_item si ON si.product_id = oi.product_id
LEFT JOIN (
  -- hypothetical table where extension stores force status
  SELECT product_id, value FROM force_product_stock_status
) fp ON fp.product_id = oi.product_id
LIMIT 100;

Adjust the fp table name or join to the EAV tables (catalog_product_entity_int) depending on how the extension stores the status.

Step 4 — Capturing UTM & campaign metadata in Magento

Si vous haven’t been saving UTMs, add a small piece of code to store UTM params into quote & commande. Two pragmatic approchees:

  • Frontend: capture UTMs with JavaScript into a cookie, then write to quote during paiement
  • Backend: use an observateur to copy quote attributs personnalisés into commande on commande_place_after

Example JavaScript to capture UTMs into a cookie:

<script>
(function(){
  function getParam(name){
    var m = window.location.search.match(new RegExp('(?:\\?|&)' + name + '=([^&]+)'));
    return m ? decodeURIComponent(m[1]) : null;
  }
  var utm_source = getParam('utm_source');
  var utm_medium = getParam('utm_medium');
  var utm_campaign = getParam('utm_campaign');
  if (utm_source || utm_medium || utm_campaign){
    var data = {
      s: utm_source || '',
      m: utm_medium || '',
      c: utm_campaign || ''
    };
    document.cookie = 'mf_utm=' + btoa(JSON.stringify(data)) + '; path=/; max-age=' + (60*60*24*30);
  }
})();
</script>

Server-side (Magento) you can read that cookie during paiement and save it to the quote and then to the commande. Voici a conceptual PHP snippet for a Magento 2 observateur:

// In your Observer for checkout_onepage_controller_success_action or quote to order transfer
public function execute(
  
  \Magento\Framework\Event\Observer $observer
) {
  $order = $observer->getEvent()->getOrder();
  $request = $this->request;
  $utmCookie = $request->getCookie('mf_utm');
  if ($utmCookie) {
    $data = json_decode(base64_decode($utmCookie), true);
    if (!empty($data['c'])) {
      $order->setData('marketing_campaign', $data['c']);
    }
    if (!empty($data['s'])) {
      $order->setData('marketing_source', $data['s']);
    }
    if (!empty($data['m'])) {
      $order->setData('marketing_medium', $data['m']);
    }
    $order->save();
  }
}

Après you have campaign data captured on commandes, the SQL rollups become straightforward.

Step 5 — Building a simple ETL pipeline

For clarity I’ll show a minimal pipeline using Node.js to pull commandes via Magento API REST and write daily aggregations into Postgres. C'est easy to run as a tâche cron or in a container.

Minimal Node ETL (concept)

// package.json dependencies: node-fetch, pg
const fetch = require('node-fetch');
const { Client } = require('pg');

const MAGENTO_BASE = process.env.MAGENTO_BASE; // e.g. https://store.example.com
const MAGENTO_TOKEN = process.env.MAGENTO_TOKEN; // integration token
const PG_CONN = process.env.PG_CONN; // postgres connection

async function fetchOrders(updatedAfter){
  const url = `${MAGENTO_BASE}/rest/V1/orders?searchCriteria[filter_groups][0][filters][0][field]=updated_at&searchCriteria[filter_groups][0][filters][0][value]=${encodeURIComponent(updatedAfter)}&searchCriteria[filter_groups][0][filters][0][condition_type]=gt`;
  const res = await fetch(url, { headers: { 'Authorization': `Bearer ${MAGENTO_TOKEN}` } });
  const body = await res.json();
  return body.items || [];
}

async function upsertDailyAggregates(orders){
  const client = new Client({ connectionString: PG_CONN });
  await client.connect();
  for (const o of orders){
    const day = o.created_at.split('T')[0];
    const campaign = o.marketing_campaign || 'unknown';
    const revenue = parseFloat(o.grand_total) || 0;
    await client.query(`
      INSERT INTO analytics.daily_campaign (day, campaign, revenue, orders)
      VALUES ($1,$2,$3,1)
      ON CONFLICT (day,campaign) DO UPDATE
      SET revenue = analytics.daily_campaign.revenue + EXCLUDED.revenue,
          orders = analytics.daily_campaign.orders + 1;
    `, [day, campaign, revenue]);
  }
  await client.end();
}

(async ()=>{
  const orders = await fetchOrders('2025-09-01 00:00:00');
  await upsertDailyAggregates(orders);
})();

This illustrates the idea: pull new/updated commandes, map to your campaign champ, and update an analytics table.

Step 6 — points d'accès API for the tableau de bord

Build a couple of optimized endpoints to serve the UI: e.g., /api/metrics/daily?start=...&end=...&campaign=... and /api/metrics/résumé?period=30d

A simple Express endpoint exemple that returns daily revenue by campaign from Postgres:

app.get('/api/metrics/daily', async (req,res)=>{
  const { start, end, campaign } = req.query;
  const where = [];
  const params = [];
  if (start) { params.push(start); where.push(`day >= $${params.length}`); }
  if (end) { params.push(end); where.push(`day <= $${params.length}`); }
  if (campaign) { params.push(campaign); where.push(`campaign = $${params.length}`); }
  const q = `SELECT day, campaign, revenue, orders FROM analytics.daily_campaign ${where.length ? 'WHERE ' + where.join(' AND ') : ''} ORDER BY day`;
  const { rows } = await pg.query(q, params);
  res.json(rows);
});

Step 7 — Frontend: real-time interactive visualization

For an interactive tableau de bord you can use Chart.js for charts and DataTables for tabular drill-downs. Provide date pickers and campaign selectors. For near real-time updates, either use short polling (every 10-30s) or WebSockets if you need push updates.

Minimal HTML + JS exemple (Chart.js + fetch API)

<div id="controls">
  <label>Start<input type="date" id="start" /></label>
  <label>End<input type="date" id="end" /></label>
  <select id="campaignSel"><option value="">All</option></select>
  <button id="refresh">Refresh</button>
</div>
<canvas id="revenueChart" width="1000" height="400"></canvas>

<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script>
async function loadData(){
  const start = document.getElementById('start').value;
  const end = document.getElementById('end').value;
  const campaign = document.getElementById('campaignSel').value;
  const params = new URLSearchParams({ start, end, campaign });
  const res = await fetch('/api/metrics/daily?' + params.toString());
  const rows = await res.json();
  const groups = {};
  rows.forEach(r => {
    if (!groups[r.campaign]) groups[r.campaign] = { labels: [], data: [] };
    groups[r.campaign].labels.push(r.day);
    groups[r.campaign].data.push(parseFloat(r.revenue));
  });
  const datasets = Object.keys(groups).map((k, idx) => ({
    label: k,
    data: groups[k].data,
    fill: false,
    borderColor: `hsl(${(idx*60)%360},60%,50%)`
  }));
  const labels = rows.filter(r=>true).map(r=>r.day).filter((v,i,a)=>a.indexOf(v)===i);
  if (window.revChart) { window.revChart.data.labels = labels; window.revChart.data.datasets = datasets; window.revChart.update(); }
  else {
    const ctx = document.getElementById('revenueChart').getContext('2d');
    window.revChart = new Chart(ctx, { type: 'line', data: { labels, datasets }, options: { responsive: true } });
  }
}

document.getElementById('refresh').addEventListener('click', loadData);
loadData();
</script>

This basic interface vous donne mulconseille campaigns on one chart. Add toolconseils and on-click drill-downs to fetch commande-level data for any day/campaign.

Step 8 — Real-time considerations

Si vous need true real-time: stream events when commandes are placed into a fichier de messages (e.g., Kafka, Redis Streams) and have a small consommateur update an in-memory store (Redis) and push via WebSockets. For many shops, polling or a 1-minute aggregation is plenty.

Step 9 — Periodic rapports and automation

Leadership usually wants a scheduled PDF or CSV résumé every morning. Options:

  • Server-side generator: produce a PDF (wkhtmltopdf, Puppeteer) or CSV from aggregated data and send via e-mail.
  • Magento cron: implement a console command that runs daily and sends rapports.
  • Use a scheduler in your container (cron or Kubernetes CronJob) to run ETL & rapport generator.

Example: simple Linux cron entry that runs a Node script every day at 6am

0 6 * * * /usr/bin/node /opt/analytics/scripts/daily_report.js >> /var/log/mf/daily_report.log 2>&1

And a minimal Node script that creates a CSV and sends via SMTP:

const fs = require('fs');
const { Client } = require('pg');
const nodemailer = require('nodemailer');
(async ()=>{
  const client = new Client({ connectionString: process.env.PG_CONN });
  await client.connect();
  const { rows } = await client.query(`SELECT day, campaign, revenue, orders FROM analytics.daily_campaign WHERE day = current_date - 1`);
  const csv = ['day,campaign,revenue,orders', ...rows.map(r => `${r.day},"${r.campaign}",${r.revenue},${r.orders}`)].join('\n');
  fs.writeFileSync('/tmp/daily_report.csv', csv);
  const transporter = nodemailer.createTransport({ host: 'smtp.example.com', port: 587, auth: { user: 'noreply@example.com', pass: 'password' } });
  await transporter.sendMail({ from: 'noreply@example.com', to: 'leadership@example.com', subject: 'Daily Marketing Report', text: 'Attached', attachments:[{ path: '/tmp/daily_report.csv'}] });
  await client.end();
})();

Si vous prefer PDFs, use Puppeteer to render an HTML tableau de bord page and save a PDF snapshot to attach to your e-mail.

Step 10 — Visualization conseils for impact

Keep the UI focused:

  • Top ligne: résumé KPIs (Revenue, ROAS, CAC, Conversion rate)
  • Middle: time-series charts (revenue by campaign, commandes)
  • Bottom: detailed table with quick actions (export, create ticket for stock problèmes)

Make stock-aware visual cues: highlight campaigns where a significant portion of revenue comes from products with forced out-of-stock statuses. Add a colonne for forced_stock_flag or is_in_stock and color lignes red/orange when stock is an problème.

Step 11 — SEO & sharing optimization for the tableau de bord

Even internal tableau de bords avantage from SEO-like optimizations because they make the page easier to share and discover inside the company, and often you’ll want to provide public snapshots or embedded rapports for partners.

Here’s a practical checklist to make your tableau de bord shareable and SEO-friendly:

  • Friendly URL: use a clean slug (e.g. /analytics/marketing-roi or /articles/advanced-rapporting-tableau de bord-marketing-roi for a public writeup).
  • Descriptive meta title and meta description for each tableau de bord page (server-side render these with the date range or campaign name in the title).
  • OpenGraph tags so shared links on Slack/Teams/LinkedIn show a useful pavis (title, description, thumbnail).
  • Canonical URLs for saved snapshots to avoid duplication.
  • Structured data (JSON-LD) if you publish a public case study or rapport page so moteur de recherches understand it.
  • Sitemap entries for public rapports or snapshot pages so they’re indexed if intended.
  • Allow per-rapport permalinks (share a URL that encodes the filtres like ?start=...&end=...&campaign=...)

Example meta tags (server-render):

<meta name="title" content="Advanced Reporting Dashboard for Marketing ROI | Magefine" />
<meta name="description" content="Build a custom Magento 2 advanced reporting dashboard for Marketing ROI. Integrate Force Product Stock Status, visualise in real time, automate reports and optimise SEO for sharing." />
<meta property="og:title" content="Advanced Reporting Dashboard for Marketing ROI | Magefine" />
<meta property="og:description" content="Custom dashboard for Marketing ROI: integrate Magento sales and stock modules, create custom metrics, visualise in real time, and automate reporting." />
<meta property="og:url" content="https://magefine.com/articles/advanced-reporting-dashboard-marketing-roi" />
<link rel="canonical" href="https://magefine.com/articles/advanced-reporting-dashboard-marketing-roi" />

Step 12 — Performance & hosting considerations (Magento context)

Analytics pages peut être heavy. Keep these points in mind for Magento hosting:

  • Serve the tableau de bord from a separate service or subdomain (analytics.magefine.com) or use a lightweight app server to avoid overloading Magento vitrine servers.
  • Use caching for public snapshots and pre-computed aggregates for charts.
  • Use Redis for caching and session store, and ElasticSearch for fast product-level recherchees if you need them inside the tableau de bord.
  • Si vous host on Magefine, choose hosting with enough CPU and RAM for data processing or use a managed DB for your analytics store.

Step 13 — Integnote Force Product Stock Status specifically

Force Product Stock Status modules typically let commerçants set the displayed stock status independently from real inventaire. For ROI decisions this matters: you peut être promoting products that show as "In Stock" while the inventaire provider shows 0 available.

How to incorporate:

  1. Discover how the module stores valeurs: check the extension docs or lookup the attribut produit or table created by the extension.
  2. If stored as a attribut produit (common), join via EAV tables (catalog_product_entity_int / varchar) using the attribute_id for the extension attribute.
  3. If the extension uses a custom table like force_product_stock_status(product_id, status), join that table.
  4. Define a resolved_stock_status = CASE WHEN force_status IS NOT NULL THEN force_status ELSE cataloginventaire_stock_item.is_in_stock END
  5. Expose this as a colonne in your commande line rollups so you can compute stock-aware revenue and highlight risky campaigns.

Example SQL snippet to compute resolved stock status:

SELECT
  oi.order_id,
  oi.product_id,
  COALESCE(fp.status, si.is_in_stock) AS resolved_stock_status,
  oi.qty_ordered,
  oi.price
FROM sales_order_item oi
LEFT JOIN cataloginventory_stock_item si ON si.product_id = oi.product_id
LEFT JOIN force_product_stock_status fp ON fp.product_id = oi.product_id;

Now compute revenue coming from items where resolved_stock_status = 0 or 'out_of_stock' and rapport it by campaign to flag promotions that may cause fulfillment problems.

Step 14 — Security, roles and collaboration

Dashboards often expose sensitive info. Follow these rules:

  • Authenticate and authorize: integrate with SSO (SAML / OAuth) if possible and restrict who can view/export raw data.
  • Audit: log exports and scheduled rapport sends.
  • Mask PII when exporting to external partners.
  • Use HTTPS everywhere and secure DB credentials in environment variables or a vault.

Step 15 — Putting it all together: an implémentation checklist

  1. Capture UTMs into commandes (start today).
  2. Decide analytics DB and ETL cadence (near real-time vs hourly).
  3. Implement ETL to daily_campaign and commande_item_rollups (include resolved_stock_status).
  4. Build optimized points d'accès API for charts and tables.
  5. Implement frontend with Chart.js and DataTables, add filtres and permalinks.
  6. Schedule daily automated rapports with CSV/PDF attachments and keep logs of sends.
  7. Secure the tableau de bord and add role-based access and SSO.
  8. Optimize SEO/open graph for public snapshots and share-permalinks.

Quick dépannage and conseils

  • Slow queries? Pre-aggregate. Use materialized views or daily rollups and refresh them regularly.
  • Missing campaign data? Add the cookie capture code and run a backfill where you can match referrers and timestamp heuristics.
  • Discrepancies with GA? Understand attribution windows and refunds: your revenue is from commandes, GA may attribute differently.
  • Stock mismatch? Use resolved_stock_status and include a small reconciliation job that compares forced status vs inventaire quantity weekly.

Final notes — ship iteratively

Start with the core: revenue by campaign and ROAS. Add stock-awareness next (Force Product Stock Status integration). Then iterate toward automation and real-time. Keep UI simple: a few meaningful filtres and a clear export button will make your tableau de bord useful from day one.

If you’d like, I can help produce the exact migration scripts (schema for analytics tables), provide a ready-to-déployer Node/Express app that fetches data from Magento, or a reference frontend with Chart.js and server-side rendering that’s SEO-ready for Magefine hosting. Tell me which part you want first and I’ll produce the code and configuration you can run this week.

Good luck and happy rapporting—this is the tri of fonctionnalité that quickly pays for itself when your marketing team can see which campaigns actually generate profitable commandes and which ones cause stock headaches.

Author: Magefine Insights — practical conseils for Magento 2 sellers and agencies.