Hyvä vs Luma: what the performance benchmarks actually show
We ran Lighthouse across 15 pages on identical Magento 2 environments — same server, same catalogue, same product data. The only difference was the frontend theme. Here is what the data shows.
Methodology
Both storefronts run on the same Hypernode server with identical Magento 2.4.x installations, the same product catalogue, and the same data. Lighthouse was run three times per page using Google's PageSpeed Insights API, with results averaged. All tests were run on the mobile profile, which is where the performance gap between the two themes is most pronounced.
The results
Mobile Lighthouse scores, averaged across 3 runs per page using the PageSpeed Insights API.
Why the gap is so large
The performance difference comes down to JavaScript. Luma loads KnockoutJS, RequireJS, and a large number of RequireJS-managed modules on every page. Even pages with no interactive elements pay the full JavaScript loading cost because RequireJS evaluates dependencies synchronously before rendering.
Hyvä replaces this with Alpine.js — a lightweight reactive framework at around 15 KB — and Tailwind CSS. Instead of loading all JavaScript upfront, Alpine.js initialises only the components present on the current page. The result is dramatically lower JavaScript parse and execution time, which directly improves Largest Contentful Paint and First Contentful Paint scores.
Core Web Vitals in practice
Beyond Lighthouse scores, the most visible difference is Cumulative Layout Shift. Luma frequently has elements that visibly jump as the page loads — navigation, pricing, and product images can all shift position during render. Hyvä is designed to avoid this, and in our tests it consistently returned a CLS score of 0 across all page types.
This matters in two ways. First, layout shift is a frustrating user experience — particularly on mobile where a shift can cause an accidental tap. Second, Google uses CLS as a ranking signal through Core Web Vitals. Stores with poor CLS are at a disadvantage in organic search compared to stores that pass.
When Luma scores still matter
Luma's Lighthouse scores are not a dealbreaker for every store. For B2B platforms or internal tools where the primary users are on desktop connections and performance is less commercially critical, Luma's desktop scores — typically in the 75–85 range — may be acceptable.
Where performance directly affects conversion — consumer-facing B2C stores, especially those with significant mobile traffic — the difference between a score of 30 and a score of 97 is commercially meaningful. Slower pages convert less well, rank lower in organic search, and deliver a worse user experience.
See the difference yourself
Open both demos and run your own Lighthouse audit. The numbers are right there.