BEIS Energy Certificate

Designing a new operational energy rating service for commercial buildings

Duration
Jan - Apr 2022

My Role
Senior Service Designer
& Researcher

The Brief

My Role

Research

Workshops

Design

Stakeholders

The Department for Business, Energy and Industrial Strategy (BEIS) was designing a new mandatory government service — PEERS (Property Energy Efficiency Rating Scheme) — to introduce operational energy ratings for large commercial and industrial buildings across England and Wales. Unlike existing certificates such as the EPC, which measure theoretical energy performance, PEERS would rate buildings on their actual annual energy use and carbon emissions, directly supporting the UK Government's net-zero 2050 commitment.

No equivalent digital service existed in the UK. The closest comparison — the Display Energy Certificate (DEC) — was decentralised, couldn't support self-reporting, and gave government no direct access to assessment data. BEIS needed to understand whether a viable, user-centred, cost-effective digital service could be built from scratch — and what it should look like.

I I was embedded as a Service Designer and Researcher within a multidisciplinary team that included policy leads, UX designers, business analysts and technologists from BEIS and a delivery partner. Working across a six-week Discovery sprint, I was responsible for user research, service design artefacts, stakeholder documentation and recommendations to support the decision to proceed to Alpha.

Specifically, I owned the design and facilitation of user interviews, synthesis of research findings into structured documentation for BEIS stakeholders, and the production of explanatory infographics to communicate complex policy and service relationships to non-design audiences.

User interviews and research synthesis

The project engaged 45 users across six distinct groups: building owners, tenants, assessors, scheme administrators, accreditors, and citizens. I participated in the design and facilitation of in-depth qualitative interviews across this range, covering sentiment toward a new energy scheme, process pain points, digital literacy, and expectations for self-reporting.

I organised and documented research outputs into structured findings for BEIS stakeholders — translating interview data into 141 key insights that informed the service design direction. I helped develop the taxonomy of user groups and characteristics that framed the entire research sample strategy.

A key finding I helped surface: self-reporting — a core policy objective — was not supported by any comparable service globally and carried significant quality risks. My research documentation of assessor concerns about self-reporting accuracy directly influenced the recommendation to pursue further Alpha-phase testing before committing to this feature..

Stakeholder workshops and design validation

I organised and supported workshops with internal BEIS stakeholders — including policy, service and technology leads — to align on research priorities, test early service design concepts, and validate design decisions against policy intent. The team also ran 12 concept-testing sessions with participants to validate the most uncertain design elements, including self-reporting flows and ratings comparison approaches.

I produced explanatory infographics for these sessions to make complex service relationships — such as the interaction between building owners, tenants, assessors and the scheme administrator — legible to policy stakeholders who were not familiar with service design thinking.

Service design artefacts

I contributed to the end-to-end service design work that mapped the proposed PEERS service across seven key design decisions — including user account models, ratings engine architecture, self-reporting complexity, assessor workflows and audit processes. Each design decision was documented with research evidence, options considered, the recommended approach, and rationale — a structured format that enabled BEIS stakeholders to make informed decisions quickly.

I designed explanatory infographics to visualise the policy components, user journeys and service relationships in formats accessible to non-designer stakeholders in BEIS and DLUHC.

Accessibility

Managing upward and across teams

The project required navigating a complex stakeholder landscape: BEIS policy leads, DLUHC (who owned the existing DEC service), NABERS Australia (the international equivalent), accreditation bodies, and the delivery team. I contributed to a formal 137-page Discovery Report submitted to BEIS Digital — covering executive findings, service recommendations, a RAID log, existing service analysis and a horizon scan — which required careful alignment between research findings and policy objectives before submission.

The report was structured to pass two internal approval boards (Estimation and Prioritisation Board and Triage Board), demonstrating a level of stakeholder rigour that went beyond typical UX delivery work.

Inclusive research and design

The research sample was deliberately designed to include users with a range of digital literacy levels — from confident and expert to those who might struggle with a digital-first service. Accessibility considerations were built into the user taxonomy from the outset, with specific attention to users who might have disabilities that affect service usage.

Service design recommendations explicitly addressed the need to support non-digital channels alongside the digital service, recognising that assessors and some building owners operate with paper-based processes and offline workflows.

Impact

The Discovery phase concluded with a formal recommendation to proceed to Alpha — the direct outcome of the research and service design work the team produced. The 137-page Discovery Report, to which I contributed research documentation, infographics and service design artefacts, was submitted for approval to BEIS Digital's governance boards.

Key recommendations from the Discovery — including the need for a centralised government-hosted service, the caution around self-reporting, and the potential to reuse DEC technology components — directly shaped the Alpha phase scope and roadmap.

The research identified 141 key user insights across 45 interviews and 30 existing studies, and generated 51 ideas mapped across 8 design levers for the future service — providing BEIS with the most comprehensive user evidence base this policy area had seen to that point.

Phase
Discovery


Gov criteria — self-assessment


Agile, multidisciplinary team — embedded across a 6-week Discovery sprint with policy, technology and design workstreams

Accessible design across channels — inclusive research sample, digital literacy range, non-digital channel consideration

UCD deliverables — user taxonomy, interview guides, research synthesis, service design documentation, explanatory infographics

Prototyping tools — Figma / Adobe XD; GOV.UK Prototype Kit exposure (actively developing)

Usability testing — concept validation sessions with 12 participants on key uncertain design decisions

GOV.UK Design System — add specific reference if you used it or referenced it in your design work

Community of practice — add if you contributed to any internal design community sessions or shared methods with team peers

Mentoring — add if you supported any junior team members or shared tools knowledge during this project

Developer collaboration tools — add specifics: ADO, Confluence, Jira, sprint ceremonies, handover documentation