The Politics of Performance Transparency

The series introduction established that public sector performance dashboards function as policy instruments, and the previous post examined how different dashboard designs constitute different publics with different capacities for democratic participation. This post applies a framework developed by Lucy Kimbell and Cameron Tonkinwise in their chapter "A Political Dialogue About Government Service Design Politics" (Kimbell & Tonkinwise, 2025) to the public sector dashboard context. Their analytical vocabulary clarifies what is at stake when we design dashboards as public services - and why the design decisions involved are unavoidably political.

The core claim is straightforward: public sector performance dashboards are not neutral information infrastructure. They are service designs that constitute particular relationships between citizens, providers, and the state, embedding theories of accountability that deserve explicit interrogation rather than technical resolution.

The Framework: Key Concepts

Kimbell and Tonkinwise develop several analytical distinctions that prove essential for understanding dashboard design as government service design. The most foundational is the distinction between politics and the political. "Politics", in their usage, refers to the formal field of systems of government, politicians, and bureaucrats - the routines and rituals that societies establish for making collective decisions. "The political", by contrast, denotes the contestation about values and power that exists in all interactions, including those well outside the formal political arena (Kimbell & Tonkinwise, 2025, p. 40).

The significance of this distinction for dashboard design is immediate: politics tries to contain and control the unwieldy nature of the political. When dashboard design is treated as a technical rather than political exercise, the political dimensions are suppressed rather than resolved. The choices embedded in metric selection, aggregation methodology, and visual presentation all involve contestation about values and power, yet the technical framing renders these contests invisible.

Three Functions of Government

Tonkinwise distinguishes three overlapping functions of government that prove analytically productive when applied to dashboards (Kimbell & Tonkinwise, 2025, p. 40). The first is rules - what citizens are not allowed to do, encompassing enforcement, compliance, and licensing. In the dashboard context, this manifests as audit trails, methodology enforcement, and statistical compliance requirements. The second is services - what government does to support and enable citizens - which maps onto performance information provision and self-service tools. The third is policies - how government develops new rules and services and implements them - which is precisely the function that dashboard design decisions perform when they operate as policy instruments.

Loading diagram…

The critical observation is that public sector performance dashboards conflate these functions. A single dashboard simultaneously enforces methodological standards, provides a performance information service, and implements transparency policy. Treating the design as purely technical obscures the policy and rules functions that operate beneath the service framing.

The Customerisation Critique

Perhaps the most incisive element of Kimbell and Tonkinwise's analysis concerns the importation of "customer-centredness" into government services. As they put it, "if you design a government service in a customer-centred way, you are ontologically redesigning people from being citizens to being customers when interacting with their governments" (Kimbell & Tonkinwise, 2025, p. 40). The customer framing positions the relationship as transactional rather than mutual; it assumes a "customer is always right" logic; it creates ratchet effects - "why can't getting health data be as easy as Amazon?" - and it conceals what Mol (2008) calls the "logic of care" behind a "logic of choice".

This critique applies directly to public sector dashboard strategy. The language of "executive self-service" positions senior leaders as customers of a data service rather than as participants in collective governance. The ontological shift matters because it reshapes the design requirements: convenience and speed rather than deliberation and accountability.

Democratic Services and Mutuality

Against the customerisation model, Kimbell and Tonkinwise propose an alternative grounded in mutuality. "A democratic service is a balanced relation, support without subordination, or mutual support" (Kimbell & Tonkinwise, 2025, p. 40). This reframes the design question: rather than asking how dashboards can better serve executive customers, it asks how dashboards might support mutual accountability - relationships in which both those who publish data and those who are accountable to it participate in shaping how performance is understood.

The Co-Design Critique

Kimbell and Tonkinwise are also direct about the misuse of co-design in government contexts. Co-design "associated with government services in Europe sometimes appears to be a simplistic stand-in for democratic processes which are not working well", they write; "get some designers in to run a couple of workshops with stakeholders, and public servants can tell themselves they are enacting democracy" (Kimbell & Tonkinwise, 2025, p. 40). This cautions against treating user research workshops as sufficient democratic legitimation for dashboard design choices that are fundamentally about how power is exercised. The participatory design literature, including von Busch and Palmas's (2023) analysis of how co-design can be co-opted, reinforces the point that participation in design processes does not automatically produce democratic outcomes.

Applying the Framework to Public Sector Performance Dashboards

What Kind of Government Function Are Dashboards?

Using Kimbell and Tonkinwise's tripartite framework, public sector performance dashboards sit awkwardly across all three governmental functions. As rules infrastructure, dashboards enforce methodological standards for how metrics must be calculated, impose publication schedules that create compliance requirements on data providers, and establish ranking methodologies that providers must accept. As service provision, dashboards ostensibly "serve" executives by providing performance visibility, "serve" the public by creating transparency, and "serve" analysts by providing data access. As policy implementation, dashboard design choices instantiate transparency policy, the selection of metrics embeds political priorities, the aggregation methodology encodes assumptions about what "good performance" means, and the visual presentation frames how performance should be interpreted.

The problem is that the service framing dominates while the rules and policy functions remain implicit. Dashboard programmes are presented as improving "information services" when they are actually implementing policy choices and enforcing methodological rules. This is the kind of depoliticisation that Kimbell and Tonkinwise's framework is designed to expose.

The Customerisation of Public Sector Executives

The "executive self-service" strategic aim directly instantiates the customerisation critique. The language tells the story: "self-service" positions executives as customers of a data service; "interactive statistics" frames data as a product to be consumed; "user needs" adopts commercial service design vocabulary. What this language obscures is that executives are not customers but accountable public officials, that their relationship to performance data is fiduciary rather than transactional, and that the "service" of providing data is not value-neutral - it shapes what can be known and acted upon.

Following Kimbell and Tonkinwise's argument, designing dashboards as self-service tools ontologically reconstitutes executives as customers rather than as accountable governors. Customer logic prioritises convenience over deliberation, assumes individual choice rather than collective decision-making, and positions the dashboard as subordinate to user preferences rather than as infrastructure for governance. The alternative framing - dashboards as infrastructure for accountable governance - would foreground different design requirements entirely: deliberation support, audit trails, and contextual explanation rather than convenience and speed.

The Public Transparency Paradox

Kimbell and Tonkinwise's framework illuminates a deep tension in the "provoking action in others" strategic aim. The nominal framing presents public dashboards as serving citizens by providing transparency, enabling them to hold providers accountable through access to performance information. This appears democratic: the public participates in governance through information access.

The actual dynamic, as explored in the previous post on how dashboards constitute publics, is quite different. The mechanism is not information-driven citizen action but anticipatory reputation effects on providers. The public is not actually the audience; intermediaries - journalists, politicians, advocates - are. This creates what Kimbell and Tonkinwise would identify as a democratic deficit masquerading as democratic participation. The "public" is assembled as passive recipients of information rather than active participants in governance; the transparency service is designed for a fictional citizen-user who interprets data directly; and the actual democratic function - intermediary-mediated accountability - goes unsupported by the dashboard design.

If we take seriously Kimbell and Tonkinwise's argument about designing government services "in ways that make explicit that such services involve the co-creation of value" (Kimbell & Tonkinwise, 2025, p. 40), then public dashboards should be designed to support intermediary workflows explicitly, to acknowledge that "the public" is constituted differently by different dashboard designs, to make visible the methodology and choices embedded in the data presentation, and to create mechanisms for public deliberation about the metrics themselves - not just passive consumption of results.

The Analyst as Democratic Infrastructure

Performance analysts who translate dashboard outputs into contextualised narratives for board meetings and governance discussions can be read through Kimbell and Tonkinwise's framework as performing a democratic function that the self-service model structurally cannot support: contextualisation of what the numbers mean locally, deliberation support through framing data for collective discussion, translation of abstract metrics into actionable intelligence, and mediation of the relationship between the data and those accountable to it.

In Kimbell and Tonkinwise's terms, these analyst roles sustain the mutuality that the self-service model reduces to transaction. They create the human relationships necessary for data to function in governance rather than mere consumption. The persistence of this function despite sustained investment in self-service tooling suggests that the dashboard-as-self-service model is misaligned with the democratic requirements of public sector governance - not because the analyst role is a workaround but because the interpretive and relational work analysts perform is constitutive of how governance actually functions.

The Analyst Data Pipeline as Hidden Infrastructure

Kimbell and Tonkinwise emphasise that "designers' attentiveness to lived experience brings into view how people experience 'rules' and 'policies' as materialized into service infrastructures and the practices of public administrations" (Kimbell & Tonkinwise, 2025, p. 40). The analyst data pipeline - the workflow from data collection through transformation to dashboard publication - is precisely such an infrastructure. But it is invisible to dashboard users.

The Pipeline as Rules and Service Labour

The data pipeline enforces rules: what counts as valid data, how metrics are calculated, when data is available, who can access what. These rules are experienced by providers as constraints but remain invisible to dashboard consumers. As Muller et al. (2019) demonstrate through their analysis of data science work practices, the pipeline performs curation, design, and creation work that disappears behind the polished dashboard surface. Their observation - "I am the ground truth" - captures how pipeline workers create the reality that dashboards then present as discovered fact.

Following Kimbell and Tonkinwise's attention to service work, the data pipeline also involves substantial human labour: analysts who transform raw data into publishable metrics, quality assurance staff who verify accuracy, methodology specialists who develop and maintain calculation rules, publication teams who manage release schedules. This labour is rendered invisible by the self-service framing. When executives "self-serve", they are actually consuming the products of extensive human service work upstream. Framing dashboards as self-service obscures the labour relations embedded in data production - a concern that connects directly to Kimbell and Tonkinwise's argument that service design "can lock people into particular relations with government" (Kimbell & Tonkinwise, 2025, p. 40).

The Pipeline as Policy Choice

Every pipeline decision is also a policy choice: which data sources to include and exclude, how to handle missing or conflicting data, what aggregation level to present, how to calculate confidence intervals, whether to show trends, rankings, or absolute values. These choices implement policy but are typically framed as technical decisions. The Muller et al. (2019) insight about data workers constructing ground truth applies here: the pipeline does not neutrally transmit pre-existing facts but actively produces the performance reality that dashboards present.

Reconceptualising Dashboard Design as Democratic Infrastructure

From Customerisation to Mutuality

Following Kimbell and Tonkinwise's concept of democratic services, what would dashboards designed for mutuality look like? Where a customerised design optimises for self-service access, convenience, and individual consumption, a mutual design would optimise for supported deliberation, collective sense-making, and relationships of accountability. Where the customerised model positions the user as a customer and the dashboard as a product, the mutual model positions the user as a co-participant in governance and the dashboard as infrastructure for accountability.

Loading diagram…

The practical implications are significant. Mutual dashboards would invite and display contextual annotation - what providers say about their performance, what the data cannot capture - rather than presenting data as finished fact. They would design for board meetings and governance conversations rather than individual browsing sessions. They would make the choices embedded in the pipeline visible and contestable rather than hidden behind polished visualisation. And they would create channels for dashboard users to challenge, question, and propose alternatives to how performance is represented.

From Transparency to Accountability

Kimbell and Tonkinwise's distinction between politics and the political also suggests a shift from transparency - nominal information provision - to accountability, understood as relationships of mutual obligation. In the transparency model, data is published, citizens can access it, providers are deemed "transparent", and democracy is considered served. In the accountability model, data is contextualised, intermediaries translate and deliberate, providers explain and respond, and relationships of obligation are enacted rather than merely assumed.

The design implication is that accountability dashboards would explicitly support the response dimension: not just what providers' performance is, but what they are doing about it, what constraints they face, and how they are being supported. The next post in this series examines this shift from the perspective of user behaviour, exploring how the deficit model of dashboard use fails to account for what actually happens when people encounter performance data.

From Self-Service to Cognitive Assemblage

Integrating Tkacz's (2022) concept of the "cognitive assemblage" with Kimbell and Tonkinwise's service design framework suggests reconceiving dashboards not as tools that individual users access to extract information, but as components of distributed governance systems that include human intermediaries as essential elements. This reframing accepts that intermediary roles are not workarounds for dashboard deficiencies but constitutive elements of how performance governance actually functions. Dashboard design should therefore explicitly support these intermediary roles rather than attempting to eliminate them through self-service.

The Political Stakes of Dashboard Design

What Publics Are Being Constituted?

Following Kimbell and Tonkinwise's invocation of Dewey's argument that people "have to be assembled as a 'public', constituted as 'citizens'" (Kimbell & Tonkinwise, 2025, p. 40), the question becomes: what publics do current public sector dashboards constitute? As the previous post argued, dashboard designs assemble particular kinds of subjects - executives as customers rather than as accountable public officials, the wider public as passive recipients rather than as deliberative participants, providers as objects of surveillance rather than as partners in improvement, analysts as invisible service workers rather than as knowledge professionals with expertise. Different dashboard designs would constitute different publics with different capacities for democratic participation.

What Political Choices Are Being Depoliticised?

Kimbell and Tonkinwise cite Kinross (1985) on "the rhetoric of neutrality" - how visual information design makes "certain (statistical) pictures of the world seem like facts under the guise of making them more legible" (Kimbell & Tonkinwise, 2025, p. 40). Performance dashboards depoliticise several categories of choice simultaneously:

Loading diagram…

Each of these is a site of contestation rendered invisible by the technical framing.

Heuristics for a Democratic Alternative

Kimbell and Tonkinwise do not provide a prescription, but they do offer heuristics. Dashboard design should attend to service workers as well as service recipients - considering the analysts, data managers, and intermediaries who make the infrastructure work, not just the executives who consume it. It should question the customer framing: when "executive self-service" is the goal, ask what is lost by positioning executives as customers rather than as participants in collective governance. It should make the political visible rather than presenting dashboards as neutral information infrastructure. It should design for mutuality, creating infrastructure for relationships of mutual accountability rather than one-directional transparency. And it should support deliberation rather than consumption - designing for governance conversations rather than individual browsing sessions.

As Kimbell and Tonkinwise conclude, "these heuristics highlight the reasons why Service Design in government is not simply a special case of Service Design" (Kimbell & Tonkinwise, 2025, p. 40). The same applies to public sector performance dashboards: dashboard design is not simply a special case of data visualisation or information design. It is policy design with democratic stakes, and the choices embedded in how performance data is selected, aggregated, presented, and made available shape how accountability functions in the health system. Treating these choices as technical questions to be resolved by designers and analysts alone obscures the political contestation they deserve.

References

Kimbell, L., & Tonkinwise, C. (2025). A political dialogue about government service design politics. In L. Penin, A. Prendiville, & D. Sangiorgi (Eds.), The Bloomsbury Handbook of Service Design (Chapter 6.3). Bloomsbury Academic.

Kinross, R. (1985). The rhetoric of neutrality. Design Issues, 2(2), 18-30.

Mol, A. (2008). The Logic of Care: Health and the Problem of Patient Choice. Routledge.

Muller, M., Lange, I., Wang, D., Piorkowski, D., Tsay, J., Liao, Q. V., Dugan, C., & Erickson, T. (2019). How data science workers work with data. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM.

Tkacz, N. (2022). Being with Data. Goldsmiths Press.

von Busch, O., & Palmas, K. (2023). The Corruption of Co-design: Political and Social Conflicts in Participatory Design Thinking. Routledge.