Corporate Visualization Entrepreneurship : Proactively Providing Visualizations for Institutional Use

Design ideally is about service on behalf of the other.

The paper is targeted mainly to those proactively designing and/or producing visualizations to be adopted for institutional use. I call this kind of activity Corporate Visualization Entrepreneurship (CVE); the term I use here for those engaged in CVE is entrepreneurs.
I first introduce a conceptual framework, a starting point for systematic reflection on the process and the hurdles along the way, for either planning or diagnostic purposes. On several relevant aspects I have included contrasting comparisons with two alternative situations : commissioned projects, and those whose end product is aimed at the general public or a large, non-specialist audience.
The descriptive sections include an account of our team's initiative to provide a visual analytics (VA) tool to colleagues in a policy DG, examined with reference to the conceptual framework. I conclude with the presentation of the end product, prefaced by a short overview of the design choices and constraints. 


Conceptual framework : Thinking about the journey


Keywords : institutional use and proactive

Institutional use should be understood in contradistinction to personal use : the necessary but insufficient condition is that it takes place in an institutional setting; additionally, it has to occur as a result of a decision taken and transmitted through the formal channels of a hierarchical organization. To clarify : my use of a given equipment at work is personal if it is the result of my own decision / preference, and not mandated or at least recommended by the hierarchy.
Users in this category usually have a strong and specific, even punctual, interest in a topic or domain, for professional reasons (policy analysis, or governmental and administrative decision making). Their knowledge of the subject matter could surpass that of the entrepreneur, and they know how to correctly use and interpret the data.
A prerequisite for success in CVE is adoption of the end product by the target organization, therefore the operational objective of the enterprise should be phrased with reference to formal structures and processes. Unlike personal use, where the group of decision-makers and that of users/consumers are coterminous, corporate decision-makers are by definition a minority, or even outsiders relative to the user group; consequently the steps taken to ensure adoption are different in the two cases.
The entrepreneur has to pay attention to the decision process within the target organization. This involves determining, among other things, the accepted ways for setting the agenda, the management style, and the existence, identity, and actual influence of opinion leaders.
In order to win over the key corporate decision-makers or influencers, the entrepreneur has to ensure that the proposition is able to satisfy simultaneously (or at least not antagonize) a variety of preferences, interests and standpoints, sometimes very different and even diverging. In contrast, when dealing with mass customers, what really matters is to be appealing to as many of them as possible; if the market is markedly heterogeneous, the developer could focus on a subgroup at the expense of the rest, or address each market segment with a different proposition.
The difference between institutional and other types of users, as well as universe segmentation, is illustrated by the Eurostat user personas (see the two examples below).

Ingrid, an experienced staffer on a policy DG, uses raw data for conducting her own analyses and creating her own graphs, as part of her regular tasks.

Kristoffer, a member of the general public, reaches Eurostat webpage via a Facebook post; he has a personal interest in statistics, occasionally looks at visualizations and reads reports, but is only exceptionally digging into data in order to reach his own conclusions.

Being proactive means that the design/production process is not initiated through a specific request from the users/stakeholders (as in the case of a commissioned project), but on the initiative of the design/production team.
One important consequence is that it is highly unlikely that the users/stakeholders will provide specifications / requirements. A commissioned project is the opposite : the kickoff is usually accompanied by a concrete description of the desired end state, to subsequently guide the process. In this respect, the proactive approach bears some similarities with development of a new mass product or service, where the customers' needs and requirements are elicited through user and/or market research; some of these new products and services can be described as market-creating, need-inducing, or need-revealing.
Other concerns, not present in the case of commissioned projects, refer to issue onboarding and stakeholder cooperation (discussed below).

Stakeholder, need, solution

The landmarks of the landscape the entrepreneur is attempting to navigate are the stakeholder (user), its need (the problem to be solved, and the main reason for adopting the solution), and the solution (product or service which bridges the gap between the current, unsatisfactory state to the desired one); this also seems to be the natural, logical progression.
In unfortunate cases, the need is forgotten, being mistaken for the product feature; as the late Harvard Business School marketing professor, Theodore Levitt, used to tell his students, “people don’t want a quarter-inch drill, they want a quarter-inch hole!”. In order to avoid such circumstances, one should ask oneself "what is the question to which my product is the answer?" To illustrate the idea with an extreme example, imagine a present-day computer with a floppy disk drive.
In other cases, the path is reversed : the starting point is the end product, either alone (serendipitous discoveries like teflon, microwaves, graphene come to mind), or in combination with the need it purportedly addresses (basically the story of traveling salesmen as well as many technology startups).
Many sources insist that all three elements - user, need, product - have to be aligned; here is the formulation from the GOV.UK Service Manual:
To deliver a service that meets your users’ needs, you have to understand:
  • who your users are
  • what they’re trying to do
  • how they currently do it
  • the problems or frustrations they experience
  • what they need from you to achieve their goal
Backward chaining is also crucial : "If you can’t easily relate your ideas back to how you framed the problem, either you were wrong about the problem or you are wrong about your solutions."

Context scan

In order to determine likelihood of success, and to decide whether to commence, the entrepreneur should look at :
  • Issue onboarding : whether the problem it attempts to tackle is perceived and acknowledged at the relevant decision-making levels, and, provided that, whether the organization is (or is likely to become) committed to finding a solution, to an extent at least sufficient for the internal interest groups to accept compromises.
    In contrast, in a commissioned project, where the process is initiated by the stakeholders or their representatives, these aspects are usually settled beforehand.
  • Likelihood of stakeholders' cooperation throughout the process : whether users, opinion leaders, and decision-makers are willing or likely to provide input and feedback. If the problem to be solved is not pressing, or the value of the potential solution is not apparent or not communicated successfully, cooperation is seen by the stakeholders as just a pointless burden. One should also not mistake cooperation and contribution with acknowledgement of the need and acceptance in principle of the solution.
  • Entrepreneur's brand : How is the entrepreneur perceived by the target group : Is it credible and accepted as solution provider?
  • Competition : Are there any potential or actual competing solutions / providers? How does the entrepreneur and the proposed solution stack up against these?

Types of problems

These are types of visualization-related problems, presented in their logical sequence.
  1. Idea : Out of the possible ways of approaching the topic, I don't know which is suited to visualization.
  2. Illustration : I have an idea, but I don't know what specific visualization would properly illustrate it.
  3. Implementation : I have in mind a specific visualization, but I cannot to do it, because :
    1. [instrument] the tool doesn't have the functionality
    2. [inability] I don't know how to do it
There is also an inverse conditionality order : implementation capabilities determine the illustrations, and consequently whether the concept can be visualized at all. Higher level problems should be tackled with an eye towards the levels below, and primarily towards implementation.

Narrowing down the solution space

Here are various aspects to consider when exploring the universe of possible solution with the intention to narrowing it down to the one(s) that will enter the implementation pipeline.
  • Deliverable types :
    • design (idea, outline, mockup) / actual product/service ;
    • building blocks (tools, components, knowhow) / complete solution.
  • Usage type, informing interface complexity : a continuum from simple interfaces that display and emphasize actionable information (for snapshot reporting, e.g. dashboards and scorecards), to those that support visual discovery and analysis
  • Technical environment / constraints :
    • Likelihood of adoption is higher if the required support infrastructure (software, hardware) already available and the beneficiary does not have to make extra investments. Examples of visualization infrastructure that require additional investments : Tableau, Shiny server.
    • Data confidentiality might prevent hosting or processing the data on an external server.
    • If offline use is required, only local / in-browser data processing is possible, and server-hosted interactive visualizations (e.g. Shiny) are ruled out.
    • A static (print, pdf) format precludes interactivity and dynamic display, and influences the size.


Our story


The next section presents the experience of our team at Eurostat of proactively providing a visual analytics (VA) tool intended for institutional use by colleagues in a policy DG,
The operational objective was to secure the integration of the analytical tool in a specific workflow, by demonstrating added value. The VA tool is an example of deriving new outputs from existing data.

Context

The endeavor should be seen through the frame of reference set by Eurostat strategic efforts to answer the imperatives of the user-centric orientation of ESS Vision 2020. Similar experiences might become more familiar at Eurostat as a result of the drive to create innovative visualizations for ESS core users.
The key reference is the ESS Vision 2020 (henceforth V2020) - a common strategic response of the European Statistical System (ESS) to the challenges that official statistics is facing. It identifies five key areas in which common action is needed in order for European statistics to be "fit for the future". Two of these key areas, focus on users and improved dissemination and communication, are reflected in a strategy built around the principle of putting users at the forefront. The Vision requires that the ESS responds by, inter alia : engaging with users proactively, improving the responsiveness to user needs, and delivering information in an interactive and easily comprehensible way .
Our unit has also been set up as part of the V2020 implementation; we are dealing mostly with experimental statistics, regarded by the ESS as an effort to better respond to our users' needs.
My team is producing advanced estimates of income inequality indicators (AROP and QSR), which are included in the Social Scoreboard, a monitoring tool for Member States' performance in relation to the European Pillar of Social Rights, and a key component of the European Semester .
We are in close contact with people from various EU institutions involved in both the preparatory methodological work and the actual compilation of the European Semester documents - these are our stakeholders. They are part of what V2020 describes as the ESS "core users" .


The discovery phase

The trigger for initiating the project was our perception of a diffuse unfulfilled need among our stakeholders, regarding a more dynamic representation of income inequality. In other words, our target group had a Type I problem.
After scanning the context, we have assessed that the target group was acknowledging the issue and was open to a solution, but their potential for active involvement in the design phase was quite low : they rejected suggestions to conduct brainstorming sessions for identifying potential solutions, and accepted only to provide feedback on ideas at an advanced maturity level and implementation stage, that our team would provide. Our credibility as solution providers was medium-high, and the acceptability level was high.
We have also investigated the existence of potential competitor ideas and solution providers, through literature reviews and networking, but could identify none.
The users were divided into two groups, one of them more accessible and on which we decided to focus for cooperation during the design phase. They were also the primary target for institutional use; the other group was to be approached at a later stage.
Additional factors contributing to our subsequent decision to act to remedy the situation were prior experience with the same issue, and an expected high value of an apposite solution.

User research

In order to obtain the necessary grounding knowledge we conducted short, informal interviews with target users about their analytical habits and practices, and carried out close examinations of public documents (analyses and reports) they had produced; insight generated from the collected information was systematically checked and validated with stakeholders. Knowledge acquisition alternated with synthetic reviews and gap identification. As a result of this iterative process we became confident that we had reached a correct understanding of stakeholders' expectations and analytical needs.
Among the hurdles we encountered during the grounding phase was the frequently limited availability of the stakeholders. Unlike a mass target group, our users were not interchangeable : when one of our colleagues was busy, we could not interview instead someone with similar socio-demographic characteristics in a different DG.
There are of course more elaborate and methodical ways of understanding user expectations, behaviors, needs, and motivations, of acquiring knowledge about context of use, challenges, and opportunities. As mentioned by IBM's chief designer, these include "direct and indirect partnerships with clients and users, including deep observational studies to learn about their workflow as well as how they want to use a data analytics tool", and "generative research, which provides insights into a particular problem area, and into the user’s broader concerns and their end-to-end experience."


The design process

Once the general nature of the need has been established by our team, and action was decided, we undertook the design of the intended product.

Setting the design objective

Next, our team carried out a session of setting the design objective, which started with the translation of the collected insights into analytical questions to be addressed by the tool, followed by a scan of the solution space. We realized a significant overlap with the questions that were informing and directing our own analyses, which opened the possibility of repurposing some of our own analytical tools.

Prototype

The next step in the design process was the in-house development of a prototype, during several mini-brainstorming sessions, starting from a visualization already produced for similar purposes (see sidebar).


The co-design alternative for the steps presented so far would have been, for example, to conduct several sessions with (ideally) all stakeholders, for elicitation and confirmation of needs and requirements, review of the solution space, brainstorming, concept selection, drafting, and mockup building.

First version

The first version of the tool was presented to a group of self-selected interested users, most of them also influential in the decision process within their units. The chosen delivery setting was a workshop whose purpose was not only to familiarize the users with the tool, but also to allow them to explore its heuristic potential. We believe that this setting contributed decisively to the stakeholders' enthusiastic response. Such a response was also a further confirmation that we had managed to understand correctly their needs and expectations, and that the tool's added value was communicated effectively.
The feedback we received included the request for some minor changes, mainly to eliminate superfluous content; we thus realized the importance of considering not only what the users want, but also what they do not need.
During the workshop the participants had direct access to the analytical tool and were encouraged to use it by themselves, so this was for us also an opportunity to test its usability. Thanks to the initial decision to have an interface as simple as possible, and with all explanatory information immediately available, we scored high on this aspect.
Usually, the design process includes several iterations of versioning and feedback, but it seems that we have reached from the outset a high level of user satisfaction with the tool design.
Positive feedback is a strong footing for organizational adoption, but nevertheless does not guarantee it. For us, the next step is the presentation of the tool at the formal, mandatory training session for the EC staff elaborating the European Semester documents.

Lessons learned

We would like to offer some suggestions and warnings to those who plan to embark on a similar exercise, of proactively designing a product or service for institutional users.
  • It might take a lot of work just to register on the users' radar.
  • A great idea is not enough, it doesn't speak for itself - you have to sell it to the potential users. Prepare for reluctance, resistance, and even rejection.
  • Keep in mind that users and those who decide about use are not necessarily the same persons.
  • Find the right extent, form, and timing of user involvement : it should secure their buy-in and get you the necessary input, without becoming an excessive burden that would turn them away. Remember : they didn't ask for what you are offering!
  • The user is the supreme arbiter of the value of your idea and your work. If, despite your best and sustained effort, the user doesn't see it, perhaps it is not there.


The Visual Analytics (VA) tool


A VA tool is a visual interface to data, one that supports analytical reasoning; it is supposed to reveal insightful information, for instance by zooming in on different subsets or (as in our case) by presenting different representations of the data.
The primary purpose of our analytical tool is to enable the audience to easily find the story in the data, to "navigate the users towards the insights they’re really looking to make" . Unlike storytelling with data, quite popular for targeting non-specialist audiences and the general public, we do not attempt to interpret the data for our users - remember Ingrid, the user persona most similar to our target users, she is sometimes distrustful of processed data, and prefers to draw her own conclusions.

Design choices

We have chosen to be rather conservative in terms of visualization, because we wanted to deliver a tool that is accepted by a diverse group of people with a wide range of numerical and visual literacy, as well as willingness to experiment. Moreover, the usefulness of novel, complex visualizations is not always immediately apparent, and is an additional potential point-of-failure.
The tool had to be self-contained, meaning it had to include the data, in order to allow offline use.
As for the software implementation, we have opted for the ubiquitous and familiar Excel; the file does not include macros, because the IT policies in this regard are variable, which might create a functional vulnerability. We are considering switching to an R-generated interactive html document, to be hosted online, in order to eliminate issues related to updating and vulnerability of the data and of the file itself to mishandling by the users. The tool is meant to be used in an institutional setting, on a desktop, laptop, or large screen; a mobile version was never considered.
Esthetically, I have tried to follow the advice of the author of Storytelling with Data : "you want the design to fade into the background, letting the data take center stage."

The end product

The Excel-based VA tool contains 7 sheets. On each sheet, all information is visible on a single screen, so no scrolling is necessary.

The first sheet is a read.me page, with a short presentation of the tool and the data, and the instructions for use.

The last sheet describes and explains the communication format for the estimated values. 
The main visualization is supplemented by 4 separate sheets presenting each of the 4 charts that compose it. The user can select the target country in the pink cell (top left); the charts and the table are automatically refreshed. Currently, no drill-down or zoom-in functionality is implemented or foreseen.
The charts present the same data - yearly evolution of income quintiles from 2006 to date - in different ways : (1) as absolute values, (2) as percentage change vs. 2006, (3) as absolute change vs. 2006, and (4) as change relative to the median.