Skip to main content
SearchLoginLogin or Signup

Mapping the Public Media Stack: what we did and how you can use it

Published onMay 19, 2020
Mapping the Public Media Stack: what we did and how you can use it

Mapping the Public Media Stack: what we did and how you can use it

By Jay Owens

In order to offer a guide to public media organizations on media technologies - and the potential to make more ethical and sustainable decisions about which to use - we needed to do two things: identify a comprehensive list of these technologies, and then review them.

Following a workshop in New York in May 2019 - where over 50 people working in public media and public media tech came together to explore the idea of a Public Media Stack - we had an exciting list of ideas and technologies to consider. The next step was to get from these hypotheses to a comparable dataset.

How do you do this? You need to create structure. And this has been my focus as the project’s research design lead. 

Stage 1. Questionnaire design

I made a single, comprehensive list of all the questions, issues and topics raised at the New York workshop - then interrogated these questions to develop them into a research questionnaire. 

Key considerations were:

  1. We needed to ask closed questions (ones with yes/no answers, or a range of options from a list) so that the answers would be a structured dataset that we could chart and summarize - not just a series of text strings.

  2. We needed to ask objective questions, not opinion-based ones - questions that different analysts would answer in the same way. 

  3. We needed clear sources. Questions were designed to be answered from information found on the product website or in news coverage, rather than requiring first-hand user experience.

  4. We needed to know why we were asking these questions. I mapped each question to the type(s) of risk it helped publishers avoid - e.g. knowing what drives higher pricing helps reduce the financial risk of buying a tool. 

In this way, some initial concerns around instances of media industry ineptitude, or questionable tool use, became rather more dialled-down inquiries into the competitive position of each tool, availability of open-source alternatives, and lock-in factors like data export, deletion, and proprietary file formats. 

It’s obviously crucial to give space to more qualitative values questions - but these are addressed in our expert essays, the interviews with media organizations about their experiences with their tech stacks, and potential user reviews to come. 

Stage 2. Research process

In the interest of going beyond Google ourselves, we used cloud-based spreadsheet/database tool Airtable to build both a form for data entry, and store the answers. Tech journalists Martin Bryant and Imran Ali worked through the coding in December 2019 and January 2020. We had a total of 36 questions, which I had initially thought might be too many - but the easy-to-use form and the closed question format meant that they could be answered in around one hour per tool. 

Once coding was complete, I ran some data quality checks to ensure classifications were consistent, then charted the data to get a read on patterns before applying the scoring. At this point we had to switch to a Microsoft stack, for ease of use of Excel pivot tables and Powerpoint charting.

Stage 3. Scoring

The questions had already been designed to assess potential risk areas for publishers - so scoring was a straightforward matter of assigning ‘risk points’ for particular answers. We’ve included the scoring table below so you can see how we allocated risk points to each question.

  • 22 of the 36 questions were scorable

  • Up to two points were given per ‘risk’ answer - with one point for answers that carried a lower level, and zero for non-risk answers

  • ‘Cannot tell’ answers were scored as risks, as lack of information available pre-purchase increases the risk of making suboptimal decisions

  • Totals were summed for each of the four sections of the questionnaire (Strategy, Financial, Technical, and Data, Security & Ethics), and calculated as a percentage of the maximum score possible

  • Each section was weighted to account for a quarter of the final score

Final scores ranged from 0% risk points for the best scoring tool (a very well-documented collaboration product) to 62% for the worst (also, as it happens, a collaboration tool).

However, percentage scores felt artificially precise, and even misleading to communicate. A tool scoring 13% might not be riskier than one scoring 12%. Instead we thought it would be more useful to indicate that both were low scorers overall, and among the least risky. 

We then grouped the products by quartile, with 28 products in each category, and the bottom quartile broken into two groups depending on information available:

  • Lowest risk (risk scores 0-16%)

  • Small risk (risk scores 16-23%)

  • Some risk (risk scores 25-39%)

  • High risk (scores >40%) - 17 products

  • Lacks info (where >30% of scored answers were ‘cannot tell’)  - 11 products

Interpreting this information

This research is essentially a first round of due diligence on tools you might consider using in your media technology stack. Tools we’ve flagged as ‘lowest risk’ are unlikely to catch you out: their pricing is transparent, they have clear data policies, they’re typically regularly updated with new feature releases, and most have open source alternatives. The ‘small risk’ set is also likely to provide sound choices; but perhaps share a bit less information in one or two areas, an ambiguity that’s raised their risk scores slightly.

Tools flagged as ‘some risk’ and ‘high risk’ could still be a fit for your organization - but we’d recommend robust research before you make that decision. They may be better suited for more experienced and technically resourced media projects: for example, they’re likely to need expertise or developers to install. We’ve flagged the areas where their risk scores were higher, as pointers to ask sales reps for additional information if you want to. And it could be helpful to speak with other users and read reviews (e.g. on G2Crowd.com) before you decide. 

Some of the questions we asked didn’t easily fit a scoring framework - but it’s information that’s nonetheless very useful to share. We’ve summarized these factors on each product profile, so, at a glance, you can understand skill needs and factors affecting pricing.

Key findings

  1. The skill levels required to set up and use media tools increase as you move through the workflow. 83% of collaboration tools are ‘plug and play’ - but for Publishing, Measurement, Audience, and Storage, you’re likely to need either a tool or domain expert, or (for the latter two especially) a software developer.

  2. Despite GDPR, about a third of tools are lacking in data policy and data control. 38% of tools don’t have clear GDPR information - with Collaboration, Audience, and Publishing tools the most lacking, which can increase the challenges for media projects in managing your own data policies too. And around a third report only partial or no ability to export or delete your data (or your audience’s, if applicable). This applies to Publishing tools particularly.

  3. Ensuring mission and values alignment with tech suppliers isn’t easy: 63% of products don’t have an accessibility statement on their websites, and 68% don’t share workforce diversity data. Including these factors in RFPs and tenders could help shape industry norms for the better.

  4. Pricing clarity varies wildly. 80% of Collaboration and Production tools either have very clear pricing information online, or are clearly free. By contrast, three in four Measurement and Audience tools are either partly clear, or not clear at all. Pricing ad spend on the big social media platforms is hard to judge: you can control campaign pricing closely, but there aren’t benchmarks for individual publishers to judge typical or optimal campaign costs.

  5. Five products in the top 10 were from Microsoft, with very low risk scores of 5% or less. Microsoft win on clear documentation: from product roadmaps to pricing, they lay it all out commendably clearly.

What you can do with this information

Above all, we hope this structured review can save many media projects a lot of time. You can use this database in at least three ways:

  • A comprehensive list of the main tools at each stage of the stack. Several rounds of expert input and review should mean that we’ve got all the main contenders - though we are almost certainly missing some of the smaller open source and brand new startup options. Use these lists to ensure your own longlists are comprehensive.

  • Shortlisting which tools you want to explore further. Use these rankings and the detailed tool information provided to prioritize which tools are most likely to fit your needs, so you only have to explore a few in depth.

  • Identifying key due diligence questions that you need to ask. It can be tricky to know you’ve covered every question you need to, particularly regarding business or technical problems you haven’t encountered before. Use our expert-generated list to feel confident that you’re covering every area of risk.

We sincerely hope you find this analysis useful. Let us know how you’re using it over the next year, which information was most useful, and whether there’s anything you’d like us to add. Either tweet us at [1](https://twitter.com/Storythings) #publicmediastack or email you feedback here.

Public Media Stack Scoring Table

Question

Section

Score

Risk assessed

Product Name

Key Information

n/a

n/a Key info

Product URL

Key Information

n/a

n/a Key info

Parent Company/Organisation (if applicable)

Key Information

n/a

n/a Key info

Workflow Category

Key Information

n/a

n/a Key info

Date product founded

Key Information

n/a

Stability & longevity

Product Description

Key Information

n/a

n/a Key info

What type of company or organisation is the product owned by?

Strategic

Non-scoring question

Stability & longevity

Is the company profit-making? 

Strategic

2 points for Loss
1 point for Cannot find (transparency risk)

Stability & longevity

How is this product funded?

Strategic

Non-scoring question

Stability & longevity

What market share does the tool have?

Strategic

2 points for Very Niche
1 point for Challenger or Cannot Tell

Lock-in,Financial

Is there an Open Source alternative in this product space?

Strategic

1 risk points if N

Lock-in

When did the company last release new features? 

Strategic

1 risk point if H1 2019
2 risk points if earlier than that - or unclear.

Functionality,Stability & longevity

Have they shared details of the product roadmap? 

Strategic

2 risk point if N

Functionality,Stability & longevity

Is there talk of the product raising further investment?

Strategic

non-scoring

Stability & longevity

Is there talk of the product being acquired?

Strategic

2 risk point if Y

Stability & longevity

Have there been any newsworthy business issues or challenges in the last 12 months?

Strategic

2 risk points if Y

Functionality,Stability & longevity

Is there a pricing model listed on the website?

Financial

2 risk points if N

Financial

How clear & transparent is this pricing model?

Financial

0 risk points if n/a
1 risk point Not Clear

Financial

What is the range of prices offered?

Financial

Non-scoring question

Financial

What factors drive higher pricing?

Financial

Non-scoring question

Financial,Functionality

What skills are needed to get this tool set up & usable?

Technical

2 risk points for Cannot Tell
1 risk point for Expert, Dev or Vendor

Financial

What skills are needed to USE this tool day-to-day?

Technical

2 risk points for Cannot Tell
1 risk point for Expert, Dev or Vendor

Financial,Functionality

Does this tool use proprietary file formats for saving documents & outputs?

Technical

2 risk points if Demands
1 risk point if Cannot Tell

Lock-in

Is the product reliant on 3rd-party APIs or interoperability to function?

Technical

2 risk points if Y

Stability & longevity

Have there been any newsworthy technical issues in the last 3 years?

Technical

2 risk points if Y

Functionality,Security

How clear is the Terms of Service for users (i.e. publishing customers)

Data, Ethics & Security

2 risk points if None available
1 risk point if Technical

Data control

Does the product have a data collection & usage policy on its website? (Can be a GDPR statement)

Data, Ethics & Security

2 risk points if no/cannot find
1 if topline

Data control,Privacy

Is their data policy “GDPR everywhere”?

Data, Ethics & Security

2 risk points if none availble or unclear

Data control,Privacy

Does the product require your audience to sign-in to use it?

Data, Ethics & Security

n/a non scoring question

Data control,Privacy

IF SIGN IN REQUIRED: Does audience sign-in use 3rd party platforms?

Data, Ethics & Security

Non-scoring question

Lock-in

Does the public product website mention how to export or transfer your data to another service?

Data, Ethics & Security

2 risk points for No  / Cannot Tell
1 risk point for Only Partial

Lock-in,Data control

Does the public product website mention how to delete your data?

Data, Ethics & Security

2 risk points for No
1 risk point for Only Partial

Lock-in,Privacy

Have there been any newsworthy security breaches with this product in the last 3 years? 

Data, Ethics & Security

2 risk points for Y

Security,Privacy

Does the company have an accessibility statement on its website?

Data, Ethics & Security

2 risk points for No

Mission & values

Does the company release information on the diversity of the workforce?

Data, Ethics & Security

2 risk points for No

Mission & values

Portrait of Jay Owens

Jay Owens

Research and Framework Designer for the Public Media Stack

Comments
0
comment
No comments here
Why not start the discussion?