How Social Media Platforms use MLOps and AI Governance to Help to Moderate Content

by Dr. Phil Winder , CEO

AI Governance Report

The UK government’s communications regulator, Ofcom, commissioned Winder.AI to produce a report to improve their understanding of the end-to-end AI governance processes that support the creation and deployment of automated content classifiers used in moderating online content. Together we interviewed social media platforms and moderation technology vendors to ask them about the tools, technologies and processes that are often referred to as machine learning operations (MLOps).

Download the report

An image of a book representing the MLOps report.
<figcaption></figcaption></figure>

Background: The Online Safety Bill, MLOps and AI Governance

Ofcom has been named as the new online safety regulator in the United Kingdom (UK), in the Online Safety Bill (OSB). The OSB will give online platforms new duties of care which are aimed at improving online safety.

Online platforms use moderation, which can be manual or automated using technologies that fall under the banner of artificial intelligence (AI), to keep their users safe by reviewing content against community guidelines and taking action if it violates them. AI-powered moderation is becoming more common as platforms seek to scale their moderation processes.

AI governance is a new and increasingly important field that provides an overarching framework for the use of AI and reporting mechanisms to stakeholders. In this report, we focussed on the operational practices surrounding the development, deployment, and ongoing maintenance of such systems.

Problem: Understanding the Challenges in AI Governance

Ofcom, traditionally known for its involvement in regulating the communications market, required MLOps expertise to help them understand the end-to-end processes that support the operation of AI solutions used in moderating online content. They then wanted to understand how a variety of online platforms were dealing with these challenges to ultimately improve online safety as a whole.

The project was a second phase which followed a less thorough engagement with online platforms and stakeholders in the first phase. Winder.AI drafted a range of questions which explored the key areas of interest and discussed them with Ofcom before they were finalised and used in interviewing the online platforms.

– Sanya Osisanya, Senior Technology Adviser, Ofcom

Solution: AI Interviews and Analysis

The Winder.AI team developed a comprehensive strategy to help Ofcom interview online platforms about their MLOps solutions and AI governance practices. This included the curation of hundreds of questions about the end-to-end lifecycle of AI solutions, with a specific emphasis on the production operation of models. We then performed many interviews with leading online social media platforms, content platforms, and moderation technology vendors and collated their responses. We then analysed the data to produce a report which was then presented to Ofcom.

Based upon this information we subsequently wrote a public report, with commercial and sensitive information removed, which was published on the Ofcom website.

Winder.AI developed a comprehensive report with expert analysis and judgement based on the interviews with the platforms has increased Ofcom’s internal knowledge on the application of automated content classifiers by online platforms and the processes through which they build and maintain them. We will be using the report as we develop our approach to regulating the platforms in the near future.

– Sanya Osisanya, Senior Technology Adviser, Ofcom

Value of this Work

The core value of this work was to help Ofcom appreciate the challenges involved when developing AI technologies that are meant to be used at a global scale in mission-critical systems. This work also gave them a better understanding of how the leaders in the space are achieving these goals, which will help them to make informed decisions about how to help the industry in the future. The publication of the report will also help to raise awareness of the challenges involved when attempting to moderate content at scale.

But ultimately, we hope that this work will help to improve online safety for everyone, by distributing MLOps knowledge and best practices to the wider industry.

The most impressive thing about Winder.AI was their expertise and knowledge in the AI and MLOps space. This was attested to by some of the online platforms interviewed in their discussions with Ofcom.

– Sanya Osisanya, Senior Technology Adviser, Ofcom

Contact

If this work is of interest to your organization, then we’d love to talk to you. Please get in touch with the sales team at Winder.AI and we can chat about how we can help you.

Contact Us

More articles

MLOps in Supply Chain Management

MLOps consulting in supply chain management. Learn how Winder.AI supported the MLOps platform team at Interos to deliver an internal AI platform for their developers.

Read more

MLOps in Insurance

MLOps consulting for insurance companies. Learn how Winder.AI collaborated with Tractable to deliver vital AI efficiency via MLOps.

Read more
}