Helping Modzy Build an ML Platform

by Dr. Phil Winder , CEO

Winder.AI collaborated with the Modzy development team and MLOps Consulting to deliver a variety of solutions that make up the Modzy product, a ModelOps and MLOps platform. A summary of this work includes:

  • Developing the Open Model Interface
  • Open-sourcing chassis, the missing link that allows data scientists to build robust ML containers
  • Model monitoring and observability product features
  • MLOps and model management product features

The Problem: How to Build An ML Platform

Modzy’s goal is to help large organizations orchestrate and manage their machine learning (ML) models. This is a phase in the ML lifecycle where data scientists often take a back seat, handing their work over to ML engineers, who are responsible for turning models into deployable artifacts, who then hand these across to operational teams, who are responsible for serving and monitoring.

The majority of companies don’t have the resources or capacity to have such tightly focused roles, so they leverage platforms to do much of the heavy lifting. Modzy wants to make it as easy as possible for data scientists to release their own models, so that means they need to concentrate on automating and simplifying the process.

The Solution: Leverage Winder.AI’s Experience and Expertise

Modzy asked Winder.AI to develop a range of POC’s that produced prototype versions of Modzy’s product, beginning with model management, and advancing to model monitoring. We delivered working microservices that directly integrated with their backend systems. This meant they could demonstrate and offer new functionality in a fraction of the time of what it would have taken to quickly prove viability.

This work led to further open source projects which leveraged our tight links with the MLOps.community. Modzy wanted to begin contributing to open source efforts to improve awareness of the brand. Using our ML experience, we helped to define what the current MLOps problems were and created a strategy that would help alleviate them. This work resulted in two open source projects, the Open Model Interface and chassis.

An image of the modzy administrative panel.
Modzy dashboard, courtesy of Modzy.

Open Source Partners

Our community expertise, built from a decade of strategic, implementation, and marketing experience, were greatly tested when we were asked by Modzy to collaborate on an open source project. The first problem is defining a project that is worthy of spending time on.

In our ML and RL application work, there was one area that we thing is under represented in the ML landscape; building and maintaining servable models. A variety of vendors have come up with solutions, but in a lot of our work a simple, cloud-native container, running on a Kubernetes cluster, or equivalent, is often good enough, especially when projects are internally facing. But data scientists or those that work in a pure data role often don’t have experience of Docker or Kubernetes or any of the cloud-native landscape.

An image of the chassis website.
Open source project chassis empowers data scientists with a safe and fast way of building production-ready models.

So we built chassis, a library and service that allows data scientists to build robust, performant, deploy-ready containers with a single command that they can run from their notebook.

As part of this work we realized that there aren’t many ML-focussed API standards and for those that do, there were elements that were lacking. So we also open sourced the Open Model Interface, which is the specification that the chassis containers conform to. They are fast, secure, and upgradable. In the future we hope that it will become possible to deploy OMI compatible containers to a range of clouds and vendors; but of course you publish your OMI models to Modzy right now.

An image of the OMI website.
The open model interface project aims to unify ML APIs to make it easier to deploy them to a variety of sources.

Value of This Work

By leveraging Winder.AI’s experience, Modzy were able to quickly prototype new product ideas and validate them with their early adopters. We also set the groundwork for the future direction of the product, both technically and strategically, much faster and cheaper than they would have been able to if they had to wait for internal engineering time.

This relationship blossomed into externally facing open-source work, which provides some insight into the trust and track record we had built over the years. This work is helping to solidify Modzy in the ML marketplace and establish it’s open source pedigree.

Contact

If this work is of interest to your organization, then we’d love to talk to you. Please get in touch with the sales team at Winder.AI and we can chat about how we can help you.

More articles

How We Built an MLOps Platform Into Grafana

A case-study investigating how Winder.AI built an MLOps platform into Grafana.

Read more

How To Build a Robust ML Workflow With Pachyderm and Seldon

This article accompanies our demonstration of using Pachyderm and Seldon Deploy to manage the machine learning lifecycle. Follow along with the fictional organization CreditCo on a journey of ML failure and robustness. Learn how Winder.AI can provide MLOps consulting to help make your ML workflows more robust.

Read more
}