Artefact has adapted Lean Manufacturing methodologies in order to remedy the seven sources of waste traditionally encountered in artificial intelligence projects.
Lean AI provides a two part framework to address this:
- Optimisation of the production chain via the development of standard data products
- A clear emphasis on operationalising data products in the last mile
These guidelines are designed to avoid pitfalls and optimise processing, but they can be adapted.
They will deliver optimised time-to-market, continuous delivery, fluid collaboration and quick decision-making to maximise profit.
Optimisation of the production chain via the development of standard data Products
Standardisation: The main way to implement technological advances
In his book The Design of Everyday Things, Don Norman observes that technological advances come either from technologies themselves or from standardisation. He illustrates this with the history of the automobile. The first cars were all different, making them difficult to operate, requiring strength and specific skills. However, over time they became more standardised. All cars now have steering wheels, indicators and gear sticks in broadly the same places, fulfilling the same function. By standardizing these key elements a driver can drive any car anywhere in the world. This kind of standardisation has facilitated most major technological and functional advances.
Data products as Standardized Intelligent Lego
Lean AI is committed to building ecosystem-based technological bricks, or data products, on which the company can build a sustainable AI strategy.
The objective is to provide the data teams with a library of standard technological components, so that projects with the same technical characteristics can be processed using the same approach and be integrated into the rest of the ecosystem.
To be reusable, AI must be “packaged” like a Lego structure, composed of several bricks that when combined, create a more complex model. The artificial intelligence model relies on a foundation of common denominators – generic data products – essential for the proper functioning of AI but not specific to the problem being addressed. Specialised data products can then be built where necessary to deliver the specific AI product requirement.
Uber is building its common denominators by creating a data science platform called Michelangelo. The platform aims to address production issues by standardising workflows and packaging data code snippets to help data centers share and disseminate knowledge.
In retail, the customer and the store are common data sources across most projects. They are generic data sources that can be used by specialised data products such as recommendation algorithms or algorithms to ensure stock availability.
This approach has multiple benefits:
- It builds the capability of teams by standardising approaches and building internal skills
- It increases the quality of the final product by setting development standards and quality controls
- It improves the time-to-market by centralising knowledge and learning from past experiences
Above all it allows the teams to concentrate on the essential 20% of specialist work that will increase the value generated by the final product.
A clear emphasis on operationalising data products in the last mile
Lean AI methodology puts the emphasis on the last mile of the production chain. A data product is only successful if it is widely distributed and consumed by end users.
The distribution of data products
The work of the data teams does not stop when the model is constructed but when business KPIs are met. This involves making the algorithm available to all end users.
For example, once a new component of the Google Cloud is developed, it is packaged so Google Cloud Platform users can use it easily. Google develops machine learning APIs, which are created to expand the usage of Google’s AI products and drive widespread adoption. The goal of data products is to benefit a much larger audience than the one the component was originally designed for.
A machine learning model for predicting the stock availability for a specific EAN (European Article Number) can be profoundly useful for a retailer. However, if the retailer feeds it by a standardised API into a dashboard that is available to the entire company it is far more powerful. This dashboard can be used by store managers to track the flow of their inventory, by the marketing team to target their campaigns on successful products, by the promotion team to define its promotional strategy, and by the supply team to optimise the supply chain.
Once the data product has been distributed to the different areas of the company, we must measure adoption and drive performance.
Artefact recommends implementing an AI Analytics tool, controlled by product owners, to promote improvement.
Some of the KPIs from AI Analytics:
- Usage KPIs (e.g. number of users of the model)
- Technical KPIs (e.g. model performance (% error), number of failures on the production line)
- Business KPIs (e.g. savings generated, incremental turnover)
The AI product owner must become an expert in human behaviour
To understand product usage by real people the product owner must spend time with people. As a result, immersion sessions with business teams are better than sending user surveys. The AI product owner should act as an ethnographer, observing how the created tool is used in their work environment.
This methodology is proven best practices for developing digital products. Pierre Fournier, Head of Product at ManoMano, recommends product owners organise “Popcorn Fridays”, where they watch video recordings of user journeys on the site. The first Friday of the month is dedicated to the viewing of 20 sessions on the payment path, and the last of the month to watching 20 sessions of users landing on the site. This allows the product owners to identify with users and makes deciding on adjustments much easier.
contact our experts