Skaff is an incubator
for Artefact technical products

Elevate technical delivery standards.
Improve commercial success.
Consolidate artefact as leading data.

Skaff provides open source knowledge and deployable solutions to solve foundational technical problems

There is a significant technical overhead before data and AI projects are able to demonstrate value

Skaff recognizes this foundational work and builds high quality accelerators to streamline build and deployment, to allow teams to focus on value-add work.

Knowledge
Packs

Get a running start

Speed up your onboarding on a technology or expertise by walking through one of our knowledge packs.

What is in the box?

Knowledge packs include a 45 minutes hands-on,
and our collective convictions on how to
approach the subject.

Try one

Deployable packages

Cut through the boilerplate

Accelerate the development and industrialization of data projects by using on the shelf open source software.

What does it look like?

These accelerators can be Python packages,
Terraform modules, Git repository templates,
dashboard wireframes, and much more.

Try one

Fast track your Gen AI app using
our Langchain industrialization kit.

Success stories

Private Equity

By leveraging Skaff’s GenAI accelerators, an Artefact team quickly demonstrated the large time savings that could be achieved by indexing and querying unstructured data for M&A.

Analysts were able to ask questions about due diligence documents, market studies, expert interviews, and other reports in natural language. This allows cross referencing information easily, greatly improving productivity.

Greenlit to scale to 1500 users.

Consumer beauty

When building a data platform to support marketing use cases, using Skaff accelerators fast-tracked the deployment of data lakes, data pipelines, access control, finops, and data governance.

With this taken care of in a matter of days instead of weeks or months, data engineers were able to focus on building data products and serve strategic use cases for the brand.

RETAIL

To analyze data streaming from points of sales in a fraud detection use case, Skaff’s dbt server accelerator was used to deploy and schedule analytics pipeline.

This allowed the Artefact team to quickly gain insights on fraud detection events and other incidents

Having this accelerator ready to go allowed them to shave off weeks of development time and focus on their product.

Meet the SKAFF Staff

Alexis Vialaret
Robin Doumerc

Medium blog articles by our tech experts

Choice-Learn: Large-scale choice modeling for operational contexts through the lens of machine learning

Discrete choice models aim at predicting choice decisions made by individuals from a menu of alternatives, called an assortment. Well-known use cases include predicting a...

The era of generative AI: What’s changing

The abundance and diversity of responses to ChatGPT and other generative AIs, whether skeptical or enthusiastic, demonstrate the changes they're bringing about and the impact...

How Artefact managed to develop a fair yet simple career system for software engineers

In today’s dynamic and ever-evolving tech industry, a career track can often feel like a winding path through a dense forest of opportunities. With rapid...

Why you need LLMOps

This article introduces LLMOps, a specialised branch merging DevOps and MLOps for managing the challenges posed by Large Language Models (LLMs)...

Unleashing the Power of LangChain Expression Language (LCEL): from proof of concept to production

LangChain has become one of the most used Python library to interact with LLMs in less than a year, but LangChain was mostly a library...

How we handled profile ID reconciliation using Treasure Data Unification and SQL

In this article we explain the challenges of ID reconciliation and demonstrate our approach to create a unified profile ID in Customer Data Platform, specifically...

Snowflake’s Snowday ’23: Snowballing into Data Science Success

As we reflect on the insights shared during the ‘Snowday’ event on November 1st and 2nd, a cascade of exciting revelations about the future of...

How we interview and hire software engineers at Artefact

We go through the skills we are looking for, the different steps of the process, and the commitments we make to all candidates.

Encoding categorical features in forecasting: are we all doing it wrong?

We propose a novel method for encoding categorical features specifically tailored for forecasting applications.

How we deployed a simple wildlife monitoring system on Google Cloud

We collaborated with Smart Parks, a Dutch company that provides advanced sensor solutions to conserve endangered wildlife...

Deploying Stable Diffusion on Vertex AI

This article provides a guide for deploying Stable Diffusion model, a popular image generation model, on Google Cloud using Vertex AI.

All you need to know to get started with Vertex AI Pipelines

Presentation of a tool that demonstrates, practically, our experience using Vertex AI Pipelines in a project running in production.