Labelf Blog

Discover the latest product updates, announcements, and articles from the Labelf team.
February 27, 2023

Fairly Simple Explanation of what the Labelf AI Platform does

In order to explain what the Labelf Platform does, let's start from ChatGPT since that is something that most people know about these days.

The "GPT" in ChatGPT means that it is a  "Generative Pre-Trained Transformer". The "generative" part of it means that its main purpose is to generate text, in the case of ChatGPT an answer to a question or a prompt. It does need to read text in order to be able to generate something relevant, but it is not its main purpose.

A text classification model is optimized to read text. And instead of generating a bunch of words as the answer, it will give you one out of a predefined set of possible answers. This is much more useful if you want to analyze big masses of text. Let's for example say that you conducted a survey and asked people if they had any complaints about visiting your stores. With a text classification model you can quickly find out and keep track on how many are complaining about service vs. how many complained about that things were out of stock. You can find it out without having to read thousands of answers.  It's also much better for automating software since it's easier for the next part of the code to know the possible answers beforehand when it is supposed to execute something based on that answer. In order to train a model like this, you need to give the model a couple of examples of possible texts and what the answer to those should be.

The Labelf Platform helps you create such Text Classification models by supplying you with an interface that allows you to give these answers to the model in a quick and effective manner.  It also helps you check that the model is performing as expected and put it into production.

The "Transformers" part of "Generative Pre-Trained Transformer" (as in GPT or ChatGPT) is the name for the architecture of those models. Basically how the model was constructed. The Transformer architecture revolutionized Natural Language Processing back in 2017/2018 and text classification models with that type of architecture was outperforming humans for the first time in history. In the Labelf Platform, you can train Transformers models, without having to know any of the hard technical stuff about Transformers, and still reap the same rewards.

Note: OpenAI, the creators of ChatGPT also provides a text classification service, but you have to have a pre-labeled dataset and use it via API as of right now. It also misses a lot of essential features that the Labelf Platform has such as: cluster discovery, dashboards, language filters, synthetic data generation, labeling interface, active learning, data on-ramps, integrations, metrics visualization and much more.

Per Näslund

CTO & Co-Founder @ Labelf AI

Change Cookie Settings