Definitions and Acronyms

As with most emerging technologies there are a lot of buzzwords, acronyms, and phrases that get used liberally. Here’s an ever-growing list to help level the playing field.

Acronyms

AI

Artificial Intelligence

Artificial Intelligence refers to the development of computer systems capable of performing tasks that typically require human intelligence, such as learning, reasoning, problem-solving, and decision-making.

ChatGPT

Chat Generative Pre-trained Transformer

ChatGPT is an AI-based language model developed by OpenAI. It is designed to generate human-like responses in conversational contexts. The “GPT” in ChatGPT stands for “Generative Pre-trained Transformer,” referring to the underlying model architecture used in the language model.

LLM

Large Language Model

A Large Language Model is a sophisticated artificial intelligence system that has been trained on vast amounts of text data to generate coherent and contextually relevant human-like responses.

Definitions

Token

A token, in the context of generative AI and natural language processing, is a piece of a whole, so a document can be split into words, phrases, or symbols based on a certain rule-set. In English, a token could be as short as one character, like ‘a’ or ‘I’, or as long as a word like ‘elephant’. Even punctuation characters like a comma or period can be tokens. The way the text is tokenized can impact the way the model interprets and generates text.

Zero-Shot Learning

Zero-shot learning is when a model can make predictions for things it has never seen before, using its previous knowledge to understand and recognize new items or tasks. It’s like learning to identify new objects without being directly trained on them.

Few-shot Learning

Few-shot learning refers to the capacity of a machine learning model to understand and start generating appropriate outputs based on a very small number of examples, typically less than 10. It’s a concept that pushes the boundaries of traditional machine learning, where models often require hundreds or thousands of training examples to perform a task effectively. In the context of AI language models like GPT-3, few-shot learning means that the model can understand the desired output format or style based on just a few example inputs.

Spam prevention powered by Akismet