top of page

If you are into GenAI, you must know these 12 Terms

  • Feb 4
  • 3 min read

Studying generative AI requires an understanding of some key terms, such as LLM (Large Language Model), NLG (Natural Language Generation), and prompt engineering. Familiarity with these concepts will further your understanding of how generative AI works and its applications.

GenAI
GenAI

  1. LLM (Large Language Model)

  • What: Advanced AI systems trained on massive text datasets


  • Used in: ChatGPT, Claude, Gemini, Copilot


  • Real-world example: GitHub Copilot using LLMs for code suggestions


  1. Transformers

  • What: Neural network architecture that revolutionized AI


  • Used in: All modern language models (BERT, GPT, T5)


  • Real-world example: Google BERT powering search results understanding


  1. Prompt Engineering

  • What: Art of crafting effective instructions for AI models


Used in:


  • Business applications


  • Content creation


  • Specialized tasks


  • Real-world example: Writing prompts for image generation in DALL-E


  1. Fine-tuning

  • What: Customizing pre-trained models for specific tasks


Used in:


  • Industry-specific applications


  • Custom chatbots


  • Specialized tools


  • Real-world example: Fine-tuning models for medical diagnosis


  1. Embeddings

  • What: Numerical representations of text/images


Used in:


  • Search engines


  • Recommendation systems


  • Document comparison


  • Real-world example: Pinecone using embeddings for vector search


  1. RAG (Retrieval Augmented Generation)

  • What: Combining external knowledge with AI generation


Used in:


  • Enterprise chatbots


  • Documentation tools


  • Customer support


  • Real-world example: Enterprise chatbots accessing company documentation


  1. Tokens

  • What: Units of text processing in AI models


Used in:


  • Model capacity planning


  • Cost calculation


  • Input/output management


  • Real-world example: GPT-4's 32k token context window


  1. Hallucination

  • What: AI generating false but plausible information


Impact on:


  • Business applications


  • Content generation


  • Decision support


  • Real-world example: ChatGPT generating incorrect historical dates


  1. Zero-shot Learning

  • What: AI performing tasks without specific training


Used in:


  • Classification tasks


  • Language understanding


  • New use cases


  • Real-world example: Classifying new product categories without training


  1. Chain-of-Thought

  • What: Step-by-step reasoning in AI responses


Used in:


  • Problem solving


  • Mathematical calculations


  • Logic tasks


  • Real-world example: Solving complex math problems step-by-step


  1. Context Window

  • What: Maximum text length AI can process


Impact on:


  • Document processing


  • Conversation length


  • Task complexity


  • Real-world example: Processing long legal documents


  1. Temperature

  • What: Controls AI output randomness


Used in:


  • Creative writing


  • Code generation


  • Response variation


Settings:


  • Low (0.0): Consistent, focused


  • High (0.7-1.0): Creative, varied


  • Real-world example: Adjusting creativity in marketing copy generation

What other GenAI terms do you use regularly? Please share them below, and I will add them in the next iteration.

  1. Generative AI: The AI system which can generate new content in text, images, and music forms by learning the patterns from the existing data.


  2. LLM: Large Language Model; a super advanced model, such as GPT-3, which is capable of producing human-like text and doing all kinds of natural language processing.


  3. Chatbot: A software which uses generative AI for conversations to give information or help people through natural language processing.


  4. GPT (Generative Pre-trained Transformer): Very powerful language generator, which builds on learned patterns from diverse data sets.


  5. Prompt: The text or file entered into a generative AI so that it returns an answer which may be written, files, images, and even audio or video.


  6. Prompt Engineering: Developing the specific text input to make an AI guide to produce your desired output.


  7. Tokens: Meaningful units, like words or characters, which form the basis of understanding and generation in language models.


  8. Context Window: The number of preceding words or tokens that a model takes into account while generating or trying to understand text.


  9. Fine-Tuning: The process of additional training on specific data to fine-tune the performance of a model for particular applications.


  10. Attention Mechanism: A module that enables models to pay attention to the relevant parts of input data, thereby enhancing decision-making and output generation.


  11. AI Image Generation: Models trained to generate images from text inputs, trained on vast datasets of visual content.


  12. Embeddings: Numerical representations of words or entities that capture their semantic relationships, aiding in language understanding.


Additional Concepts

  1. Retrieval Augmented Generation (RAG): A technique that enhances prompts by sourcing relevant information from a vector database.


  2. Graph Retrieval Augmented Generation : advanced version of RAG in which the graph structures get used for the generation of content to boost the accuracy and relevance of generated content.


  3. AI Agent: Intelligent entity performing specific tasks autonomously within AI systems, which enhances efficiency and adaptability.


Subscribe to our newsletter

Comments


bottom of page