If you are into GenAI, you must know these 12 Terms
- Feb 4
- 3 min read
Studying generative AI requires an understanding of some key terms, such as LLM (Large Language Model), NLG (Natural Language Generation), and prompt engineering. Familiarity with these concepts will further your understanding of how generative AI works and its applications.

LLM (Large Language Model)
What: Advanced AI systems trained on massive text datasets
Used in: ChatGPT, Claude, Gemini, Copilot
Real-world example: GitHub Copilot using LLMs for code suggestions
Transformers
What: Neural network architecture that revolutionized AI
Used in: All modern language models (BERT, GPT, T5)
Real-world example: Google BERT powering search results understanding
Prompt Engineering
What: Art of crafting effective instructions for AI models
Used in:
Business applications
Content creation
Specialized tasks
Real-world example: Writing prompts for image generation in DALL-E
Fine-tuning
What: Customizing pre-trained models for specific tasks
Used in:
Industry-specific applications
Custom chatbots
Specialized tools
Real-world example: Fine-tuning models for medical diagnosis
Embeddings
What: Numerical representations of text/images
Used in:
Search engines
Recommendation systems
Document comparison
Real-world example: Pinecone using embeddings for vector search
RAG (Retrieval Augmented Generation)
What: Combining external knowledge with AI generation
Used in:
Enterprise chatbots
Documentation tools
Customer support
Real-world example: Enterprise chatbots accessing company documentation
Tokens
What: Units of text processing in AI models
Used in:
Model capacity planning
Cost calculation
Input/output management
Real-world example: GPT-4's 32k token context window
Hallucination
What: AI generating false but plausible information
Impact on:
Business applications
Content generation
Decision support
Real-world example: ChatGPT generating incorrect historical dates
Zero-shot Learning
What: AI performing tasks without specific training
Used in:
Classification tasks
Language understanding
New use cases
Real-world example: Classifying new product categories without training
Chain-of-Thought
What: Step-by-step reasoning in AI responses
Used in:
Problem solving
Mathematical calculations
Logic tasks
Real-world example: Solving complex math problems step-by-step
Context Window
What: Maximum text length AI can process
Impact on:
Document processing
Conversation length
Task complexity
Real-world example: Processing long legal documents
Temperature
What: Controls AI output randomness
Used in:
Creative writing
Code generation
Response variation
Settings:
Low (0.0): Consistent, focused
High (0.7-1.0): Creative, varied
Real-world example: Adjusting creativity in marketing copy generation
What other GenAI terms do you use regularly? Please share them below, and I will add them in the next iteration.
Generative AI: The AI system which can generate new content in text, images, and music forms by learning the patterns from the existing data.
LLM: Large Language Model; a super advanced model, such as GPT-3, which is capable of producing human-like text and doing all kinds of natural language processing.
Chatbot: A software which uses generative AI for conversations to give information or help people through natural language processing.
GPT (Generative Pre-trained Transformer): Very powerful language generator, which builds on learned patterns from diverse data sets.
Prompt: The text or file entered into a generative AI so that it returns an answer which may be written, files, images, and even audio or video.
Prompt Engineering: Developing the specific text input to make an AI guide to produce your desired output.
Tokens: Meaningful units, like words or characters, which form the basis of understanding and generation in language models.
Context Window: The number of preceding words or tokens that a model takes into account while generating or trying to understand text.
Fine-Tuning: The process of additional training on specific data to fine-tune the performance of a model for particular applications.
Attention Mechanism: A module that enables models to pay attention to the relevant parts of input data, thereby enhancing decision-making and output generation.
AI Image Generation: Models trained to generate images from text inputs, trained on vast datasets of visual content.
Embeddings: Numerical representations of words or entities that capture their semantic relationships, aiding in language understanding.
Additional Concepts
Retrieval Augmented Generation (RAG): A technique that enhances prompts by sourcing relevant information from a vector database.
Graph Retrieval Augmented Generation : advanced version of RAG in which the graph structures get used for the generation of content to boost the accuracy and relevance of generated content.
AI Agent: Intelligent entity performing specific tasks autonomously within AI systems, which enhances efficiency and adaptability.



Comments