Welcome to the LangGraph Playlist repository!
This playlist is designed to help you understand LangGraph from the ground up — starting with basic graph concepts like nodes and edges, all the way to advanced topics like tool nodes, multi-agent systems, persistence, and human-in-the-loop workflows.
Each video walks you through real, practical examples so you can build production‑ready AI applications using stateful, graph-based pipelines.
To keep your AI projects clean and organized, it is recommended to use conda environments. Follow the steps below to install Miniforge and set up your environment.
Download from the official repository:
https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-MacOSX-arm64.sh
Run the following commands:
chmod +x ~/Downloads/Miniforge3-MacOSX-arm64.sh
sh ~/Downloads/Miniforge3-MacOSX-arm64.sh
source ~/miniforge3/bin/activateconda create --prefix ./env python=3.13
conda activate ./envpip install -r requirements.txtYour LangGraph environment is ready to build powerful AI apps 🚀
- Introduction to LangGraph fundamentals.
- Understanding nodes, edges, and state in a graph.
- Building graphs where nodes execute one after another.
- Understanding how data flows through a linear pipeline.
- Adding conditional edges to route execution based on state.
- Implementing dynamic decision-making within a graph.
- Creating graphs with cycles and loops for iterative processing.
- Understanding when and how to break out of loops.
- Running multiple nodes in parallel to improve efficiency.
- Fan-out and fan-in patterns in LangGraph.
- Understanding state reducers to manage and merge state updates.
- Using custom reducers for complex state management.
- Integrating a Language Model as a node inside the graph.
- Passing state context to and from the LLM.
- Combining LLM calls with conditional routing for smarter pipelines.
- Building graphs that adapt based on LLM output.
- Building a conversational chatbot using LangGraph state management.
- Maintaining context across multiple turns of a conversation.
- Implementing streaming responses from LLMs inside a graph.
- Delivering real-time output to the user as tokens are generated.
- Adding tool nodes that can call external functions and APIs.
- Connecting LangGraph with real-world capabilities.
- Combining LLM reasoning with tool execution in a unified graph.
- Building an agent-like pipeline that decides when to use tools.
- Adding checkpointing to save and restore graph state.
- Enabling long-running and resumable workflows.
- Pausing graph execution to incorporate human feedback.
- Building workflows where humans can review, approve, or correct AI actions.
- Composing complex systems using nested subgraphs.
- Encapsulating reusable graph logic into modular components.
- Using LangGraph's Send API to dynamically dispatch messages to nodes.
- Enabling fine-grained control over node execution and state passing.
- Orchestrating multiple AI agents within a single LangGraph system.
- Building collaborative agent networks for complex, multi-step tasks.
- Implementing Retrieval-Augmented Generation (RAG) inside a LangGraph agent.
- Combining vector search, LLM reasoning, and graph state for intelligent document Q&A.
langgraph
langchain
langchain-openai
langchain-core
python-dotenv
notebook
Got suggestions or improvements?
Feel free to open an issue or submit a pull request.
This project is licensed under the MIT License.
See the LICENSE file for details.
Thank you for checking out the LangGraph Playlist!
Happy building with AI 🚀