API Loops and Programmatic AI
Move beyond copy-paste. Learn to call AI from code and process hundreds or thousands of records automatically.
This pathway teaches you the fundamental skill that separates casual AI use from serious productivity gains: calling AI programmatically.
When you have 500 records to classify, 300 documents to summarise, or 1,000 rows to enrich, copy-pasting into Claude is not practical. You need to write code that loops through your data, calls the AI API for each item, and saves the results.
We start with the simplest possible pattern - a Python script that processes a CSV file row by row. Then we move to document processing and more sophisticated classification tasks. By the end, you will be comfortable writing scripts that process any amount of data.
You do not need to be a developer, but you should be comfortable running Python scripts and using Google Colab or similar environments.
Before you start
- •Basic Python skills (variables, loops, functions)
- •Familiarity with Google Colab or Jupyter notebooks
- •An OpenAI or Anthropic API key with some credit
- •Some data you want to process (CSV files, documents)
What you will achieve
- ✓Process hundreds or thousands of records using AI
- ✓Write reusable scripts for common data tasks
- ✓Understand the costs and trade-offs of API-based AI
- ✓Be ready to build more sophisticated AI applications
Pathway stages
The Basic Loop
2-3 hours
The fundamental pattern: read CSV, call API, save results. This is the foundation for everything else.
Recipes in this stage
- Enrich Data At Scale With Llm Apis
The "boiling water" of programmatic AI - master this and everything else follows
What you will be able to do
- ✓Call OpenAI or Claude API from Python
- ✓Process a CSV file row by row
- ✓Handle API errors and rate limits gracefully
- ✓Save enriched data back to CSV
Document Processing
3-4 hours
Process PDFs, Word documents, and other files in bulk.
Recipes in this stage
- Process Documents In Bulk With Apis
Extract, analyse, and summarise documents at scale
What you will be able to do
- ✓Extract text from PDFs and Word documents
- ✓Process document folders automatically
- ✓Extract structured data from unstructured documents
Classification at Scale
3-4 hours
Categorise, tag, and route records using AI - the most common production use case.
Recipes in this stage
- Classify Enquiries With Ai
Categorise incoming messages automatically
- Categorise Transactions Automatically
Apply consistent categories to financial data
What you will be able to do
- ✓Build classification pipelines that run unattended
- ✓Achieve consistent categorisation across large datasets
- ✓Reduce manual data cleaning work significantly
Production Patterns
2-3 hours
Make your scripts robust enough for regular use.
Recipes in this stage
- Check Data For Problems
Validate AI outputs before trusting them
What you will be able to do
- ✓Build quality checks into your pipelines
- ✓Handle edge cases and failures gracefully
- ✓Monitor AI outputs for drift or problems