← Back to pathways

API Loops and Programmatic AI

Move beyond copy-paste. Learn to call AI from code and process hundreds or thousands of records automatically.

Intermediate4 stages8 recipes10-14 hours

This pathway teaches you the fundamental skill that separates casual AI use from serious productivity gains: calling AI programmatically.

When you have 500 records to classify, 300 documents to summarise, or 1,000 rows to enrich, copy-pasting into Claude is not practical. You need to write code that loops through your data, calls the AI API for each item, and saves the results.

We start with the simplest possible pattern - a Python script that processes a CSV file row by row. Then we move to document processing and more sophisticated classification tasks. By the end, you will be comfortable writing scripts that process any amount of data.

You do not need to be a developer, but you should be comfortable running Python scripts and using Google Colab or similar environments.

Before you start

  • Basic Python skills (variables, loops, functions)
  • Familiarity with Google Colab or Jupyter notebooks
  • An OpenAI or Anthropic API key with some credit
  • Some data you want to process (CSV files, documents)

What you will achieve

  • Process hundreds or thousands of records using AI
  • Write reusable scripts for common data tasks
  • Understand the costs and trade-offs of API-based AI
  • Be ready to build more sophisticated AI applications

Pathway stages

1

The Basic Loop

2-3 hours

The fundamental pattern: read CSV, call API, save results. This is the foundation for everything else.

Recipes in this stage

What you will be able to do

  • Call OpenAI or Claude API from Python
  • Process a CSV file row by row
  • Handle API errors and rate limits gracefully
  • Save enriched data back to CSV
2

Document Processing

3-4 hours

Process PDFs, Word documents, and other files in bulk.

Recipes in this stage

What you will be able to do

  • Extract text from PDFs and Word documents
  • Process document folders automatically
  • Extract structured data from unstructured documents
3

Classification at Scale

3-4 hours

Categorise, tag, and route records using AI - the most common production use case.

Recipes in this stage

What you will be able to do

  • Build classification pipelines that run unattended
  • Achieve consistent categorisation across large datasets
  • Reduce manual data cleaning work significantly
4

Production Patterns

2-3 hours

Make your scripts robust enough for regular use.

Recipes in this stage

What you will be able to do

  • Build quality checks into your pipelines
  • Handle edge cases and failures gracefully
  • Monitor AI outputs for drift or problems

Created by AI Recipes Team

Last updated: 2024-12-23