1
0

ContentAutomation Boilerplate

Project scaffold for posting media from a storage backend to multiple social networks.

Structure

src/content_automation/
	adapters/
		social/
			base.py
			instagram.py
			youtube.py
		storage/
			base.py
			local.py
			s3.py
	controller.py
	factories.py
	interfaces.py
	main.py
	settings.py

Design

  • SocialNetworkAdapter interface defines a typed post_media(media_url, caption) contract.
  • Social adapters inherit from SocialNetworkBaseAdapter and use typed uplink.Consumer clients.
  • StorageAdapterBase defines exists() and get_public_url().
  • LocalFilesystemStorageAdapter and S3StorageAdapter implement storage behavior.
  • PublishController is adapter-agnostic and only depends on abstractions.
  • AppSettings uses pydantic-settings with nested settings classes per adapter/backend.

Configuration

Environment variables use prefix CONTENT_AUTOMATION_ and nested delimiter __.

Examples:

# Targets
$env:CONTENT_AUTOMATION_TARGET_SOCIAL_NETWORKS='["instagram","youtube"]'

# Instagram
$env:CONTENT_AUTOMATION_INSTAGRAM__ACCESS_TOKEN='your-instagram-token'
$env:CONTENT_AUTOMATION_INSTAGRAM__USER_ID='your-instagram-user-id'
$env:CONTENT_AUTOMATION_INSTAGRAM__API_VERSION='v25.0'

# YouTube
$env:CONTENT_AUTOMATION_YOUTUBE__ACCESS_TOKEN='your-youtube-token'
$env:CONTENT_AUTOMATION_YOUTUBE__REFRESH_TOKEN='your-youtube-refresh-token'
$env:CONTENT_AUTOMATION_YOUTUBE__CLIENT_ID='your-google-oauth-client-id'
$env:CONTENT_AUTOMATION_YOUTUBE__CLIENT_SECRET='your-google-oauth-client-secret'
$env:CONTENT_AUTOMATION_YOUTUBE__TOKEN_URI='https://oauth2.googleapis.com/token'
$env:CONTENT_AUTOMATION_YOUTUBE__SCOPES='["https://www.googleapis.com/auth/youtube.upload"]'
$env:CONTENT_AUTOMATION_YOUTUBE__EXPIRY='2026-03-13T00:00:00Z'
$env:CONTENT_AUTOMATION_YOUTUBE__CATEGORY_ID='22'
$env:CONTENT_AUTOMATION_YOUTUBE__PRIVACY_STATUS='public'

# Storage backend (local or s3)
$env:CONTENT_AUTOMATION_STORAGE__BACKEND='local'

# Local storage settings
$env:CONTENT_AUTOMATION_STORAGE__LOCAL__ROOT_DIRECTORY='D:\media'

# S3 storage settings
$env:CONTENT_AUTOMATION_STORAGE__BACKEND='s3'
$env:CONTENT_AUTOMATION_STORAGE__S3__BUCKET_NAME='your-bucket'
$env:CONTENT_AUTOMATION_STORAGE__S3__KEY_PREFIX='uploads'
$env:CONTENT_AUTOMATION_STORAGE__S3__REGION_NAME='us-west-1'

Run

python -m content_automation.main path/to/video.mp4 --caption "My new post"

Pre-commit

Install hooks:

uv run pre-commit install --hook-type pre-commit --hook-type pre-push

Run hooks manually:

uv run pre-commit run --all-files

Docker

Build the image:

docker build -t content-automation .

Run a publish job:

docker run --rm --env-file .env -v "$PWD:/workspace" content-automation upload path/to/video.mp4 --caption "My new post"

Run any other command in the container:

docker run --rm -it --env-file .env content-automation bash
Description
Languages
Python 88.4%
Makefile 5.9%
Shell 4.3%
Dockerfile 1.4%