What happens if you treat a prompt for a long-horizon task as part of the external environment, so the LLM of your choice can decompose and chew over it recursively?

This post is for paying subscribers only

Join peers managing over $100 billion in annual IT spend and subscribe to unlock full access to The Stack’s analysis and events.

Subscribe now

Already a member? Sign in