Looks Like GPT-4-32k is Rolling Out
Ever gotten this error when trying to generate a large body of text with GPT-4? This modelβs maximum context length is <8192> tokens. However, your messages resulted in <a Gazillion> tokens. Please reduce the length of the messages. So was I. Now I have just discovered that the new “gpt-4-32k” model slowly rolls out, as … Read more