1 support issue
- All
- Questions
- Suggestions
- Problems
William Wang
Dec 15, 2023
Use GPT model
Please add support to use a long context model, currently, I get this in response:
{
"error": {
"message": "This model's maximum context length is 4097 tokens, however you requested 5362 tokens (3314 in your prompt; 2048 for the completion). Please reduce your prompt; or completion length.",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
- Report illegal content
- Copy link