Do you need to file an issue?
Describe the bug
I followed the tutorial in get_started.md to index the book, but found it didn't generate the community reports. And I found these exceptions in log:
^^^^^^^^^^^^^^^^^^^
File "/Volumes/workplace/community/graphrag/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 859, in acompletion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with this model. Learn more about supported models at the Structured Outputs guide: https://platform.openai.com/docs/guides/structured-outputs", 'type': 'invalid_request_error', 'param': None, 'code': None}}
After switch the model to gpt-4o in settings.yaml, it ran as expected. Can we change the default model in this repo?
Steps to reproduce
No response
Expected Behavior
No response
GraphRAG Config Used
Logs and screenshots
No response
Additional Information
- GraphRAG Version:
- Operating System:
- Python Version:
- Related Issues:
Do you need to file an issue?
Describe the bug
I followed the tutorial in get_started.md to index the book, but found it didn't generate the community reports. And I found these exceptions in log:
After switch the model to gpt-4o in settings.yaml, it ran as expected. Can we change the default model in this repo?
Steps to reproduce
No response
Expected Behavior
No response
GraphRAG Config Used
# Paste your config hereLogs and screenshots
No response
Additional Information