minhyung / error-solutions-openai
An OpenAI API-compatible solution provider for spatie/error-solutions.
Package info
github.com/overworks/error-solutions-openai
pkg:composer/minhyung/error-solutions-openai
Requires
- php: ^8.2
- guzzlehttp/guzzle: ^7.0
- illuminate/support: ^10.0|^11.0|^12.0|^13.0
- openai-php/client: ^0.19
- psr/simple-cache: ^3.0
- spatie/error-solutions: ^2.0
Requires (Dev)
- illuminate/cache: ^10.0|^11.0|^12.0|^13.0
- orchestra/testbench: ^8.22|^9.0|^10.0|^11.0
- pestphp/pest: ^2.20|^3.0
- phpstan/phpstan: ^2.1
README
minhyung/error-solutions-openai provides a small replacement for the OpenAI
solution classes in spatie/error-solutions.
It keeps Spatie's existing solution provider flow, but lets you use modern OpenAI models and OpenAI API-compatible providers such as OpenRouter, vLLM, or Ollama-compatible servers.
Installation
composer require minhyung/error-solutions-openai
Publish the optional config file:
php artisan vendor:publish --tag="error-solutions-openai-config"
Configuration
Set your API key and model:
ERROR_SOLUTIONS_OPENAI_KEY=sk-... ERROR_SOLUTIONS_OPENAI_MODEL=gpt-5.4-mini
For an OpenAI API-compatible provider, set a custom base URL:
ERROR_SOLUTIONS_OPENAI_KEY=... ERROR_SOLUTIONS_OPENAI_BASE_URL=https://openrouter.ai/api/v1 ERROR_SOLUTIONS_OPENAI_MODEL=openai/gpt-5.4-mini
Extra provider headers can be configured in config/error-solutions-openai.php:
'headers' => [ 'HTTP-Referer' => env('APP_URL'), 'X-Title' => env('APP_NAME'), ],
If your provider expects a token limit parameter other than max_tokens, set:
ERROR_SOLUTIONS_OPENAI_TOKEN_LIMIT_PARAMETER=max_completion_tokens
Usage
Register the provider in Spatie's existing config/error-solutions.php:
use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider; return [ 'solution_providers' => [ 'php', 'laravel', OpenAiSolutionProvider::class, ], ];
You can also instantiate it directly:
use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider; $provider = new OpenAiSolutionProvider( apiKey: env('ERROR_SOLUTIONS_OPENAI_KEY'), model: 'gpt-5.4-mini', );
The provider uses Chat Completions because it is the broadest common API across OpenAI-compatible model providers.
Testing
composer test
composer analyse
Changelog
Please see the GitHub releases for more information on what has changed.
Contributing
Pull requests are welcome. Please run the test suite and static analysis before opening a pull request.
Security
If you discover a security vulnerability, please report it privately instead of opening a public issue by emailing urlinee@gmail.com.
Credits
License
The MIT License (MIT). Please see License File for more information.