btx/sql-ai-query

PHP package to generate SQL queries using AI based on database schema with conversation context.

Maintainers

Package info

github.com/bachtiarpanjaitan/SqlAiQuery

pkg:composer/btx/sql-ai-query

Statistics

Installs: 0

Dependents: 0

Suggesters: 0

Stars: 0

Open Issues: 0

1.0.1 2026-03-28 06:31 UTC

This package is auto-updated.

Last update: 2026-03-28 06:32:56 UTC


README

Total Downloads Latest Stable Version License

> Generate SQL queries from natural language using **ChatGPT** or **Grok**, with full conversation context support.

Features

  • 🤖 Multi-provider — OpenAI (GPT-4o, GPT-4, GPT-3.5-turbo), xAI Grok (grok-3) and Custom Provider via the same openai-php/client package
  • 📝 Conversation context — every exchange is saved so the AI understands follow-up questions
  • 🗄️ Auto schema extraction — introspect MySQL, PostgreSQL, or SQLite via PDO, or supply a PHP array / raw DDL
  • 🧱 Fluent builder API — compose your generator in one readable chain
  • 📦 PSR-4 autoloaded, PHP 8.1+

Installation

composer require btx/sql-ai-query

Quick Start

use Btx\SqlAiQuery\SqlAiQuery;

$result = SqlAiQuery::make()
    ->useOpenAI(apiKey: 'sk-...')          // or ->useGrok('xai-...')
    ->withSchemaFromPdo($pdo)              // auto-introspect live DB
    ->generate('Show me the 10 most recent orders with the customer name');

echo $result->getSql();
// SELECT o.id, c.name AS customer_name, o.total_amount, o.created_at
// FROM orders o
// JOIN customers c ON c.id = o.customer_id
// ORDER BY o.created_at DESC
// LIMIT 10;

Configuration

Choosing a Provider

// ChatGPT (default model: gpt-4o)
->useOpenAI(apiKey: 'sk-...', model: 'gpt-4o', temperature: 0.2, maxTokens: 2048)

// Grok (default model: grok-3)
->useGrok(apiKey: 'xai-...', model: 'grok-3', temperature: 0.2, maxTokens: 2048)

// Another AI Provider
->useCustomAi(apiKey: '...', model: 'gpt-4o', temperature: 0.2, maxTokens: 2048)

// Custom provider (implement AIProviderInterface)
->useProvider(new MyCustomProvider(...))

Providing the Schema

// Option A — Auto-introspect from a live PDO connection
->withSchemaFromPdo($pdo)
->withSchemaFromPdo($pdo, 'my_database')  // specify database name

// Option B — PHP array
->withSchemaFromArray([
    'users' => [
        'id'    => ['type' => 'BIGINT', 'nullable' => false, 'key' => 'PRI'],
        'email' => ['type' => 'VARCHAR(255)', 'nullable' => false],
    ],
])

// Option C — Raw DDL string
->withSchemaFromDdl('CREATE TABLE users (id INT PRIMARY KEY, ...)')

// Option D — Pre-formatted description string
->withSchema('TABLE users: id (BIGINT PRI), email (VARCHAR 255) ...')

Multi-turn Conversations

Every generate() call returns a QueryResult that carries a contextId.
Pass it back to continue the same conversation:

$generator = SqlAiQuery::make()
    ->useOpenAI('sk-...')
    ->withSchemaFromPdo($pdo)
    ->withStoragePath('/var/app/contexts')  // where to persist contexts
    ->build();

// First turn
$r1 = $generator->generate('Get all customers registered in the last 30 days');
$contextId = $r1->getContextId();

// Follow-up — AI remembers the previous query
$r2 = $generator->followUp('Now add the total number of orders for each', $contextId);

// Refine further
$r3 = $generator->followUp('Only keep those with more than 3 orders', $contextId);

Contexts are persisted as JSON files so they survive between requests / CLI runs.

API Reference

SqlAiQuery (Builder)

Method Description
::make() Create a new builder instance
->useOpenAI($key, $model?, $temp?, $maxTokens?) Configure ChatGPT provider
->useGrok($key, $model?, $temp?, $maxTokens?) Configure Grok provider
->useProvider($provider) Inject a custom AIProviderInterface
->withSchemaFromPdo($pdo, $db?) Auto-introspect schema via PDO
->withSchemaFromArray($array) Schema from PHP array
->withSchemaFromDdl($ddl) Schema from DDL string
->withSchema($string) Schema from formatted string
->withStoragePath($path) Directory for context JSON files
->withMaxContextMessages($n) Max messages to keep per context (default 20)
->build() Return a QueryGenerator instance
->generate($request, $contextId?) Build + generate in one call

QueryGenerator

Method Description
generate($request, $contextId?) Generate SQL; starts fresh if $contextId is null
followUp($request, $contextId) Continue an existing conversation
fresh($request) Force a brand-new conversation
getContextStore() Access the ContextStore
deleteContext($contextId) Delete a stored context

QueryResult

Method Description
getSql() Extracted SQL query (strips Markdown fences)
getRawResponse() Full unprocessed AI response
getContextId() Context ID for continuing the conversation

ConversationContext

Method Description
addSystem($content) Set the system prompt
addUser($content) Add a user message
addAssistant($content) Add an assistant message
getMessages() All messages as array
trimToLastN($n) Keep only the N most recent non-system messages
toJson() / fromJson() Serialize / deserialize

ContextStore

Method Description
save($context) Persist a context to disk
load($contextId) Load a context from disk
exists($contextId) Check if a context exists
delete($contextId) Remove a context
list() List all stored context IDs

Supported PDO Drivers

Driver Notes
mysql Full column metadata + foreign keys via information_schema
pgsql Tables and columns from information_schema
sqlite Tables and columns via PRAGMA table_info

Project Structure

src/
├── Contracts/
│   └── AIProviderInterface.php   # Provider contract
├── Context/
│   ├── ConversationContext.php   # Manages message history
│   └── ContextStore.php          # File-based persistence
├── Exceptions/
│   ├── SqlAiQueryException.php
│   └── ProviderException.php
├── Providers/
│   ├── OpenAIProvider.php        # ChatGPT adapter
│   └── GrokProvider.php          # Grok adapter (OpenAI-compatible)
├── Schema/
│   └── SchemaExtractor.php       # PDO / array / DDL → formatted schema
├── QueryGenerator.php            # Core engine
├── QueryResult.php               # Response wrapper + SQL extractor
└── SqlAiQuery.php                # Fluent builder (main entry-point)

License

MIT