SQL-Rooms with DuckDB - Local In-Browser
Natural Language to SQL with DuckDB. Data stays local to your browser.
Features
- DuckDB in Local Browser
- Process large files
- Text-to-SQL, Charts
Additional Information
TBD
Content to be added.
About the App
SQL Rooms AI @ Tigzig - User Guide
1. Set Up API Keys (Required)
Before using SQL Rooms AI, you must configure your API key:
- Click the API Keys button in the top menu bar
- Enter your API key for at least one provider:
- OpenAI - Requires paid account
- Google Gemini - FREE tier available (recommended for getting started)
- Anthropic Claude - Requires paid account
- Click Save Keys
Need an API key? Click the "Get API Key" link next to each provider in the API Keys modal.
2. Select AI Model
After setting up your API key, choose your preferred model from the dropdown in the top menu:
- OpenAI models: gpt-4o-mini, gpt-4.1-nano
- Google models: gemini-2.0-flash, gemini-2.5-flash-lite, gemini-2.5-flash
- Anthropic models: claude-sonnet-4-5
You can switch models at any time.
3. Quick Start Example
To see how SQL Rooms AI works, check the saved session "Tour de France cycling analysis" in the History dropdown:
- View example queries requesting data in table format
- See how to ask for charts and visualizations
- Learn the natural language query patterns
Example queries you can try:
- "Show me the first 10 rows of the data"
- "What are the column names and types?"
- "Create a chart showing trends over time"
- "Calculate summary statistics for all numeric columns"
Core Features
Natural Language Queries
Simply type your question in the input box at the bottom of the screen and press Enter. The AI will:
- Understand your request
- Generate and execute SQL queries
- Display results in tables or charts
- Explain the analysis
How AI Understands Your Data:
- With each request, the AI receives the database schema (table structures, column names, and types)
- Sample rows from each table are included (up to 10 rows per table)
- The AI knows the structure but may not know statistical distributions of your specific data
For New Datasets:
- The AI won't initially know categorical variable distributions or data patterns
- You'll need to explore the data first with queries like "SUMMARIZE tablename" or "show me sample data"
- Alternatively, upload a separate data dictionary file with metadata
Data Import
Import File (Button at top of Data Sources panel)
Upload data files in multiple formats:
- CSV - Comma-separated values
- TSV - Tab-separated values
- Pipe-delimited (.pipe, .psv) - Pipe-separated values
- TXT - Text files with auto-delimiter detection
- Parquet - Columnar binary format
- JSON - Structured data format
Auto-Delimiter Detection:
- For .txt files, the system automatically detects whether data is comma, tab, or pipe-delimited
- No manual configuration needed
How to import:
- Click Import File button
- Select one or more files
- Files are loaded into tables (table names derived from file names)
- All data stays in your browser - nothing uploaded to servers
Import Database (Button at bottom of Data Sources panel)
Import complete DuckDB database files (.db or .duckdb):
- Click Import Database
- Select a .db or .duckdb file
- Review/edit the schema name
- Click Import
- All tables from the database are imported into a new schema
What happens:
- Creates a new schema in your database
- Imports all tables with their data
- Tables are immediately available for AI queries
- Use schema.table syntax to reference imported tables
Data Export
Export Table
Export individual tables in your preferred format:
- Click Export Table button (bottom of Data Sources panel)
- Select the table from dropdown
- Choose format:
- CSV - Comma-separated, compatible with Excel/spreadsheets
- Pipe-delimited (TXT) - Alternative delimiter for data with commas
- Parquet - Compressed binary format for analytics tools
- Click Export
- File downloads with timestamp (e.g.,
tablename_2025-10-06.csv
)
Download Database
Export your entire database as a compressed archive:
- Click Download Database button (bottom of Data Sources panel)
- System creates a ZIP file containing:
- schema.sql - CREATE TABLE statements for all tables
- table1.parquet, table2.parquet, etc. - Table data in Parquet format
- README.md - Instructions for using the exported data
- ZIP file downloads automatically
Benefits:
- Parquet format is 80-90% smaller than CSV
- Preserves data types and structure
- Can be imported into Python, R, Jupyter, or other DuckDB instances
- README includes example code for loading data
Session Management
History Dropdown (Top menu bar)
- New Session - Start fresh analysis session
- Past 10 Sessions - Access recent sessions (only sessions with content are saved)
- Rename Sessions - Click edit icon next to session name
- Sessions automatically save as you work
Session Limits:
- System keeps last 10 sessions with content
- Example sessions are always preserved
- Older sessions are automatically removed
Current Session
- Session name displayed in top menu (click to rename)
- Delete current session using trash icon
- Sessions persist across browser sessions
Data Management
Clear All Data (Bottom of Data Sources panel)
Permanently deletes:
- All uploaded files and tables
- All imported database schemas
- All data in the main schema
Important:
- This action cannot be undone
- Chat history is preserved
- You'll need to re-import data to continue analysis
Database Panel (Left Side)
The database icon button toggles the Data Sources panel showing:
- Table structure and schemas
- Column names and types
- Row counts
- Import/export controls
SQL Editor (Terminal Icon)
Click the terminal icon in the top menu to:
- Write and execute custom SQL queries
- Access advanced DuckDB features
- Run DDL operations (CREATE, DROP, ALTER tables)
- Execute multi-statement queries
Charts and Visualizations
To create charts, simply ask in natural language:
- "Draw a time series of events by month"
- "Show me a bar chart of categories"
- "Create a scatter plot of X vs Y"
- "Visualize the distribution of values"
The AI will:
- Query the necessary data
- Generate appropriate visualization
- Display interactive chart
- Explain the visualization
Chart Types Available:
- Line charts (time series)
- Bar charts (categorical data)
- Scatter plots (relationships)
- And more based on your data
Data Privacy
All Local Processing:
- Data stays in your browser (uses DuckDB-Wasm)
- No data uploaded to servers
- Files are stored in browser memory only
- Only API calls go to AI providers (OpenAI/Google/Anthropic)
What Gets Sent to AI:
- Your natural language questions
- Database schema (table/column structure)
- Sample rows from tables (up to 10 rows per table)
- Query results (up to 100 rows)
What Does NOT Get Sent:
- Your complete datasets
- Individual files you import
- Full table contents
Tips for Effective Use
Start with data exploration for new datasets:
- "SUMMARIZE tablename"
- "Show me the first 20 rows"
- "What are the unique values in column_name?"
Be specific in your requests:
- Good: "Show me top 10 customers by total sales in 2024"
- Less specific: "Show me some customer data"
Reference table names when working with multiple tables:
- "Join sales and customers tables on customer_id"
- "Compare metrics between table1 and table2"
Use the History dropdown to revisit previous analyses
Switch models if you get better results with different AI providers
Save complex queries by creating new sessions for different analysis tasks
Troubleshooting
AI not responding?
- Check that API key is configured (click API Keys button)
- Verify you selected a model from the dropdown
- Check browser console for errors
Import failed?
- Verify file format is supported
- Check file isn't corrupted
- Try exporting from source in a different format
Chart not displaying?
- Ensure query returns appropriate data structure
- Try rephrasing your visualization request
- Check if data has the necessary columns for the chart type
Session stuck?
- Refresh the page
- Start a new session
- Clear browser cache if issues persist
Source Code and Build Guides
Build It
GitHub Repo:
https://github.com/amararun/sql-rooms-tigzig-s
Refer to the README and App Architecture deployment for further customizations