AI-Native Development, Natively
A next-generation terminal-based AI coding assistant that combines the best features of open-source AI CLI tools with native integration to the AINative platform ecosystem.
- Features
- Installation
- Quick Start
- Configuration
- Usage
- Completed Features
- Development
- Project Structure
- Contributing
- License
- Documentation
- Support
- Acknowledgments
- Multi-Provider AI Support: Anthropic Claude, OpenAI GPT, Google Gemini, AWS Bedrock, Azure OpenAI, and Ollama
- Beautiful TUI: Sophisticated Bubble Tea-based terminal interface
- AINative Platform Integration: Native access to ZeroDB, Design Tokens, Strapi CMS, and RLHF systems
- Hybrid Authentication: JWT/OAuth 2.0 for AINative services, API keys for LLM providers
- Streaming Responses: Real-time AI responses with Server-Sent Events
- Cross-Platform: macOS, Linux, and Windows support
Download and run the installation script:
curl -fsSL https://raw.githubusercontent.com/AINative-Studio/ainative-code/main/install.sh | bashThis script will:
- Detect your platform and architecture automatically
- Download the latest release
- Verify checksums for security
- Install the binary to
/usr/local/bin
Download and run the PowerShell installation script:
irm https://raw.githubusercontent.com/AINative-Studio/ainative-code/main/install.ps1 | iexThis script will:
- Detect your architecture
- Download the latest release
- Verify checksums for security
- Install to
%LOCALAPPDATA%\Programs\AINativeCode - Add to your PATH automatically
# Add the AINative Studio tap
brew tap ainative-studio/tap
# Install AINative Code
brew install ainative-code
# Verify installation
ainative-code version- Scoop (Windows)
- Chocolatey (Windows)
- APT/YUM repositories (Linux)
For manual installation or to download specific versions, visit the releases page and download the appropriate archive for your platform:
Supported Platforms:
- Linux (AMD64, ARM64)
- macOS (Intel, Apple Silicon)
- Windows (AMD64, ARM64)
See the Installation Guide for detailed manual installation instructions.
docker pull ainativestudio/ainative-code:latest
docker run -it --rm ainativestudio/ainative-code:latestAfter installation, verify that AINative Code is working:
ainative-code versionGet started with AINative's cloud authentication and hosted inference:
-
Start the Python Backend:
cd python-backend uvicorn app.main:app --reload -
Login to AINative:
ainative-code auth login-backend \ --email your-email@example.com \ --password your-password
-
Send Your First Chat:
ainative-code chat-ainative \ --message "Hello! Tell me about AINative" \ --auto-provider
Learn More:
- Getting Started Guide - Complete setup instructions
- Authentication Guide - Manage credentials and tokens
- Hosted Inference Guide - Explore chat features
- API Reference - Detailed command documentation
Alternatively, use direct API key authentication:
-
Initialize configuration:
ainative-code setup
-
Configure your preferred LLM provider:
ainative-code config set provider anthropic ainative-code config set anthropic.api_key "your-api-key"
-
Start coding:
ainative-code chat
Configuration file location: ~/.config/ainative-code/config.yaml
All configuration options can be set via environment variables with the AINATIVE_CODE_ prefix:
# Basic configuration
export AINATIVE_CODE_PROVIDER=openai
export AINATIVE_CODE_MODEL=gpt-4
# API keys (recommended for security)
export AINATIVE_CODE_LLM_OPENAI_API_KEY=sk-...
export AINATIVE_CODE_LLM_ANTHROPIC_API_KEY=sk-ant-...
# Run the CLI
ainative-code chatConfiguration precedence (highest to lowest):
- Command-line flags (
--provider openai) - Environment variables (
AINATIVE_CODE_PROVIDER=openai) - Config file (
~/.config/ainative-code/config.yaml) - Default values
For a complete list of supported environment variables, see Environment Variables Documentation.
# LLM Provider Configuration
providers:
anthropic:
api_key: "$(pass show anthropic)"
model: "claude-3-5-sonnet-20241022"
max_tokens: 4096
temperature: 0.7
openai:
api_key: "${OPENAI_API_KEY}"
model: "gpt-4"
max_tokens: 4096
# AINative Platform Configuration
ainative:
auth:
token_cache: "~/.config/ainative-code/tokens.json"
auto_refresh: true
zerodb:
endpoint: "https://zerodb.ainative.studio"
strapi:
endpoint: "https://cms.ainative.studio"
# TUI Settings
ui:
theme: "dark"
colors:
primary: "#6366F1"
secondary: "#8B5CF6"
success: "#10B981"
error: "#EF4444"# Start interactive chat
ainative-code chat
# Chat with specific model
ainative-code chat --model claude-3-opus-20240229
# One-shot chat
ainative-code chat "Explain how to implement OAuth 2.0"# Generate code from prompt
ainative-code generate "Create a REST API handler for user authentication"
# Generate and save to file
ainative-code generate "Create a REST API handler" -o handler.go# Query ZeroDB
ainative-code zerodb query "SELECT * FROM users WHERE active = true"
# Extract design tokens
ainative-code design-tokens extract --format json
# Sync with Strapi CMS
ainative-code strapi syncThe project includes a production-ready structured logging system with:
- Structured Logging: JSON and text output formats for easy parsing and debugging
- Multiple Log Levels: DEBUG, INFO, WARN, ERROR, FATAL with configurable minimum level
- Context-Aware Logging: Automatic inclusion of request IDs, session IDs, and user IDs from Go context
- Log Rotation: Automatic rotation based on file size, age, and backup count using lumberjack
- High Performance: ~2μs per log operation, zero allocations for disabled log levels
- Thread-Safe: Global logger with mutex protection for concurrent use
- Flexible Configuration: YAML-based or programmatic configuration
import "github.com/AINative-studio/ainative-code/internal/logger"
func main() {
// Use global logger with default configuration
logger.Info("Application started")
// Structured logging with fields
logger.InfoWithFields("User logged in", map[string]interface{}{
"user_id": "user123",
"email": "user@example.com",
})
// Context-aware logging
ctx := logger.WithRequestID(context.Background(), "req-123")
log := logger.WithContext(ctx)
log.Info("Processing request") // Automatically includes request_id
}Performance Benchmarks (Apple M3):
| Operation | Time/op | Allocations |
|---|---|---|
| Simple message | 2.0 μs | 0 allocs |
| Formatted message | 2.2 μs | 1 allocs |
| Structured fields (5) | 2.9 μs | 10 allocs |
| Context-aware | 2.2 μs | 0 allocs |
| Disabled level | 0.7 ns | 0 allocs |
See docs/logging.md for complete logging documentation.
- Go 1.21 or higher
- Make (optional, for using Makefile)
# Clone the repository
git clone https://github.com/AINative-studio/ainative-code.git
cd ainative-code
# Build
make build
# Run tests
make test
# Run linter
make lint
# Install locally
make install# Run all tests
make test
# Run tests with coverage
make coverage
# Run integration tests
make test-integration
# Run specific test
go test -v ./internal/llm/...ainative-code/
├── cmd/
│ └── ainative-code/ # Main CLI entry point
├── internal/ # Private application code
│ ├── auth/ # Authentication logic
│ ├── llm/ # LLM provider implementations
│ ├── tui/ # Terminal UI components
│ ├── config/ # Configuration management
│ ├── api/ # API clients (ZeroDB, Strapi, etc.)
│ └── database/ # Local SQLite database
├── pkg/ # Public library code
├── configs/ # Configuration files
├── docs/ # Documentation
├── scripts/ # Build and utility scripts
├── tests/ # Integration and E2E tests
└── .github/
└── workflows/ # CI/CD workflows
We welcome contributions! Please see CONTRIBUTING.md for details on our development process, coding standards, and how to submit pull requests.
Before contributing, please:
- Read our Code of Conduct
- Check existing issues and pull requests
- Review the Development Guide
Quick contribution steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes and add tests
- Ensure all tests pass (
make test) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
Copyright © 2024 AINative Studio. All rights reserved.
AINative Code - AI-Native Development, Natively
Brand Colors:
- Primary: #6366F1 (Indigo)
- Secondary: #8B5CF6 (Purple)
- Success: #10B981 (Green)
- Error: #EF4444 (Red)
Comprehensive documentation is available in the /docs directory:
- Architecture Guide - System design and technical architecture
- User Guide - Getting started and usage instructions
- API Reference - Detailed API documentation
- Development Guide - Contributing and development setup
- Examples - Code examples and use cases
- Documentation: https://docs.ainative.studio/code
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Email: support@ainative.studio
Built with:
- Bubble Tea - TUI framework
- Cobra - CLI framework
- Viper - Configuration management
- zerolog - High-performance logging
Inspired by projects like:
- Aider
- GitHub Copilot CLI
- Cursor
AI-Native Development, Natively