Skip to content

AINative-Studio/ainative-code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

157 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

AINative Code

CI Release codecov Go Report Card License Go Version Latest Release

AI-Native Development, Natively

A next-generation terminal-based AI coding assistant that combines the best features of open-source AI CLI tools with native integration to the AINative platform ecosystem.

Table of Contents

Features

  • Multi-Provider AI Support: Anthropic Claude, OpenAI GPT, Google Gemini, AWS Bedrock, Azure OpenAI, and Ollama
  • Beautiful TUI: Sophisticated Bubble Tea-based terminal interface
  • AINative Platform Integration: Native access to ZeroDB, Design Tokens, Strapi CMS, and RLHF systems
  • Hybrid Authentication: JWT/OAuth 2.0 for AINative services, API keys for LLM providers
  • Streaming Responses: Real-time AI responses with Server-Sent Events
  • Cross-Platform: macOS, Linux, and Windows support

Installation

Quick Install

Linux and macOS

Download and run the installation script:

curl -fsSL https://raw.githubusercontent.com/AINative-Studio/ainative-code/main/install.sh | bash

This script will:

  • Detect your platform and architecture automatically
  • Download the latest release
  • Verify checksums for security
  • Install the binary to /usr/local/bin

Windows

Download and run the PowerShell installation script:

irm https://raw.githubusercontent.com/AINative-Studio/ainative-code/main/install.ps1 | iex

This script will:

  • Detect your architecture
  • Download the latest release
  • Verify checksums for security
  • Install to %LOCALAPPDATA%\Programs\AINativeCode
  • Add to your PATH automatically

Package Managers

Homebrew (macOS and Linux)

# Add the AINative Studio tap
brew tap ainative-studio/tap

# Install AINative Code
brew install ainative-code

# Verify installation
ainative-code version

Coming Soon

  • Scoop (Windows)
  • Chocolatey (Windows)
  • APT/YUM repositories (Linux)

Manual Installation

For manual installation or to download specific versions, visit the releases page and download the appropriate archive for your platform:

Supported Platforms:

  • Linux (AMD64, ARM64)
  • macOS (Intel, Apple Silicon)
  • Windows (AMD64, ARM64)

See the Installation Guide for detailed manual installation instructions.

Docker

docker pull ainativestudio/ainative-code:latest
docker run -it --rm ainativestudio/ainative-code:latest

Verify Installation

After installation, verify that AINative Code is working:

ainative-code version

Quick Start

AINative Cloud (Recommended)

Get started with AINative's cloud authentication and hosted inference:

  1. Start the Python Backend:

    cd python-backend
    uvicorn app.main:app --reload
  2. Login to AINative:

    ainative-code auth login-backend \
      --email your-email@example.com \
      --password your-password
  3. Send Your First Chat:

    ainative-code chat-ainative \
      --message "Hello! Tell me about AINative" \
      --auto-provider

Learn More:

Traditional Setup (API Keys)

Alternatively, use direct API key authentication:

  1. Initialize configuration:

    ainative-code setup
  2. Configure your preferred LLM provider:

    ainative-code config set provider anthropic
    ainative-code config set anthropic.api_key "your-api-key"
  3. Start coding:

    ainative-code chat

Configuration

Configuration file location: ~/.config/ainative-code/config.yaml

Environment Variables

All configuration options can be set via environment variables with the AINATIVE_CODE_ prefix:

# Basic configuration
export AINATIVE_CODE_PROVIDER=openai
export AINATIVE_CODE_MODEL=gpt-4

# API keys (recommended for security)
export AINATIVE_CODE_LLM_OPENAI_API_KEY=sk-...
export AINATIVE_CODE_LLM_ANTHROPIC_API_KEY=sk-ant-...

# Run the CLI
ainative-code chat

Configuration precedence (highest to lowest):

  1. Command-line flags (--provider openai)
  2. Environment variables (AINATIVE_CODE_PROVIDER=openai)
  3. Config file (~/.config/ainative-code/config.yaml)
  4. Default values

For a complete list of supported environment variables, see Environment Variables Documentation.

Example Configuration File

# LLM Provider Configuration
providers:
  anthropic:
    api_key: "$(pass show anthropic)"
    model: "claude-3-5-sonnet-20241022"
    max_tokens: 4096
    temperature: 0.7

  openai:
    api_key: "${OPENAI_API_KEY}"
    model: "gpt-4"
    max_tokens: 4096

# AINative Platform Configuration
ainative:
  auth:
    token_cache: "~/.config/ainative-code/tokens.json"
    auto_refresh: true

  zerodb:
    endpoint: "https://zerodb.ainative.studio"

  strapi:
    endpoint: "https://cms.ainative.studio"

# TUI Settings
ui:
  theme: "dark"
  colors:
    primary: "#6366F1"
    secondary: "#8B5CF6"
    success: "#10B981"
    error: "#EF4444"

Usage

Chat Mode

# Start interactive chat
ainative-code chat

# Chat with specific model
ainative-code chat --model claude-3-opus-20240229

# One-shot chat
ainative-code chat "Explain how to implement OAuth 2.0"

Code Generation

# Generate code from prompt
ainative-code generate "Create a REST API handler for user authentication"

# Generate and save to file
ainative-code generate "Create a REST API handler" -o handler.go

AINative Platform Operations

# Query ZeroDB
ainative-code zerodb query "SELECT * FROM users WHERE active = true"

# Extract design tokens
ainative-code design-tokens extract --format json

# Sync with Strapi CMS
ainative-code strapi sync

Completed Features

Logging System (TASK-009) ✅

The project includes a production-ready structured logging system with:

  • Structured Logging: JSON and text output formats for easy parsing and debugging
  • Multiple Log Levels: DEBUG, INFO, WARN, ERROR, FATAL with configurable minimum level
  • Context-Aware Logging: Automatic inclusion of request IDs, session IDs, and user IDs from Go context
  • Log Rotation: Automatic rotation based on file size, age, and backup count using lumberjack
  • High Performance: ~2μs per log operation, zero allocations for disabled log levels
  • Thread-Safe: Global logger with mutex protection for concurrent use
  • Flexible Configuration: YAML-based or programmatic configuration

Logging Quick Start

import "github.com/AINative-studio/ainative-code/internal/logger"

func main() {
    // Use global logger with default configuration
    logger.Info("Application started")

    // Structured logging with fields
    logger.InfoWithFields("User logged in", map[string]interface{}{
        "user_id": "user123",
        "email": "user@example.com",
    })

    // Context-aware logging
    ctx := logger.WithRequestID(context.Background(), "req-123")
    log := logger.WithContext(ctx)
    log.Info("Processing request") // Automatically includes request_id
}

Performance Benchmarks (Apple M3):

Operation Time/op Allocations
Simple message 2.0 μs 0 allocs
Formatted message 2.2 μs 1 allocs
Structured fields (5) 2.9 μs 10 allocs
Context-aware 2.2 μs 0 allocs
Disabled level 0.7 ns 0 allocs

See docs/logging.md for complete logging documentation.

Development

Prerequisites

  • Go 1.21 or higher
  • Make (optional, for using Makefile)

Building from Source

# Clone the repository
git clone https://github.com/AINative-studio/ainative-code.git
cd ainative-code

# Build
make build

# Run tests
make test

# Run linter
make lint

# Install locally
make install

Running Tests

# Run all tests
make test

# Run tests with coverage
make coverage

# Run integration tests
make test-integration

# Run specific test
go test -v ./internal/llm/...

Project Structure

ainative-code/
├── cmd/
│   └── ainative-code/      # Main CLI entry point
├── internal/               # Private application code
│   ├── auth/              # Authentication logic
│   ├── llm/               # LLM provider implementations
│   ├── tui/               # Terminal UI components
│   ├── config/            # Configuration management
│   ├── api/               # API clients (ZeroDB, Strapi, etc.)
│   └── database/          # Local SQLite database
├── pkg/                   # Public library code
├── configs/               # Configuration files
├── docs/                  # Documentation
├── scripts/               # Build and utility scripts
├── tests/                 # Integration and E2E tests
└── .github/
    └── workflows/         # CI/CD workflows

Contributing

We welcome contributions! Please see CONTRIBUTING.md for details on our development process, coding standards, and how to submit pull requests.

Before contributing, please:

  1. Read our Code of Conduct
  2. Check existing issues and pull requests
  3. Review the Development Guide

Quick contribution steps:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes and add tests
  4. Ensure all tests pass (make test)
  5. Commit your changes (git commit -m 'Add amazing feature')
  6. Push to the branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Copyright © 2024 AINative Studio. All rights reserved.

Brand

AINative Code - AI-Native Development, Natively

Brand Colors:

  • Primary: #6366F1 (Indigo)
  • Secondary: #8B5CF6 (Purple)
  • Success: #10B981 (Green)
  • Error: #EF4444 (Red)

Documentation

Comprehensive documentation is available in the /docs directory:

Support

Acknowledgments

Built with:

Inspired by projects like:

  • Aider
  • GitHub Copilot CLI
  • Cursor

AI-Native Development, Natively

About

AI-Native Development CLI - Unified LLM interface with native ZeroDB, Design, Strapi, and RLHF integration

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors