Skip to content

brunocurado/NvidiaAIStudio

Repository files navigation

Nvidia AI Studio

A native macOS AI coding assistant powered by NVIDIA NIM β€” with agentic file access, GitHub integration, and Apple's Liquid Glass interface built entirely in Swift.

macOS Swift License Version


What is this?

Nvidia AI Studio is a standalone macOS app that gives you a local AI coding assistant with real access to your filesystem. Unlike browser-based tools, this app runs natively on your Mac β€” it can read your files, write code, run shell commands, commit to GitHub, and work across your entire project, all from a single interface.

It connects to the NVIDIA NIM API, giving you access to over 16 frontier models including DeepSeek, Kimi, Qwen, Llama, and Mistral β€” for free during the NVIDIA NIM preview.


Screenshots

SPLASH SCREEN

Captura de ecrã 2026-03-23, às 10 23 56

MAIN INTERFACE

Captura de ecrã 2026-03-23, às 10 24 08

THEME SELECTOR AND BACKGROUND TINT

Captura de ecrã 2026-03-23, às 10 27 09

SKILLS SELECTOR

Captura de ecrã 2026-03-23, às 10 25 09

AVAILABLE MODELS >>> OVER 100 MODELS AVAILABLE THROUGH NVIDIA NIM

Captura de ecrã 2026-03-23, às 10 27 02

YOU CAN ALSO CHOOSE FROM ANTHROPIC OPEN AI OR BYOK

Captura de ecrã 2026-03-23, às 10 51 30

SSH CONNECTION

Captura de ecrã 2026-03-23, às 10 28 57

BACKGROUND AGENTS

Captura de ecrã 2026-03-23, às 10 26 27

MCP SERVERS AND CONNECTORS

Captura de ecrã 2026-03-23, às 10 27 28

AND GITHUB INTEGRATION

Captura de ecrã 2026-03-23, às 10 27 15

Features

πŸ€– Multi-Model & Multi-Provider Support

Switch between 16+ frontier models mid-conversation. Supports multiple AI providers:

Provider Models Highlights
NVIDIA NIM DeepSeek V3.2, Kimi K2.5, Qwen3 Coder 480B, Llama 3.3, Mistral, etc. Free preview, 128K–256K context
Anthropic Claude 4 Sonnet, Claude 4 Opus Best reasoning
OpenAI GPT-4o, GPT-4o-mini Vision + function calling
Custom Any OpenAI-compatible endpoint Self-hosted models

Each model is labelled with its capabilities:

  • πŸ‘οΈ Vision models β€” Attach images directly to your messages
  • 🧠 Thinking models β€” Extended reasoning with configurable depth (Low / Medium / High / Off)

πŸ› οΈ Agentic Skills (Tool Calling)

The AI can take real actions on your Mac through a set of built-in skills with an autonomous agent loop (up to 10 iterations):

Skill What it does
read_file Read any file on your system
write_file Create or overwrite files
list_directory Browse directory contents
search_files Grep across your codebase
run_command Execute shell commands
git Run any git operation
ssh_command Execute commands on a remote VPS
image_generation Generate images via NVIDIA NIM
fetch_images Fetch and inject images for vision-capable models
web_search / web_fetch Search the web and fetch page content

Tool calls are displayed as GlassCode-style inline pills showing the tool name, edited filename, +additions / -deletions diff stats, and expandable results.

πŸ”’ Full Access vs Sandboxed Mode

Control how much of your system the AI can see:

  • Full Access β€” the AI can read, write, and run commands anywhere on your Mac
  • Sandboxed β€” restricted to the active workspace folder; all file operations outside that folder are blocked at the skill level

Switch modes live at any point in a conversation from the bottom toolbar.

πŸ“ Workspace Management

  • Open Workspace β€” pick any project folder as your working context
  • Sessions are grouped by project in the sidebar
  • Right-click context menu on folders: New Thread, Rename, Remove Workspace
  • Show less / Show more to collapse folder listings
  • New threads automatically inherit the active workspace path
  • Export threads as .txt via the share button

πŸ™ GitHub Integration

Connect your GitHub account via OAuth Device Flow or Personal Access Token:

Setup:

  1. Click Settings β†’ GitHub
  2. Click Connect with GitHub (opens Device Flow) or paste a Personal Access Token
  3. Authorize the app on github.com

Once connected:

  • Clone Repository β€” browse all your repos and clone with one click
  • Commit & Push β€” see changed files with diff stats, write a commit message, and push β€” no Terminal needed
  • Token is stored securely in the macOS Keychain

πŸͺŸ Apple Liquid Glass UI

Built natively with macOS 26 Tahoe's Liquid Glass design language:

  • .glassEffect() used throughout β€” sidebar, message bubbles, input area, toolbar, tool call pills, toasts
  • .buttonStyle(.glass) for all toolbar buttons
  • Translucent window with adjustable opacity and blur sliders in Settings
  • 7 built-in color themes: Dark, Midnight, Ocean, Forest, Sunset, Nord, Light
  • Smooth spring animations on panels, messages, and state transitions

πŸ’¬ Premium Chat Experience

  • Markdown rendering β€” full GFM with syntax-highlighted code blocks, headings, lists, blockquotes
  • "Worked for Xm Ys" timing badge after each assistant response
  • Inline diff pills β€” "Edited filename.swift +14 -5 >" for file edit tool calls
  • Streaming dots animation while the model is generating
  • Collapsible reasoning β€” see the model's chain-of-thought with character count
  • User & assistant avatars with subtle glow circles
  • Context compression β€” auto-summarizes older messages when context usage exceeds 80%
  • Context usage indicator β€” circular ring showing how much of the model's context window is used

πŸ“Š Usage Tracking

  • Track token usage per model, session, and provider
  • View total prompt/completion tokens over time
  • Available from the sidebar Usage panel or the Tokens toolbar button

⚑ Background Agents

  • Multiple agents can work in the background
  • Live badge counter on the Agents toolbar button
  • Floating panel in the chat view shows running agents

πŸ”” Native Notifications

  • Desktop notification when a response completes (with model name)
  • Permission requested on first launch

πŸ”Œ MCP (Model Context Protocol) Support

  • Configure external MCP servers in Settings β†’ MCP
  • Auto-connect on app launch
  • Extend the app's capabilities with custom tool servers

Requirements

  • macOS 26.0 (Tahoe) or later β€” required for Liquid Glass APIs
  • Apple Silicon or Intel Mac
  • A free NVIDIA NIM API key

Installation

Option 1 β€” Download DMG (easiest)

  1. Download Nvidia-AI-Studio.dmg from the Releases page
  2. Open the DMG and drag the app to your Applications folder
  3. Launch the app, go to Settings β†’ API Keys, and add your NVIDIA NIM key

Option 2 β€” Build from source

# Clone the repo
git clone https://github.com/brunocurado/NvidiaAIStudio.git
cd NvidiaAIStudio

# Add your API key to a .env file
echo "NVIDIA_NIM_API_KEY=nvapi-your-key-here" > .env

# Build and package
bash build_app.sh release

# The app will be at:
# NvidiaAIStudio/build/Nvidia AI Studio.app

Requirements for building:

  • Xcode 26+ with Swift 6.2
  • macOS 26.0 (Tahoe)

Getting Your NVIDIA NIM API Key

  1. Go to build.nvidia.com
  2. Create a free account
  3. Navigate to your profile β†’ API Keys
  4. Generate a key β€” it starts with nvapi-
  5. Paste it in Settings β†’ API Keys inside the app

The free tier includes generous usage limits across all available models.


Project Structure

NvidiaAIStudio/
β”œβ”€β”€ App/
β”‚   β”œβ”€β”€ NvidiaAIStudioApp.swift    # Entry point, window config, Liquid Glass
β”‚   └── AppState.swift             # Global observable state
β”œβ”€β”€ Models/
β”‚   β”œβ”€β”€ AIModel.swift              # Model definitions + multi-provider defaults
β”‚   β”œβ”€β”€ AppTheme.swift             # 7 color themes + glass effect config
β”‚   β”œβ”€β”€ Message.swift              # Chat message + tool call types
β”‚   β”œβ”€β”€ Session.swift              # Conversation session with project grouping
β”‚   └── SystemPrompt.swift         # Dynamic system prompt with workspace context
β”œβ”€β”€ Services/
β”‚   β”œβ”€β”€ NVIDIAAPIService.swift     # Streaming API client (OpenAI-compatible)
β”‚   β”œβ”€β”€ OpenAIAPIService.swift     # OpenAI/Anthropic provider
β”‚   β”œβ”€β”€ ModelFetcher.swift         # Live model list from NVIDIA NIM
β”‚   β”œβ”€β”€ GitHubService.swift        # OAuth Device Flow + REST API
β”‚   └── MCPManager.swift           # Model Context Protocol server manager
β”œβ”€β”€ Skills/
β”‚   β”œβ”€β”€ Skill.swift                # Protocol + SkillRegistry + sandbox enforcement
β”‚   β”œβ”€β”€ FileSkills.swift           # File operations + shell commands
β”‚   β”œβ”€β”€ GitSkill.swift             # Git operations
β”‚   β”œβ”€β”€ SSHSkill.swift             # Remote SSH execution
β”‚   β”œβ”€β”€ WebSkills.swift            # Web search + fetch
β”‚   └── ImageGenerationSkill.swift # NVIDIA image generation
β”œβ”€β”€ ViewModels/
β”‚   └── ChatViewModel.swift        # Agent loop, streaming, tool execution, context compression
β”œβ”€β”€ Utilities/
β”‚   └── KeychainHelper.swift       # Secure storage for API keys + GitHub tokens
└── Views/
    β”œβ”€β”€ ContentView.swift           # Main 3-column layout + toolbar
    β”œβ”€β”€ OnboardingView.swift        # First-launch setup
    β”œβ”€β”€ Chat/
    β”‚   β”œβ”€β”€ ChatView.swift          # Message list + background agents panel
    β”‚   β”œβ”€β”€ InputAreaView.swift     # Rich input with attachments + model picker
    β”‚   └── MessageBubbleView.swift # Markdown bubbles, tool pills, timing badges
    β”œβ”€β”€ Sidebar/
    β”‚   └── SidebarView.swift       # Threads, workspaces, skills, usage, settings
    β”œβ”€β”€ Settings/
    β”‚   └── SettingsView.swift      # API Keys, GitHub, Theme, Models, MCP tabs
    β”œβ”€β”€ RightPanel/
    β”‚   └── RightPanelView.swift    # Git diff viewer + terminal
    β”œβ”€β”€ Components/
    β”‚   └── ToastView.swift         # Notification toasts
    β”œβ”€β”€ GitPanelView.swift          # Commit & Push panel
    β”œβ”€β”€ CloneRepoView.swift         # Repository browser + clone
    └── SkillsPanelView.swift       # Skills toggle panel

Keyboard Shortcuts

Action Shortcut
Send message Enter
New line in message Shift + Enter
Commit & Push (in Git panel) Cmd + Enter
Open settings Cmd + ,

Configuration via .env

You can pre-configure the app by placing a .env file in the same directory as the app binary:

NVIDIA_NIM_API_KEY=nvapi-your-key-here

The app auto-loads this key on first launch if no key is configured in Settings.


Dependencies

All other functionality is built on native Apple frameworks: SwiftUI, AppKit, Foundation, Security.


Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you'd like to change.

  1. Fork the repo
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License β€” see LICENSE for details.


Acknowledgements

  • Built with NVIDIA NIM β€” access to world-class AI models through a single OpenAI-compatible API.
  • Designed with Apple's Liquid Glass design language on macOS 26 Tahoe.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors