GitHub’s research lab. The team that built Copilot before it was a product. Current work includes agentic workflows, LLM-powered code navigation, and GitHub Spark. Past projects that shipped: Copilot, Monaspace (code font family), Copilot for Pull Requests. About 13 people โ principal researchers and research engineers. Worth following to see what GitHub is thinking about before it shows up in the product.
pwarnock-cc-plugins is a curated marketplace of Claude Code plugins and
reusable skills, organized into functional categories with quality gates that
prioritize depth, security review, and real-world testing over volume. It
provides 11 plugins across 5 categories and 7 marketplace skills for extending
Claude Code’s capabilities.
Core Features
Plugin Categories
Integrations: Connect Claude Code to external services (Notion CRM,
Strong’s Concordance for biblical study)
Developer Tools: Curated best practices for Vercel and Prisma that go
beyond generic documentation
Workflow: Cross-session orchestration via gastown-parallel-workflow and
session retrospectives
Knowledge & Context: Three-tier progressive disclosure system for managing
large codebase context (constitution, trigger tables, subsystem maps)
Search & Discovery: Semantic search across millions of open source
examples via githits-mcp
Crush represents Charmbracelet’s entry into the AI coding assistant space,
bringing their signature terminal-first philosophy and “glamourous” design
aesthetic to AI-powered development. Built as a native terminal application,
Crush distinguishes itself through its exceptional multi-model support,
flexible configuration options, and deep integration with developer workflows
through LSP and MCP protocols.
Core Features
Terminal-First Design
Native CLI Experience: Fully optimized for terminal workflows with
keyboard-driven interaction
Cross-Platform Support: First-class support on macOS, Linux, Windows
(PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD
Minimal Resource Footprint: Lightweight Go-based implementation with
fast startup and response times
Session Management: Multiple work sessions with context preservation per
project
Multi-Model Architecture
Provider Flexibility: Choose from Anthropic, OpenAI, Groq, OpenRouter,
Google Gemini, Cerebras, HuggingFace, VertexAI, Amazon Bedrock, and more
Custom Provider Support: Add your own OpenAI-compatible or
Anthropic-compatible APIs
Model Switching: Switch LLMs mid-session while preserving context and
conversation history
Cost Tracking: Built-in cost tracking and token usage monitoring
Enhanced Context Awareness
LSP Integration: Uses Language Server Protocol for additional context,
just like modern IDEs
MCP Support: Extensible via Model Context Protocol (stdio, http, and sse
transport types)
Project Initialization: Analyzes codebase and creates context files for
future sessions
Git Integration: Native support for version control workflows with
attribution options
Technical Specifications
Language: Go (98%) with minimal Smarty templating
Installation: Homebrew, NPM, Winget, Scoop, Arch Linux, Nix, Debian/Ubuntu,
Fedora/RHEL, or direct binary download
Configuration: JSON-based with project-local and global options
License: FSL-1.1-MIT (Fair Source License)
Repository: 15.4k+ stars, 883+ forks on GitHub
Release Cadence: Regular updates with 75+ releases