Gas Town AI Tool Accused of Using User Credits for Self-Improvement Without Consent
An issue on GitHub has brought to light that Gas Town, a tool for AI development, reportedly utilizes users' LLM credits and GitHub accounts to resolve its own bugs without informing them. The application features algorithms that automatically assess open issues in the maintainer's GitHub repository, utilizing subscribed LLM resources while tackling problems such as gh-3638, gh-3622, and gh-3641. Users' accounts could submit pull requests without their approval. This practice is standard, lacking any opt-in option or warnings in the documentation. Claude's investigation verified that agents monitor and address issues through polecats. The mayor.md template indicates a push for proactive bug reporting to fixed GitHub URLs. Concerns have been raised regarding the absence of disclosure about funding open-source projects with AI credits, prompting a request for a shift to an opt-in model.
Key facts
- Gas Town formulas gastown-release.formula.toml and beads-release.formula.toml cause installations to review GitHub issues
- Users' LLM credits and GitHub accounts are used without explicit consent
- The software works on upstream issues like gh-3638, gh-3622, and gh-3641
- No disclosure exists in public README or documentation about this behavior
- Claude's investigation confirmed agents track issues and submit PRs using user credentials
- The mayor.md template lines 161-174 suggest proactive bug reporting
- Users may fund fixes to the maintainer's codebase without being told
- A request has been made to change the functionality to opt-in only
Entities
Institutions
- GitHub
- steveyegge/gastown