Sitr
5.0(
5 ratings
)ExtensionAccessibility14 users
Overview
Automatically sanitize sensitive information in prompts before submitting to LLM interfaces
Sitr automatically sanitizes sensitive, censored, or custom-defined keywords in AI prompts before they are submitted to large language model (LLM) interfaces. This ensures safer interactions by filtering out undesired or inappropriate terms while preserving the core intent of your input. Whether you're working in a professional environment, need to comply with content policies, or simply want to protect personal or confidential data, Sitr provides seamless protection without disrupting your workflow. Lightweight, efficient, and privacy-conscious, it runs silently in the background to enhance prompt hygiene and support responsible AI usage.
5 out of 55 ratings
Details
- Version1.1.4
- UpdatedSeptember 8, 2025
- Offered byLAVA EFF
- Size30.13KiB
- LanguagesEnglish
- Developer
Email
kurtkey.business@gmail.com - Non-traderThis developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.
Privacy
The developer has disclosed that it will not collect or use your data. To learn more, see the developer’s privacy policy.
This developer declares that your data is
- Not being sold to third parties, outside of the approved use cases
- Not being used or transferred for purposes that are unrelated to the item's core functionality
- Not being used or transferred to determine creditworthiness or for lending purposes