
Granola Privacy Alert: AI Notes Viewable via Link and Used for Training by Default
Users of the AI-powered note-taking application Granola are being advised to review their privacy settings following revelations regarding data accessibility and usage. Although the company markets its service as 'private by default,' the platform currently allows anyone with a specific link to view notes. Furthermore, Granola utilizes user notes for internal AI training purposes unless individuals manually opt out of the process. Positioned as an AI notepad for professionals, these default configurations have raised concerns regarding the actual level of privacy provided to its user base. This report explores the discrepancy between the marketing claims and the functional reality of Granola's data handling policies as reported by The Verge.
Key Takeaways
- Link Accessibility: Despite claims of being private, any individual possessing a specific link can view a user's Granola notes.
- AI Training Defaults: Granola utilizes user-generated notes for internal AI training by default.
- Opt-Out Requirement: Users must manually change their settings to prevent their data from being used for AI model development.
- Privacy Discrepancy: There is a notable gap between Granola's "private by default" marketing and its actual data sharing and training configurations.
In-Depth Analysis
The Reality of 'Private by Default' Claims
Granola markets itself as an AI notepad designed for professional use, emphasizing a commitment to privacy. However, the current technical implementation reveals that notes are accessible to anyone who has the corresponding link. This configuration challenges the traditional definition of "private," as it relies on the secrecy of a URL rather than restricted access controls or authentication. For users handling sensitive professional information, this default state poses a potential risk if links are shared inadvertently or discovered by unauthorized parties.
Data Utilization for AI Development
Beyond the visibility of notes, Granola's policy regarding internal AI training has come under scrutiny. The platform automatically opts users into a program where their notes are used to train the company's internal AI models. While many AI companies seek user data to improve their algorithms, the integration of this as a default setting—combined with the link-sharing accessibility—highlights a trend in the industry where user data is a primary resource for product iteration. Users who wish to maintain total confidentiality of their notes must navigate the application's settings to explicitly opt out of these training protocols.
Industry Impact
The situation with Granola underscores a growing tension in the AI software industry between user privacy and the data requirements of machine learning. As more "AI-first" productivity tools enter the market, the definition of "private by default" is becoming increasingly fluid. This case serves as a significant example for the industry, suggesting that transparency regarding link-based sharing and AI training opt-outs is critical for maintaining user trust. It also highlights the responsibility of users to audit the privacy settings of AI tools, even when those tools are marketed as secure professional solutions.
Frequently Asked Questions
Question: Can anyone see my Granola notes without my permission?
Based on the report, anyone who obtains the specific link to your note can view its content, as this is the default setting for the application.
Question: Does Granola use my personal notes to train their AI?
Yes, Granola uses notes for internal AI training by default. Users must manually opt out if they do not want their data used for this purpose.
Question: How does Granola describe its own privacy policy?
Granola describes its notes as being "private by default," despite the link-sharing and AI training configurations currently in place.


