Skip to main content
All CollectionsOverview of Flint
Communicating with parents about AI
Communicating with parents about AI

Details about Flint's student data privacy, educator oversight of AI use, and control over AI response accuracy.

Sohan Choudhury avatar
Written by Sohan Choudhury
Updated over 11 months ago

As a school administrator, you might be fielding questions from parents about your school's stance on AI.

Below is an overview of common areas of parent concern, and how Flint addresses each.

Student data privacy

Tools like ChatGPT (and the thousands of consumer-facing AI tools out there) do not protect student data. The hidden cost of these tools is that any data that students put in (e.g. personal experiences, questions about sensitive topics) is then used by AI companies to train future versions of AI models. Similar to social media use, the "product" for these AI companies is your data.

Flint doesn't use any student data to train AI models. Data is stored only so that students, parents, teachers, and school administrators can access it. That means that students can use AI with appropriate oversight, without their data being the hidden product.

Equitable access

Tools like ChatGPT restrict access to users over the age of 13, denying younger students the ability to develop AI literacy early on. Additionally, the vast difference in quality of free (e.g. regular ChatGPT) versus paid (e.g. ChatGPT Plus), means that not all students are on a level playing field when using AI.

Flint allows students under the age of 13 to access to AI, while working directly with schools to ensure COPPA compliance. Additionally, students are never the end customers of Flint. That means that when a school rolls out Flint, every student gets access to the most robust and accurate AI models (e.g. GPT-4) and don't have to pay anything themselves.

Educator oversight of AI use

Many of the concerns with irresponsible AI use by students today (think students using ChatGPT to write their essays) stem from the fact that there's no oversight of students using AI. Any child can sign up for an AI tool if they have an email address, and easily lie about their age.

Flint allows educators to view exactly how students are using AI. Every student interaction with AI is tracked and can be viewed by teachers as well as school administrators. In special cases, administrators have the ability (e.g. based on teacher concern or a parent request) to request an export of student data from Flint.

Controlling the behavior of AI in an educational context

AI tools like ChatGPT can answer any question about anything — or at least they attempt to. This may be great for personal productivity, but makes these tools difficult to use educationally. Because tools like ChatGPT are designed to satisfy the end user, they'll give students the immediate answer to any question instead of challenging students to think through problems themselves.

Flint allows teachers to control exactly how AI interacts with students. This means that the AI won't give away the answer to a student. Instead, it will challenge students to think critically and adjust the difficulty of assessment or review on the fly. If a student strays off topic, the AI will gently nudge the student back to the learning objective that the teacher has provided, and all chat data can be viewed by the teacher.

Ensuring accuracy in an educational context

AI models are trained on millions of documents, ranging from textbooks to news articles to online forums. This makes these AI models seemingly knowledgeable on any topic, but dive deeper and you'll find that AI might make content up when it doesn't know the answer, or fail to provide specific sources for its knowledge.

On Flint, teachers can provide the AI with content from their class — whether that's a worksheet, textbook chapter, lecture slides, lesson plan, or link to a recent news article. The AI will then pull from the materials provided by the teacher when interacting with students, ensuring accuracy and relevant context.

AI for teacher augmentation, not replacement

Students are already using AI for homework help, through tools like ChatGPT and many other apps (just search "AI homework" on the App Store). In the long run, if AI use continues to be driven primarily by students, there's a risk that the importance of teachers as the source of learning is undermined in the minds of students.

Flint changes the equation by letting teachers augment themselves with AI instead of feeling as if they are competing against it. Teachers have incredible context on the needs of their students, and can use Flint to multiply their own effectiveness, such as by providing students with more personalized help in the classroom.

Did this answer your question?