Opinion

We Started Lens Academy: Scalable Education on Superintelligence Risk

​The number of people who deeply understand superintelligence risk is far too small. There’s a growing pipeline of people entering AI Safety, but most of the available onboarding covers the field broadly, touching on many topics without going deep on the parts we think matter most. People come out having been exposed to AI Safety ideas, but often can’t explain why alignment is genuinely hard, or think strategically about what to work on. We think the gap between “I’ve heard of AI Safety” and “I understand why this might end everything, and can articulate it” is one of the most important gaps to close.We started Lens Academy to close that gap. Lens Academy is a free, nonprofit AI Safety education platform focused specifically on misaligned superintelligence: why it’s the central risk, why alignment is hard, and how to think about what to work on. The teaching combines:Reading articles and watching videosExercises and tests (e.g. you get a question and a free text box to answer)1-on-1 AI tutoring that helps you work through concepts and arguments throughout the weekWeekly group discussions where ideas land, get challenged, and become real.Help Lens help the AI Safety community by becoming a navigatorOne of the best ways you and the LW community can help is by signing up as navigators[1]. We’ll be running a beta in April, for which we’ll need navigators, and many many more in the future. Register your interest by leaving your email here: https://lensacademy.org/enroll [2]Designed to reach millionsLens has no application process and no waitlists. Once we have enough scale, people can sign up and start within days. The platform is designed to scale by eliminating several factors that tend to add much cost and time-effort to such platforms, whilst maintaining (or improving upon quality). We’re aiming for a marignal cost per student of under $10, and accept anyone without gatekeeping. How we teachMost AI Safety education measures exposure: did you read this article? Did you watch this video? We instead try to measure understanding: can you explain this concept? Can you apply it in a new context? There’s a well-documented gap between feeling like you’ve learned something and actually having learned it. We’re building for the latter, using active learning, measured learning outcomes, and (planned) spaced repetition.In practice, the learning experience interleaves content with active engagement. E.g. you read some section, the tutor asks you to explain something in your own words; you answer; the tutor helps you refine it.  We can show (parts of) articles with our own annotations, embed video clips, select sections from podcasts, and run AI voice roleplay scenarios. If a topic hooks you, optional material lets you go deeper, on-platform. This is still a work in progress: we already have optional (segments of) articles, videos, but we’re working toward more personalized, interest-based curriculums in the future.Our learning outcomes are defined separately from course material, which makes it straightforward for experts to review whether our course actually teaches the things that matter most. (Still working on a better interface for such reviews.)What we’ve builtBesides the course platform itself and the first version of our Intro course, we’ve shipped:AI tutoring with sidebar chat, in-line chat, and voice roleplay modeAI-assessed tests: questions and (voice) roleplay conversations that are evaluated by AIFully automated operations: signup, group scheduling, Discord channel creation, Google Calendar invites, progress tracking, joining another group for one meeting or permanently, postponing group meetings by a week, attendance tracking. We’ll keep expanding our set of automated operations as we grow, such that we can continue to scale our userbase indefinitely.Discord breakout rooms, custom-built because Discord doesn’t offer them natively.Facilitator panel for monitoring student progress across groupsPromptLab for testing and iterating on AI tutor system promptsContent validation tools for checking course material formatting and structureLens Editor: a collaborative online markdown editor compatible with Obsidian (basically a custom Notion replacement. More on this in a future post)Where we areWe’re early. Our alpha cohort (20 students, 7 weeks) is wrapping up soon. We’re preparing for a beta run of our course in early April. The team is small: Luc as a full-time founder, with Chris Lons, Slava, Al, and Ouro contributing part-time on course content and strategy. A lot has been built and a lot more remains.How you can helpWe’d love feedback. If you’re experienced in AI Safety, skim the curriculum and tell us what’s missing or wrong.[3] If you know people entering the field, point them our way. If you’re interested in facilitating (or participating as a student), leave your email at https://lensacademy.org/enroll.We’re also looking for a co-founder: most likely a full-stack product builder who can do code, design, and product thinking. More on that in an upcoming post.More updates to come. Want to get notified? Leave your email at  https://lensacademy.substack.com/subscribe LinksWebsite: lensacademy.orgCourse overview: https://lensacademy.org/course (review our curriculum here)Talk to us on Discord: https://discord.gg/nn7HrjFZ8E Register interest: https://lensacademy.org/enroll Get notified for future posts: https://lensacademy.substack.com/subscribe ^Navigators are the people who facilitate our course’s group meetings^The “enroll” page is for both students and navigators. We’ll send you an email as soon as you can actually sign up. Alongside with more information that lets you decide whether you want to.^Note that the course is under very active development and still pretty scrappy at the moment. We currently plan to continue to focus on helping people understand the reasons of why ASI alignment is hard, and help them navigate the pre-paradigmatic landscape of AI Safety. The structure and material of the course will probably change a lot though. We’re happy to receive suggestions on our focus, our structure, and our course content.Discuss ​Read More

​The number of people who deeply understand superintelligence risk is far too small. There’s a growing pipeline of people entering AI Safety, but most of the available onboarding covers the field broadly, touching on many topics without going deep on the parts we think matter most. People come out having been exposed to AI Safety ideas, but often can’t explain why alignment is genuinely hard, or think strategically about what to work on. We think the gap between “I’ve heard of AI Safety” and “I understand why this might end everything, and can articulate it” is one of the most important gaps to close.We started Lens Academy to close that gap. Lens Academy is a free, nonprofit AI Safety education platform focused specifically on misaligned superintelligence: why it’s the central risk, why alignment is hard, and how to think about what to work on. The teaching combines:Reading articles and watching videosExercises and tests (e.g. you get a question and a free text box to answer)1-on-1 AI tutoring that helps you work through concepts and arguments throughout the weekWeekly group discussions where ideas land, get challenged, and become real.Help Lens help the AI Safety community by becoming a navigatorOne of the best ways you and the LW community can help is by signing up as navigators[1]. We’ll be running a beta in April, for which we’ll need navigators, and many many more in the future. Register your interest by leaving your email here: https://lensacademy.org/enroll [2]Designed to reach millionsLens has no application process and no waitlists. Once we have enough scale, people can sign up and start within days. The platform is designed to scale by eliminating several factors that tend to add much cost and time-effort to such platforms, whilst maintaining (or improving upon quality). We’re aiming for a marignal cost per student of under $10, and accept anyone without gatekeeping. How we teachMost AI Safety education measures exposure: did you read this article? Did you watch this video? We instead try to measure understanding: can you explain this concept? Can you apply it in a new context? There’s a well-documented gap between feeling like you’ve learned something and actually having learned it. We’re building for the latter, using active learning, measured learning outcomes, and (planned) spaced repetition.In practice, the learning experience interleaves content with active engagement. E.g. you read some section, the tutor asks you to explain something in your own words; you answer; the tutor helps you refine it.  We can show (parts of) articles with our own annotations, embed video clips, select sections from podcasts, and run AI voice roleplay scenarios. If a topic hooks you, optional material lets you go deeper, on-platform. This is still a work in progress: we already have optional (segments of) articles, videos, but we’re working toward more personalized, interest-based curriculums in the future.Our learning outcomes are defined separately from course material, which makes it straightforward for experts to review whether our course actually teaches the things that matter most. (Still working on a better interface for such reviews.)What we’ve builtBesides the course platform itself and the first version of our Intro course, we’ve shipped:AI tutoring with sidebar chat, in-line chat, and voice roleplay modeAI-assessed tests: questions and (voice) roleplay conversations that are evaluated by AIFully automated operations: signup, group scheduling, Discord channel creation, Google Calendar invites, progress tracking, joining another group for one meeting or permanently, postponing group meetings by a week, attendance tracking. We’ll keep expanding our set of automated operations as we grow, such that we can continue to scale our userbase indefinitely.Discord breakout rooms, custom-built because Discord doesn’t offer them natively.Facilitator panel for monitoring student progress across groupsPromptLab for testing and iterating on AI tutor system promptsContent validation tools for checking course material formatting and structureLens Editor: a collaborative online markdown editor compatible with Obsidian (basically a custom Notion replacement. More on this in a future post)Where we areWe’re early. Our alpha cohort (20 students, 7 weeks) is wrapping up soon. We’re preparing for a beta run of our course in early April. The team is small: Luc as a full-time founder, with Chris Lons, Slava, Al, and Ouro contributing part-time on course content and strategy. A lot has been built and a lot more remains.How you can helpWe’d love feedback. If you’re experienced in AI Safety, skim the curriculum and tell us what’s missing or wrong.[3] If you know people entering the field, point them our way. If you’re interested in facilitating (or participating as a student), leave your email at https://lensacademy.org/enroll.We’re also looking for a co-founder: most likely a full-stack product builder who can do code, design, and product thinking. More on that in an upcoming post.More updates to come. Want to get notified? Leave your email at  https://lensacademy.substack.com/subscribe LinksWebsite: lensacademy.orgCourse overview: https://lensacademy.org/course (review our curriculum here)Talk to us on Discord: https://discord.gg/nn7HrjFZ8E Register interest: https://lensacademy.org/enroll Get notified for future posts: https://lensacademy.substack.com/subscribe ^Navigators are the people who facilitate our course’s group meetings^The “enroll” page is for both students and navigators. We’ll send you an email as soon as you can actually sign up. Alongside with more information that lets you decide whether you want to.^Note that the course is under very active development and still pretty scrappy at the moment. We currently plan to continue to focus on helping people understand the reasons of why ASI alignment is hard, and help them navigate the pre-paradigmatic landscape of AI Safety. The structure and material of the course will probably change a lot though. We’re happy to receive suggestions on our focus, our structure, and our course content.Discuss ​Read More

Leave a Reply

Your email address will not be published. Required fields are marked *