TL;DR: MATS Autumn 2026 applications are now open. It’s a 10-week fully-funded research fellowship (Sep 28 – Dec 4, 2026) in AI alignment, security, and governance, with mentorship from researchers at Anthropic, Google DeepMind, OpenAI, Redwood, AI Futures Project, and more. This cohort also launches two new tracks: a Founding & Field-Building track and a Biosecurity track. Fellows receive a $5,000/month stipend + $8,000/month compute budget, plus housing, meals, and travel. Apply by June 7, 2026 AoE at matsprogram.org/apply.This is also MATS’s first Autumn cohort – part of our shift to running three fellowships per year to expand capacity for AI safety research and talent development.About MATSMATS Research is an educational research nonprofit dedicated to solving the talent pipeline bottleneck in AI alignment and security research. We believe reducing risks from powerful AI is one of the world’s most urgent and talent-constrained challenges, and that ambitious people from a wide range of backgrounds and career stages can meaningfully contribute to this work. That’s why we’re training the next generation of AI safety researchers and founders.Program detailsThe Autumn 2026 cohort runs from September 28 to December 5, 2026, based primarily in Berkeley, California and London, UK. Fellows receive:$5,000/month stipend + $8,000/month compute budgetOffice space in Berkeley or London (depending on mentor preference)Housing, meals, and travel coveredJ1 visa support if neededMentorship from world-class researchers and a dedicated research managerA close-knit cohort, regular seminars and workshops with industry experts, and an active global alumni networkOver 80% of fellows receive an extension to continue their fellowship for 6 to 12 months with ongoing mentorship, support, and funding ($7,680/mo stipend + $8,000/month compute)Research tracksApplicants can apply to one or more of the following tracks. Each track page describes the research agenda, the mentors involved, and what we’re looking for in applicants—we encourage prospective fellows to read the relevant track pages before applying:EmpiricalTheoryStrategy and ForecastingPolicy and GovernanceSystems SecurityBiosecurityFounding and Field-BuildingNew this cohortTwo of these tracks are new for Autumn 2026:The Founding and Field-Building track for founders, field-builders, and high-agency generalists looking to launch new AI safety initiatives.The Biosecurity track focused on preventing catastrophic biological risk from AI.Our record527 alumni, 100+ mentors200+ publications with 12,300+ citations80% of alumni who graduated before 2025 are working in AI safety/security10% have co-founded active AI safety startups30+ initiatives founded by alumniWho we’re looking forMATS explicitly looks for talent that traditional pipelines might overlook. We welcome technical researchers without prior ML experience who can demonstrate strong reasoning and research potential. We also encourage applications from policy professionals with strong writing skills, familiarity with governmental processes, and the technical literacy to engage with AI systems—particularly those with backgrounds in national security, cybersecurity, US-China relations, biosecurity, and/or nuclear policy.Remote participationWhile we prefer fellows to participate in-person from Berkeley (with some streams based out of London), we understand this may not always be feasible and are open to remote participation for exceptional candidates on a case-by-case basis.Apply by June 7, 2026 AoE → matsprogram.org/applyIt only takes 1–2 hours to apply. Please share this post with people you know who’d be a strong fit.Discuss Read More
MATS Autumn 2026 Fellowship Applications Now Open—Apply by June 7
TL;DR: MATS Autumn 2026 applications are now open. It’s a 10-week fully-funded research fellowship (Sep 28 – Dec 4, 2026) in AI alignment, security, and governance, with mentorship from researchers at Anthropic, Google DeepMind, OpenAI, Redwood, AI Futures Project, and more. This cohort also launches two new tracks: a Founding & Field-Building track and a Biosecurity track. Fellows receive a $5,000/month stipend + $8,000/month compute budget, plus housing, meals, and travel. Apply by June 7, 2026 AoE at matsprogram.org/apply.This is also MATS’s first Autumn cohort – part of our shift to running three fellowships per year to expand capacity for AI safety research and talent development.About MATSMATS Research is an educational research nonprofit dedicated to solving the talent pipeline bottleneck in AI alignment and security research. We believe reducing risks from powerful AI is one of the world’s most urgent and talent-constrained challenges, and that ambitious people from a wide range of backgrounds and career stages can meaningfully contribute to this work. That’s why we’re training the next generation of AI safety researchers and founders.Program detailsThe Autumn 2026 cohort runs from September 28 to December 5, 2026, based primarily in Berkeley, California and London, UK. Fellows receive:$5,000/month stipend + $8,000/month compute budgetOffice space in Berkeley or London (depending on mentor preference)Housing, meals, and travel coveredJ1 visa support if neededMentorship from world-class researchers and a dedicated research managerA close-knit cohort, regular seminars and workshops with industry experts, and an active global alumni networkOver 80% of fellows receive an extension to continue their fellowship for 6 to 12 months with ongoing mentorship, support, and funding ($7,680/mo stipend + $8,000/month compute)Research tracksApplicants can apply to one or more of the following tracks. Each track page describes the research agenda, the mentors involved, and what we’re looking for in applicants—we encourage prospective fellows to read the relevant track pages before applying:EmpiricalTheoryStrategy and ForecastingPolicy and GovernanceSystems SecurityBiosecurityFounding and Field-BuildingNew this cohortTwo of these tracks are new for Autumn 2026:The Founding and Field-Building track for founders, field-builders, and high-agency generalists looking to launch new AI safety initiatives.The Biosecurity track focused on preventing catastrophic biological risk from AI.Our record527 alumni, 100+ mentors200+ publications with 12,300+ citations80% of alumni who graduated before 2025 are working in AI safety/security10% have co-founded active AI safety startups30+ initiatives founded by alumniWho we’re looking forMATS explicitly looks for talent that traditional pipelines might overlook. We welcome technical researchers without prior ML experience who can demonstrate strong reasoning and research potential. We also encourage applications from policy professionals with strong writing skills, familiarity with governmental processes, and the technical literacy to engage with AI systems—particularly those with backgrounds in national security, cybersecurity, US-China relations, biosecurity, and/or nuclear policy.Remote participationWhile we prefer fellows to participate in-person from Berkeley (with some streams based out of London), we understand this may not always be feasible and are open to remote participation for exceptional candidates on a case-by-case basis.Apply by June 7, 2026 AoE → matsprogram.org/applyIt only takes 1–2 hours to apply. Please share this post with people you know who’d be a strong fit.Discuss Read More
