Pooyan Fazli is an assistant professor and the director of the People and Robots Laboratory (PeRL) in The GAME School and the Media and Immersive eXperience (MIX) Center at Arizona State University. He received his Ph.D. in computer science from the University of British Columbia. Previously, he was a postdoctoral fellow in the CORAL Research Group at Carnegie Mellon University and in the Laboratory for Computational Intelligence at the University of British Columbia.
Center Affiliations
I am affiliated with the Center for Human, Artificial Intelligence, and Robot Teaming (CHART).Research Interests
Robot Learning, Human-Robot Interaction, Vision and Language, Multimodal Learning, Video UnderstandingProspective Postdocs, Students, Visitors, and Interns
If you are interested in a position in our lab, please fill out this form.Sponsors
Our work is generously supported by the National Science Foundation (NSF), National Institutes of Health (NIH), National Aeronautics and Space Administration (NASA), PIT-UN, Google, Amazon, Ability Central Foundation, and ASU.News
| 02/2026           | My interview with KJZZ's The Show on space medicine and rural healthcare: Listen here |
| 11/2025           | NASA Prize was featured on ASU News and Newswise. |
| 07/2025           | NASA MPLAN Prize on audio-visual question answering in space medicine |
| 12/2024           | Our Google grant on AI+X research pathways was featured on ASU News. |
| 11/2024           | Google exploreCSR award ($125,000) |
| 10/2024           | HIRBI Grant ($10,000) |
| 10/2024           | PIT-UN Grant ($25,000) |
| 04/2024           | We launched ViDScribe, an AI platform that empowers blind and low vision users by providing automated audio descriptions for online videos. |
| 11/2023           | Featured on ASU News |
| 09/2023           | Our work on video accessibility was featured on ABC 15. |
| 07/2023           | NIH R01 grant (~$3.2M) on automated video description for blind and low vision users |
| 06/2023           | ASU CHART grant ($10,012) on superhuman performance in autonomous robot teaming applications (SPARTA) |
| 08/2022           | NSF grant ($94,997, Grand Total: $599,974) to develop an edge-based approach to robust multi-robot systems in dynamic environments |
| 04/2022           | ASEE CyBR-MSI grant ($10,000) |
| 02/2022           | Our work on democratizing AI was featured on Google TensorFlow's blog. |
| 11/2021           | Grant award ($6,500) from Google TensorFlow on learning responsible AI for social impact |
| 10/2021           | RSCA grant ($17,885) on safe and resilient autonomous navigation for service robots |
| 08/2021           | Google exploreCSR award ($32,000) on democratizing AI and promoting AI fairness, accountability, transparency, and ethics |
| 05/2021           | I am a Faculty in Residence at Google in Mountain View, CA. |
| 01/2021           | Grant award ($20,000) from Amazon on human-aware robot navigation in indoor environments |
| 10/2020           | I am a visiting faculty in the Department of Computer Science at the University of Copenhagen. |
| 11/2020           | Grant award ($99,948) from the Ability Central Foundation on video accessibility for blind and low vision individuals |
| 08/2020           | NSF grant ($999,987) to promote diversity in the AI workforce and encourage students from underrepresented groups to pursue research and careers in AI |
| 08/2020           | NSF grant ($749,304) to develop safe and secure autonomous robots |
| 12/2019           | Grant award ($102,500) from the Ability Central Foundation on video accessibility for blind and low vision individuals |
| 04/2019           | Speaker and panelist in the Deep Humanities and Arts Symposium in San Jose, CA |
| 02/2019           | Grant award ($6,000) from the Center for Computing in Life Sciences on learning to navigate like humans |
| 12/2018           | Our paper titled "Online Learning of Human Navigational Intentions" was a finalist for the Best Paper Award at ICSR 2018. |
| 12/2018           | Our paper titled "Predicting the Target in Human-Robot Manipulation Tasks" was a finalist for the Best Interactive Paper Award at ICSR 2018. |