This is the ERIC Lab at UCSC.

About Us

Welcome to the UCSC Embodied and Responsible Interaction and Communication (ERIC) Lab! ERIC also stands for Epic Research In Computing. Our lab’s research interests include Natural Language Processing, Computer Vision, and Machine Learning, with an emphasis on building Embodied AI agents that can communicate with humans using natural language to perform real-world multimodal tasks.

Recent News

  • Our lab received a research grant from Microsoft. Thanks Microsoft!
  • Our lab received a gift award from Adobe. Thanks Adobe!
  • Our lab received multiple gift awards from eBay and Snap. Thanks!
  • Two workshops are accepted to ACL 2024! Will be co-organizing the 3rd Workshop on Advances in Language and Vision Research (ALVR 2024) and the Fourth International Combined Workshop on Spatial Language Understanding and Grounded Communication for Robotics (SpLU-RoboNLP 2024) in Bangkok, Thailand.
  • Prof. Xin (Eric) Wang has a invited talk at Yale University (10/2023).
  • Three papers accepted to NeurIPS 2023! Congratulations to all authors!
  • Three papers accepted to EMNLP 2023! Congratulations to all authors!
  • Our Athena Team won Second Place ($50K) in Alexa Prize SocialBot Grand Challenge 5 (Scientific Innovation Track)! Check out Amazon News for more information about this!
  • Our SlugJARVIS Team won Third Place ($50K) in Alexa Prize SimBot Challenge! Check out UCSC News for more information about this!
  • Our ESC paper is accepted to ICML 2023!
  • Our SlugJARVIS team advances to the finals of the inaugural Alexa Prize SimBot Challenge! Check out Amazon News for more information about this and UCSC News for our three teams of all three Alexa Prize Challenges!
  • Prof. Xin (Eric) Wang is co-organizing the 5th Workshop on Closing the Loop Between Vision and Language (CLVL) at ICCV 2023!
  • Prof. Xin (Eric) Wang is serving as Area Chair for NeurIPS 2023!
  • Two papers on (1) Training-Free Structured Diffusion Guidance and (2) Neuro-Symbolic Procedural Planning with Commonsense Prompting (Spotlight) are accepted to ICLR 2023!
  • Three papers on (1) Multimodal Graph Transformer, (2) Imagination-Based Automatic Evaluation, and (3) Imagination-Guided Open-Ended Text Generation are accepted to EACL 2023!
  • Our Sage team received an Amazon Alexa Prize Award to work on Alexa Prize TaskBot Challenge 2. Thanks Amazon!
  • Our paper "Parameter-Effcient Model Adaptation for Vision Transformers" is accepted to AAAI 2023!
  • Our Athena team received an Amazon Alexa Prize Award to work on Alexa Prize SocialBot Grand Challenge 5. Thanks Amazon!
  • Our paper "CPL: Counterfactual Prompt Learning for Vision and Language Models" is accepted to EMNLP 2022!
  • Our paper VLMbench: A Compositional Benchmark for Vision-and-Language Manipulation got accpeted to NeurlPS 2022 (Datasets and Benchmarks)!
  • Our papers on (1) Privacy-preserving Federated Vision-and-Language Navigation and (2) Language-guided Artistic Style Transfer are accepted to ECCV 2022!
  • Our SlugJARVIS team won the Alexa Prize SimBot Public Benchmark Challenge! link
  • One paper on Understanding Instance-Level Impact of Fairness Constraints accepted to ICML 2022!
  • Congrats to the Ph.D. and undergrad students in our lab for securing summer research internships at Google Research, Adobe Research, Samsung Research, and Department of Energy (Science Undergraduate Laboratory), etc. in Summer 2022!
  • Two papers accepted to NAACL 2022! Topics include (1) Imagination-Augmented Natural Language Understanding and (2) Diagnosing Vision-and-Language Navigation.
  • Two papers accepted to CVPR 2022! Topics include (1) Compositional Temporal Grounding and (2) Language-based Video Editing.
  • Three papers accepted to ACL 2022! Topics include (1) Vision-and-Language Navigation Survey, (2) Multilingual Fairness, and (3) Interpretable Research Replication Prediction.
  • We have received a Google Faculty Research Award. Thanks Google!
  • Our SlugJARVIS team received an Amazon Alexa Prize Award to work on Alexa Prize SimBot Challenge. Thanks Amazon!
  • We have received the AAII Interdisciplinary Research Award.
  • Our paper on Mitigating Gender Bias in Image Search is accepted to EMNLP 2021 as an Oral paper.
  • We have received Google Cloud Research Credits.
  • Our VALUE paper is accepted to NeurIPS 2021 (Datasets and Benchmarks).