A11Y at CSUN


Our research seeks to improve the accessibility of interactive technologies. Accessibility does not only benefit people with disability but also empowers everyone. Recent advances in AI have enabled new forms of human-computer interaction characterized by greater adaptability and better human-machine symbiosis. Our research uses AI to accelerate the development of assistive technology and enhance the accessibility of user interfaces. We are looking for graduate students and undergraduate students to participant in these areas: 

  • Adaptive human-technology interaction (including mobile) 
  • Automated Web Accessibility Inspection
  • Explainable AI and Visualization
  • AI (Computer) Education

You are welcome to visit CSUN Human-computing Lab @ JD2222.

Recent Graduate Research Projects:

  • Investigating How People with Disabilities Disclose Difficulties on YouTube by Jaime Garcia and Summayah Waseem

  • AI Visualization for multi-agent robotics by Gage Aschenbrenner and Ramin Roufeh
  • Automated Speech Recognition for Instructional Contents by Timothy Spengler
  • Visual Biofeedback in Speech Rehabilitation by Luan Ta
  • Behavioral Biometrics Classification by Shen Huang
  • Web Accessibility Suggestions through Short-text Classification by Gerardo Rodriguez
  • Deep Convolutional GANs in Procedural Terrain Generation Systems by Edgar Lopez-Garcia
  • Procedural Terrain Generation in Virtual Reality by Ryan Vitacion