Sumon Biswas

Assistant Professor
Department of Computer and Data Sciences
Case School of Engineering

Teaching Information

Teaching Schedule

  • CSDS 447: Responsible AI Engineering
  • CSDS 393/493: Software Engineering
  • CSDS 600: Special Topics

Research Information

Research Interests

My research focuses in the intersection of Software Engineering and AI, developing models, verification techniques, and design principles for responsible AI systems. Combining both formal with empirical methods, I aim to ensure algorithmic fairness and safety across end-to-end AI pipeline and LLM-enabled workflows, through analysis of various software abstractions and their real-world implementations.

[For prospective students] I’m seeking multiple self-motivated students (BS, MS and PhD) to join my research group. If you are interested, please email me your CV and unofficial transcripts.

External Appointments

  • Distinguished Reviewer for ACM Transactions on Software Engineering and Methodology (TOSEM).
  • Panelist in the National Science Foundation (NSF) grant proposal review.
  • Reviewers of the journals: IEEE Transactions on Software Engineering (TSE), IEEE Software, and Empirical Software Engineering (EMSE), Transactions on Affective Computing (TAC), Information and Software Technology (IST)
  • Program Committee Member of IEEE/ACM International Conference on Software Engineering (ICSE 2024-25), The ACM International Conference on the Foundations of Software Engineering (FSE 2026), IEEE/ACM International Conference on Automated Software Engineering (ASE 2023-24)

Publications

Education

Post Doc
Carnegie Mellon University
PhD
Computer Science
Iowa State University
MS
Computer Science
Iowa State University

Additional Information

Personal website: https://sumonbis.github.io

Currently, I’m focusing on foundation models and LLMs, with an emphasis on safety and responsible deployment of AI agents and systems. Our lab runs the state-of-the-art AISC2 cluster, comprising five HGX H200 servers featuring 40 NVIDIA H200 GPUs (141GB memory each). If you’re excited to push the frontiers of LLMs, let’s talk.