S-Eval

S-Eval

IS2Lab

S-Eval: Automatic and Adaptive Test Generation for Benchmarking Safety Evaluation of Large Language Models

69 Stars
3 Forks
69 Watchers
other License
Cost to Build
$5.60M
Market Value
$11.63M

Growth over time

1 data points  ·  2025-06-06 → 2025-06-06
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about S-Eval

Question copied to clipboard

What is the IS2Lab/S-Eval GitHub project? Description: "S-Eval: Automatic and Adaptive Test Generation for Benchmarking Safety Evaluation of Large Language Models". Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone S-Eval

Clone via HTTPS

git clone https://github.com/IS2Lab/S-Eval.git

Clone via SSH

[email protected]:IS2Lab/S-Eval.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the S-Eval issue tracker:

Open GitHub Issues