LLM-eval-survey
The official GitHub page for the survey paper "A Survey on Evaluation of Large Language Models".
How to download and setup LLM-eval-survey
Open terminal and run command
git clone https://github.com/MLGroupJLU/LLM-eval-survey.git
git clone is used to create a copy or clone of LLM-eval-survey repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with LLM-eval-survey https://github.com/MLGroupJLU/LLM-eval-survey/archive/master.zip
Or simply clone LLM-eval-survey with SSH
[email protected]:MLGroupJLU/LLM-eval-survey.git
If you have some problems with LLM-eval-survey
You may open issue on LLM-eval-survey support forum (system) here: https://github.com/MLGroupJLU/LLM-eval-survey/issuesSimilar to LLM-eval-survey repositories
Here you may see LLM-eval-survey alternatives and analogs
netdata primesieve fashion-mnist FrameworkBenchmarks BenchmarkDotNet jmeter awesome-semantic-segmentation sysbench hyperfine tsung benchmark_results across web-frameworks php-framework-benchmark jsperf.com go-web-framework-benchmark huststore phoronix-test-suite Attabench ann-benchmarks sbt-jmh caffenet-benchmark chillout IocPerformance prophiler TBCF NBench sympact awesome-http-benchmark BlurTestAndroid