Loading ...
微软学术去年最后一天关闭了,现在只有 谷歌学术 和Semantic Scholar了。 微软学术是对文献进行了语义处理的,比如可以区分某个学者在某个单位任职时发布的文献,和学者与该机构共同 …
微软目前上线了新的学术搜索,我试了下,中文不可以用,英文还行,基本都能搜索出来。对于微软再次进入学…
NeurIPS Reproducibility Challenge Report: Adapting Neural Networks for the Estimation of Treatment Effects ...
Several post-training quantization methods have been applied to large language models (LLMs), and have been shown to perform well down to 8-bits.
LLM-QAT: Data-Free Quantization Aware Training for Large Language Models Anonymous ACL submission Abstract 001 Several post-training quantization methods