Cite

Loading ...

Copy and paste a formatted citation or use one of the links to import into a bibliography manager.
ImportLinks
Citation for this paper is under construction, please kindly try another paper.

Download

Loading ...

Downloadable resource for this paper is under construction, please kindly view it online.
反馈建议及合作事宜,欢迎联系我们:
  1. 微软学术和谷歌学术比较有什么优劣? - 知乎

    微软学术去年最后一天关闭了,现在只有 谷歌学术 和Semantic Scholar了。 微软学术是对文献进行了语义处理的,比如可以区分某个学者在某个单位任职时发布的文献,和学者与该机构共同 …

  2. 如何评价微软学术搜索? - 知乎

    微软目前上线了新的学术搜索,我试了下,中文不可以用,英文还行,基本都能搜索出来。对于微软再次进入学…

  3. NeurIPS Reproducibility Challenge Report: Adapting Neural Networks for the Estimation of Treatment Effects ...

  4. LLM-QAT: Data-Free Quantization Aware Training for Large Language...

    Several post-training quantization methods have been applied to large language models (LLMs), and have been shown to perform well down to 8-bits.

  5. LLM-QAT: Data-Free Quantization Aware Training for Large Language Models Anonymous ACL submission Abstract 001 Several post-training quantization methods

    微软学术致力于提供来自全球的多语种文献检索服务。反馈建议及合作事宜,欢迎联系我们:bingopen@microsoft.com