• @Veritas@lemmy.ml
    link
    fedilink
    English
    51 year ago

    I’m most excited about the upcoming Vicuna 65B and other LLMs with 100k+ context that can basically get a whole book or large source code as input.