

11·
10 hours agoCan’t, he’s too busy with Tasker profiles.


Can’t, he’s too busy with Tasker profiles.


That’s news to me, unless you’re only referring to the smaller models. Any chance you can run a model that exceeds your ram capacity yet?


Sadly, we’ll most likely see an influx of regulation right when it’s broadly accessible to the general public to run locally.


Those poor raccoons have no idea they’ve picked the wrong side. They’d have been better off adapting to moles and burrowing beneath our climate catastrophes.


Sorry for asking, but what are the chances this gets an IOS release?


deleted by creator
I’m not knowledgeable in this area, but I wish there was a way to partition the model and stream the partitions over the input, allowing for some kind of serially processing of models that do exceed memory. Like if I could allocate 32gb of ram, and process a 500gb model but at (500/32) a 15x slower rate.