at the moment yes. The one possible silver lining with all of the current hardware crunch is that it _should_ force some hardware advancements. The last couple years hardware has been kinda boring. My m1max is still zippy as all hell and doesn't really need to be upgraded, unless I am committing to local AI inference.
I kinda assume phones are going to be battery powered for the foreseeable future. "Gaming" phones with better cooling do exist, but they are a tiny niche. Most local AI users will want to serve their inference needs through a very different kind of system.
Yes, but the battery tech itself is improving. We're already seeing new phones approach 8000 mAH internal batteries, which is large enough that you can splurge on compute and still have some left over at the end of the day.
I'm very curious what kind of hardware advancements you're imagining. Because we're already kind of near a physical wall regarding heat dissipation on phones.
I mean hey, maybe foundational physics will surprise the world with a radical breakthrough that disappears heat into a black hole or something, but I sure wouldn't hold my breath
eGPU cradles, presumably, for people with intense local model execution requirements until it can be made to work in the device? This is exactly like the POS dongles Square had until tap to pay was more widespread?
This is 100% AI's fault. It is a mix of more commits coming in, most likely code quality degradation, and I would not be surprised if capex that could be used to help with load is going towards AI instead
i have a lot of sway over what git+cicd system my corporate overlord uses. As I am very Github alternative curious right now, if anyone is pushing alternative git+cicd stack, I want to hear it.
It really has been infuriating lately. Between this and my company's proxy screwing with HTTP/2 at least once a day the frustration is very very real. While I'm nowhere as invested in GitHub its decline does make me sad.
reply