Do you know whether “federated inference” is also technically feasible (at least theoretically) similar to “Federated learning strengthens this design by enabling local training with aggregated updates”?
Meaning, can a local cohort of compute be leveraged to create transient capacity for more involved problems?
Thank you so much. I have to look up terms like AR Systems and TOPS. My high school computer lab had a 32k Apple that was a challenge to graph a sinusoid on. 🤓 The computer lab back in college would 'time share' with the computers at Dartmouth. 🐕 To even attempt to fathom 10-billion parameters⚡🧮 as "small" blows up this olde dog's notion of knew trix. Great overview ⏰⚡💰🕶️📲 of the scale value and time sequence of AI going forward...
My opinion is SLMs for verticalized application layer products which are hyper specialists for a specific field. They will be deep not broad
LLMs will continue to push the frontier and will be the models professionals use for extremely compute intensive tasks. Perhaps asking a question that the model will sit and think about for weeks
this is good
slm > llm indeed
Great for the ecosystem!
lfg
cheers
Well, if SLM glasses can help people from walking straight into me while they are engrossed in their phone, then I can’t wait.
Hahaha true product market fit!
Do you know whether “federated inference” is also technically feasible (at least theoretically) similar to “Federated learning strengthens this design by enabling local training with aggregated updates”?
Meaning, can a local cohort of compute be leveraged to create transient capacity for more involved problems?
Its possible in theory but in practice might be better to use a bigger central computer. But could be useful in a world where blockchain x AI
It's certainly a brave knew world. 🌐👨🏻💻🔮🌄🎯
"On Knew Things" 🏁🏆
About the time I get to accept 🐢 one knew set of tech acronyms, a knewer one pops up! 🐇
Better build more electro capacity for the grid, I figure most kids ⚡will want both LLM's & SLM's😏
Definately need more energy!
Thank you so much. I have to look up terms like AR Systems and TOPS. My high school computer lab had a 32k Apple that was a challenge to graph a sinusoid on. 🤓 The computer lab back in college would 'time share' with the computers at Dartmouth. 🐕 To even attempt to fathom 10-billion parameters⚡🧮 as "small" blows up this olde dog's notion of knew trix. Great overview ⏰⚡💰🕶️📲 of the scale value and time sequence of AI going forward...
My opinion is SLMs for verticalized application layer products which are hyper specialists for a specific field. They will be deep not broad
LLMs will continue to push the frontier and will be the models professionals use for extremely compute intensive tasks. Perhaps asking a question that the model will sit and think about for weeks
I am not bullish on Apple given their current failures in incorporating AI. The winners might surprise us:
Google with the Pixel brand
OpenAI with wearables
Meta probably with glasses
And don't sleep on startups moving faster than encumbents can
Yes I think SLMs will be build into consumer products and nearly invisible. Phones don’t say “powered by the internet!” We just expect them to be
Application layer companies will charge by subscription or usage
Frontier labs will charge subscription + tokens after a certain limit and the competitive advantage will be in how cheap you can price per token