13 Comments
User's avatar
Cyrus Johnson "AI Counsel"'s avatar

this is good

slm > llm indeed

Expand full comment
Matthew Harris's avatar

Great for the ecosystem!

Expand full comment
Cyrus Johnson "AI Counsel"'s avatar

lfg

cheers

Expand full comment
Steve's avatar

Well, if SLM glasses can help people from walking straight into me while they are engrossed in their phone, then I can’t wait.

Expand full comment
Matthew Harris's avatar

Hahaha true product market fit!

Expand full comment
Dierken's avatar

Do you know whether “federated inference” is also technically feasible (at least theoretically) similar to “Federated learning strengthens this design by enabling local training with aggregated updates”?

Meaning, can a local cohort of compute be leveraged to create transient capacity for more involved problems?

Expand full comment
Matthew Harris's avatar

Its possible in theory but in practice might be better to use a bigger central computer. But could be useful in a world where blockchain x AI

Expand full comment
Robert C Culwell's avatar

It's certainly a brave knew world. 🌐👨🏻‍💻🔮🌄🎯

"On Knew Things" 🏁🏆

About the time I get to accept 🐢 one knew set of tech acronyms, a knewer one pops up! 🐇

Better build more electro capacity for the grid, I figure most kids ⚡will want both LLM's & SLM's😏

Expand full comment
Matthew Harris's avatar

Definately need more energy!

Expand full comment
Robert C Culwell's avatar

Thank you so much. I have to look up terms like AR Systems and TOPS. My high school computer lab had a 32k Apple that was a challenge to graph a sinusoid on. 🤓 The computer lab back in college would 'time share' with the computers at Dartmouth. 🐕 To even attempt to fathom 10-billion parameters⚡🧮 as "small" blows up this olde dog's notion of knew trix. Great overview ⏰⚡💰🕶️📲 of the scale value and time sequence of AI going forward...

Expand full comment
User's avatar
Comment removed
Aug 31
Comment removed
Expand full comment
Matthew Harris's avatar

My opinion is SLMs for verticalized application layer products which are hyper specialists for a specific field. They will be deep not broad

LLMs will continue to push the frontier and will be the models professionals use for extremely compute intensive tasks. Perhaps asking a question that the model will sit and think about for weeks

Expand full comment
User's avatar
Comment removed
Aug 31
Comment removed
Expand full comment
Matthew Harris's avatar

I am not bullish on Apple given their current failures in incorporating AI. The winners might surprise us:

Google with the Pixel brand

OpenAI with wearables

Meta probably with glasses

And don't sleep on startups moving faster than encumbents can

Expand full comment
Matthew Harris's avatar

Yes I think SLMs will be build into consumer products and nearly invisible. Phones don’t say “powered by the internet!” We just expect them to be

Application layer companies will charge by subscription or usage

Frontier labs will charge subscription + tokens after a certain limit and the competitive advantage will be in how cheap you can price per token

Expand full comment