Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's just concept space. The entire LLM works in this space once the embedding layer is done. It's not really that novel at all.


This was my thought. Literally everything inside a neural network is a “latent space”. Straight from the embeddings that you use to map categorical features in the first layer.

Latent space is where the magic literally happens.


Completely agree. Have you see this?

https://sakana.ai/asal/




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: