[[Intelligence]] x [[Generalization]]
[#51 FRANCOIS CHOLLET - Intelligence and Generalisation - YouTube](https://youtu.be/J0p_thJJnoo?si=VsoI2D6y5SK5Qm2l)
For intelligence you need to be optimizing for generality itself
Generalization is the ability to mine previous experience, to make sense of future novel situations
Generalization describes a knowledge differential, it characterizes the ratio between known information and the space of possible future situations
Generalization power is the sensitivity to abstract analogies, to what extend can we analogize the knowledge that we already have into simulacrums that apply widely across the experience space
I like his analysis of LLMs in this framework: https://twitter.com/fchollet/status/1763692655408779455 and here's his paper
[[1911.01547] On the Measure of Intelligence](https://arxiv.org/abs/1911.01547)
His in practice benchmark on generalization is nice.
How to make these ideas as practical as possible? Did you or someone attempted to make a more generalization maximizing cost function from these ideas?
Maybe you could design a cost function that incentives more predictive constructions of alternative world models?