i often feel like a shapeshifting fluid, a general fuzzy meta hypergraph, with plastic ecosystems of nested subagents, shapeshifting between centralized and decentralized network configurations You can just design your own identity What would you say is your primary way of thinking in your experience? Language? Abstract? Visual? Multimodal? Graphs? Fuzzy? Symbolic? All sorts of combinations? Can't put it into words?,...https://x.com/BangL93/status/1908128095967592485 Technically I was contrasting fuzzy and symbolic between each other while the other things can be subsets of those two, depending how you define it all Or you can also contrast it with neural in connectionist sense And you can see it as a spectrum, have subsymbolic stuff, and neurosymbolic stuff I think you can say that tons of these things I mentioned live in a structured high dimensional space of possible qualia with many of these as discrete or continuous dimensions (or something kind of in the middle with phase shifts) Also hypergraphs are interesting that sometimes make sense in phenomenology, or metagraphs, or hypermetagraphs 😄 Or Markov blankets can be useful as well And of course the whole QRI's coupling oscillators etc. stuff I also find it fascinating that when you explore different scientific fields, you train the mind to use different elementary structures and different ways of composing them To first very high level approximation, a lot of programmers think in discrete symbolic code, engineers think in engineering diagrams, geometric mathematicians think in shapes, algebraic mathematicians compose algebraic symbols from axioms into theorems, physicists think in rate of change, graph theorists think in graphs, category theorists think in similarities between abstract graphs across scales, system scientists think in dynamical complex systems across scales, etc. And there's still amazing gigantic diversity and nuance to it all And you can combine all of this into hybrid or more meta ways of thinking Yes, I think that tons of these different ways of thinking that I mentioned in all my previous messages have literal mathematically distinguishable neural correlates in neural dynamics I think some cut more fundamentally into the brain's architecture than others And there's tons of different commonalities between them, like manipulating invariances So groups theory with symmetries are under a lot of them its nice that stuff like meditation or excercise or psychedelics or just constant learning can battle the lowering of neuroplasticity with age tho to be honest my neuroplasticity baseline may be sometimes too high because I love to juggle different world views often Are you a dense model or mixture of experts model? Maybe we think in fuzzy metahypergraphs The more perspectives I accumulate, the more contradictions I get in my world model, and that inhibits my decision making And any attempt to resolve the contradictions under some heuristic basically closes you internally into a more narrow perspective or set of perspectives I guess one approach to not introduce contradictions in your world model is to do attentive compression such that you ignore as much data as possible that contradicts your existing perspectives But that gets you into all sorts of closemindedness and confirmation bias traps, that you can never escape to some degree anyway Do you approximate your world model by affine transformations with nonlinear activation functions, polynomials, sines, pseudorandom noise signals (reservoir computing), or some superexotic magic that is approximating arbitrary functions and generalizing that allows you to venture into out of distribution beyond classical language? What is the most common way of you conceptualizing yourself? Do you see yourself as a human that's part of society according to natural language narratives? Do you prefer to see yourself as a computational system that can do all sorts of arbitrary computations? Do you try to see yourself more in natural language terms, computer science terms, physics terms, pure math terms, etc.? Combinations? Personality I prefer the computational system lens more on average. How about you? What would you change in your brain's architecture? Selfawareness of selfawareness of selfawareness of selfawareness i wonder what does the feeling of "understanding the nature of reality" correlate with on the neural level because there are so many people thinking they "understand the nature of reality" but their truths are completely incompatible between eachother but they're so extremely confident in it i just run on this heuristic of making more and more predictive models of everything and the the more predictive models I have in various domains, the closer they are to scientific "truth" outside of that, it feels like you can postulate an infinite amount of arbitrary axioms, arbitrary stories, arbitrary narratives why should any of them be more "true" than any other? it often feels so arbitrary so one of my only explanations now that make sense to me is that it just "feels" to be true, and that there is some neural correlate there but it doesnt have to be empirically predictive at all people cannot deal with unknowns or a lack of answers from my anecdotal experience so they just fill it with whatever and when any new scientific evidence stops fitting your stories that used to fill the unknowns with something, then you start crusading against it, like a lot of religions do with science but not always, some people seem to just update the religious narrative to fit more with existing science I think it's also kind of a self defense mechanism but i also think that a lot of people label deep meditative states as "truth" precisely because they feel so good, but the scientific reason might be more like because there's more resting dopamine or some other cocktail of neurotransmitters or relaxed baseline neural activity (more symmetrical activity possibly) correlated with wellbeing it takes a very strong mind to consider different answers to the big questions and not go insane The more perspectives I accumulate, the more contradictions I get in my world model, and that also inhibits my decision making and value construction And many attempts to resolve the contradictions under some heuristic basically closes you internally into a more narrow perspective or set of perspectives I guess one approach to not introduce contradictions in your world model is to ignore as much data as possible that contradicts your existing perspectives But that gets you into all sorts of closemindedness and confirmation bias traps, that you can never escape to some degree anyway But living in a simplified world model with little contradictions that makes sense to you feels much better, as long as the environment isn't actively destabilizing it with way too much conflicting data, but you can always just train bigger closedmindedness and confirmation bias And ideologies/religions get born when big enough group of people share the same compressive simplified model of reality, or mutate some existing memeplex All under various incentives that often aren't partially about maximizing empirical predictive power like science often tries to Yes i think about this hypothesis often: symmetry theory of valence might hold because of biological incentive to favour model simplicity. Simpler models have more symmetries. And as a result of this incenve we get a lot of great models in physics based on symmetries, but also a lot of underfitted models of physics by the general public like all the sacred geometry stuff that also feels euphoric so its still at least great art and a theraphy tool. I'm at this moment meditating on love for all mental representations as that in my phenomenology seems to generate symmetries by minimizing self/other and ingroup/outgroup categorizations Love for others, love for fantasy characters, love for animals, love for objects, love for nature, love for concepts, love for science, love for technology, love for math, love for philosophy, love for existence, love for anything, just unconditional love itself on its own i wonder what is the neural correlate of the feeling that you're God that seems to happen often in religions i suspect it happens when brain's self model (neural representations under the self label) gets expanded to your whole world model which should correlate with more decentralized representations maybe some difference in prefrontal cortex [Self model - Wikipedia](https://en.wikipedia.org/wiki/Self_model) i prefer seeing every belief my brain comes up as just some useful model for some purpose generated by some computational process, some approximating model of the world and of my organism, like artificial neural networks form representations, with a lot of useful evolutionary biases, where map is never the territory (edited) i see free will as a fun model you can play with on a philosophical or more experiental level i suspect if your self model believes about free will more in its representations then the brain generates more dopamine or actives more relevant circuits as you believe that you have higher agency, where agency is the ability to predict and control future states of the world using your models in the cybernetic sense, like a more advanced thermostat when i did 5-MeO-DMT (legally) and my brain's inner world engine's space and time totally disintegrated for a bit, but I don't see it as objective truth, but instead as playing with my brain's physics engine its fascinating, i often wonder about how the brain constructs the physics engine that models the world approximately, constantly grounded in incoming data from the senses, that can go in arbitrary ways in dreams, meditation, substances, etc., but it's still limited by its architecture you can checkout some Josha Bach to get more metacognition and agency over one's beliefs [https://www.youtube.com/watch?v=nnMiDK2okIE](https://www.youtube.com/watch?v=nnMiDK2okIE) i think its a great heuristic to never 100% believe the brain's beliefs, being a good bayesian