To go deeper: some animals act curiously, others with fear, but only a few of them understand what the mirror does and use it to inspect themselves.
To go deeper: some animals act curiously, others with fear, but only a few of them understand what the mirror does and use it to inspect themselves.
I like to describe it as a “force multiplier” along the lines of a powered suit.
You are putting in small inputs, and it’s echoing out in a vast, vast virtual space and being compared and connected with countless billions of possible associations. What you get back is a kind of amplification of what you put in. If you make even remotely leading suggestions in your question or prompt, that tiny suggestion is also going to get massively boosted in the background, this is part of why some LLM’s can go off the rails with some users. If you don’t take care with what exactly you’re putting in, you will get wildly unexpected results.
also, it’s devil tech so there’s that.