So, the weights of nodes, okay, so this is like a multinomial logit problem. So, in a neural network, the nodes are the weights determine exclusively what the path is, okay? But this is no different than a high-dimensionality logit problem where you’re going to get what is it? Multicollinearity. And so there are ways to collapse the dimensions of that that I don’t think are used in AI and might actually improve the output like fuck what is it called? The things I did at EAB. The other issue is, so let’s do simple word completion where you have the cat-ran blank. You’re going to have so many options that are basically tied that it’s going to come down to rounding and sorting. And so they are statistically identical, but it just matters how they sort. Each layer of a neural network should be evaluated with statistical rigidity to determine whether it actually does add information. It may not, and so therefore one-shot AI needs to consider whether to ask more questions. > [!ai]+ AI > > **Key Points:** > - **Node weights in neural networks:** Determine the path taken through the network, similar to a multinomial logit problem. > - **Multicollinearity:** High-dimensionality in node weights can lead to multicollinearity, where multiple nodes have similar effects. > - **Dimensionality reduction:** Techniques from other fields (e.g., EAB) could be applied to reduce dimensionality and improve output. > - **Word completion:** In simple word completion tasks, many options may be statistically identical, leading to arbitrary sorting and potential inaccuracies. > - **Statistical evaluation:** Each layer of a neural network should be evaluated statistically to determine its contribution to information gain. > - **One-shot AI:** If a layer does not add significant information, one-shot AI systems should consider asking additional questions.