![]() |
|
![]() |
| > > In 1953, Enrico Fermi criticized Dyson’s model by quoting Johnny von Neumann: “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.”
For those who are interested, you can watch Freeman Dyson recount this conversation in his own words in an interview: https://youtu.be/hV41QEKiMlM |
![]() |
| Adding any finite number of parameters is strictly better than adding an infinity of parameters (i.e. an arbitrary distribution of dark matter chosen to match the observations). |
![]() |
| It tends to be a parameter that can be derived from rrasoning and assumptions. This contrasts to free parameters where you say "and we have no idea what this value should be, so we'll measure it" |
![]() |
| >It's missing a mechanism to explore, search and expand its experience.
Can't we create an agent system which can search the internet and choose what data to train itself with? |
![]() |
| I mean the devil is in the details. In Reinforcement Learning, the target moves! In deep learning, you often do things like early stopping to prevent too much optimization. |
![]() |
| The number of parameters is just the wrong metric, it should be the amount of information contained in the parameter values, their entropy, Kolmogorov complexity or something along that line. |
![]() |
| I wish there was more humor on arXiv.
If I could make a discovery in my own time without using company resources I would absolutely publish it in the most humorous way possible. |
![]() |
| Joke titles and/or author lists are also quite popular, e.g. the Greenberg, Greenberger, Greenbergest paper[1], a paper with a cat coauthor whose title I can’t seem to recall (but I’m sure there’s more than one I’ve encountered), or even the venerable, unfortunate in its joke but foundational in its substance Alpher, Bethe, Gamow paper[2]. Somewhat closer to home, I think computer scientist Conor McBride[3] is the champion of paper titles (entries include “Elimination with a motive”, “The gentle art of levitation”, “I am not a number: I am a free variable”, “Clowns to the left of me, jokers to the right”, and “Doo bee doo bee doo”) and sometimes code in papers:
(Yes, this is working code; yes, it’s crystal clear in the context of the paper.)[1] https://arxiv.org/abs/hep-ph/9306225 [2] https://en.wikipedia.org/wiki/Alpher%E2%80%93Bethe%E2%80%93G... |
![]() |
| IIUC:
A real-parameter (r(theta) = sum(r_k cos(k theta))) Fourier series can only draw a "wiggly circle" figure with one point on each radial ray from the origin. A compex parameter (z(theta) = sum(e^(z_ theta))) can draw more squiggly figures (epicycles) -- the pen can backtrack as the drawing arm rotates, as each parameter can move a point somewhere on a small circle around the point computed from the previous parameter (and recursively). Obligatory 3B1B https://m.youtube.com/watch?v=r6sGWTCMz2k Since a complex parameter is 2 real parameters, we should compare the best 4-cosine curve to the best 2-complex-exponential curve. |
I love the ironic side of the article. Perhaps they should add the reason for it, from Fermi's and Neumann's. When you are building a model of reality in Physics, If something doesn’t fit the experiments, you can’t just add a parameter (or more) variate it and fit the data. The model should have zero parameters, ideally, or the least possible, or, even at a more deeper level, the parameters should emerge naturally from some simple assumptions. With 4 parameters you don’t know whether you are really capturing a true aspect of reality of just fitting the data of some experiment.