bob1029 3 days ago

I was recently struggling with the best way to randomly construct a well-connected, recurrent topology of neurons until I encountered Percolation theory.

There is a basic natural log scaling rule that essentially guarantees that you will have a well-connected topology (even with random connections) as long as you ensure a minimum # of connections are assigned to each element.

The required fanout at each order of magnitude network size goes something like:

  10:              ~3 connections
  100:             ~5 connections
  1,000:           ~7 connections
  10,000:          ~10 connections
  100,000:         ~12 connections
  1,000,000:       ~14 connections 
  100,000,000,000: ~26 connections
I've been able to avoid a lot of complicated code by leveraging this.
  • C-x_C-f 3 days ago

    What do you mean by well-connected topology? If you mean that you can reach every neuron from any neuron then the number of connections you need is asymptotically n log n / 2 (not up to a constant factor or anything, just n log n / 2 on the nose, it's a sharp threshold), see [0]. In general when percolation is done on just n nodes without extra structure, it's called the Erdős–Rényi model [0], and most mathematicians even just call this "the" random graph model.

    [0] https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_....

esafak 3 days ago

Relevant to the study of phase transitions in machine learning; e.g., https://openreview.net/forum?id=0pLCDJVVRD

  • mindcrime 2 days ago

    Thanks for sharing that! A high level (not at all fleshed out) notion that percolation could have some applications to AI/ML is what led me to lookint into this in the first place. :-)

physicsguy 3 days ago

I loved studying this stuff at undergrad. Was one of my favourite courses.