Currently there are five power levels:
0: - 1-3, tin wire.
1: 1-32, Copper wire (1x ins)
2: 128, Gold wire (2x ins)
3: 512, Glass/Diamond wire (3 rubber of ins)
4: 2048, ReIron - HV
Following the documentation on the wiki/forums the corresponding lossless distances; loss being determined /only/ by cable type and length.
3: 19 or 39 (different values for each source)
However in the real world when using a higher grade of cable losses only decrease for the same signal over the same run length. Thus I would think that this would be a more normalized result metric:
0: 1.25 @ 32eU * except this would burn/blow up first.
1: 4 @ <= 32eU
2: 8 @ <= 32eU
3: 128 @ <= 32eU ** This uses such a costly component in it's construction.
4: 64 @ <= 32eU
2: 2 @ 128eU
3: 64 @ 128eU ** This uses such a costly component in it's construction.
4: 16 @ 128eU
3: 19 @ 512eU ** This uses such a costly component in it's construction. (same as current value?)
4: 4 @ 512eU
4: 1 @ 2048eU
In other words, I think that the loss metric should be relative to a packet size, and that transferring a smaller packet should benefit proportionally to the change in packet size. (So transferring 32eU/t over 128eU/t rated cable would have 4x the blocks before loss distance; 1eU/t (solar panel as an example) would have 128x the loss distance)