Okay well what I’m saying is it just depends on how long you are pulling how many amps to determine how many watts are getting lost in heat. It isn’t exactly like a pipeline width since the added heat from the current will increase the resistance typically and will therefore be a bit of a feedback cycle the longer you are heating the wire the more resistance you’re adding the more you’re heating the wire.
If you want to assume you have continuous 130A through a wire at a given voltage then you can calculate the voltage drop, but if you sustain that load you’ll continue to add heat and increase the resistance until you melt the conductor. Also things we haven’t discussed yet but are relevant the battery amps vs motor amps (typically you’ll have much higher motor amps than battery amps due to the duty cycle, the voltage is on off, on off with the BLDC motor so you’ll have bursts of high amperage as voltage is applied to the coils.
Assuming 50V and 14AWG (2.525 ohms per 1000ft) and assume 6 inches of phase wire then we take .5ft /1000ft * 2.525 = 0.001263 or 1.263 milli-Ohm. At 130A you’d have 130^2 * 1.263 mOhm ~= 21W bled off in a wire (let me know if I botched the math somewhere or used the wrong equation here… very possible).
Anyway all the calc and numbers aside getting some real world data on the temp and actual amp draw will help more than any amount of academic argument