The language is a bit cryptic and Torque / s is not a physically meaningful quantity. But if I understand correctly what you wanted to say, then you are right: The faster you go, the more power and torque you need (duh). You can use the usual calculators to estimate if what motor specs you need for your case:
But I don’t really understand how you connect it to gear reduction. The way it actually connects: if you are not limited by specs of your components, the gear ratio doesn’t matter for your example (speaking from physics point of view) because the motor will get you the same acceleration at a higher torque. The output power is the same, though, because the motors rpm is lower:
P = F * v = T_wheel * w_wheel (T is toque, w is angular velocity)
T_wheel = 1/R * T_motor (R is gear reduction)
w_wheel = R * w_motor
P = F * v
= T_wheel * w_wheel
= T_motor / R * w_motor * R
= T_motor * w_motor
As you see the power is constant and the gear reduction cancels out. But in reality that’s not what matters, because what you didn’t include in your calculation is that the motor efficiency is lower at low rpm and low toque load. That’s why a high gear reduction is more efficient, and means that the INPUT power goes up. But it has nothing to do with friction. Just motor efficiency. So get the highest possible gear reduction / lowest KV that can give you your desired max speed. Then you are most efficient.
No, the opposite. Hub motors are usually inefficient because their KV is too high. They are usually “geared”, to run at 60+ km/h, because you can only get so much copper into a motor of a certain size, which provides a lower limit for the KV. This is the reason why you want to have a gear reduction for a BLDC motor in the first place.
If you run the hub motors at sufficiently low torque load (large motors, many motors, not too many hills or aggressive acceleration, …) the overall efficiency of a hub setup can even be higher, because they make up for lower motor efficiency with their lower rolling friction (no belts).