Viewing a single comment thread. View all comments

maurymarkowitz t1_ixucdsb wrote

These stories come up every few years and then disappear again for the simple reason that it’s the dumbest idea in history.

Losses in transmission are about 50%. Losses due to night time are 50%. Immediately the advantages are not so obvious… losses due to weather and all other downstream effects on earth are about another 50%. Losses due to increased “weathering” in space, which is extremely hostile, are also about 50% (actually more). So in the end, flying the panel into space instead of Mohave will get you the same amount of lifetime power but cost you hundreds or thousands of times more.

Now let’s talk ground space. The article states it uses less, but that’s only on rainbow unicorn planet. Here on earth where microwave radiation is strictly controlled, the maximum power density allowed is about 100 times less than sunlight. So the land usage is 100 times as large. That is a limit controlled by the ITU among others, and they are already on record stating they will not allow any changes because doing so would wipe out space and ground based transmissions across a wide band. Consider all the arguing about low power 5G towers near airports and then multiply the power level 100 times…

This is never going to happen, everyone that’s looked at it knows this, but the space nerds keep saying it’s the next sliced milk.

5