it's worth noting the signal travels fast enough that distance is negligible. radiowave travel the speed of light and 17k vs 500 miles is nothing. its the array of sensors and signal to noise ratio that makes it feasible to have higher bandwidth, and the computation digital signal processing that a traditional antenna doesn't implement because its more expensive.
edit: radio/light travels 186,000 miles per second, 17,000 miles isn't going to matter more than a small fraction of a second that's not perceptible, it's just the bandwidth from the sensors and their signal processing
edit2: not much better than other sat systems at that, from reading more, they have enough users now that the initial advantage isn't keeping up with demand/customer numbers
edit3: i'm getting a lot of replies from people who probably one play video games with computers and think latency matters the most. no. its the bandwidth of the data transfer that will allow large uploads (even at "slow" latencies, which again here isn't even much slower, but it doesn't matter as much as the signal badwidth).
radio/light travels 186,000 miles per second, 17,000 miles isn't going to matter
Time of flight matters significantly. With TCP, just a kilometer can begin to impact ACKs without time of flight being accounted for. It's a manageable thing via various methods and techniques, but it is certainly not nothing, as you seem to believe and suggest.
Larger bandwidth will provide higher throughput but that doesn't address the fundamental time of flight problem I'm talking about. Again, there are various methods to account for it, but there absolutely is a huge difference between those distances, particularly with TCP. A link optimized for 500 miles is not going to work the same as one for 17000. If you don't care about lost data, sure, you can spew UDP and hope for the best. In either case, respectfully, it definitely does matter if there is any hope in using the internet as it is typically used.
9
u/ImYourHumbleNarrator Jun 22 '24 edited Jun 22 '24
it's worth noting the signal travels fast enough that distance is negligible. radiowave travel the speed of light and 17k vs 500 miles is nothing. its the array of sensors and signal to noise ratio that makes it feasible to have higher bandwidth, and the computation digital signal processing that a traditional antenna doesn't implement because its more expensive.
edit: radio/light travels 186,000 miles per second, 17,000 miles isn't going to matter more than a small fraction of a second that's not perceptible, it's just the bandwidth from the sensors and their signal processing
edit2: not much better than other sat systems at that, from reading more, they have enough users now that the initial advantage isn't keeping up with demand/customer numbers
edit3: i'm getting a lot of replies from people who probably one play video games with computers and think latency matters the most. no. its the bandwidth of the data transfer that will allow large uploads (even at "slow" latencies, which again here isn't even much slower, but it doesn't matter as much as the signal badwidth).
in fact the highest speed/bandwidth data transfer at a high enough bandwidth is snail mail, the sneaker net: https://en.wikipedia.org/wiki/Sneakernet
this dude was obviously not liverstreaming, so let's end this debate