Tuesday, June 20, 2023

13. Fiber & Satellite: Latency in real world applications

 In practice, the actual latency, or ping, experienced when transmitting data from one place to another is higher than just the time it takes for light to travel the distance. This is due to a number of additional factors including routing, processing delays at each network device, the quality of the transmission medium, and more.

When data is transmitted through a fiber optic cable, it isn't just being sent directly from one place to another. It must pass through various devices such as switches, routers, and servers. Each of these devices takes a certain amount of time to process the data. This, combined with the actual physical distance the data must travel, is known as latency.

Transatlantic fiber optic cables, such as those connecting London and New York, generally achieve real-world one-way latencies of around 70-75 milliseconds. This is partly due to the fact that the actual path length of the fiber optic cables is longer than the direct 'as-the-crow-flies' distance. So a round-trip ping would be approximately double this value, or around 140-150 milliseconds.

For geostationary satellites, the latency is much higher due to the greater distance the signal must travel. A typical round-trip latency for geostationary satellite communication is around 500-600 milliseconds. This includes the time for the signal to travel up to the satellite, for the satellite to process and retransmit the signal, and for the signal to travel back down to Earth.

Again, these are typical values and actual latency can vary based on a number of factors including the specific equipment being used, the quality of the transmission medium, and the amount of network congestion.

TLDR; In reality the link latency differs depending on what fiber and which route is taken. For satellite 500 ms is pretty good average for geostationary orbit.

12. Fiber & Satellite Latency

When calculating how long it will take for light or any electromagnetic wave to travel a certain distance, we generally use the speed of light, which is approximately 299,792 kilometers per second in vacuum. However, the speed of light slows down when passing through different media such as glass or fiber optic cables. In fiber optic cables, the speed of light is about two-thirds of its speed in a vacuum, or about 200,000 kilometers per second.

For the purpose of this response, let's look at the following distances:

The average distance from London to New York is approximately 5,585 kilometers.

The average altitude of a geostationary satellite is about 35,786 kilometers above the Earth's equator.

If we are calculating for fiber optic cables:

London to New York: 5585 km / 200,000 km/sec ≈ 0.028 seconds or 28 milliseconds.

For a geostationary satellite, we need to account for the signal going up to the satellite and back down again, thus:

London to satellite and back: (35,786 km * 2) / 299,792 km/sec ≈ 0.238 seconds or 238 milliseconds.

These are very rough estimates and only cover the travel time of light or an electromagnetic wave. They do not account for any additional delays that might occur due to the processing and retransmission of signals at each end, or routing delays as signals pass through various network devices. For instance, it takes more than just the propagation delay for a signal to travel through a fiber optic cable - there are also delays associated with processing and routing the data.

Lastly, it's also important to note that the routing of data over the Internet often doesn't follow the most direct geographical path. For instance, data traveling from London to New York might not take a direct path, but instead might be routed through various Internet backbone networks, potentially increasing the travel distance and time. Similarly, signals to geostationary satellites aren't sent straight up, but rather must be sent at an angle, further increasing the travel distance.


TLDR; because of the distance and speed of light and other electromagnetic it will in theory take 28ms from just the distance and because satellites are at geostationary orbit just from the distance you get 238ms.






Sunday, June 18, 2023

11. Power from wifi networks, so where does it actually go?

 11. Power from wifi networks, so where does it actually go?

Radio waves, such as those used by WiFi networks, are a form of electromagnetic radiation. When your WiFi router broadcasts a signal, it's essentially sending out electromagnetic waves into the surrounding environment. These waves spread out in all directions and, in an ideal scenario, would be picked up by devices that are designed to receive such signals, such as your laptop or smartphone.

However, if there are no devices to receive the signal, or if the signal extends beyond the range of any devices, it doesn't simply vanish. The Inverse Square Law mentioned earlier comes into play here as well. As the waves spread out, their intensity decreases with the square of the distance from the source. In other words, the further away you go from the router, the weaker the WiFi signal becomes.

Over time and distance, the energy of these waves is dispersed and absorbed by the environment – walls, furniture, and even the air itself. The absorbed energy typically turns into a small amount of heat, but the amount is so minimal it would be virtually impossible to measure with everyday tools.

This energy doesn't go to waste, per se, because it was never meant to be conserved. Instead, it served its purpose: to carry information wirelessly over a distance. If no device picks up the signal, it doesn't negate the purpose of the energy expenditure.

Moreover, while the concept of harnessing energy from WiFi signals or other ambient electromagnetic waves might seem appealing, it's not currently feasible as a significant power source. The energy present in these signals is extremely low. An MIT study found that, even in an area with a strong WiFi signal, the total power available to be harvested is about 100 microwatts - not enough to even charge our modern smartphones.

Thus, while our environment is full of energy in various forms, not all energy is easily or practically harvestable. Understanding this can help us better focus our efforts on realistic and sustainable ways to generate and save energy.

TLDR; the energy from wifi signals will in the end be turned into heat, there is no just a magical place where it will store all that wifi energy and allow you to tap into that. Wifi signals are not very good way to transmit energy accross distance.

10. Free power from thin air

The Reality Check: Conservation of Energy and Inverse Square Law - Why We Can't Harness Unlimited Free Energy From Thin Air

The quest for unlimited, cost-free energy is as old as civilization itself. A world powered by abundant and free energy seems like a utopia - a world where the shackles of limited resources don't restrict progress, where technology advances without leaving a carbon footprint, and where energy poverty is a thing of the past. Yet, despite technological advancements, this dream seems as elusive as ever. Two significant scientific principles stand in our way - the Conservation of Energy and the Inverse Square Law. In this post, we'll delve into how these laws of physics prevent us from extracting unlimited free energy from thin air.

The Barrier of Conservation of Energy

The principle of Conservation of Energy, often attributed to the work of physicist Hermann von Helmholtz, states that energy cannot be created or destroyed; it can only be transferred or transformed from one form to another. This is a fundamental law of nature and serves as the foundation of all physics and engineering disciplines.

To understand why it poses a problem for our quest for free energy, consider this: when you push a car, you use your muscular energy to set it in motion. As the car moves, this energy is converted into kinetic energy (the energy of motion). But once the car stops, where does the energy go? It transforms into heat due to friction and sound due to the vibrations caused. At no point does the energy disappear; it simply changes form.

In the context of energy extraction from the air, this law implies that the 'free' energy has to come from somewhere. Air molecules do have kinetic energy due to their motion, but harnessing this energy would slow them down, reducing their temperature. This could have significant and potentially catastrophic climatic effects.

The Hurdle of Inverse Square Law

The Inverse Square Law, which governs a wide range of physical phenomena from gravity to electromagnetism, states that a specified physical quantity or intensity is inversely proportional to the square of the distance from the source of that physical quantity.

The energy from the Sun, for example, reduces by the square of the distance as you move away from it. Even if we could harness all the solar energy hitting the Earth, it wouldn't be enough to power our civilization forever, because the energy we receive is limited.

If we consider harnessing energy from the electromagnetic fields in the air (a proposal by some 'free energy' proponents), we run into the Inverse Square Law. The strength of the electromagnetic field reduces rapidly with distance - the amount of energy you can extract from the air diminishes dramatically as you move away from the source.

Conclusion

While the dream of free, unlimited energy is an admirable one, our current understanding of physics, as defined by principles like the Conservation of Energy and the Inverse Square Law, restricts our ability to extract infinite energy from the air.

However, it's important to remember that these laws don't doom us to a future of energy scarcity. Rather, they guide us toward more realistic, sustainable solutions. The laws of physics may constrain us, but they also challenge us to innovate. After all, the greatest achievements of humanity often come when we work with nature's laws, not against them.

To tackle the world's energy challenges, we must focus our efforts on improving efficiency, reducing energy waste, and developing renewable energy sources that work harmoniously with our planet's natural systems. This is our most reliable route towards a sustainable energy future.


Friday, June 16, 2023

5. Conservation of energy

How does conservation of energy have to be taken into account when trying to receive wifi power?

In terms of wireless communication, the conservation of energy principle states that the energy transmitted by the source (in this case, your Wi-Fi transmitter) is spread across a certain area (defined by the transmission range), and the energy captured by the receiver (your antenna) depends on its location within that range, its gain, and the frequency of the signal, among other factors.

When a signal is transmitted wirelessly, it radiates outwards from the source in a spherical pattern (assuming an isotropic antenna). This spreading means the energy is distributed across an ever-increasing area as the distance from the source increases. As a result, the energy available to any receiver diminishes with the square of the distance from the source, according to the inverse-square law.

From the perspective of energy conservation, it's impossible for one or more receivers to capture more energy than what was initially transmitted. Even if you had an infinite number of antennas, they couldn't capture more energy than what was transmitted in the first place.

Furthermore, a large part of the transmitted energy is lost as it propagates through space, due to factors such as absorption by objects and the atmosphere, reflection, diffraction, and scattering. All of these factors contribute to the path loss of the signal.

Thus, while you could theoretically add more antennas to try to capture more of the signal, the practical limitations due to interference between antennas, the spread of the signal, and other factors make it impractical to try to capture all the transmitted energy. Instead, efforts are usually focused on improving the efficiency and sensitivity of the receiver, and optimizing the placement and type of antennas, to capture as much of the signal as possible within the practical constraints.

I had this awesome original idea, just he stomped by the laws of physics and conservation of energy. Why does real physical world do this? Why cant I just get free energy everywhere, would be so much easier. 



Thursday, June 15, 2023

4. 100,000 antennas for maximum wifi power

At 10 meters away, how many receiving antennas do I need to setup to get that full 100mW power all back to me?

First, it's important to understand that receiving antennas do not add up power linearly, because they can interfere with each other due to the overlapping of their coverage areas. This is known as antenna diversity.

In an idealized, theoretical scenario where you could perfectly combine signals without any loss or interference, you would calculate the number of antennas needed based on the amount of power each antenna receives.

From the previous calculation, a single antenna 10 meters away would receive about 1μW (0.001mW) of power. The original power transmitted is 100mW.

Therefore, to capture the full 100mW with antennas each receiving 0.001mW, you would theoretically need 100mW / 0.001mW = 100,000 antennas.

But this is highly impractical and not feasible in a real-world situation due to the complexities of radio wave propagation, interference between antennas, and physical constraints. There's also the fact that antennas have a limit to how much they can capture based on their effective aperture, and adding more antennas will not necessarily increase the total amount of power captured if they start to interfere with each other.

In practice, it would be better to use techniques such as increasing the gain of the antennas, using directional antennas, reducing interference, or using multiple-input and multiple-output (MIMO) techniques to improve the signal quality and data rate, rather than trying to capture the full transmit power.


Wednesday, June 14, 2023

3. 10 meters away from the source

If you move to 10 meters away from the source, you substitute this new distance into the Friis Transmission Equation:

Pr = Pt + Gt + Gr + 20log10(λ/4πd)

We previously calculated:

    • The transmitted power (Pt) as 20dBm

    • The wavelength (λ) as approximately 0.125 meters

    • The gains of the transmitting and receiving antennas (Gt and Gr) as 5dBi each

Now, the distance (d) between the antennas becomes 10 meters:

Pr = 20dBm + 5dBi + 5dBi + 20log10(0.125/4π*10) = 20dBm + 10dB - 60dB = -30dBm

So, the received power would be around -30dBm. This value can be converted back to milliwatts using the formula Power (in mW) = 10^(Pr/10), which results in approximately 0.001mW or 1μW.

Once again, this is a theoretical calculation. In a practical scenario, you may receive more or less power due to factors such as obstructions, antenna patterns, interference, and other complexities of radio wave propagation. Your receiver also needs to be capable of receiving at such low power levels.


13. Fiber & Satellite: Latency in real world applications

 In practice, the actual latency, or ping, experienced when transmitting data from one place to another is higher than just the time it take...