PumpDragn

PumpDragn t1_j0k1tlz wrote

The short answer to this question is that they either pay a premium for “green” energy on grids where it is readily available, or subsidize by funding green energy projects across the nation to make these claims.

Is every watt used from a green source? Definitely not… are they really offsetting enough with subsidies to be truly carbon neutral? Hard to say. That is the goal though.

3

PumpDragn t1_j0k1k35 wrote

For reals. The “best” solution to this problem is choosing the ideal location, and designing the hardware to withstand more extreme temperature/humidity limits.

The Dalles is pretty dry from what I understand, so they use the water to humidify AND cool the air. Building in a location where the outside air conditions are suitable for the servers year round is the best solution from a water usage standpoint.

However there are other challenges with that… such as the availability of usable energy, availability of a work force to build the data centers. If we located all data centers in these “ideal” regions, we would have latency issues for people further away, along with reliability issues caused by a high concentration of assets in a given location.

The google data center next door to me uses chillers for cooling and saves water as a result. As mentioned above, the trade off here is they use a significant amount of electricity, which may or may not be the more environmentally friendly solution depending on the source of their energy.

I’m not saying they can’t do better, but minimizing environmental impact is a huge priority for modern data center design, and they are constantly evaluating designs, and improving upon them for this reason.

6