• KairuByte@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    10
    ·
    9 months ago

    I’m actually curious on the water usage. How is it being utilized that it is completely removed from circulation? Or is it simply being used, dumped into the cities return, and used again?

    Because water running through a datacenter… sounds like perfectly drinkable water. Maybe a little warm?

    • time_fo_that@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      In every water cooling loop for consumer grade computing hardware, the water is cycled through for hundreds (thousands?) of hours before servicing is needed. I think it would be pretty easy for a company with such massive resources to have some sort of small on site water treatment facility or filtration system. Swimming pools filter their water, why can’t data centers?

      I don’t think the water would be potable after running through that sort of hardware because the piping is probably not safe for transporting drinking water, especially at high temperatures when different chemicals could leech into the water. There’s also fittings, lubricants, anti-microbial additives, etc. that would further complicate things.

      • diomnep@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Doubtful that this is using traditional per-device water cooling. I’m betting that this is traditional hot row/cold row cooling and there is an evaporative component of the HVAC coolant loop to reduce the power requirements of the system as a whole.

        Lemmy loves to shit on MS but they are constantly innovating in efficient data centers because spending less to operate directly reduces their expenditures, which directly equals profit on a service with fixed costs and multi-year reservations like Azure.

        You can bet that if the solution was as simple as what you suggest that they would have been doing it for years, but the thermal considerations for one machine and the thermal considerations for 100,000 machines are not the same. The #1 priority to operate that many systems is to use as little power as possible because power is not only the biggest expense but also the primary limiting factor on the total number of systems you can host.