This is not much of a question, more like a possible way to get some more attention to the Julia language.
Everybody is talking about we need to save the planet and use less energy.
I think it could be a great way to promote the Julia language. Compare the power consumption between let say, java/javaScript and Julia for example. Maybe a crazy idea but sometimes they works.
Lets say a benchmark will prove a little as 10% (i beleve its more). Its a lot of servers out there and a switch from resource hungry languages to Julia could contribute to save energy.
So if Julia could save a lot of Watts, Julia is also a âGreenâ language and people should hear about it.
I dont know, Its just a dream, hope it will come true
I wonder if Julia is not actually worse than others in this area: the same code is compiled over and over again in countless computers everyday, contrary to other languages (which reuse compiled code at least for the dependencies). I think for many users, e.g. the typical Jupyter or Pluto notebook, the CPU spends more time compiling code than processing data. Maybe code running on servers is more efficient, but my hunch is that currently there is more code running on individual computers than servers.
Even on servers I remember feeling bad when I set up CI on GitHub projects because of the huge energy costs implied (I assumed) for each commit on a Julia project, compared to the equivalent Python project which will of course âcompileâ the project code but not the dependencies.
Maybe Iâm totally wrong and server work already outweighs desktop/laptop work⌠Also, it would take only a few truly huge computations to offset the compilation costs on individual computers, so I could be wrong in this way too. And anyway this might change in the future as there are plans to cache compiled code I think.
I wouldnât feel too bad about CI. Letâs do a bit of fun order of magnitude analysis!
Your average laptop draws ten to a few tens of Watts at idle. Letâs overestimate at 100W at max load.
My typical packages take less than a minute to run through tests. Arguably for a good dev experience it should be < 10s, perhaps, but letâs call it 100s.
Letâs imagine several CI configurations for different Julia versions and operating systems. 10, say.
So all up, a single CI run is 100J/s * 100s * 10 = 100kJ which is probably an overestimate for most (but not all packages) Iâve written. For comparison:
Thatâs about the chemical energy in a small tomato
The typical human needs on the order of approximately 10_000 kJ per day of food energy. Thatâs ~100 CI runs per day per person, or at least tens of CI runs even if youâre starving.
Thereâs around 30_000 kJ of chemical energy in a litre of gasoline which is ~300 CI runs. If youâre willing to drive 5 km (0.25L of fuel, say) then you should be willing to do 75 CI runs!
The world average total energy usage per person is apparently around 20_000 kWh per year (See eg, Energy use per person - Our World in Data). This is ~200_000 kJ per day, or 2000 CI runs per day.
Largely, I donât think computing is something the average user needs to be worried about! Other energy intensive human activities are much more concerning.
Iâm not so sure about your calculations though - if that is so, why does mining cryptocurrencies use soooo much energy (more than some countries)
Server farms do use quite a bit of energy - so if you can get 10x - 100x more work done running Julia code rather than say, Python, wouldnât that make a significant difference?
Thatâs interesting to put computer activities in context, but I donât think comparisons with unrelated domains make sense when considering the energy consumption of Julia specifically⌠Itâs an excuse that can always be used, thereâs always something else consuming more energy.
I think it absolutely makes sense to compare domains quantitatively, in terms of making life decisions âshould I participate in this or that activityâ.
If choosing Julia vs some other language is 1% or less of the energy used just to keep me living a normal life, then thereâs no reason to waste time worrying about that choice.
I agree when considering my general choice of an activity, 1% is not that much. But if I can save 0.8% of my energy consumption just by switching language, and still accomplish my goals, thatâs also very relevant! I think thatâs the core question of this threadâŚ
To get back to the original topic here (which was a nice and positive message, thanks )
I think it will be extremely difficult to quantitatively show that Julia saves energy. Things to consider:
Julia should be used âin productionâ as part of a large, potentially ongoing computation or a very widely deployed application. These are the cases where a significant amount of energy is actually used.
It should replace the fixed amount of work that another system was doing and do it more efficiently or better.
The hard part is that many systems donât have a fixed amount of computational work to do. It seems we can always consume as much compute as possible given hardware and software improvements over many years. The actual limitation on energy use from computing seems more economic than computational.
For example HPC workloads do consume a lot of energy, but a lot of scientific workloads have the property that the simulation just expands to fill the compute available. Larger meshes, higher resolution, more samples. So the amount of energy used in these situations where Julia is great may be somewhat independent of the language, and more dependent on the budget to buy hardware and compute time. Thereâs better scientific return with the ability to do higher resolutions or whatever. But I think making a clear comparison is really difficult!
Crypto is another good example of where the practical energy usage of the system has nothing to do with computational efficiency of the components. Better efficiency just means the network adjusts to make hashing more difficult and the same total amount of energy is used. When bitcoin went to GPUs and then ASICs nothing really changed energy-wise. By design!
This has been discussed here some times. Although I agree mostly with the points of @c42f that it is not clear if these measures actually mean anything for the real-world use of energy in computing, there are these two studies:
In this one Julia is not initially present, but somewhere there one can find a link to an updated version in which Julia is included, and has a fair position:
And there is this Nature Astronomy commentary, which includes Julia:
Perhaps thatâs not what this post is about, but it seems worthwhile to mention that Julia is very much used in green tech, from climate modelling to energy grid optimization. Weâre using it as well in an industrial setting to reduce our clientâs energy consumption. Those initiatives can move the needle a lot, even when the algorithms themselves take a lot of CPU time to runâŚ
If you interleave the 2020 results with the broader, original results, youâll see Julia bested by C, Rust, C++, and Ada, but besting everyone else from OCaml to Fortran to Go to Java (especially considering the Java efficiency caveat detailed in: https://www.sciencedirect.com/science/article/abs/pii/S0167642321000022).
And I assume Julia has made efficiency improvements since v1.3.1 and will continue to improve, so pretty cool. I didnât assume it would be energy efficient compared to compiled languages w/ GC.
> 100 minutes is common for the primary packages I work on.
Now, Iâm suddenly concerned about the amount of energy Iâm apparently using.
But I suspect theyâre using much less than 100W, given that CI is restricted to a single core (2 threads).
This is great. Effectively a case where Julia is being used to make a fixed amount of work more efficient. And if itâs not computational work but an industrial process (or whatever), the amount of energy saved could be huge. (Assuming that your client has a fixed amount of work and they donât scale up their operations until theyâre using the same amount of energy again?)
Yeah, overconsumption and Jevonâs paradox make any kind of energy efficiency work a bit depressing, but there are other effects in play and hopefully itâs still worthwhile. Sigh.
2 ideas, where I think Julia has a strong potential to yield massive energy savings, if researchers extend Juliaâs already excellent solvers/automatic differentiation and extend current research being done implemented in other languages, would be:
Gradient estimation or more efficient gradient-free optimization/tuning of discrete simulators
Where I work we have (a thousand? more?) cloud VMs running nearly every day, running long (e.g. 6-CPU-hour per iteration) simulations of CPUs and GPUs in development. Itâs not my day job, but Iâve been on the look-out for innovations (papers, etc) that would allow us to take these discontinuous/discrete simulations (often done using grid-search or primitive black-box-optimization) and somehow speed up the search, ideally using gradient estimation techniques so SGDs would âjust workâ. If anybody is aware of recent breakthroughs in this area (especially one Julia-based), I think in relatively little time I could rally some engineers in the company and rewrite some of the simulators from C++ to Julia to take advantage of it.
One thing which will save energy is choosing the correct type for calculations.
Look at the machine learning community with use of FLoat32 and the recent specific lower precision types from Nvidia.
I read a fantastic paper years ago on the topic of watts per instruction and choosing types - sadly I did not keep the reference. If anyone has it please let me know.