Good question, PaPaw! The savings is real and here's the simple logic as to "how". For the sake of discussion, let's say that a base assumption is that in both scenarios the outdoor temperature (OAT) is the same… let's say 40 degrees F. Your furnace (or whatever energy source) must heat your interior space from 40 degrees to 60 degrees, which represents a 20 degree delta (difference). In the second scenario at 70 degrees the furnace must heat to a 30 degree delta.
In both scenarios you are supplying the same amount of energy (BTU's) to raise the OAT from 40 degrees to the thermost60 degree thermostat set point. But the additional 10 degrees in the second scenario is where the savings is realized. That additional 10 degrees requires a constant additional amount of BTU's to maintain the extra 10 degrees of interior air temperature.
Think of it this way… the larger the delta, the more total energy is required to maintain
that higher set point. The larger the delta… the greater the heat loss factor. This is a basic energy law that can't be broken.
And here's a factual tidbit to consider as well… "heat always travels from hot to cold"
Unless you have gaps or openings, the cold doesn't really "travel inside", the heat always travels outside through insulation, metal, fiberglass… all materials… always.
Hope this helps to better understand if you're saving, and why.
(also a PawPaw)