a paper on how to figure out the effect of hills on the length of a hike
Posted: Wed Apr 14, 2021 2:40 pm
I wrote a paper recently on how to figure out the effect of hills on the length of a run or hike. I thought this might be interesting to people on this forum.
Paper (open access): https://www.biorxiv.org/content/10.1101/2021.04.03.438339v1
Software: http://www.lightandmatter.com/kcals (open source)
The software has a mode for running and mode for walking. The paper tests its predictions against real-world data for running.
For those who don't want to click through and read the whole paper, here's a brief summary of the results in the form of a couple of graphs.
The first graph shows two hypotheses about the effect of hill climbing on how much energy you expend. The solid line is from treadmill data. The dashed line is the rule that every 100 meters of elevation gain counts as 600 meters of extra distance.
I collected publicly available results from races in the LA area and tested these hypotheses. I constructed four tests, which I label a, b, c, and d. Neither model was consistent with all four tests. The second graph summarizes the results.
Each circled part of a graph tells you that I found evidence against that part of the graph. For example, the part of the dashed-line model that's circled and labeled "a" tells you that this part of that model is wrong in real life: a gentle downhill slope really does make you more efficient than running on the flats.
Based on these results, I came up with a third model that seems to work better than the first two. The third model is the same as the solid (treadmill) curve, except that at the bottom of the curve, I replace it with the dotted line. In other words, the treadmill results basically seem right, except that when people are outdoors in real-world conditions, they aren't as fast running downhill as you would have thought based on energy measurements in the lab. It's probably true that their energy consumption per mile is low, but there are other reasons why they just can't run downhill fast enough to keep their rate of energy consumption going at the maximum that their lungs and heart are capable of.
I have an open-source app (link above) that lets you upload a GPS track and estimate the effective length of any given route. You can put it in running mode or walking mode. As an example of an application, I was thinking it would be super fun to run from Manker Flat to Wrightwood and back in the same day, but I was in doubt about whether it was something I was in good enough shape to complete in a day. So I used the software to calculate the energy consumption for that run and compare with the hardest run I'd done that was similar, which was a loop of Baldy and Three T's. I found that the Wrightwood out and back was about 50% more energy consumption, which told me that no, I probably couldn't do it safely in a day.
Rather than the actual energy consumption in calories, the more useful number that the software prints out is usually what I call the climb factor, CF. This is defined as the percentage of your energy that is spent on hill climbing, compared to doing the same route on level ground. As an example, the Mount Wilson Trail Race has a climb factor of 26%, which isn't super high because it's an out and back, and you're going downhill on the way back. For comparison, the Baldy Run to the Top has a climb factor of 48%, which is much higher because it's an uphill-only race. I would like to spread the idea that people should realize that elevation gain isn't really a great measure of exertion, and that there is this other number, the climb factor, that is more meaningful.
I've never been the kind of guy who wanted to geek out too much on sports statistics, but using the software has helped me a lot with planning what kinds of runs I could have fun doing and do safely.
Paper (open access): https://www.biorxiv.org/content/10.1101/2021.04.03.438339v1
Software: http://www.lightandmatter.com/kcals (open source)
The software has a mode for running and mode for walking. The paper tests its predictions against real-world data for running.
For those who don't want to click through and read the whole paper, here's a brief summary of the results in the form of a couple of graphs.
The first graph shows two hypotheses about the effect of hill climbing on how much energy you expend. The solid line is from treadmill data. The dashed line is the rule that every 100 meters of elevation gain counts as 600 meters of extra distance.
I collected publicly available results from races in the LA area and tested these hypotheses. I constructed four tests, which I label a, b, c, and d. Neither model was consistent with all four tests. The second graph summarizes the results.
Each circled part of a graph tells you that I found evidence against that part of the graph. For example, the part of the dashed-line model that's circled and labeled "a" tells you that this part of that model is wrong in real life: a gentle downhill slope really does make you more efficient than running on the flats.
Based on these results, I came up with a third model that seems to work better than the first two. The third model is the same as the solid (treadmill) curve, except that at the bottom of the curve, I replace it with the dotted line. In other words, the treadmill results basically seem right, except that when people are outdoors in real-world conditions, they aren't as fast running downhill as you would have thought based on energy measurements in the lab. It's probably true that their energy consumption per mile is low, but there are other reasons why they just can't run downhill fast enough to keep their rate of energy consumption going at the maximum that their lungs and heart are capable of.
I have an open-source app (link above) that lets you upload a GPS track and estimate the effective length of any given route. You can put it in running mode or walking mode. As an example of an application, I was thinking it would be super fun to run from Manker Flat to Wrightwood and back in the same day, but I was in doubt about whether it was something I was in good enough shape to complete in a day. So I used the software to calculate the energy consumption for that run and compare with the hardest run I'd done that was similar, which was a loop of Baldy and Three T's. I found that the Wrightwood out and back was about 50% more energy consumption, which told me that no, I probably couldn't do it safely in a day.
Rather than the actual energy consumption in calories, the more useful number that the software prints out is usually what I call the climb factor, CF. This is defined as the percentage of your energy that is spent on hill climbing, compared to doing the same route on level ground. As an example, the Mount Wilson Trail Race has a climb factor of 26%, which isn't super high because it's an out and back, and you're going downhill on the way back. For comparison, the Baldy Run to the Top has a climb factor of 48%, which is much higher because it's an uphill-only race. I would like to spread the idea that people should realize that elevation gain isn't really a great measure of exertion, and that there is this other number, the climb factor, that is more meaningful.
I've never been the kind of guy who wanted to geek out too much on sports statistics, but using the software has helped me a lot with planning what kinds of runs I could have fun doing and do safely.