That appears to be a commonly held belief but I see no physical process by which the apex of an edge would see more load due to more obtuse edge geometry behind it (at a given edge angle, to be clear), nor any tests confirming it. It seems like a poorly considered argument that more force to cut = more degradation on the edge. It ignores any consideration of what parts of the knife contribute to the increased difficulty of cutting, and assumes that that increased load must be evenly distributed on the entire knife. I would think that a simple summation of forces on the body being considered would indicate that to be clearly inaccurate.
As a thought, if I took a given knife and glued a large flange to the spine of it, making it drag quite a bit more, would you think that would affect in any way what loads are on the edge? I would suggest the added drag is going to be mostly on the flange, depending on how the cardboard now contacts the knife, and unless there's some interaction between how the flange makes the cardboard split at the edge, then there's no reason it would have an effect on the edge. If anything, one could envision the flange acting to force the cardboard apart and letting the edge fall into a cut which is under tension. I'm not suggesting the flange would increase edge retention here, but my point is there needs to be some explanation or conclusive demonstration that more obtuse edge geometry above the apex results in more wear at the apex.
Maybe someday someone will do a study on it?