:thumbup: Excellent and loaded questions.
When testing my edges, I look for a balance point of micro-roll and mini-chip when edge encountered damaging impact. For such impact, too soft = very big roll; too hard = large crescent chip. This 'balance point' solely depend on quality of the matrix. The finer the grain and smaller carbide, the higher this balance point will be. Higher balance-point, mean higher hardness and higher toughness go together.
I think, an ideal mode of edge degradation (for non-wearing type) would be micro-roll & mini-chip because this type of damage has less depth than big roll/ripple or large chip. Say as a kid, if ask to build a high arch 3" & 10" long 1" thick bridges out of lego. Arch = bending radius. Without much hesitation, you would choose micro lego instead of those honkin' 1" thick lego, right?
Technically, this balance mode is also applicable to 'wearing' degradation type especially when carbides serve as saw teeth. So ideally, the matrix hold onto carbide as strong & tough as possible, so ideal degradation would be a release of individual carbide rather taking out a chunk of matrix along with carbide. This way, intact matrix would protect other carbides, avoiding cascading failure. In super high alloy case, well, there aren't much matrix spacing between particle/carbide, so sometime cascade (big chip) is unavoidable if the matrix can't absorb the damaging energy. With an incoming 50cal, I would rather be behind a single wall of sandbags than a single wall of bricks
My current ht for low Cr steels with carbon 0.9-1.2% yield similar ultra fine grain sizes. Where 52100 has slightly more & larger CrC than in W2, so it gave up tiny toughness for gain in wear resistant.
What about edge toughness difference at the higher Rc's? Are they similar in edge degradation? The way the edge starts to micro roll or chip?