I read the paper in the link above. The cryo samples were harder and more wear resistant than the non cryo'd samples, but not as tough. That was their conclusion.
Since hardness and toughness are inversely proportionate, this comes as no surprise to me.
That cryo dies indeed improve wear resistance is important to me, because a more wear resistant cutting tool is a better cutting tool, IMO. If toughness is more important than edge holding, skip the cryo.
I think the real question here is, is some level of retained Austenite beneficial to the overall performance of the knife? Or, stated another way, does the highest achieveable level of Martensite result in the best blade?
I don't think that can be answered in a "one size fits all" blanket statement.
One problem I have with the method used in the article is that all the specimens were not tested at the same hardness. So, it was an apples to oranges comparison. If the non-cryo'd piece was tested at Rc60, like the cryo'd piece, what would the difference in toughness have been? Certainly greater, but not as much greater.
I have been experimenting with non-cryo'd CPMS30V blades, at the recommendation of Scott Devanna from Crucible. I will try and report back after I finish some up (today) and sharpen them. As people aren't breaking my knives in use, my interest in this is more towards edge degradation and failure during cutting, rather than total blade failure.
I will say, it is easy to get frustrated over all this. Usually, the more education and infiormation you have, the more confident you are in choosing a course of action. Not so in this area.