REQUEST: Knife throwing and metal fatigue, materials science. Let's see some data.

chiral.grolim

Universal Kydex Sheath Extension
Joined
Dec 2, 2008
Messages
6,422
I posted a thread in the knife testing area a while back trying to prove or refute the myth that throwing knives not specifically heat-treated for throwing necessarily results in fracture and therefore constitutes gross abuse.

So, does anyone have any light to shed on this? And by "light" I mean a mathematical demonstration based on physics and/or demonstrative empirical data?
 
While I have no data to offer you, I do have a follow up question as well. Why is it that manufacturers of knives designed for throwing, heat-treat their blades to a much lower RC hardness than those that are not designed for that specific purpose? I have read that throwing knives typically have a RC in the high 40's to low/mid 50's. If heat-treat wasn't a consideration, then why not harden all knives to the 58-60RC that we see in "hard use" knives?

Ok, that was two questions...
 
I am not a knife maker or metallurgist, but there are a couple things that i have gathered regarding heat-treating steel to a purpose.

From wikipedia: Austenite Behavior in plain-carbon steel - As austenite cools, it often transforms into a mixture of ferrite and cementite as the carbon diffuses. Depending on alloy composition and rate of cooling, pearlite may form. If the rate of cooling is very fast, the alloy may experience a large lattice distortion known as martensitic transformation in which it transforms into a BCT-structure instead of into ferrite and cementite. In industry, this is a very important case, as the carbon is not allowed to diffuse due to the cooling speed, which results in the formation of hard martensite. The rate of cooling determines the relative proportions of martensite, ferrite, and cementite, and therefore determines the mechanical properties of the resulting steel, such as hardness and tensile strength. Quenching (to induce martensitic transformation), followed by tempering will transform some of the brittle martensite into tempered martensite. If a low-hardenability steel is quenched, a significant amount of austenite will be retained in the microstructure.


So that regards tempering steel to the appropriate microstructure/carbide matrix desired to maximize performance for an intended use. Further data I have gleaned is that steel hardness is almost directly proportional to ultimate tensile strength and compressive strength (how much a material will deform before taking a permanent bend). Softer steel will bend plasticly until yield-point (fracture) is reached, harder steel will resist deformation or bend elastically (non permanently) until fracture. That regards strength. Toughness (Charpy impact test) on the other hand is somewhat inversely proportional to the final tempered hardness of steel, reaching maximum for 1095 at ~50 HRC with 125 ft*lbs compared to 20 ft*lbs at 60 HRC. Torsional impact toughness responds similarly. http://www.panix.com/~alvinj/graph1095.jpg
So, to maximize impact toughness of a knife, putting it at low HRC is optimal... but this leaves a soft edge that bends and compresses easily. Sooo.... how brittle is 20 ft*lbs impact toughness on a knife designed for cutting? THAT is really the question, I think, if "hard use" knives in 1095 appear plenty tough at 20 ft*lbs. Other knives might be tougher, but how tough is necessary to endure the stresses of throwing (esp. if throwing isn't the primary intended use)?
 
I am copying this post from my thread in the Knife Testing area, just to bump up this discussion. I realized that I'd made some poor calculations previously, so I am attempting to rectify the matter and see what comes out of pushing this discussion further.

For those curious, this all began with bumping up against the wide-spread myth that 1095 knives hardened to 60 Rc are too brittle to throw without inducing failure. Given the plethora of empirical evidence debunking this myth (thousands of throws with such knives that did NOT result in failure vs. a few throws that did), I wondered at its persistence and wanted to investigate the physics behind knives breaking when thrown...

One real quick thought:

50 km/h (= 14 m/s) to standstill in lets say 1cm (= 0.01m)
...
so acceleration = 14/.00143 = 9800 m/s^2 or about 1000 g's for just under one and a half milliseconds.

If your knife weighs in at 500g, it is going to experience a load near the tip equivalent to an axial load of 500 kg. There are a number of simplifications assumed eg: constant acceleration ( it will actually start low and end high as a broader cross-section enters the target ), rotational forces low compared to axial and so on, but I think it gives a flavor. And you can change the parameters to match your own case, eg: 2cm penetration halves the result.

If you hit something hard, it makes for a huge load, but it is also less likely that it will fully stop your knife, so swings and roundabouts a bit.

I know that this is responding to an old post, but I wanted to point out, like Lagrangian, that calculating G-force is a step backward from looking at ENERGY and its distribution. Now I have realized that in my previous post I included some gross miscalculations, so here is that post with corrections:

Let's assume our hardened 1095 knife is 340g, approximately 30cm long x 4cm wide x 0.5cm thick (net 60 cc), center-of-mass traveling 14 m/s, kinetic energy at impact = 33.32 J, and the if the blade absorbs ALL of this energy at impact (perfect ricochet), that's 0.555 J/cc (Joules per cubic centimeter).

I found some data graphing impact toughness (via Charpy) of 1095 steel at ~60 HRC in the 15 - 20 foot-pounds range, 1 foot pounds = 1.35581795 joules, so 15 ft-lbs = 20.4 Joules. Now, I'll need someone to explain to me what unit of volume/area that value covers, but if it is the Charpy standard, that's 5.5 cc, so 3.7 J/cc.

That's 6.67X the amount of energy per cc required to endure the impact of the knife being thrown...


Is it incorrect to divide the Charpy value into unit-volumes? Perhaps so, since the energy of the pendulum in the impact-test makes contact at only a small part of the test-object... perhaps it makes more sense to work with cross-sectional areas, 1cm^2 for standard Charpy test samples. Soooo... 1095 @ 60 Rc can handle 20.4 J/cm^2 impact energy. Our throw puts 33.32 J into the knife's center of mass. 33.32 > 20.4, so the energy is present to induce fracture in a 1 cm^2 cross-section, indeed up to a 1.63 cm^2 cross-section, but our knife is 2 cm^2 (4 x 0.5). Even if ALL the energy of the traveling knife was reflected back into it on impact, it could withstand 20% more prior to fracture.
Usually a thrown knife experiences only a few rotations in flight and I'm not sure how to incorporate that energy into the equation - someone else, please? - but would that energy be able to cover the remaining 20%? Unlikely.
And here we must remember that the likelihood of ALL the energy being reflected back into the knife and absorbed is quite low. A "stick" transfers the majority of this energy into the target. A "bounce" dissipates energy into the continued motion of the blade and vibration (which we already know should NOT induce fracture). To induce fracture, we need enough energy focused at a small-enough cross-section upon impact before dissipation.

Now, one might argue that knives are not designed as a solid block with such square dimensions, the cross-section varies along the length in the form of edges, tips, etc. and that is true. The smaller cross-sectional areas are more susceptible to fracture - less energy is require to snap off the thin tip of a knife or the edge of the blade than is required to induce catastrophic failure at a thicker portion.
If we give our sample knife a full-flat grind on each side, we reduce it's cross-section to 1cm^2, lower as it approaches a radius'd tip. Now we just need a throw where the tip becomes a focal point for sufficient energy on impact, perhaps on a throw where it embeds only 1cm into the target (as proposed). For a full-flat-ground blade 4cm wide x 0.5cm thick, the edge angle is ~3.6dps. If this same angle is ground in a radius at the tip (single-edge blade), the knife is 0.12cm thick & 2.65cm wide at 1cm from the tip, net 0.32 cm^2. At 20.4 J/cm^2 impact toughness, a mere 6.5 J focused at this cross-section could induce fracture, 20% of the energy in the blade as it flies. So, on impact, is >80% of the energy dissipated? If so, no fracture at that point will occur unless the blade was flawed there to begin with.

Boy, knowing how much energy is dissipated by other pathways on impact is pretty important to calculating what sort of impact is required to fracture a knife!

This would make the matter quite simple regarding analysis of broken knives. If you have a general idea of how much energy was put into the throw and can measure the cross-section of the fracture, you can compare this to the Charpy value and deduce whether or not the blade performed within expected parameters, i.e. if the energy used to induce the fracture exceeded the expected toughness level. If the fracture energy value was below expected values, then the knife was flawed before the throw. Of course, deducing the energy level of the blade on impact might be more difficult than that as well. What if it was a really hard throw? What if it was rotating at very high speeds?

The discussion continues...
 
My guess is that part of why throwing non-throwing knives is considered abuse is other construction factors. Things that almost all throwing knives have, like full tangs, may not be in bowies and other knives that people throw. This leads to damage that is not related to the treatment of the metal. I think that most throwing knife breaks are probably as result of poor throws where the knife sticks at an angle side to side, resulting in torque on the blade, similar to if it were used for prying.
Sorry I didn't have any more specific info in the physics of the problem. I have enjoyed reading what you have posted, thanks for sharing.
Good luck finding more info, Unicyclone
 
Back
Top