Think about how all our nuclear tests actually raised carbon 14 levels to such a high point it can actually be used as a marker to determine the age of an object anywhere in the world.
Nuclear testing did change the levels of C14 significantly [1]. Carbon dating is impossible for dates after the '50s, when large amounts of nuclear testing started.
OK, but in my understanding, C14 dating was not really used to date anything recent in the first place. Usually it's used to date old items, monuments, bones, etc...
I think the suggestion is C14 dating would be useful to date recent events (perhaps in accidents or crime scenes) were nuclear tests not so common in the 50s. Humanity lost a useful tool (dating recent events) so that we could develop more devastating nuclear weapons a bit quicker than we otherwise would have.
Phrased like that it sounds a shame, but I have no idea if this is true or not. My understanding of C14 dating is that it relys on the half-life of C14 being predictable. But my understanding of radioactive decay is that it is not perfectly deterministic. This suggests to me that you might need a bit of time for the radioactive decay to trend to its expected average rate.
Sorry but it does not make any sense. When there's C14 inside an old artifact or a skeleton, the fact that the atmospheric C14 has doubled over the past 50 years does not change the result of your C14 extracted, for old items.
Then think about our crazy high cancer rates.