Except that it doesn't seem to. You only notice a rise in serum creatinine levels, or any other marker of uraemia, after fully 50% of the kidneys' function has been lost. In other words, as some donors know, you can make do with just one kidney! Such an extravagance would surely be penalised by evolution, for such resources can be better diverted into other things - like finding food, fighting rivals and mating (!).
What is the solution to the paradox? I don't know, but two possible reasons for the maintenance of excess renal function in our ancestral past come to mind:
- Perhaps this increased capacity was required more frequently than we think. Prior to civilization, perhaps our diets contained a higher toxin load - a load that we now bypass thanks to both millennia of selective breeding for more human-friendly food and better food processing. Furthermore, perhaps states of hypovolaemia (which are liable to cause acute renal failure) were more common before. I'm thinking now of blood loss and dehydration...
- More shrewdly, perhaps our normal bodily function is actually affected by a more modest decline in renal function than we currently think it is. For instance, perhaps (say) growth is relatively stunted by even a 10% decline in renal function (which would raise the retained collection of toxins by 10%). We only think the kidneys are so lavishly wasteful because the first abnormality that we can detect happens long after this point.
I am indebted to this article for the second insight.