The distaste is for complexity. Adding new functions to reduce duplication can add complexity. It can also reduce complexity. It can also decrease complexity for a while and then in the long run increase complexity. Recall the famous quote, "It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures." Sometimes dependencies are really cheap - when they are the right abstraction. I think the article put it fine.
Rails itself is an example of an abstraction that reduces complexity for a while and then adds complexity when you reach a certain size. So it was the right abstraction at first, and then the requirements change and it slips over to the wrong abstraction. Here is an insightful comment about why that is both inevitable and doesn't matter: https://news.ycombinator.com/item?id=11028885
Here's a very objective and powerful way to measure complexity: dependencies and volatility.
Otherwise we're all saying "complex" but not being clear and likely meaning different things.
For example, a lot of people believe that "not easy" = "complex" but as Rich Hickey articulates that's a counterproductive way to think of complexity. (See http://www.infoq.com/presentations/Simple-Made-Easy)
If your system's design results in your stable components depend on the non-stable (volatile) components, your system is complex. This is because volatile components change often and these changes ripple to your stable components effectively rendering them volatile. Now the whole system becomes volatile, and the changes to it become very hard to reason about - hence complex.
Avoiding this problem has been captured, among others, by the Stable Dependencies Principle (http://c2.com/cgi/wiki?StableDependenciesPrinciple), which states that the dependencies should be in the direction of the stability. A related one is the Stable Abstractions Principle (http://c2.com/cgi/wiki?StableAbstractionsPrinciple), which states that components should be as abstract as they are stable.
Rails itself is an example of an abstraction that reduces complexity for a while and then adds complexity when you reach a certain size. So it was the right abstraction at first, and then the requirements change and it slips over to the wrong abstraction. Here is an insightful comment about why that is both inevitable and doesn't matter: https://news.ycombinator.com/item?id=11028885