Might not be the best analogy but, think about the controller the same way as you would think about a spider's web. Its sole job is to catch flies (requests) for the spider (underlying layers) to digest. The web can catch and hold smaller or larger flies (models). A spider's web role is not to digest the prey, although it can be used in this purpose. The thinner and cleaner the web, the easier for the spider to make a living.
You could apply somewhat the same logic to your MVC application. The huge and nasty functions you describe are most likely behavior of the model and they should belong in the model (note that the model is not only the object that's being displayed in the view). If the behavior of the model changes it's the model that should be changed and not the controller that handles it.
Also, keeping them as private methods in the controller would only clutter it and make it hard to maintain. It also makes way for a bad habit, since other people that are involved in development would be tempted to do the same, since they've seen it done before in the project.
Yes, this “cheating” is real, and yes, cyclomatic complexity is not an ideal measure of subjective code simplicity. It is still a very useful code quality indicator.
When we extract a function, we are introducing an abstraction. Any abstraction has a cognitive overhead, it is not “for free”. The art in this refactoring is then choosing abstractions carefully so that understanding the abstraction is simpler than understanding the extracted code. This is likely (but not necessarily) the case when the new abstraction can be used to simplify multiple pieces of code, but even a single-use abstraction can be useful when it is sufficiently powerful.
Arbitrarily breaking a function up does not have any value. That doesn't create any abstractions and just obfuscates the actual control flow. What was now a lot of complexity in one place is now a lot of complexity all over the place: the code has become less cohesive.
Instead of a simple function-based cyclomatic complexity, it would make more sense to calculate the cyclomatic complexity of the part of the code that is unit-tested and understood together. In this sense, it has limited value to introduce private helper functions that offer no abstractions and do not have a well defined interface. Eliminating repetitions would be the only benefit here. In contrast, introducing modules, classes, or functions that can be tested, understood, and documented separately do more likely represent useful abstractions. Including them in the cyclomatic complexity calculation of your code is not useful.
I don't know any tool that does this kind of calculation. As a proxy, I use the testing effort with the goal of high branch coverage for the system under test. When the code is decoupled by the use of good abstractions, anything of interest can be tested fairly directly without much effort. If there are parts of the code that refuse to be covered easily, this might indicate these parts are a candidate for abstraction. This tends to be the case with nested control flow constructs, and also with multiple unrelated sequential conditionals or loops.
Finally, it's worth pointing out that subjective complexity is more important than the numbers produced by a tool. Automatic quality checks are likely to miss many parts that make the code difficult to understand for humans, such as issues with the design or naming, but also overuse of reflection and metaprogramming. And some kinds of code are easier for humans to understand then for the tools. The switch/case construct is the prime example of this. For a human, 20 cases look maybe twice as complicated as 5 cases, as long as the cases are simple and don't interact with each other. Tabular formats are super easy for us to understand! But a linter sees 20 branches, which is way over any acceptable cyclomatic complexity.
For that reason, any linter worth its salt can be selectively turned off for a small code region. Linter warnings are just a suggestion to investigate. If your investigation concludes that the warning was a false positive and that the code is in fact easy to understand, then please don't refactor that function.
Best Answer
The canonical answer is to apply the Single Responsibility principle:
Also, classes must have Cohesion
On the other hands, symptoms of low cohesion are exactly what you describe:
As such, my advice is to review your class structure and refactor appropriately. You don't need to have very large classes, but certainly you want classes that do one clear thing right.