I Come to Honor Newton, Not Cancel Him
Newton never performed a regression. Leibniz either. That’s not to slight their genius—it’s to question their relevance in today’s classrooms. An American STEM student spends years studying the cathedrals of pre-computational math and may never learn that the world now yields more easily to brute-force computing.
We still teach students to worship elegance: closed-form solutions, hand-cranked integrals, blackboard proofs of beauty. But the real world doesn’t care about elegance. Orbits, pandemics, bridges, and profit models all yield more readily to iteration. You can model them with a few dozen lines of code more easily than with any calculation by hand. Many brilliant students hit the wall of symbolic integration and think, This is pointless. They’re half right—it is pointless to do by hand a very difficult task that all but the most ascetic professionals now perform by computer.
Infinite series succumb easily to iteration. The concept of an “infinite” series becomes powerful once you can count to nearly infinity (and yes, I know a computer reaches only an infinitesimal fraction of it, but that’s semantic nitpicking). If a series hasn’t converged by the hundred-billionth term, it probably won’t. Limits can be taught in a day or two in a computer lab: a limit is just a very small delta, and a compiler can handle coefficients as small as 2⁻⁶³. Letting students feel the beating heart of calculus is worth inelegant reasoning.
The student who asks “why is this the best way?” instead of dutifully performing complex hand integrations is told he lacks discipline. Yet that student is most like Newton himself—impatient, experimental, hungry for what works. Our curriculum rewards endurance over curiosity.
The tragedy is what it does to vivid minds—the ones who might love math. They are told to memorize, to derive, to wait for understanding. They learn the ritual before the power. The grinds keep going because grades are their currency. The restless, often brilliant ones, drift away.
The way forward isn’t to abolish math but to reclaim its purpose. Treat symbolic calculus as heritage—something to appreciate, not endure. Much of it, especially its best symbols, is worth teaching. But the most useful parts are the ones most easily applied.
Teach that d/dx means “a really small slice of x,” and that dy/dx means “change in y over change in x.” Teach that ∫ₐᵇ x dx is the area under y = x between a and b, and that an integral is just an antiderivative. Teach how to read Σ notation. That’s ninety percent of what matters in calculus once you have a computer and want to describe the world with numbers. Don’t let the hard parts of symbolic calculus deter anyone from applied math.
Give them loops, random numbers, regressions, and difference equations. Show that prediction isn’t magic; it’s iteration. Once they can simulate, they can think. Once they can think, they can build.
I prefer partitioning to symbolic calculus. It’s more natural in two or three dimensions, and iteration makes higher dimensions almost intuitive. I hate seeing the world butchered and oversimplified to fit the limits of hieroglyphs. Making the world move on a screen—watching it respond to your parameters—isn’t cheating. It’s comprehension.
I’m a criminal defense lawyer who still feels affection for math. I don’t want my son turned off from what might be a wholesome passion by outdated pedagogy.
We don’t need more students trained to copy 19th-century proofs. We need builders who can model the world, not recite it. Let the next Newton learn to loop, not to grind. The power that once lived in a quill now lives in a compiler, and it’s time the curriculum caught up.


