Meme of two women fighting while a man smokes from a pipe in the background.
The women fighting are labeled “mathematicians defining pi” and “engineers just using 3 because it’s within tolerance”
The man smoking is labeled “astrophysicists” and the pipe is labeled “pi = 1”
Is it actually? I’ll admit im pretty rusty on time complexity, but naively I’d think that pi being irrational would technically make even reading or writing it from memory an undecidable problem
It all depends on the precision you need. You could use an infinite series to get to the precision needed but for most use-cases it’s just a double baked into the binary itself, hence O(1)
It’s usually a constant (or several ones with varying degrees of accuracy and size)
If you’re trying to calculate it, then it’s quite difficult.
If you just want to use it in a computer program, most programming languages have it as a constant you can request. You get to pick whether you want single or double precision, but both are atomic (a single instruction) on modern computers.
Do said atomic instructions produce pi though, or some functional approximation of pi? I absolutely buy that approximate pi is O(1), but it still seems like a problem involving a true irrational number should be undecidable on any real turing machine
What would be the “n” in that Big O notation, though?
If you’re saying that you want accuracy out to n digits, then there are algorithms with specific complexities for calculating those. But that’s still just an approximation, so those aren’t any better than the real-world implementation method of simply looking up that constant rather than calculating it anew.
I guess n would be infinite in the limit I’m looking for. I’m looking at this in like a “musing about theoretical complexity” angle rather than actually needing to use or know how to use pi on modern systems.
For the record, I realize how incredibly pedantic I’m being about the difference between the irrational pi and rational approximations of pi that end up being actually useful. That being said, computational complexity has enough math formalism stink on it that pedantry seems encouraged