no it’s the joke. In o-notation you always use the highest approximation, so o(n!²) does not exist, it’s only o(n!)

Otherwise there would never be o(1) or o(n), because o(1) would imply that the algorithm only has a single line of instructions, same for o(n)

T = O(n) means that there exists a single constant k such that T < kn for all sufficiently large n. Therefore O(n!^2) is not the the same as O(n!), but for example both 10n!, 10000n!, n! + n^2 (note the plus) are O(n!).

Another way to think about this: suppose you believe that O(n) and O(n^2) are distinct. Now plug in only numbers that are factorials (2, 6, 24, …).

Create a post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.
  • 1 user online
  • 120 users / day
  • 257 users / week
  • 744 users / month
  • 3.72K users / 6 months
  • 1 subscriber
  • 1.48K Posts
  • 32.7K Comments
  • Modlog