O(1) - read as "O of one" - is a notation that says, that no matter how much data you'll put in the algorithm/program, the execution time will be exactly the same.
A good examples are simple assignments, basic calculations, comparisons, accessing list values by index.
### See also
1. [[Big O Notation]]
### Reference
1.