Informally, the logical depth of a string to a significance level is the time required to compute by a program no more than bits longer than the shortest program that computes .[1]
Formally, let be the shortest program that computes a string on some universal computer . Then the logical depth of to the significance level is given by where is the number of computation steps that made on to produce and halt.
Bennett, Charles H. (1988), "Logical Depth and Physical Complexity", in Herken, Rolf (ed.), The Universal Turing Machine: a Half-Century Survey, Oxford U. Press, pp. 227–257,
CiteSeerX10.1.1.70.4331