Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm.
Formally, in the context of some universal computer U {\displaystyle U} U the logical depth of a string x to significance level s is given by \) {\displaystyle {\text{min}}\{T(p):(|p|-|p^{*}|<s)\wedge (U(p)=x)\},} \) the running time of the fastest program that produces x x and is no more than s longer than the minimal program.
See also
Effective complexity
Self-dissimilarity
Forecasting complexity
Sophistication (complexity theory)
References
Bennett, Charles H. (1988), "Logical Depth and Physical Complexity", in Herken, Rolf (ed.), The Universal Turing Machine: a Half-Century Survey, Oxford U. Press, pp. 227–257, CiteSeerX 10.1.1.70.4331
Craig, Edward (1998), "Computability and Information, Section 6: Logical depth", Routledge Encyclopedia of Philosophy, Vol. 10: Index, Taylor & Francis, p. 481, ISBN 9780415073103
Undergraduate Texts in Mathematics
Graduate Studies in Mathematics
Hellenica World - Scientific Library
Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License