4.3 Article

Gacs-Kucera theorem

Journal

THEORETICAL COMPUTER SCIENCE
Volume 929, Issue -, Pages 172-173

Publisher

ELSEVIER
DOI: 10.1016/j.tcs.2022.06.040

Keywords

Randomness; Complexity; Reducibility

Ask authors/readers for more resources

The Gacs-Kucera theorem is an important tool in mathematics and computer science, which can reduce infinite sequences to random sequences. The early proofs of this theorem were somewhat cumbersome, but the use of general concepts can significantly simplify the proof process.
Gacs - Kucera Theorem [2,3,5], tightened by Barmpalias, Lewis-Pye [1], w.t.t.-reduces each infinite sequence to a Kolmogorov - Martin-Lof random one and is broadly used in various Math and CS areas. Its early proofs are somewhat cumbersome, but using some general concepts yields significant simplification illustrated below. (c) 2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available