- See Also
- Gwern
-
Links
- “Classical Sorting Algorithms As a Model of Morphogenesis: Self-Sorting Arrays Reveal Unexpected Competencies in a Minimal Model of Basal Intelligence”, Zhang et al 2023
- “Learning Transformer Programs”, Friedman et al 2023
- “Tracr: Compiled Transformers As a Laboratory for Interpretability”, Lindner et al 2023
- “A Generalist Neural Algorithmic Learner”, Ibarz et al 2022
- “Learning With Differentiable Algorithms”, Petersen 2022
- “Vectorized and Performance-Portable Quicksort”, Blacher et al 2022
- “Is This the Simplest (and Most Surprising) Sorting Algorithm Ever?”, Fung 2021
- “RASP: Thinking Like Transformers”, Weiss et al 2021
- “Why Are Tar.xz Files 15× Smaller When Using Python’s Tar Library Compared to MacOS Tar?”, Lindestøkke 2021
- “PiRank: Learning To Rank via Differentiable Sorting”, Swezey et al 2020
- “Engineering In-Place (Shared-Memory) Sorting Algorithms”, Axtmann et al 2020
- “Sparse Sinkhorn Attention”, Tay et al 2020
- “Fast Differentiable Sorting and Ranking”, Blondel et al 2020
- “Stochastic Optimization of Sorting Networks via Continuous Relaxations”, Grover et al 2019
- “In-Place Parallel Super Scalar Samplesort (IPS4o)”, Axtmann et al 2017
- “Programming With a Differentiable Forth Interpreter”, Bošnjak et al 2016
- “BlockQuicksort: How Branch Mispredictions Don’t Affect Quicksort”, Edelkamp & Weiß 2016
- “Adaptive Computation Time for Recurrent Neural Networks”, Graves 2016
- “Pointer Networks”, Vinyals et al 2015
- “Neural Turing Machines”, Graves et al 2014
- “How Inefficient Can a Sort Algorithm Be?”, Lerma 2014
- “Sorting from Noisy Information”, Braverman & Mossel 2009
- “SimHash: Hash-Based Similarity Detection”, Sadowski & Levin 2007
- “Noisy Sorting Without Resampling”, Braverman & Mossel 2007
- “Noisy Binary Search and Its Applications”, Karp & Kleinberg 2007
- “Proving 50-Year-Old Sorting Networks Optimal: Part 1”
- “Zero Tolerance for Bias”
- Sort By Magic
- Wikipedia
- Miscellaneous
- Bibliography
See Also
Gwern
“Can You Unsort Lists for Diversity?”, Gwern 2019
“The sort --Key
Trick”, Gwern 2014
“Resorting Media Ratings”, Gwern 2015
Links
“Classical Sorting Algorithms As a Model of Morphogenesis: Self-Sorting Arrays Reveal Unexpected Competencies in a Minimal Model of Basal Intelligence”, Zhang et al 2023
“Learning Transformer Programs”, Friedman et al 2023
“Tracr: Compiled Transformers As a Laboratory for Interpretability”, Lindner et al 2023
Tracr: Compiled Transformers as a Laboratory for Interpretability
“A Generalist Neural Algorithmic Learner”, Ibarz et al 2022
“Learning With Differentiable Algorithms”, Petersen 2022
“Vectorized and Performance-Portable Quicksort”, Blacher et al 2022
“Is This the Simplest (and Most Surprising) Sorting Algorithm Ever?”, Fung 2021
Is this the simplest (and most surprising) sorting algorithm ever?
“RASP: Thinking Like Transformers”, Weiss et al 2021
“Why Are Tar.xz Files 15× Smaller When Using Python’s Tar Library Compared to MacOS Tar?”, Lindestøkke 2021
Why are tar.xz files 15× smaller when using Python’s tar library compared to macOS tar?
“PiRank: Learning To Rank via Differentiable Sorting”, Swezey et al 2020
“Engineering In-Place (Shared-Memory) Sorting Algorithms”, Axtmann et al 2020
“Sparse Sinkhorn Attention”, Tay et al 2020
“Fast Differentiable Sorting and Ranking”, Blondel et al 2020
“Stochastic Optimization of Sorting Networks via Continuous Relaxations”, Grover et al 2019
Stochastic Optimization of Sorting Networks via Continuous Relaxations
“In-Place Parallel Super Scalar Samplesort (IPS4o)”, Axtmann et al 2017
“Programming With a Differentiable Forth Interpreter”, Bošnjak et al 2016
“BlockQuicksort: How Branch Mispredictions Don’t Affect Quicksort”, Edelkamp & Weiß 2016
BlockQuicksort: How Branch Mispredictions don’t affect Quicksort
“Adaptive Computation Time for Recurrent Neural Networks”, Graves 2016
“Pointer Networks”, Vinyals et al 2015
“Neural Turing Machines”, Graves et al 2014
“How Inefficient Can a Sort Algorithm Be?”, Lerma 2014
“Sorting from Noisy Information”, Braverman & Mossel 2009
“SimHash: Hash-Based Similarity Detection”, Sadowski & Levin 2007
“Noisy Sorting Without Resampling”, Braverman & Mossel 2007
“Noisy Binary Search and Its Applications”, Karp & Kleinberg 2007
“Proving 50-Year-Old Sorting Networks Optimal: Part 1”
“Zero Tolerance for Bias”
Sort By Magic
Annotations sorted by machine learning into inferred 'tags'. This provides an alternative way to browse: instead of by date order, one can browse in topic order. The 'sorted' list has been automatically clustered into multiple sections & auto-labeled for easier browsing.
Beginning with the newest annotation, it uses the embedding of each annotation to attempt to create a list of nearest-neighbor annotations, creating a progression of topics. For more details, see the link.
compression
transformer-lab interpretability
differentiable-sorting fast-sort adaptive-sort vectorized-sort neural-sort
hashing
Wikipedia
Miscellaneous
-
https://ae.iti.kit.edu/documents/people/sanders/papers/KalSan06.pdf
: -
https://buttondown.email/hillelwayne/archive/when-would-you-ever-want-bubblesort/
: -
https://danlark.org/2022/04/20/changing-stdsort-at-googles-scale-and-beyond/
-
https://github.com/amargaritov/starlit#starlit-algorithm-description
-
https://opensource.googleblog.com/2022/06/Vectorized%20and%20performance%20portable%20Quicksort.html
: -
https://randomascii.wordpress.com/2021/02/16/arranging-invisible-icons-in-quadratic-time/
: -
https://timepedia.blogspot.com/2009/08/on-reducing-size-of-compressed.html
-
https://timepedia.blogspot.com/2009/11/traveling-salesman-problem-and.html
: -
https://wrap.warwick.ac.uk/61087/7/WRAP_cs-rr-360.pdf#page=2
: -
https://www.antoniomallia.it/sorted-integers-compression-with-elias-fano-encoding.html
:
Bibliography
-
https://arxiv.org/abs/2106.06981
: “RASP: Thinking Like Transformers”,