The Betty White Award for 2022
Computational Complexity Blog
by
4d ago
In Dec 2021 I noted in this post, which was my 1000th post ever (according to Ken Regan, see here) that Betty White had the misfortune of dying on Dec 31, 2021, so AFTER the People we said goodbye to in 2021 articles had already appeared.  That might not be quite right since when she died it was Jan 1 SOMEWHERE in the world. I learned a new phrase- AWE- AnyWhere on Earth.  But no, she was not mentioned in the People we said goodbye to in 2022 articles. The Betty White award goes to a celebrity that had the same fate- dying too late in the year to be mentioned in those artic ..read more
Visit website
Meta-Complexity
Computational Complexity Blog
by
1w ago
I'm sure many of you long-time readers are asking, "Why all this big focus on machine learning in your posts and tweets? You are the 'Computational Complexity' blog! You've barely said a word about meta-complexity." So what is meta-complexity? From what I can tell the term goes back a few years but really came into wide use in computational complexity in the past year. The Computational Complexity Conference held an invited talk on meta-complexity by Rahul Santhanam, and the Simons Institute is hosting a research program this spring on the topic. As the name suggests, meta-complexity studies t ..read more
Visit website
ChatGPT tried to write an obit for Martin Davis. What it got right, wrong, and what to make of it.
Computational Complexity Blog
by
1w ago
When Martin Davis passed away Lance emailed me what he got from using ChatGPT to do an obit. Here it is and I also note what it got wrong. ------------------------------------------------------------------------- Born in 1928 in Brooklyn, New York, Davis received his bachelor's degree from Brooklyn College in 1950 [WRONG-1948] and his Ph.D. from the University of Chicago in 1953 [WRONG- it was Princeton in 1950].  He went on to have a distinguished career, with positions at the Institute for Advanced Study, the University of California, Berkeley, and New York University, where he spent t ..read more
Visit website
Semantic Search for the Blog
Computational Complexity Blog
by
2w ago
 As Google started to limit academic storage, I started looking at Google Takeout and started wondering what I could do with all that data. I downloaded all the posts from the blog, since we use Google's blogger, and ran them through OpenAI's Ada Embedding. The Ada embedding maps text up to 8192 words into a point on the 1536-dimensional unit sphere. You can measure the similarity between two embeddings via a simple dot product, giving you the cosine of the angle between them. So I created a semantic search for the blog. Go ahead and try it out. Search for   You can enter a sear ..read more
Visit website
Martin Davis Passed Away on Jan 1, 2023
Computational Complexity Blog
by
2w ago
As you probably already know from other sources, Martin Davis passed away on Jan 1, 2023, at the age of 94. His wife Virginia died a few hours later. He majored in math at Brooklyn College and graduated in 1950. He got his PhD under Alonzo Church at the Univ of Chicago in 1953. Three years for a PhD seems fast! His Phd was on Recursion theory. In it he conjectured that Hilbert's tenth problem (below) is undecidable. He is known for the following (if you know more, please leave a comment). 1) Hilbert's Tenth Problem. Hilbert posed 23 problems in the year 1900 for mathematicians to work on ove ..read more
Visit website
Positional Encoding
Computational Complexity Blog
by
3w ago
Given the excitement over ChatGPT, I spent part of the winter recess trying to understand the underlying technology of Transformers. After trying various tutorials, I found the best explanation comes in the original 2017 paper, Attention is All you Need. This is my attempt to figure out positional encoding in transformers with some naive questions. Let me know if I'm completely off base. Earlier models like Recurrent Neural Nets and Convolutional NNs worked under the assumption that information close by was more likely to be correlated. Machine learning seems to improve as the models make fewe ..read more
Visit website
Complexity Year in Review 2022
Computational Complexity Blog
by
1M ago
Complexity result of the year goes to NP-Hardness of Learning Programs and Partial MCSP by Shuichi Hirahara Consider the following version of Occam's Razor: find the smallest program consistent with your data. Hirahara shows that solving this problem is randomly hard for NP. The paper also takes a major step to understanding the complexity of the Minimum Circuit Size Problem, a major area of study in computational complexity over the past few years. Runner up goes to Maximum Flow and Minimum-Cost Flow in Almost-Linear Time by Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maxi ..read more
Visit website
Voter Suppression, Harvard Style
Computational Complexity Blog
by
1M ago
The following appeared on Harry Lewis's blog, here, hence it is written in his voice, though it is a co-authored. You'll see why later.  I then have some comments on it.  ------------------------------------------ Voter Suppression, Harvard-Style (This piece is jointly authored by Harry Lewis (Harvard) and Bill Gasarch (University of Maryland at College Park). Harry was Bill's PhD Advisor.) There are elections in Hong Kong, but to get on the ballot you have to be nominated by a committee controlled by Beijing government. Elections for the Harvard Board of Overseers—one of Harva ..read more
Visit website
Harry Lewis's talk on The Birth of Binary on Dec 8 (Thats today!)
Computational Complexity Blog
by
1M ago
 More on the information on Harry Lewis's  talk is in his blog post about it:here 1) On Dec 8, 2022 Harry Lewis is giving the annual Thoraf Skolem Memorial Lecture on                                                      The Birth of binary  Its in Oslo. BOO- I can't get there! It will be on Zoom- YEAH I can see it! Here is the link: here It will be at 7:15AM East Coast Time- BOO- I needs my sleep! (Plus, I am posting this after its over ..read more
Visit website
The Future of Education is Personal
Computational Complexity Blog
by
1M ago
  With all the excitement about ChatGPT, how will machine learning disrupt education, say five to ten years down the road? My guess: individualized tutors. Imagine a tutor working with you teaching you important concepts, walking you through examples, answering your questions, going at your own pace, like the Oxford system. The Oxford tutor system doesn't scale well, or at least it wouldn't if we have human tutors. But we can scale using machine learning and we're not far away from being able to do so. Such tutors will be infinitely patient, have full knowledge of all written materi ..read more
Visit website

Follow Computational Complexity Blog on Feedspot

Continue with Google
OR