Algorithms for the 21st Century

Thursday, September 28, 2006 - 7:00pm
Steve Johnson, MathWorks

Many of the algorithms that we use today were designed to improve performance on machines that were much smaller and slower than today's machines. The analysis of, and justification for these algorithms usually makes assumptions that are no longer true for today's machines processing large datasets. Examples of these assumptions are that all memory references are equally costly, that memory access doesn't matter in analyzing algorithms, and that caches make small strides more efficient than large strides. The talk provides more questions than answers, but shows some empirical data that raises a lot of questions about traditional data structure design for large data.

Lecturer Biography: 

Steve worked for Bell Labs for almost 20 years, where he wrote Yacc, lint, and the Portable C compiler, and co-authored the first AT&T 32-bit Unix port. He then spent 15 years in Silicon Valley working for several startups including Ardent Computer and Transmeta. For the last 4 1/2 years, he has worked on the MATLAB language at The MathWorks in Natick, MA.