I think the point being made is that the quality of software hasn't matched the great increases in hardware. For example, the Web is progress in some way, but it's a pretty ugly solution, made from poorly conceived "standards" all held together with duct tape. Alan Kay also has this view:
http://www.drdobbs.com/architecture-and-design/interview-with-alan-kay/240003442
Quote:
The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
I think, though, that there has undeniably been some progress (consider the Internet and networking in general). We can also look at the state of Unix systems. Back in the 80's, scripting would have been done with Bourne shell and awk. These days we also have very popular interpreted languages like Python and Ruby. These can fill in the big gap between C and simple shell scripts, and help us avoid the ugliness of Perl.
Languages like Go also show some real progress made in concurrent programming, and the same might be said of Erlang as well. There is some substantial progress, but it happens fairly slowly, and most people are still happy to reinvent the wheel for the Nth time.
The big failing, in my opinion, has been in mainstream graphical programs like web browsers. It seems that there is still no elegant way to create a GUI application, no clear set of primitives from which they can be composed in a modular manner. GUI applications are still ugly, and little progress has been made on these.