What has happened is that as the web has become mainstream in terms of developers, there is an understanding of the technology. The positive side is that coding to web standards and providing robust applications is commonplace. The negative side of the equation is that applications have become constrained by the current state of the art, and with design patterns, best practices and architectures essentially decided, there is less and less innovation.
The lack of innovation has come decidedly from standards bodies. It was clear the web became popular and powerful by adherence to standards. The standards made it possible for browsers to happen. Everyone agreed (at one level or another) to work together and the Web is the result. The problem is we are continuing to look at standards bodies for the innovation. The has obviously not worked as the progression of the web has once again been driven by innovative ideas and libraries rather than new standards.
The issue now is that the web is ready for a true version 2.0 in terms of the de facto web APIs, but we are having trouble moving past the second system syndrome. The standards have only become bloated and have added little value. There is now a legacy requirement that was not present when the web first emerged. Generally, it is tough to see where the web is headed and how to get there.
At the end of the day, every developer can help to improve the web by making an effort to try something new. The web is important because it has changed the lives of users. That is far from being an entirely technical task. Innovative user interface ideas and new methods of dealing with data all help to improve the state of the art, and more importantly, improve the web for users. Even though some of the core technologies of the web are showing their age, it doesn't mean we as developers can't make an effort to continue to improve the usefulness of the web for users.