Forrester CEO George Colony just posted up a warning for web-focused companies and strategy – the web is dead, and apps are where it’s at.
I agree with him, as I see an on-rushing shift from dumb cloud-based flat web pages to the exciting, interactive and powerful apps we’re now loading on our phones and tablets.
But this “new world” is, in fact, nothing new. It’s simply the middle tick of a pendulum that’s been madly swinging since the dawn of computing – oscillating between centralized computing and local control. Back in the early nineties we would have called this mid-point client-server computing – the beginning of a decentralized model of building programs that was derailed by HTML and WWW – which were themselves a throwback to mainframe-based timesharing from the dawn of computing.
A little history puts this supposed new trend in perspective, and can offer some guidelines to the power, and pitfalls of the latest pendulum swing.
Back when computing first emerged as ENIAC, the machines were just too expensive for any one person to dominate them. IBM’s mainframe group rose to prominence because they were able to time-share – many, many computing jobs were able to share the same huge machine, each getting a small slice of the system’s resources and processor.
These central systems were controlled by a web of dumb terminals – video screens no smarter than a TV, which sucked, remora-like, off of the tiny time-slices allocated to them. They displayed alphabetic characters, but couldn’t do graphics or indeed any sort of local processing.
Over time those terminals got a bit smarter, and front-end based software began to enable more interactive forms, using software that treated each screen as a panel. In 1974 one of the most popular, ISPF, offered some semblance of local control using locally programmable function keys – which is why we still have them on our PCs today. But all the processing power remained in the central mainframe, or server, or what we might call the cloud today.
Speaking of the personal computer, it exploded onto the scene in the late 70s and early 80s, and represented a complete swing of the pendulum from shared to local control. Early PCs were self-contained units, and used local programs – or applications, to process everything locally. The first age of intelligent local apps dawned, with Multimate, Visicalc, Lotus 1-2-3, and Microsoft Office emerging as big winners. It wasn’t until the mid-80s that PC started routinely connecting to bigger computers and each other via phone-based and local area networking – and the pendulum started swinging backwards.
By the late 80s, networks of PCs were starting to talk to bigger computers, known as servers. A new model of computing developed, where smart front-end programs ran on local PCs, performing much of the processing – but offloading some processing and almost all of their data storage to SQL database servers located either down the hall or across the country.
The client-server model tried to intelligently balance the capabilities of intelligent local machines and the time-sharing model of centralized computing, storage and data manipulation. Ultimately, though, many systems failed due to an inability to scale – transaction queuing and synchronization complexity proved too daunting.
And suddenly the web was upon us. The browser – which was nothing more than early ISPF-style panels front-ending a centralized computer – took off, enabled by a centralized network and a protocol that allowed every computer – and every page (or panel) on the extended network to be linked to every other one.
Thus the last 15 years of computing has been dominated by a very mainframe-like model of relatively unintelligent front-ends (the browser) and smart back ends storing data and containing the complexity. But as the browser has become more complex developers started to take advantage of the power of the client (your Mac or PC), building more complex local apps – aka plug-ins – and the pendulum started swinging back again.
And now we’ve come full-circle. Local, intelligent apps, accessing powerful processes and data storage in the cloud, are the new black. The good news is that we’ve worked through a number of the back-end complexity issues as we built bigger and bigger web servers, and the local vs. cloud process isolation problems are better understood.
I’m a big fan of local applications, because they run faster, allow more creative solutions, and can lead to more intuitive apps. Connect them up to the wide variety of servers on the internet, and you’ve got an even better chance of building something amazing. But it’s nothing new. Screen-scraping, time sharing, client-server, cloud computing, desktop computing, browser-computing and now app-internet are all just ticks on the arc of the pendulum. And that pendulum is finally swinging back to the center.
Remember the other failings of Client-Server? Will the pendulum swing back to completely local control? Let me know what you think in the comments.