Stonemind WEB The Golden Age of Web Application Development.

The Golden Age of Web Application Development.

In a recent episode of This American Life, host Ira Glass suggested that we are now living through a “Golden Age” of television, a time when the major networks are responding to increasing competition with greater experimentation and better quality programming overall. He suggests that we may not recognize that we are living through this era until after it is over, and we are left to ponder fond memories.

I suggest that we are experiencing a similar “Golden Age” of Web application development. I think its been going on for some time now, taking shape in a recognizable form in 2005 and really hitting its stride in 2006. And, I suspect it may have peaked already and that in a few years, some of us may look back on this time and romanticize it.

Perhaps the two most recognizable developments in this Golden Age are our evolving definition of Web 2.0 (of which, social networking, tagging and rich AJAX interfaces are probably the most recognizable components) as well as the popularity and subsequent impact of Ruby on Rails and its clones.

Other signposts of this era include the increasing use and acceptance of interpreted languages over compiled languages and the formulation of REST. When I think about the way that concepts from Web 2.0 applications, Rails and REST have converged in my consciousness as a Web application developer, I am shocked to realize that after more than a decade in this field, I am only now beginning to feel that I really understand how to design and build these types of applications. As some have suggested, perhaps we are really just beginning to define “Web 1.0”.

Still other general trends have emerged to foster this Golden Age. Linux has seen enough acceptance on both the desktop and server that we aren’t as preoccupied as we were in the past with questions like “Is Linux ready for the desktop?” or “Is Linux ready for the enterprise?” I’ve also noticed that whereas in the not-too-distant past, conservative managers were skeptical of adopting any open source software, the pendulum has swung hard the other way, and these same managers are now prejudiced against proprietary software. I can barely remember the last time I have heard ASP or ColdFusion taken seriously as a possible development platform, and now when weighing technology choices, being proprietary is usually acknowledged as a risk of adopting such software.

But it seems that in this Golden Age, many managers and developers alike are overwhelmed by the changes and the choices confronting them in many areas. Perhaps the most visible technology alternatives currently confronting us are the dozens of Web application frameworks being actively developed across all the major languages in use. Even within Ruby there are other frameworks besides Rails, although they don’t get nearly as much attention. As I’ve said in this forum in the past, I think that being able to choose from many good alternatives is a nice problem to have, but it seems that many don’t share this perspective and might not characterize this era as a Golden Age at all, but as a time of uncomfortable upheaval.

I think this is just a classic example of “analysis paralysis”. People are uncomfortable with the number of options available to them, and when it comes to making a decision, to choose just one option and discard the others, the fear of making the wrong choice is overwhelming.

I’d like to offer a different view.

Its my belief that the very fact that a decision is difficult to make is evidence that the relative perceived merits and shortcomings of the available options are closely balanced in such a way that any potential negative consequences that may come from choosing any one alternative are going to be minimal compared to the potential negative consequences of choosing any other alternative. The real advantage for you and your organization is to make a reasoned choice promptly when the time is right, rather than wasting time trying to predict what will be the best choice ultimately.

Not only is predicting the future notoriously error-prone, but so is interpreting events of the past. For a non-technological example, consider the controversy surrounding President Bush’s decision to invade Iraq and the subsequent handling of that country’s occupation. Or, read the nakedly and unapologetically racist accounts of the perceived failures of Reconstruction (the policy of reforming the South after the American Civil War) written in the early 20th century by historians respected at the time. The truth is that we can’t even know what consequences and ramifications a different decision made in the past might have produced. We are unlikely to ever know what the very best decision might be or could have been, and that’s OK. The important thing is to act, to make a decision, when the time is right to do so, and thereby begin to benefit from that particular decision as much as possible, as quickly as possible.

And I suspect you already know how to make reasoned technology choices for yourself and your organization. You are attracted to the productivity of Ruby on Rails, but you work in a PHP shop? Then a Rails clone like CakePHP may be a reasonable, incremental, low risk choice palatable to your organization, even if it doesn’t have the same level of hype as the framework that inspired it. Don’t get hung up on emotionally charged topics of questionable value, like performance benchmarks or religious debates around programming languages, but instead make prudent decisions for you and your organization just as you have done in the past. Trust that process and trust your own judgment. After all, rendering these judgments is a fundamental part of your job.

I think that we tend to overestimate the amount of risk in these types of decisions. Every decision involves some risk, and on the one hand, the pace of change seems to increase these risks because we can’t see very far ahead into the future with much certainty. But on the other hand, the pace of technological change can actually mitigate the impact of our technology decisions. Are you worried that if you adopt CakePHP instead of Ruby on Rails, you will regret it in some way? Well, the truth is, even in the worst case scenario, you won’t have to regret it for long. In a few years, the technology landscape will be significantly different than it is today and you will have the opportunity to make the same decisions all over again, and you will have another chance to get it right.

So, go ahead and use CakePHP without worrying that you are somehow missing the Rails boat. You’ll be using a similar approach, learning similar concepts, and you will still be in a good position to choose from the next generation of Web application frameworks in a few years. What’s that? You think Ruby on Rails will still be the darling of the Web development community a few years from now? I seem to recall having similar thoughts when I first learned Struts! (How can it get any easier than this?) You might be right, but I wouldn’t bet on it.

Change is not only constant, its healthy, and it keeps our work interesting. You should be comfortable with the knowledge that at any given time, you are likely to be replacing at least one tool in your personal bag of tricks. It may not always be as fundamental as your programming language or your application framework–it might be one of your libraries, your preferred IDE, a new version control system or something else, but it will always be something. Whenever the benefits of using a new tool, minus the costs of learning it, are perceived to be greater than the benefits of continuing to use your current tool, you should not be afraid to act on the first reasonable opportunity to adopt it.

Sometimes the decision to change will be made for you. For example, the rest of your team may decide to adopt CakePHP when you wanted to learn Ruby on Rails, or you may go to work in an organization that uses PostgreSQL instead of your pet database, MySQL. More than once, I have seen decisions made entirely because of office politics and self interest, such as when an organization spent over a quarter of a million dollars a year on licensing and support for a content management system written in ColdFusion, because the manager who made the decision was friends with the owner of the company that made the software. In these cases, no technical, business, or organizational review is even attempted.

These situations can be very frustrating, but they do happen, so my advice is just to get over it. In cases when you feel forced to learn something new, remember that you are still learning, you will see things from a different perspective and you will be a better developer as a result. (OK, maybe not with a ColdFusion system, but in most cases.) The time you take to learn something new is an investment in your personal and professional growth that adds value to your organization and future potential employers.

Managers are notorious for wanting to decide on “THE ARCHITECTURE” that their organization will support in perpetuity, believing that the investment of time and staff resources can be maximally leveraged by freezing as many technology decisions as possible for as long as possible. For some reason, they don’t seem to remember that they decide on THE ARCHITECTURE every few years. The truth is that the danger of “multiplying architectures” just isn’t real in most organizations. Your organization’s underlying architecture is fluid–it flows over time, changing shape incrementally in relation to the changing technology landscape.

I think the greater risk, both for individuals and organizations, is that in trying to freeze the organization’s architecture, the organization may not correctly assess the diminishing value of some technologies and properly invest current human capital in future possibilities. I once witnessed a department embark on a major new project with a very innovative idea, but they decided to develop the system using a technology they had been using for a decade…Perl. It took longer than it could have taken to develop the software and stabilize it, and once this was accomplished, nobody wanted to use it, because the risk of investing in Perl was too great, its current value and future prospects were too low.

The rationale was that as an organization, they were going to be getting as much return as possible from a past investment of resources in Perl. But that past investment had long paid back what it could when compared with more modern alternatives, and now this system presented a disproportionate cost to the organization. It was difficult to find anyone internally, or hire anyone externally, who wanted to support such an old language. (And what if they had found someone eager to work in this dead language? Is that the type of employee you really want to hire? How valuable will that person be to the organization in the future?) As a result, the cost of supporting this system was much greater than if the decision had been made to invest in developing it in a newer architecture, even if at the time, the organization didn’t possess the particular expertise needed. In the end, the department finally made the right decision, to cut its losses and not develop the system further, and this decision was made shortly after its first stable release. Unfortunately any possibility of further developing the underlying innovative idea of this system also ended.

Another reason why “multiplying architectures” tends not to be a problem is that the systems we build have a finite, and typically quite brief, life span of their own that may have nothing to do with the underlying technologies used to build those systems. Aside from the increasing costs of maintaining older technologies, at some point these systems fail to serve users as well as newer alternatives. If organizations aren’t willing to decide when to “kill off” or migrate certain applications, their customers/users will make it for them, abandoning these applications once they stop delivering a certain level of value or when a new application emerges offering more value. Either way, old platforms tend to be minimized and eventually abandoned by the attrition of the systems built on those platforms.

Of course, there are times when delaying a decision or a change is the right course of action. If you don’t have to make a decision quickly, the costs of making the wrong decision are genuinely high, and none of the alternatives presents clear advantages or disadvantages, then delaying the decision to a time when you hope to have more information about the various alternatives or when the risks are more acceptable may be the best course of action.

But I believe we often err too much on the other end of the continuum. As developers, the time and energy spent learning a new framework, a new library, a new IDE, or whatever, is not a cost, but an investment, that will pay off in greater knowledge and productivity and put us in a better position to keep learning more new skills as time passes. We owe it to ourselves to keep pace with the changes in our environment and be as valuable as we can be.

So start basking in the warm light of this Golden Age, its choices and opportunities. It will be over before you know it.

Related Post