How Web Developers Have Evolved Over the Years

How the web developer role has changed in the past 20 years.

The work needed to develop a website has changed dramatically since the Internet went mainstream a little more than 20 years ago. Since then, the Web has grown from a handful of scientific articles to a massive distribution system for everything from text content to multimedia to games and applications.

The IT roles associated with the Web have evolved quite a bit as well, with the original “webmaster” role becoming the “web developer” position. Here’s a look at how the web developer job has evolved over the past two decades.

Static HTML

At the beginning of the Web in the early 1990s, Hyper Text Markup Language (HTML) had no dynamic or interactive capabilities. The HTML and Hypertext Transfer Protocol (HTTP) specifications were designed to help people easily pull documents from a server and read them through a text interface.

There were no “web developers” then – just a small group of people formatting their text using HTML, posting it on web servers and hoping others would find it.

Webmasters: Early web developers

The server-side Common Gateway Interface (CGI) system allowed websites to interact with users. Web pages could send data to the server, and CGI scripts would create a different web page based on the user’s request. As the Web started to gain traction, CGI was the only programming model available and provided enough functionality for programmers to truly “develop” for the Web.

Early web developers, or “webmasters,” didn’t just write code. They were also responsible for site design, server maintenance and marketing efforts, such as early search engine optimization (SEO) work.

The CGI model of web development was basically the POSIX shell script model (used by Unix and Linux) with few useful tools or guides available to an aspiring web developer. Development and debugging were slow and painful processes because the languages and systems for working in CGI were never meant for web development.

First, they lacked built-in functionality for working with HTML and HTTP. Second, instead of lending itself to creating a full-fledged application, the CGI programming model was more suitable for building a collection of dynamically created web pages.

Webmasters needed to have a very firm understanding of HTML and HTTP to accomplish even the simplest of tasks using the CGI programming model. Because what we know as the modern Web was still in its infancy, help files and a few books were the only real guides available to a web developer.

PHP, ASP, Java and MySQL change the game

The next wave of web development technologies that were introduced in the mid-1990s brought the first round of web-specific development technologies. Languages like PHP, ASP and Java quickly displaced the CGI model.

Equally exciting, the open source MySQL database made it possible for inexpensive web hosts to offer a database backend to websites. Dedicated web development tools, such as those built into Eclipse and Visual Studio, emerged as well, making it easier to work with increasingly large and complex codebases.

These factors combined to make it possible for organizations of all sizes and budgets to afford to have a website that was more than just a collection of static HTML pages. In turn, web development projects became sophisticated enough for web developers to grow beyond the webmaster job title.

Other responsibilities of webmasters were assumed by dedicated web designers, system administrators and others. A number of specialized functions such as SEO work became job roles of their own, and experts in these areas were often brought in on a project basis. Web developers, meanwhile, focused on writing code.

Modern web development

The current phase of web development, beginning in the early 2000s, has seen the biggest changes occur in the development process itself. As functionality evolved to deliver new applications faster — and in an easier fashion than writing desktop or client-server applications — Agile methodologies arose to enable web developers to create quality applications and respond to shifting needs and market opportunities.

Today, a web developer can use mature management techniques and tools that are very specific to the job at hand. Completing basic tasks is no longer a struggle; instead, the modern web developer has the ability to create and deliver amazing applications in record time.

Web developer qualifications

The Salary Guide from Robert Half Technology lists the starting compensation of a web developer in the United States as $78,500 to $129,500 per year, a 6.4 percent increase over 2015 figures.* Web developers should have some or all of these qualifications:

  • Bachelor’s degree in computer science or a related field
  • Several years of web-related experience
  • Strong communications skills
  • The ability to work both individually and as part of a team

In addition, employers will look for a web developer to be well versed in web technologies and tools such as AJAX, ColdFusion, JavaScript, COAP, HTML/DHTML, LAMP and others, according the guide.

There is no doubt that the constant evolution of the business world, technology and IT industry will continue to change the role of the web developer. Web development has grown from an interesting way to publish documents to the most visible way applications are created. What will the next 20 years bring?

If you're looking for a web developer job, check out our current job listings. To find out what web developers are being paid in your region, download our Salary Guide:

Get the Salary Guide