Open Data · Community Project

DevDrift

How the programming profession changed · 1970–2026

The skills expected from senior developers, decade by decade.

Skill Radar Skills Timeline Dev Population Complexity

Reading the Scale

All charts use a 0–10 scale calibrated to average expectations for a senior developer role.

0
Not on the radar. The skill didn't exist yet or nobody considered it relevant.
5
Helps you get the job. Having it sets you apart, but not having it won't disqualify you.
8
Not having it keeps you out. Most teams treat this as a hard requirement for senior candidates.
10
Not even considered a candidate without it. The skill is so fundamental it's assumed, not tested.

Then vs Now

Programming meant knowing your hardware — memory addresses, register layouts, instruction sets. A "developer" was usually a mathematician or electrical engineer working on a mainframe in a university lab or a defense contractor's basement. Teams were tiny, projects took years, and shipping meant handing over a physical tape. The tools were a line editor, a compiler, and if you were lucky, a debugger. There was no internet, no package manager, no Stack Overflow. You learned from printed manuals, colleagues, and trial and error — mostly error.

The personal computer changed who could write software. BASIC, Pascal, and C gave anyone with a desk the power to build something. The first IDEs appeared. Software became a product you could buy in a shrink-wrapped box at a store. But the ecosystem was still small — you picked one language, one platform, one compiler. You learned from thick reference books and local user groups. The entire industry still fit inside a few thousand companies, and most programmers never collaborated with someone they hadn't met in person.

The web broke the walls down. Object-oriented programming went mainstream with C++ and Java. Design patterns, refactoring, and the first whispers of agile appeared. Linux proved that strangers on the internet could build an operating system together. A developer now had to think about GUIs, networks, and users who weren't engineers. The job stopped being "write correct code" and started being "build something people can use." The skill ceiling quietly doubled, but nobody updated the job description.

The dot-com boom pulled millions into the field, and the bust that followed couldn't push them back out. Web development exploded — HTML, CSS, JavaScript, PHP, and MySQL became the new bread and butter. Agile went from manifesto to methodology. Version control moved from CVS to SVN to the early days of Git. The stack started fragmenting: frontend, backend, database, and deployment became separate specializations. Google showed that scale was a problem worth solving, and suddenly every startup wanted to solve it too.

The cloud ate the server room. AWS, Docker, Kubernetes — infrastructure became code. Mobile-first design rewired how developers thought about interfaces. Git and GitHub transformed collaboration from emailing patches to pull requests. The stack exploded: React, Angular, Node, microservices, CI/CD pipelines, monitoring, observability. A senior developer was now expected to be part architect, part operations engineer, part security consultant. The number of things you needed to know to be considered senior roughly doubled — again.

AI appeared out of nowhere and rewrote the rules overnight. Remote work became the default. Security and supply-chain attacks moved from edge cases to daily concerns. The stack kept growing — but now AI tools started writing code alongside you, reviewing your pull requests, and generating tests. Being a senior developer in 2026 means navigating more complexity than any single person was ever expected to handle, while deciding which parts of your job to delegate to a machine that learns faster than you do.

Skill Expectations Timeline

The skills that defined a programmer — language mastery, algorithms, data structures — quietly lost ground to everything that surrounds programming: cloud, security, observability, CI/CD. SQL and design patterns peaked and declined. Soft skills went from footnote to non-negotiable. Then AI compressed an entire skill category from zero to must-have in five years. The job title stayed the same; the job description was rewritten from scratch — twice.

Developer Population

In 1970 fewer than 100,000 people worldwide wrote code — mostly academics and defense contractors working on mainframes. The personal computer changed that: by the mid-1980s anyone with a desk could program. Then the web exploded the numbers again — the dot-com boom pulled millions into the field between 1995 and 2000, and even the bust that followed barely dented the trajectory. Smartphones, cloud computing, and the startup gold rush of the 2010s kept the curve steep, while coding bootcamps and free online courses lowered the barrier to entry almost to zero. The result: the developer population roughly doubles every eight years. That growth rate is exactly why senior developers are so scarce — when half of all developers have less than eight years of experience at any given moment, seniority is a mathematical minority, not a failure of hiring.

Software Complexity Growth

In the 1970s a whole program fit on a stack of punch cards. C and Unix made code portable, PCs multiplied the audience, but the tools stayed simple. The web broke that — software suddenly had to run on unknown browsers, talk to remote servers, and handle millions of users. Open source turned shared dependencies into an avalanche nobody fully controls. The security surface grew in lockstep, the stack fragmented into containers, orchestrators, cloud services, and dozens more tools. All of it compounds: more packages mean more vulnerabilities, more technologies mean more integration surface, more decision points mean more places to break.

Linux LOC

Lines of code in the Linux kernel — the most widely deployed software on earth, running phones, servers, cars, and satellites. Its growth is a thermometer for platform complexity: when the kernel doubles, drivers, syscalls, and edge cases double with it, and every layer built on top inherits that weight. From 10K lines in 1991 to over 40 million today.

Branches

Total decision points — every if, loop, and pattern-match — across a project and its entire dependency tree. Modern applications ship with thousands of transitive dependencies, each adding their own branching logic. This is the single best proxy for how much behavior a senior developer is implicitly responsible for when they hit "deploy".

npm

Total packages in the npm registry — the world's largest and a proxy for ecosystem fragmentation. More packages means more choices for every task, more transitive dependencies to audit, and more surface area for supply-chain attacks. A senior developer in 2010 might have evaluated a handful of libraries; today they inherit hundreds the moment they scaffold a project.

CVEs

Publicly disclosed security vulnerabilities per year. More CVEs means more patches to apply, more dependency upgrades to evaluate, and more time spent on security reviews instead of features. A direct measure of the "maintenance tax" that growing complexity imposes on every team.

Stack Size

The number of distinct technologies a senior developer is expected to navigate in a typical project. In 1970 that meant an editor and a language. Today a single full-stack project may touch a framework, a bundler, a type system, a CSS preprocessor, a test runner, a CI pipeline, a container runtime, an orchestrator, a cloud provider, and a dozen SaaS integrations. This is why "just learning the language" stopped being enough decades ago.

About & Methodology

Data is heuristic — not absolute truth. Validated against Stack Overflow Developer Surveys, JetBrains Developer Ecosystem Reports, DORA State of DevOps, and CNCF Annual Surveys.

This is an open community resource. Contributions, corrections, and new skill proposals are welcome via GitHub.