When I was younger, it seemed that everyone I worked with in technology knew how to build a computer. Most knew how to work with a BIOS, were comfortable with command lines, and could assemble complex compiler directives into a Make file. Over time, it seems many people, especially Windows and MacOS users, became focused on the things they needed for their job, lacking a lot of knowledge about how computers process instructions and the low-level operations they perform.
There are plenty of very talented developers out there, and many great data modelers, but as I work with many of them that try to make the transition to DevOps, I see lots of uncertainty and tentative behavior. They often approach builds, automated tests, and deployments as though these are completely new skills they need to learn. New software that is foreign to them.
I wonder if the increased use of automation will make this worse in the future? Already plenty of companies are looking to low-code and no-code solutions as a way to handle the lack of staff to perform development. Will that be exacerbated in the future as more and more automation is put in place that mocks up a shell (or more) of a project and handles work for the developers and even operations staff? Will they struggle to debug complex problems, which are the more likely problems to occur in modern software?
There is an interesting article about surgeons that might be losing some skills with the advent of robots helping in surgery. While there are some worrisome aspects to this for me, as someone getting older and possibly needing medical care, I find that some of this is applicable to the world of software. How well do we apprentice people new to our environment and give them the chance to build new skills? Often we have senior people taking on interesting work, making data modeling decisions, troubleshooting issues and more. Do we allow more junior people the chance to get hands-on experience and perhaps take charge to lead others? To learn to actually be the one to make the decision?
In a few places, I’ve seen senior people fixing bugs and junior people developing code. That seemed strange at first, until I realized that fixing problems is more than likely something I want the better developer dealing with, not the worse one. These don’t even have to be issues in production. In a DevOps world, I might have the senior people looking at and fixing the bugs that are caught in CI. While I appreciate giving someone the chance to correct their mistakes, I also think that a “refactoring” or improving of code might be something senior people are better positioned to tackle.
Of course, I think rotating people in and out of roles, giving them a chance to experience difference sides of our industry, is a good idea. This might include learning how dev works (for Ops people) and how Ops works (for developers) can bring about empathy, understanding, some skills, but likely to create a better culture of collaboration. A tenet of DevOps.
What skills are we worried about learning? Or losing? To what extent ought we try to ensure others grow wide and deeper outside of their core skills? I find this to be a area that the best DevOps companies do well. They have champions that can provide assistance, knowledge, and teaching, not just do the work. Companies that pretend to adopt DevOps aim for specialization, letting others do the work when they can. Over time, I think they’ll end up like some of today’s surgeons: only practicing their craft for the first time (with no training) when someone else isn’t available.
Listen to the podcast at Libsyn, Stitcher, Spotify, or iTunes.