The AI-driven transformation of almost every area of technology is set to continue. DevOps is certainly no exception, as AI can be applied through the software delivery and development processes, automating operations and enabling more predictive, proactive approaches to releases and updates.
This shift comes as the market moves increasingly towards cloud computing. According to Gartner, by 2025 more than 85% of organizations will have a cloud computing strategy, and as many as 95% of new digital workloads will occur in the cloud. The push towards the cloud is fueled by the demand for agility, scalability, and efficiency in how software is developed, and incorporating AI into DevOps methodologies will only elevate this to a higher level.
For the past year and a half, my colleagues and I have been testing all the DevOps areas we can think of where generative AI could play an important role, both for the comfort of engineers and for the results we provide to our clients.
My colleagues and I have compiled an ordered list of the five biggest areas that have improved over the last year due to our use of generative AI tools.
So, let's dive into it together!
First place and the most significant impact in our day-to-day lives has been created by generative AI's ability to produce bash and Powershell scripts much faster and with better quality than ever before. Everyone involved just loves it for several reasons that I would like to highlight:
Many of the scripts in our daily lives are highly boring and repetitive, and let's be honest: You won't find any engineer who wants to write scripts for moving files that do transformation, etc. Yeah, it is fun learning the first time and the second time, but when you write them thousands and thousands of times, it becomes pretty boring. With generative AI, we managed, in some cases, to reduce the time spent writing scripts by up to 70%.
Not all of our DevOps Engineers have developer backgrounds. Many are from SysOps or NetOps positions and find it difficult to write more advanced scripts. With AI, they are more motivated to write scripts and manage to produce scripts that they wouldn't otherwise be able to write alone.
We create more code, solving edge cases that would otherwise die at the bottom of the backlog. Okay, this is hard to explain, so let me show you an example. We had to write a script that moved corrupted files from the primary location to the backup location for later investigation. The issue was that the primary location was the Windows server and the backup location was Linux. Why does it matter? Before, we wouldn't have included a renaming policy for file names that were valid on Windows but invalid on Linux. With generative AI it was just one additional sentence and we solved possible issues a year from now.
A strong second place is held by the general ability of chatbots like ChatGPT to iterate vast knowledge from the internet in seconds. Generative AI here is an absolute lifesaver; the landscape of the public cloud is changing so rapidly that we often struggle to keep up with up-to-date offerings and changes that have been made to services.
We absolutely hated re-reading the same documentation for the same services every six months to find one sentence that changed between versions. After a while, you get sloppy and miss that something changed, push new infra to production, and ...
Now, we can iterate all possible services, documentation, and troubleshooting guides much faster with one click of a button. That said, there is still room for improvement, especially around the preliminary calculation of services.
It's not yet there, but this is just months away, so third place goes to navigating our cloud environments. Generative AI is not yet there, but we see how close we are to this enormous benefit to our work.
Imagine if you could ask the chatbot if the newly deployed app in the cluster will be able to reach the service in the second cluster, and it would know based on your live cloud environment.
Fourth place could almost be considered an extension of AI-assisted code, but this deserves its own place. For two reasons: one positive and one rather negative. The positive reason is that AI has enabled developers to write their own pipelines more than ever before. The negative reason is that it often fails to respect the predefined custom templates we try to enforce for all pipelines and instead creates pipeline tasks from those it knows in its own dataset.
It's important to mention that, like in assisted code, we also create many more tasks in CI/CD pipelines that we wouldn't have previously considered, because it would either take too long or we weren't that sure how to implement them. With AI, we started including additional security checks and quality checks that improved our overall DORA metrics.
The biggest struggle of DevOps has always been the question, "Is it faster to do it manually or write a script for it?" Things like renaming files, splitting documents, and setting up new projects were always in between. Now, we have managed to shift to a much more script-oriented environment. At our fingertips, we have AI-generated scripts that can do anything we can describe.
It is also interesting to see that in the near future, this will shift many jobs to be much more script-oriented. We think we will see much more low-code and no-code knowledge in back-office employees who will gain the know-how to ask generative AI for scripts and have a basic understanding of verifying that the script will do what they need. In that world, developers will be able to stop focusing on such low-hanging fruits and concentrate on things that matter most for a company's success.
Let's see what the future will bring. Did you know that generative AI outside of creating code and scripts can also generate configurations and layouts for monitoring dashboards? This fantastic feature allows us to develop single-use dashboards dedicated to, for instance, monitoring peak hours and peak resources during Black Friday. This focuses on services that were recently misbehaving or are slated to receive significant upgrades or maintenance windows.
So, this is our list of the 5 most significant improvements we currently see in our day-to-day jobs plus a bonus of what might prove very helpful in the future. But with the pace at which everything is changing, we expect that this year, we will see additional improvements, especially around FinOps, policies, and general recommendations for architecture in our cloud environments.
At Ciklum, we’re already enjoying great success in helping DevOps teams address these challenges and unlock AI’s potential across their workflows. We use a combination of cutting-edge AI technologies, deep domain knowledge in software development and tailored solutions, empowering businesses to automate and optimize DevOps processes. As a result, they’re benefitting from enhanced productivity, accelerated time-to-market, and greater agility in their software development.
To find out more on how AI-driven DevOps can transform your innovation and competitive edge, talk to our team today - or explore our resource library for more insights.