As we step further into the AI-driven era, the buzz around how it’s reshaping the tech landscape is loud and clear. Those of us knee-deep in enterprise software are seeing firsthand how AI tools like GitHub Copilot and ChatGPT at our fingertips are making coding a bit less daunting.
AI coding is taking over as we speak
Now imagine a world where you tell your computer what you need over a cup of coffee, and it just… gets it done. No endless lines of code, just straightforward instructions. We’re not quite there yet, but that’s the direction we’re headed looking at tools like Devin, Github Copilot Workspace (see below) and a lot of sprouting open source projects like SWE-Agent, Devika and OpenDevin.
And lets not forget that coding capabilities are being taken over by Github Copilot and ChatGPT already. 92% of programmers are using AI tools (and another survey here). Repetitive tasks that do require cognitive capabilities, such as creating unit tests, are so much easier and less tedious with LLM at the fingertips. At this very moment it’s not the question if AI will impact software engineering roles, it already is! We must then begin to think, what’s next?
AI infused applications
The transformative potential of AI in traditional software applications extends well beyond simple coding assistance. The integration of large language models (LLMs) like OpenAI services and other AI providers into applications offers a significant leap in how we develop and interact with software. This capability transforms how applications can be used and the range of tasks they can perform.
For example, in traditional customer service applications, LLMs can power chatbots that not only respond to customer inquiries but also understand context, manage follow-ups, and personalize communications based on previous interactions. These AI-powered chatbots can handle a multitude of customer service tasks simultaneously, reducing wait times and increasing customer satisfaction.
In the realm of document management, LLMs offer capabilities such as summarization, keyword extraction, and semantic search. This allows users to quickly find the most relevant information without manually sifting through vast amounts of data. It’s particularly beneficial in legal and research-based fields, where time and accuracy in information retrieval are crucial.
These examples highlight just a few ways LLMs are enhancing traditional applications, making them not only more efficient but also significantly more intelligent and responsive to human needs. The application of LLMs is a game-changer, turning static systems into dynamic assistants capable of supporting complex decision-making processes and interactions.
The case against software development
While the rapid advancement of AI and LLMs is undoubtedly transforming software development, it raises a compelling argument about the diminishing necessity for custom applications in many enterprise contexts. A significant portion of custom software serves primarily as an interface for accessing and manipulating data. These systems are tailored to fit specific business processes and data workflows, often becoming deeply entrenched in the organization’s operational fabric.
However, this customization comes at a cost—both financially and in terms of maintenance. Custom applications require continuous updates, patches, and modifications to remain secure and functional. Furthermore, the complexity of these systems can lead to a reliance on specialized knowledge to manage and operate them.
Looking into the future, as AI systems become more adept at understanding and processing natural language, the landscape will shift dramatically. At some point we could see AI interfaces become so intuitive and capable that they can directly interact with your data without the traditional intermediary layers of applications. Users could simply communicate their needs via natural language, and the AI would retrieve, update, or manipulate data as requested. This capability would not only streamline operations by reducing the need for multiple specialized applications but also enhance flexibility in how data is accessed and used.
And this isn’t just imaginary talk; we see huge efforts in the space of making AI talk with your own data. Cloud providers are already on top of this. Researchers are finding innovative ways to bridge the gap between token window limitations and an extensive amount of custom data using advanced RAG systems. And LLM’s are already being released that are specifically trained to do these kinds of taks.
In such a future, the role of custom applications would be reduced to handling highly specialized tasks that require specific, non-standardized processes or where regulatory and security concerns necessitate tightly controlled environments. For the vast majority of data interaction needs, a sophisticated AI would provide a more efficient, cost-effective, and user-friendly solution. This shift would not render all software development obsolete but would certainly recalibrate the value and function of custom applications within the enterprise software ecosystem.
Wrapping it up: What should we do now?
For those of us navigating the shifting sands of IT, the emergence of AI and AGI presents a dual-edged sword: on one side, vast opportunities for innovation and efficiency; on the other, the challenge of adapting to a rapidly evolving tech landscape. But as we delve deeper into this new era, one thing becomes clear: staying on the sidelines isn’t an option.
Practical advice for the road ahead
While AI and AGI technologies are still finding their footing, ignoring them isn’t a luxury businesses can afford. Here’s a bit of advice for incorporating these technologies into your IT landscape:
Start small but think big
Begin by exploring AI tools that can assist with current tasks. This could be as simple as automating repetitive tasks, building your own AI chat or using AI for data analysis. These small steps can provide insights into how AI might fit into larger operations.
Familiarize yourself and your team with AI capabilities and limitations. Understanding what AI can and cannot do helps in setting realistic expectations and spotting opportunities for integration.
Get more agile
Both in terms of machine learning models and your workforce. Ensure your team has the skills needed to work alongside AI. This might mean investing in training or hiring new talent with the necessary expertise.
The AI field is evolving rapidly. Get as agile as you can get to software development and project management to accommodate the pace of change. This means being ready to pivot when a new AI breakthrough happens or when an experiment doesn’t pan out as expected.
Leave a Reply