Summary
Keeping AI projects on track and under budget frustrates many business and technical leaders. From how the project is run, to differences between Artificial Intelligence (AI) and software development projects, there are plenty of factors that contribute to time and cost overruns. And sadly, overrun projects often lead to project failures. According to Forbes, 60-80% of AI projects fail.
This article summarizes “Agile for AI”, a methodology created by Synergise AI that’s custom-tailored to apply the principles of Agile to AI projects while incorporating the realities of AI workflows to help your AI project succeed. Synergise’s “Agile for AI” is a blended approach specifically designed for AI projects that focuses on providing visibility to the key stakeholders to enable them to make timely and informed decisions about scope, resources, and timelines.
*Please note: This article assumes the reader is generally familiar with the Agile framework and traditional project management practices from the Project Management Institute (PMI) and outlines how AI workflows should fit into place with them. Another assumption is the appropriate mix of business and technical talent is available to efficiently estimate and execute the work. (Quick Plug: Talk to us about our Full-Stack team of AI engineers for tech help - Contact us here!)
Agile for AI
The “Agile for AI” process is a blend of three approaches:
- Traditional project management - where the project is defined and managed in phases
- Agile - using iterative, incremental cycles
- AI workflows - "collecting, manipulating, and transforming data to be used as inputs for coding, training, evaluating and interpreting, fine-tuning, and implementing complex mathematical models"
Within this blended approach, there are four phases:
- Evaluation
- Data Assessment
- Proof of Concept
- Full-Stack AI Project
Different processes are appropriate for each phase in a typical AI project. There are also four themes throughout the phases: evaluate the viability of the project before moving to the next phase, consider what process is appropriate at each point, focus on business value, and bring the right team to each task. In combination, this gated phase approach promotes transparent, accountable, on-time and under-budget AI projects.
Summary of Phases and Processes
Evaluation Phase – A Short Series of Short Sprints
When beginning a new AI project, it’s important to consider the viability of the project. Synergise’s Agile for AI process starts with evaluating overall organizational readiness in terms of time, team, and resources; inventorying existing infrastructure and technology architectures; mapping business workflows; and thinking outside the box - considering other factors that could feed into the analysis of where AI could be beneficial to the company. By considering these questions prior to beginning, the company can identify projects needed to proactively address anticipated roadblocks the AI project might hit.
In many cases, two to four weekly sprints with well-defined deliverables suffice to complete the evaluation. The short form of "done”, as in the “Definition of Done”, for evaluation, is to ask whether enough information has been gathered to support a decision given the current state of the organization and its business priorities.
Data Assessment Phase – A Defined Series of Spikes
The second phase of Agile for AI is all about data assessment through the lens of discovery and documentation. You’ll need quantitative and qualitative descriptions of both internal and external data sources. You’ll also need to point out any gaps in data needed for the desired AI project’s use cases, and plan out how you plan to support and bridge the data.
Data assessment may be, in part, discovery. Organizations that already have up-to-date data inventories (data lakes, etc.) will move through this phase more quickly than others. From a process perspective, defining an overall “time box”, a fixed amount of time that is allocated to data assessment, e.g., 8 weeks, might be a starting point. The time box is split into a series of time boxes (in scrum terminology, “spikes”) that have defined durations and deliverables.
The Definition of Done for each spike may be: do we have enough information about this data source to reason about its applicability to the use cases? If the answer is no, another spike may be authorized with the specific objective of reducing the uncertainty to an acceptable level.
The go/no-go decision can be summarized as to whether the available and acquirable data supports the business objectives and use cases of interest to your organization.
Proof-of-Concept (POC) – A Defined Investment
One or more POC(s) are performed to reduce the technical risks associated with a particular business initiative to an acceptable level. Given the data assessment is in place and the right AI talent is available, a POC consists of a series of experiments using different models, data treatments and training regimens to ascertain whether the desired performance metrics (e.g., accuracy, latency, etc.) can be met. This will vary by project, company, and industry but may range from 4 to 12 weeks in length.
The key deliverable is a working system that provides sufficient information to support the business decision. This go/no-go decision is whether the technical risks are sufficiently understood to warrant funding a full-on AI project implementation.
From a process perspective, it’s impossible to prove that a technical goal cannot be reached. Stakeholders (project owners, technologists, etc.) may feel that “if we just keep trying” they will identify a viable solution. Hence, the “defined investment” approach specifies the time and resources that are available to the POC where the level of defined funding depends on both the return on investment (ROI) if the project is successful and an evaluation of the level of risk.
Periodically, through the development of the POC, the team should ask themselves: Is the goal still clear? Does the data support achieving that goal? If the answer is no, they can and should roll back to the appropriate earlier phase, evaluation or data assessment.
Similarly, if the team realizes the cost or risk associated with continuing the POC is growing, they should ask themselves if the POC should be discontinued altogether.
A Full-stack AI Project – Agile/Scrum
Given a successful data assessment, POC, and ROI analysis, a project to design, build, and deploy an AI system may be worth consideration for funding by the business leadership. Applying a typical Agile/scrum process in conjunction with a “Full-Stack AI team” allows monitoring and management similar to software engineering projects. Sprints are defined, tasks estimated, work completed, demos done, process tuned via retrospectives, and so on. A full project could last from a month to six months or more.
The desired outcome, of course, is a completed, in-production system that meets the business objectives including service level agreements (SLAs). However, even in this more deterministic phase of an AI project, it’s important to have periodic reviews to ensure technical risk is mitigated or accepted, priorities haven’t changed, and the focus on adding business value continues.
Conclusion
Running a simple, yet sophisticated process like Agile for AI - evaluate, assess, prove, and build - has proven effective in managing diverse AI projects from a variety of industries. It’s both broad, scalable, and flexible enough to handle the smallest company’s AI goals, to enterprise-grade efforts. Have questions on how Agile for AI could work for your company?
Contact our team of AI experts today for a free consultation to help you get started, or download our free guide to assess if your business is AI-ready.
For more information on everything AI implementation, check out our growing guide here.