3 Ways
to Reduce
the Cost of AI
POV
FACT
You want to see ROI on AI initiatives — who doesn’t? After all, the potential is huge.
Maximizing ROI requires wide deployment across the enterprise.
The Problem?
Revenue
Cost
Profit
The key to driving massive ROI from AI after some of the initial high-value use cases, then, is largely about deploying it massively while controlling costs. But how?
Reuse & Recycle AI Project Components
Old Project
Reuse is the simple concept of avoiding rework in AI projects, from small details like code snippets to the macro-level, like the finding and cleaning of data. Common sense and economics tell us not to start from scratch every time, and that is exactly the principle behind reuse in AI projects.
Let’s dig into what’s perhaps the most costly aspect of AI projects: data cleaning and preparation. It’s a hefty, often tedious, and time-consuming task. That said, data cleaning and preparation are critical parts of an AI project, and if not executed well, can translate into poor quality models as well as increased risk through the entire model lifecycle.
Your first Al use case(s) are likely low-hanging fruit and have more value than the 10th, 50th, or 100th use cases, so the marginal value of use cases is decreasing overall.
Reusable Code
New Code
We talked about reusing and recycling AI project components, but let’s take that to the next level. Making real money with AI requires massively increasing the number of use cases being addressed across the organization.
How can you empower anyone (not just people on a data team) to leverage the work done on existing AI projects to spin up new ones, potentially uncovering previously untapped use cases that bring a lot more value than expected?
Sharing the cost incurred from an initial AI project results in many use cases for the price of one, so to speak. However, being able to leverage one project to spur another requires Radical Transparency and The Right Tools.
The surfacing of these hidden use cases often comes from the work of analysts or business users. It is one of the keys to data democratization and eventually to Everyday AI, where it’s not just data scientists that are bringing value from data, but the business itself.
The AI project lifecycle is rarely linear, and there are different people involved at every stage, which means lots of potential rework and inefficiencies along the way. Here are three main areas where introducing efficiency — for example, through a centralized AI platform like Dataiku — can help control costs.
Packaging, release, and operationalization of data, analytics, and AI projects is complex, and without any way to do it consistently, it can be extremely time consuming.
This a massive cost not only in person hours, but also in lost revenue for the amount of time the machine learning model is not in production and able to benefit the business. Multiply this not by one model but by hundreds, and the cost is debilitating.
Dataiku has robust support for deployment of models into production (including one-click deployment on the cloud with Kubernetes), easing the operationalization of AI projects.
1
Anyone at the organization can easily access information, including who is working on which AI projects with what data, how that data is being transformed, what models are being built based on that data, etc.
Controlling Costs Is Just the Tip of the Iceberg
We’ve seen here why reducing costs is a critically important component of successful AI initiatives. But how can organizations do it? By ensuring — via investments in the right technology, including AI platforms like Dataiku — execution on these four points.
In other words, controlling costs requires removing friction — with that, you’re well on your way to successfully realizing AI at scale.
Drive ROI With Dataiku
Forrester: The Total Economic Impact™ Of Dataiku reveals that organizations save 75% of data scientists’ time and reduce 90% of manual, repeated reporting tasks with the platform.
GET THE PDF
Source: Gartner, What Is Artificial Intelligence? Ignore the Hype; Here’s Where to Start, 15 March 2022
ROI on leveraging AI techniques ranges from about 20% to more than 800%.
Scale, dimension,
and reach across the
enterprise are the real
returns on investment in AI.
Source: Gartner, What Is Artificial Intelligence? Ignore the Hype; Here’s Where to Start, 15 March 2022
Facilitate More Use Cases for the Price of One
#1
#2
New Project
Introduce Efficiency Across the AI Lifecycle
#3
—Team Lead, Analytics Innovation | Pharmaceutical Company
Radical Transparency
For example, how can someone from marketing build off of a use case developed in the customer service department if neither knows what AI projects the other is working on, much less can access and leverage those components?
The Right Tools
Dataiku makes AI and data accessible to anyone across the organization, from data scientist to analyst to business people with only simple spreadsheet capabilities.
—Team Lead, Analytics Innovation | Pharmaceutical Company
“
However, if you’re like most organizations today, the cost of maintenance and the cost of executing each use case is likely increasing.
At some point, profit from Al initiatives is decreasing due to increased costs and stagnation of revenue.
Operationalization, or Pushing to Production
How long does it take to release a first model in production?
Source Dataiku
So reducing this cost is not necessarily about simply discouraging time spent or outsourcing the work, but rather finding smarter, more efficient ways to ensure people across the organization aren’t wasting time finding data or cleaning data that has already been prepared by someone else.
For example, what if you could provide a built-in, centralized, and structured catalog of data treatments (from data sources to data preparation, algorithms, and more) for easy consumption? Enter: AI tools. Platforms such as Dataiku help teams and individuals alike systemize processes, using common elements to get to business value faster.
For example, in Dataiku, data can be prepared once and used across multiple projects, code snippets can be packaged for reuse by other data scientists, and plugins or applications can be leveraged even by non-technical business users to promote reuse and prevent costly chaos.
Fill out the form to access the PDF version of this content.
Data experts can create and share assets to be used across the organization, including things like feature stores, a portfolio of data treatments, or even entire AI projects packaged as easy-to-use applications.
2
Anyone at the organization can take, reuse, and adapt AI project work (whether micro, like data preparation, or macro, like AI applications) done by those data experts.
3
Leaders at the organization can ensure the quality of AI projects via AI Governance.
4
Get the Full PDF Version of This Content Delivered to Your Inbox
“By having these reusable data pipelines and data products, [we have] streamlined our operational side of development. We’re talking about savings in the range of $4 million plus.”
Operationalization, or Pushing to Production
Changes in Underlying Architecture
It’s not just models that need to be maintained, but architecture as well, especially as technologies in the AI space are seemingly moving at the speed of light. That means switching from one to another happens often, and when it does, it can be costly.
For example, even though the cloud is growing in popularity, most companies will take a hybrid approach, investing in AI platforms like Dataiku that sit on top of the underlying architecture to provide a consistent user experience for working with data no matter where it is stored.
In addition, as organizations’ data teams or centers of excellence grow and as more staff outside of those data professionals start working with data, having a modern approach to architecture that allows for scaling up and down of resources is critical to reducing overall costs associated with AI.
Changes in Underlying Architecture
Model Maintenance (MLOps)
Putting a model into production is an important milestone, but it’s far from the end of the journey. Once a model is developed and deployed, the challenge becomes regularly monitoring and refreshing it to ensure it continues to perform well as conditions or data change.
That means continual AI project maintenance cannot be ignored (or at least not without an effect on profit). Depending on the use case, the model can either become less and less effective in a best case scenario; in the worst case, it can become harmful to and costly for the business.
MLOps has emerged as a way of controlling the cost of maintenance, shifting from a one-off task handled by a different person for each model — usually the original data scientist who worked on the project — into a systematized, centralized task.
Dataiku has robust MLOps capabilities and makes it easy not only to deploy, but to monitor and manage AI projects in production.
Model Maintenance (MLOps)
READ THE FORRESTER STUDY
The Right Tools
Radical Transparency
Click to learn more
423%
400%
370%
340%
310%
290%
270%
240%
210%
190%
160%
130%
100%
70%
50%
25%
0%
15%
423%
400%
370%
340%
310%
290%
270%
240%
210%
190%
160%
130%
100%
70%
50%
25%
15%
0%
©2022 Dataiku. All rights reserved.
Privacy Policy
Legal Stuff