Data analytics and machine learning (ML) can be game-changers for business, but often, projects don’t deliver the results which are expected.
Foundry’s State of the CIO 2024 report reveals that 80% of CIOs are exploring AI additions to their tech stack, and 74% are working more closely with business leaders on AI applications. Despite all the buzz around AI, only 54% of CIOs report increased IT budgets. Security and rising technology costs are higher priorities than AI investments.
Excitement about AI can lead to irrational decisions. A recent study shows that while key success metrics for analytics projects are ROI, revenue growth, and improved efficiencies, only 32% of respondents successfully deploy more than 60% of their ML models. Additionally, over 50% don’t regularly measure the performance of analytics projects, suggesting that many fail to deliver business value.
High deployment rates aren’t always expected because translating business objectives into accurate models and workflows requires experimentation. But organizations that underperform may cut spending or fall behind competitors.
While technical issues are common, organizational and process-related problems also play a big role. Authors Friedman and Swaminathan in “Winning with Data Science” emphasize the need for business leaders to collaborate with data science teams. Siegel in “The AI Playbook” calls ML deployment a “rare art” that requires clear deployment and prediction goals.
I’ve looked into these organizational issues and here’s what data science teams can do better. Remember, deployment is just the beginning. To drive ROI, growth, and efficiencies, teams must ensure business teams use the analytics capabilities provided.
Why Analytics and ML Efforts Are Not Enough!
- Analytics Aren’t Integrated into End-User Workflows
Data science teams must understand how models and analytics connect to end-user workflows. It’s hard to gain adoption when a predictive model isn’t integrated into the system where decisions are made.
“When designing AI solutions, start with the user experience to drive business impacts,” says Soumendra Mohanty, chief strategy officer at Tredence. Interview end users to learn their problems instead of giving them disconnected dashboards.
Solution: Start model development with a vision statement for delivering value and integrating analytics solutions into existing business processes.
- Poor Collaboration Between Data Scientists and Developers
Achieving an end-user-adopted workflow requires collaboration between data scientists and software developers.
“A common issue is not having a proper interaction framework between data scientists and developers,” says Rita Priori, CTO at Thirona. Teams must align on the next steps to ensure smooth integration.
Solution: Create agile data science teams that bring on different skill sets during the analytics lifecycle. Early planning may include Six Sigma and UX specialists; later, involve software developers to plan implementation.
- Lack of Attention in Changing Management
Expecting end users to adopt ML-enabled workflows without change management is a mistake. Change management helps ensure that new tools are embraced..
“Align tech and business teams and get employee buy-in from the start,” says Lux Narayan, CEO of StreamAlive. Ensure streamlined communication and regular alignment between teams.
Solution: Include stakeholders and selected end-users in drafting the vision statement, reviewing impacts on workflow, and defining success criteria. Involve them in sprint planning and reviews.
- Not Learning Lessons from Experiments
Data scientists understand the iterative nature of their work, but they must also iterate on user experiences and workflows based on feedback.
“AI can enhance user experiences or become a bothersome feature,” says Cody De Arkland of LaunchDarkly. Use experiments to ensure positive user sentiment.
Solution: Implement A/B testing to measure the impact of different implementations and survey end-users. Ensure applications and workflows are observable to capture performance and usability issues.
- Delivering Analytics Without Automation or Integration
More data and predictions are great, but they can create more work if not integrated into decision-making platforms. Automation should be a priority.
“Analytics must be scalable and integrated,” says Vanja Josifovski, CEO of Kumo.
Solution: Use embedded analytics to integrate visuals into user interfaces. APIs offer richer integrations, enabling developers to create new value for users.
- Proofs of Concept Without Production Results
Too many proofs of concepts (POCs) without production results indicate misalignment with strategic direction or undefined priorities.
“A top reason AI efforts fail is a lack of strategic direction,” says Hillary Ashton, chief product officer of Teradata. “Use reusable data products to create trust in AI.”
Solution: Leadership should guide priorities and promote workflow changes when models are production-ready. Focus on creating reusable data sets, ML models, and visualization components.
- Leadership and Talent Skills Gap
Organizations need leadership talent and skills to keep up with AI advancements. Lifelong learning culture is essential.
“Lack of viable tech talent can derail analytics efforts,” says Krishnan Venkata, chief client officer of LatentView Analytics.
Solution: Address skill gaps through hiring, training, and partnerships. Ensure the team has enough business acumen to relate analytics to business needs. Define AI leadership roles and invest in multidisciplinary teams and integrated AI platforms.
Thoughts
Despite the hype, AI success depends on fixing organizational and process issues. Establish leadership roles, set priorities, drive collaboration, and promote learning activities to ensure analytics investments deliver value