
Causal AI for Business: The Quiet Revolution Changing How Companies Think
Correlation does not equal causation…
Correlation does not equal causation…
Correlation does not equal causation…
Correlation does not equal causation…
Correlation does not equal causation…
You’ve probably heard this mantra in a statistics class—or seen it scrawled across a whiteboard somewhere. It’s a core principle of data science: just because two variables move together doesn’t mean one caused the other.
The classic example? Ice cream sales and shark attacks both rise in summer. But it’s not because sharks enjoy dessert—there’s a third variable: hot weather.
Everyone accepts this. Statisticians, analysts, even researchers in artificial intelligence.
But here’s the problem:
In business, we act as if correlation does imply causation.
And we do it every day.
This quiet contradiction sits at the heart of how modern companies use data—
and the Causal AI revolution is here to fix it.
Current Practice: How Businesses use Data Today
Let’s start by understanding the typical business data journey—from collection to decision-making.
Data is collected, often passively, through company operations: sales records, support tickets, customer interactions, and increasingly, product usage logs. These form the “data wells” that analysts draw from.
Then comes exploratory analysis. The business analyst examines trends, correlations, and anomalies. They look for patterns that might explain—or at least hint at—business outcomes. For example:
- If a customer complains and isn’t refunded within 7 days, there’s a 95% chance they won’t return.
- If a student stays up to date in weeks 1–6, there’s a 99% chance they’ll finish the course.
These are probabilistic associations, not causal claims. Analysts present them with careful language:
“There is insufficient evidence to confirm a statistically significant relationship…”
In short, the analyst remains a scientist. They stay true to the statistical mantra:
Correlation does not imply causation.
The Contradiction (or the Crime)
But here’s the contradiction—the crime hiding in plain sight.
It’s not the analyst who acts on the data. It’s the decision-maker—the head of operations, the marketing lead, the director. They’re the ones who must decide:
- Should we automate more refunds, based on that customer complaint correlation?
- Should we make Weeks 1–6 easier, hoping to increase course completion rates?
And in making those decisions, they often do what the analyst wouldn’t dare:
They assume causality.
They pull the lever on A hoping B will rise. It’s rarely said out loud, but it’s there—the leap from association to intervention.
Now, we don’t blame the business leader for doing this. In fact, they’re often the best positioned to make these calls. They use intuition, experience, and gut feel—those same instincts that have broken the “correlation ≠ causation” rule a thousand times before.
But let’s be honest:
The analyst chose which correlation to spotlight. They knew exactly how it would be read. They just refused to get their hands dirty.
This isn’t a critique of one company or a few analysts—it’s the standard model of how data analytics is taught, even at the Master’s level. It’s seen as clean, scientific, rigorous.
The analyst wears the white coat (Master of Science).
The manager takes the leap (Master of Arts).
But this tidy division hides a real weakness. Data is being used to imply causation—but without the tools or language to do it responsibly.
The Causal AI Revolution – We CAN Infer Causation
Let’s now bring you up to speed with a little known revolution that is happening across certain domains, soon to be many domains, and it’s all to do with causality. By leaving the decision to the businessman’s intuition, it is of course difficult to teach a computer how to develop the intuition and to judge causality. Very difficult in fact, and this is one way that experts can still catch an AI Model in the Turing Test, by asking questions about causality.
It was long known that causality is a thing. It’s not magic, and do infer causality everyday. In fact if you have kids, you will have noticed even babies ‘getting what they want’. A scream not just correlates, but causes mama to come running. The cylinder block goes through only the cylinder hole. A wave to a stranger is normally returned. It’s only now however causality is being studied – by scholars such as Judea Pearl (winner of the Turing award). A new mathematical language has been developed, as well as programming languages to help ‘teach’ computers to think causally.
So what’s the main breakthrough? It’s not that correlation can imply causation. Again, the above mantra isn’t refuted, but perhaps we can stop singing it so loudly. The breakthrough is to first create a ‘causal model’ – a series of assumptions that we know about the variables. We create a series of assumptions, that is turned into a causal model, and only then do we know what data needs to be collected. Specific data is collected (even created) for us to then feel confident about implying causation!
Two Examples of Causal Thinking in Business
Let’s look at two simple examples where a business might act on correlation—and how causal thinking would completely reshape the analysis and the data strategy.
1. Fast Refunds = Better Reviews?
A customer service analyst notices a pattern: when customers get their refund quickly, they’re more likely to leave a positive review. Based on this, someone might suggest we automate and speed up refunds. That sounds good—but what if we’re mistaking correlation for causation?
A causal thinker would say: maybe it’s not the speed of the refund causing the good review. Maybe it’s that some types of complaints—say, about late delivery—get refunded quickly and aren’t that serious, so customers feel generous. Or maybe customers who get refunds fast had better delivery experiences in the first place.
We can’t know what causes what unless we model the situation.
Causal model:
Complaint Type and Delivery Experience both influence both the refund speed and the review score.
To actually infer causality, we’d need data on:
- the type of complaint
- the delivery delay
- the product category
- whether the refund was automatic or reviewed
- customer history, like whether they’re a long-time shopper
With this richer dataset and a causal model, we could use tools like DoWhy to simulate:
“If we speed up refunds, will the review score improve for the same customer and complaint type?”
2. Week 6 Completion = Course Completion?
In an education platform, data shows students who complete the first 6 weeks of a course are very likely to complete the whole thing. Management decides to invest heavily in making weeks 1–6 easier. Again—correlation may be fooling us.
What if the students who make it to week 6 are just the most motivated or best prepared from the start? Maybe they were going to finish anyway, no matter how hard the course was.
Causal model:
Motivation and Prior Knowledge both influence whether someone gets past week 6 and whether they finish the course.
To even begin estimating a causal effect, we’d need to measure:
- prior ability (via GPA or a placement score)
- effort (study hours logged, login frequency)
- external constraints (job hours, family status)
- module difficulty for Weeks 1–6
With that, we could ask:
“If all students completed Week 6 (even the low-motivation ones), would we actually see an increase in course completion?”
Only a causal model can help answer that.
The Future of Business Decision Making
In both examples—the refund policy and the course design—the naive move might work, or it might backfire. The correlations are tempting, but without understanding the underlying system, we risk making expensive mistakes.
If we don’t ask the right causal questions and collect the right data, we’re building business decisions on sand.
Causal AI gives us the language and tools to think in terms of interventions, not just associations. That’s what sets it apart from traditional analytics—and why it’s going to reshape how businesses reason, test, and act.
Business has always run on causality. Now, finally, analytics can catch up!

Hands-On with Microsoft Copilot: What Works (and What Doesn’t) in Business Analytics
This post documents my hands-on experience testing Microsoft Copilot for data analysis as part of my Masters in Business Analytics coursework. For more thoughts on human-AI collaboration in analytics, see my previous post: “The Human Layer: What AI Can’t Replace in Data Analytics.”
Last week, while manually creating scatter plots and correlation matrices in Excel, it struck me how many tedious steps were involved—putting columns together, sorting, selecting data ranges, removing defaults, adding axis labels variable by variable. There had to be a better way.
That’s when I decided to test Microsoft Copilot’s data analysis capabilities, using the same car sales dataset we’d been working with in class. What I discovered was both impressive and illuminating about the current state of AI in business analytics.
The Promise: AI-Powered Analysis in Minutes
Microsoft Copilot comes integrated across Office 365 applications—Teams, Outlook, Word, Excel, and more. While our team frequently uses it for meeting summaries, I’d never tried the data analysis features in Excel.
The setup couldn’t be simpler. No installation, no coding required. Click the Copilot logo, and it immediately reads your spreadsheet data. It offers options like “Basic Analysis” or “Advanced Analysis” using Python, among other preset prompts.
I loaded the car sales data from my Masters course and clicked “Advanced Analysis.” Within minutes, Copilot had created a new tab filled with charts and insights, leveraging “Python in Excel”—Microsoft’s partnership with Anaconda that brings Python capabilities directly into spreadsheets.
What Worked Impressively Well
Prompt 1: “Create an advanced analysis of my data in this Excel sheet”
The results were genuinely impressive. Copilot generated multiple visualization types that made analytical sense:
- A histogram of car prices showing distribution patterns
- Car price vs. year scatter plot revealing depreciation trends
- Car price vs. mileage scatter plot showing usage impact
- Boxplots comparing prices across engine sizes and fuel types
What struck me was that the computer doesn’t “understand” cars or automotive markets, yet it created informative graphics by intelligently spotting correlations in the data. This would be an excellent starting point for any analyst—the kind of exploratory analysis that typically takes an hour was completed in minutes.
Prompt 4: “Create a boxplot of car prices with outliers highlighted”
Prompt 5: “Create a table which includes details of the outliers from the above boxplot”
Both of these worked perfectly, likely because the prompts were crystal clear about the expected output. The outlier analysis was particularly well-executed, providing both visual identification and detailed data tables.
Where Human Expertise Became Essential
Prompt 2: “Create a correlation matrix for each of the variables, with sparklines in each of the empty cells and histograms for the cells which are correlated with themselves”
Copilot successfully created a correlation heatmap, but immediately I spotted quality issues that required human intervention. The visualization used red and blue color coding—which violates IBCS (International Business Communication Standards) accessibility recommendations for users who are colorblind.
This was perfect for exploratory purposes, allowing me as the analyst to identify correlating variables for further investigation. But it would need significant modification before presenting to stakeholders.
Prompt 3: “Try again to create the sparklines graphs for each of the correlations”
This simply didn’t work, despite being one of Copilot’s own suggested prompts. No clear explanation why—it just failed to execute.
The Broader Implications
This hands-on experience crystallized something important about the current state of AI in business analytics. Copilot is genuinely transformative—it shifts the goalposts of what can be achieved in terms of both time and skill level.
The democratization effect is real. Data-literate people who aren’t necessarily trained analysts—sales managers, inventory managers, researchers—can now load data and quickly generate meaningful charts. They can iterate through prompts until they have something useful for exploratory or explanatory purposes.
But this doesn’t eliminate the need for skilled analysts. Instead, it elevates our role. The bar has been raised, and what’s expected of top business analysts has risen with it.
Top analysts will increasingly be hired based on distinctly human skills:
- Telling compelling stories, not just publishing incoherent graphs. This includes choosing appropriate visualizations, ensuring accessibility, and crafting narratives that resonate with specific audiences.
- Conducting AI tools sophisticatedly to extract timely signals that business executives can act upon faster than ever.
- Applying domain expertise to spot quality issues, context problems, and strategic implications that AI simply can’t recognize.
What This Means for Analytics Education
This experience also highlighted a gap in how we prepare future analysts. While the theoretical foundations learned at university remain important—timeless principles of design, statistical thinking, and narrative construction—fluency with AI tools like O365 CoPilot is becoming essential for top positions.
The future analyst needs to master both the art of asking the right questions and the science of orchestrating AI tools to find answers efficiently.
The Takeaway
Microsoft Copilot for Excel is genuinely impressive and will be a permanent part of my analytics toolkit. It excels at rapid prototyping, exploratory analysis, and handling routine visualization tasks that used to consume significant time.
But my testing confirmed what I’ve observed across all my analytics projects: AI is incredibly powerful when guided by human expertise, but it requires that guidance to deliver truly valuable insights.
The most successful analysts won’t be those who resist these tools or those who blindly trust their outputs. They’ll be the ones who understand how to collaborate with AI—leveraging its speed and computational power while applying human judgment, domain knowledge, and strategic thinking to create analytics that truly drive business value.
Want to see more examples of human-AI collaboration in analytics? Check out my portfolio projects and read about “The Human Layer” in data analytics on the blog.

The Human Layer: What AI Can’t Replace in Data Analytics
This post is a reflection on the data analytics projects I’ve been involved in, as well as my Masters in Business Analytics. You can view the published projects in my portfolio at: www.georgelindley.com/portfolio.
In an era where ChatGPT can generate charts and Claude can write code, many wonder: do we still need human data analysts? After completing my Masters in Business Analytics and working on several high-impact projects, I believe the answer is a resounding yes—but the role is evolving.
The future belongs to analysts who can work with AI, not those who fear being replaced by it. Through my portfolio projects, I’ve discovered that while AI excels at processing data and generating initial outputs, the most critical decisions still require human judgment, domain expertise, and strategic thinking.
Let me show you what I mean through a detailed example from my flagship project.
The Regulated Plants Project: A Case Study in Human-AI Collaboration
In our Regulated Plants Geospatial Analysis, AI helped me process thousands of data points and generate initial visualisations. But the crucial insight came from my understanding of regulatory compliance patterns and stakeholder needs—recognizing that enforcement agencies would need county-level clustering for practical field operations, not just raw botanical data. No AI tool could have made that contextual leap from data to actionable intelligence.
Where Human Expertise Was Irreplaceable
1. Data Curation and Domain Knowledge The ‘gold’ of the Regulated Plants Database project is the curated database of plant species that are regulated in different states and provinces around the world. Crucially, this data isn’t available in one place and lacks standardized formatting. There are different types of regulations, varying plant taxonomies, and it requires a subject matter expert to find this data and standardize the regulations. That’s precisely why we secured backing from United Nations University and UC Davis Department of Plant Sciences.
2. Building Trust and Institutional Partnerships Our team reached out to UNU and UC Davis for approval to use their logos and domains as a seal of trust. This took multiple meetings from our project lead—this ‘seal of approval’ was built on handshakes and website reviews by IT personnel. AI cannot build these human relationships or earn institutional trust.
3. Narrative Choice and Data Visualization Strategy While AI tools can process data into any chart or graph format you request, the human element is crucial in choosing exactly which visualizations to use for accurately and effectively communicating the chosen narrative. This depends on understanding your specific audience—which could be one of many different stakeholder groups.
4. Aesthetic Design and User Experience The aesthetics of the website and the charts/tables were initially created by AI, but were chosen and modified by humans. This taste and sense of design—creating experiences that resonate with human users—may never be fully replaceable by AI, as it’s for humans, by humans.
5. Advanced Analytics for Visual Storytelling Currently, I’m not aware of an LLM that can independently find and download geojson data from GitHub repositories, minimize file sizes, and pair it with geographical data from CSV files or SQL tables. This was essential for representing our CSV file of regulations through an interactive map. I’m sure this capability will evolve, but for now, this type of advanced analytics requires significant human orchestration.
How AI Accelerated Our Work
1. Claude Code for Web Development I used Claude Code to build the website under strict programming guidance. Having built many websites by hand using the Flask web framework, I was able to effectively guide the LLM in creating the various files and folders:
- Created code to transform the CSV master sheet into an SQL database. When we update the CSV sheet, the code automatically updates the database.
- Built the Flask app under strict prompting, including JavaScript/Ajax files that combine geojson data by country with data returned by the Python code.
2. Railway.app for Deployment Railway.app provides the cloud deployment infrastructure that takes our code from development to production. It automatically reads our GitHub repository, sets up the necessary server environment, manages dependencies, and transforms our Flask application into the live web application you see today. This streamlined deployment process allowed us to focus on the analytics and user experience rather than server management.
The Future of Human + AI Analytics
AI is transformative for analytics—an absolute godsend for practitioners in this field. AI has democratized data analytics, allowing sales managers, researchers, and others to quickly create visualizations. This also empowers data analysts to create more advanced, more visually appealing, and more effective analytics by using LLMs strategically.
However, this democratization doesn’t eliminate the need for skilled analysts—it elevates our role. While anyone can now create a basic chart, it takes human expertise to:
- Ask the right questions of the data
- Understand the business context and implications
- Navigate complex stakeholder needs
- Build trust and institutional relationships
- Make strategic decisions about what story to tell and how to tell it
This pattern of human-AI collaboration repeated across all my projects—from supply chain optimization requiring deep understanding of operational constraints, to financial modeling where regulatory knowledge shaped analytical approaches.
Why This Matters for Your Organization
Organizations that understand this distinction—between AI-generated outputs and human-guided insights—will have a significant competitive advantage. They need analysts who can harness AI’s power while providing the strategic thinking, domain expertise, and stakeholder management that only humans can deliver.
The question isn’t whether AI will replace data analysts. It’s whether your organization will have analysts who can effectively collaborate with AI to deliver insights that truly drive business value.

Best Sales Advice I Ever Received
The following sales advice has helped me achieve the outstanding revenue growth that I’ve achieved at Pearson. But don’t think you have to work in sales for sales advice to be useful. Think about this if you ever have to ask for something, pitch something, request a raise/help/a date.
Back in 2017 I was new to sales and my boss-to-be kept scratching at the fact that I didn’t have sales experience. So I asked my sales director friend for some sales advice. By repeating the advice like a mantra, I was able to reposition myself in a sales light and land the job of my dreams. And my friend said…
To do well in sales, you need to be liked, trusted and respected.
Frederic Michaelson
Simple sales advice right? Let’s look into each of those three key areas in more detail to unlock its power:
✅ We buy from people we like
If you’re going to enter into a business relationship with a company, it means you’re going to spend quite a bit of time with their key representatives. Now, would you buy the excellent product, which comes with a pushy representative with whom you dread your next meeting? Or do you buy a similar, perhaps slightly inferior product, but you actually quite like receiving support from them. Hey, maybe you will even add their star salesman on LinkedIn and send them a few photos from your holiday over Whatsapp. Sales is a very human pursuit.
✅ We buy from people we trust
Repeated, complex solutions with a partner are more desirable than one-time transactional relationship You as the customer may depend on the solution/service for many of your own deliverables. Simply put, if you go with the incorrect service provider, you may struggle to do your own job well. We purchase from people we trust, because we need to believe the benefits of the solution will transfer to workplace results. In turn this helps you the customer look good in front of your line manager! With the services of company X and sales star Y just one phone call away, you will shine in your position.
✅ We buy from people we respect
Finally we have ‘respect’, which differs from trust in as far as respect is considering someone in high-esteem. Whereas trust is having confidence in someone to deliver on promises. We buy from people we respect because we inadvertently hope their success, intelligence, or prowess will permeate through to you. And it’s not a bad reason to select a solution/service from that provider because with b2b sales you’ll be spending a lot of time with the key representatives. A well organised, punctual, knowledgeable sales star encourages similar professionalism in others.