In April, we had the privilege of representing Harmelin Media at the 2025 Tableau Conference in San Diego. As a media agency built on data-driven insights, leveraging technology to enhance the work we do on behalf of our clients, it was incredible to join the 8,000+ data professionals, BI analysts, and industry experts — all united by a shared passion for working with data to inform, innovate, and inspire across industries.

Learning from the Best: Industry Use Cases
Across the conference floor and in numerous breakout sessions, speakers demonstrated how advanced Tableau users are addressing real-world business challenges. From leading brands to independent consultants and academic researchers, each presenter illustrated bespoke solutions tailored to their organization’s needs, all powered by Tableau. Below are a few of the standout use cases we encountered:
- Retail footprint optimization: A regional retailer shared how they used spatial analytics and Tableau’s new geospatial functions to identify “white space” markets.
- Integrated coding functionality: Biztory demonstrated how they bring Python code into Tableau calculations to supercharge their analysis and workflows.
- Dynamic visualization: Toyota walked through how they monitor and optimize their warehouse operations, leveraging CAD data, with Tableau’s visualization logic to identify waste in routing processes.
These success stories provided concrete inspiration for how our own team can enhance the reporting and analysis capabilities of what we build with Tableau.
A Glimpse of What’s Ahead: New Features on the Horizon
In the heart of the conference hall, often surrounded by autonomous robots (like Boston Dynamic’s Spot Robo Dog or the chess-playing SenseRobot, to which we suffered numerous defeats), Tableau’s product team showcased the platform’s roadmap for the next 6–12 months with a clear emphasis on tighter Salesforce and Data Pro integrations. Attendees celebrated these demonstrations, excited by the potential for stronger collaboration between data and business teams, empowering frontline users with real-time, actionable intelligence, and driving broader trust in the insights generated. Here are a few of the capabilities showcased:
- Augmented data prep: A more intuitive interface for cleaning and blending disparate data sources, powered by machine learning suggestions that accelerate “pre-flight” checks.
- Native Salesforce data connectors: Streamlined sync with Sales Cloud and Service Cloud objects, reducing ETL complexity and ensuring real-time visibility into leads, opportunities, and customer cases.
- Tableau CRM (Data Pro) unification: A single interface for blending Tableau dashboards with Salesforce Einstein Analytics lenses, so sales and marketing teams can explore CRM KPIs without switching platforms.
- Embedded Tableau in Salesforce: New Lightning components and pre-built templates make it trivial to drop interactive visualizations directly into Salesforce pages, boosting adoption among frontline users.
These enhancements reinforce Tableau’s commitment to bridging analytics and CRM — continuously expanding the ways Salesforce and Tableau work together to deliver seamless, end-to-end insights. Seeing these improvements reassured us that Tableau continues to prioritize both depth (for power users) and ease of use (for less technical stakeholders) — a consistent theme throughout the conference.
The Keynote Controversy: AI for All (But at What Cost?)
The highly anticipated keynote for the conference started in classic Tableau fashion. Ryan Aytay, President & CEO of Tableau, strolled onto the circular stage, welcoming the thousands of “Data Fam” faithful who packed the auditorium with anticipation. Many attendees had seen this presentation numerous times, but this year’s keynote took on a different tone. After applauding the achievements of the platform and the large community that supports it, excitement was mixed with apprehension as Aytay pivoted the conversation to Tableau’s impending AI future.
He acknowledged the feeling of uncertainty many practitioners have about the dramatic acceleration of AI capabilities and its insertion into nearly every aspect of the work we do each and every day. Aytay shared a positive take on this shifting landscape and felt the need to explicitly address the prime concern on the minds of the majority of attendees in that packed auditorium: Will we be replaced?
“Your job isn’t going away, it’s transforming.”
This is a reality that data professionals (and increasingly, everyone in the working sector) wrestle with, but for those witnessing the rapid rise of AI on the frontlines of tech, the future is abundantly clear. Whether it’s appointment scheduling, email drafting, content creation, or coding, it is humbling to think we will need to yield to the inevitable — an undeniable proficiency we simply can’t match, requiring us to rethink how else we might provide value above and beyond our current roles. Many struggle with the thought of the very tasks and skills we’ve honed over the years quietly being handed off to an AI, forcing us to adapt to a new way of working by necessity, but those with a positive outlook on change see this as an opportunity.
Aytay explained that AI will be what gives the workforce superpowers, pivoting day-to-day responsibilities from manual or tedious processes to roles of much deeper importance, empowering people to focus on strategic analysis, creative problem-solving, and high-impact decision-making rather than data wrangling. Fortunately, the presence of AI is abundant and increasingly making the acclimation process for those willing to experiment more welcoming. Whether you are an early AI adopter or have an all-out rejection stance, there is no denying that every application, tool, and software solution built today has an AI component aimed at streamlining workplace efficiencies. Tableau is no different.
Aytay then introduced “Tableau Next,” the “world’s first agentic analytics platform.” The crowd was dazzled by the demonstration of autonomous capabilities that may amplify their productivity and truly transform their role as data professionals in their organization and, in turn, the organizations themselves. Such features included:
- Agentic data orchestration: Logically identifying relationships for more rapid data prep and synthesis.
- Autonomous research capabilities: Identify trends and key observations within the data to surface relevant insights that may have been hidden from analysts or taken a considerable amount of time for a human to uncover.
- Automated report generation: Rapidly generating clear data visualizations and explanations to easily interpret the insights found by the agent, with primary implications/considerations highlighted for key stakeholders.
After a few live demos of these agentic capabilities, Aytay was later joined on stage by Ravi Malick, CIO of Box, who shared his thoughts on Tableau Next, Agentforce, and the “Concierge” chatbot, which provides bespoke data visualizations and insights from your data when prompted with natural language.
By being able to have a “conversation with your data” and having answers quickly surfaced by an agent at any moment, on any device, Malick and the rest of his team found a ton of value in having direct access to their data along with AI assistance. He raved about no longer needing to wait for his team to determine the appropriate data sources, pull the data, and transform it, all aspects of “foraging for data” that would typically take time and a lot of effort and coordination to produce. Now, everyone — from C-suite executives to marketing assistants — can query and interpret data without an analyst needing to be in the room.
This was the irony in the grand keynote. On one hand, they celebrated the “Data Fam” and promoted community-centric product enhancements positioned as being for their benefit, then on the other, they immediately transitioned to applauding the superior capabilities of AI in this “era of change” and the displacement of human-centric tasks, directed at an audience of the very professionals now facing displacement. The crowd was conflicted.
“Seems like they are replacing developers and analysts. Those whose entire job is to work with tools like Tableau for dashboards and analysis didn’t ask for this.”
“I disagree; it’s expanding the role of the data analyst to create full end-to-end solutions.”
Like most nuanced topics, the truth often lies somewhere in the middle. It’s natural to feel uneasy about shifting roles and the prospect of relinquishing control to an emerging technology, but AI’s impact is undeniable, and when guided thoughtfully, it can drive efficiency, innovation, and growth at an unprecedented scale. There is truth in Aytay’s perspective that, in a way, the transformation data professionals must undergo in this agentic era is a direct reflection of the transformation our organizations need to embrace. To thrive in an AI-driven world, companies must fundamentally reimagine how they operate, moving from siloed reporting to delivering timely, trustworthy data and insights at every level. Data democratization empowers every team to act with speed and intelligence, yet without the right guardrails, it can introduce serious risks. With that balance in mind, here are the key advantages and potential challenges of opening analytics to the entire organization.
Pros of Data Democratization
- Faster decision-making: Teams can surface insights on their own, reducing dependency on centralized requests.
- Increased innovation: Cross-functional access sparks new hypotheses, prototypes, and data-driven products.
- Empowered non-technical users: Business stakeholders iterate on analyses and validate ideas without waiting for IT or analytics.
- Continuous learning culture: As users engage directly with data, organizational data literacy and critical thinking improve.
Cons & Caveats
- Risk of misinterpretation: Non-experts may draw flawed conclusions or overlook data quality issues.
- Metric fragmentation: Inconsistent definitions (e.g. “active user,” “conversion”) lead to conflicting reports.
- Rogue dataset proliferation: Users may build and share their own data sets at scale without formal review, exposing the organization to potential errors, risks, and compliance issues.
- Governance burden: Immense organizational overhead of defining, communicating, and enforcing policies around data access, usage, and protection to limit non-compliant behavior.
- Duplicate effort: Teams reinvent the wheel with redundant dashboards and analyses.
The promise of democratization empowerment isn’t without its hurdles. Organizations must balance the speed and innovation it delivers against the governance, security, and consistency challenges it can introduce and thoughtfully plan for both sides.
What This Means for Harmelin Media
With years of data-driven rigor woven into Harmelin’s operational fabric — and having already begun to harness light AI enhancements in our media planning and analytics — we now stand at a pivotal juncture. We must chart a path forward supported by substantive AI understanding, avoiding the twin pitfalls of overconfidence from a superficial knowledge of advanced technologies and the naiveté that dismisses AI’s promise until we fall behind. As we open the floodgates to self-service analytics and intelligent automation, every layer of our company — from culture and skills to structure and infrastructure — must evolve in lockstep to fully realize the promise of this new era. To capitalize on democratization while containing its risks, we need to build new pillars of both governance and enablement in the future:
- AI governance dedication: Attention and resources will need to be allocated to oversee policy, mitigate bias, and ensure compliance for self-service tools with agentic capabilities being used by every department across the agency.
- Query-monitoring & streamlining: As widespread adoption rolls out across teams, we will need to develop a monitoring function to track common queries, identify high-value use cases, and build reusable assets so that redundancies are limited.
- Semantic-layer architects: To ensure all applications operate from our agency’s “source-of-truth” standard operating procedures, we will need to design and maintain the shared business logic layer that powers every AI-driven workflow.
- Process translators & liaisons: We will need to deploy technical representatives to work directly with SMEs and business units to codify complex decision-making into AI-actionable processes.
We don’t claim to have all the answers — this is uncharted territory for Harmelin and many of our peers in the industry. However, our proven track record of embracing technological shifts, combined with a highly skilled staff and a culture of continuous curiosity, uniquely positions us to capitalize on these emerging opportunities.
Final Thoughts
The 2025 Tableau Conference in San Diego — truly the data hub of the universe — underscored just how far the platform has evolved and how invaluable it remains to our global community of analysts, developers, and business users. Alongside that progress, an even more profound wave of AI innovation is cresting, one that will touch every facet of our work. Change is arriving at unprecedented speed, but the conference made clear that organizations and individuals everywhere can tap into a vibrant, passionate, and resilient data community — one designed to unite us all as we navigate and shape the AI-driven future together.
For more information, visit harmelin.com, or connect with us on LinkedIn or Facebook.