Skip to content

Search the site

GenAINews

Walmart sees "tangible results" from GenAI deployment

"We're learning and applying AI and machine learning to solve the practical opportunities right in front of us."

A Walmart shopper inside one of its st

Walmart has shared an update on its work to roll out GenAI and its experiments with a personal assistant for both shoppers and staff.

In its latest results, the retailer reported a 5.5% increase in revenue to $169.6 billion on the back of a 27% improvement in eCommerce.

On an earnings call, Doug McMillon, CEO, said: "We're seeing early tangible results from the deployment of generative AI. I'm a little hesitant to talk about AI because I know someone will hear this in the months and years to come and chuckle about how old-school it sounds given how fast things are changing.

"But it's important to convey that we're learning and applying generative AI, AI, and machine learning to solve the practical opportunities right in front of us."

McMillion said Walmart was putting its "valuable" datasets "to work" to improve customer experience and help its staff perform their daily work.

"GenAI has helped us improve our product catalogue by mentioning the personal shopping assistant we're building," he continued. "We've had it in beta form for five months, and it continues to improve. I'm excited about how it will improve the customer experience in the months and years to come, enabling us to provide a better experience than the one that starts by typing into a search bar and getting a list of results to choose from."

A graphic showing Walmart's distribution chain
A graphic showing Walmart's distribution chain

Roughly 15 months ago, the company launched a GenAI tool called My Assistant for its US office workers. That tool is now deployed across 13 countries, with 50,000 employees using its conversational interface to ask 1.5 million questions to get data around people-related metrics like hiring and retention or find answers to policy queries, such as ordering discount cards through an intuitive conversational interface.

"We'll continue to build on these use cases to enable more productivity and help identify the next best task for our associates in stores and clubs," the CEO added. "Just as we're enhancing the customer experience with GenAI, we're working to remove friction for our associates, so they can do high-value work that they enjoy like serving our customers and being merchants. I continue to be excited about how our associates are learning and changing the way they think and work."

Building GenAI tools at Walmart

Walmart's tech teams use a variety of tools to build their AI assistants. Its engineers have previously written about using the Llama Index tool, a framework for building knowledge assistants using LLMs connected to enterprise data.

The retailer also uses LangChain to simplify the integration of large language models (LLMs) like GPT into its operations by managing prompts, chaining model calls, and connecting to external data sources such as databases and APIs. This enables Walmart to build advanced AI-powered applications for customer engagement, inventory management, or internal decision-making.

Apache Beam gives it a unified framework for processing large-scale data, whether in real-time streams or batch jobs. Walmart uses Beam for tasks like transforming, aggregating, and analyzing data from multiple sources, supporting ETL operations, event processing, and machine learning pipelines.

Additionally, Google Cloud Dataflow, with its distributed architecture and automated scaling, helps Walmart manage high-throughput data processing efficiently. By handling failures and optimizing resources, Dataflow ensures reliable execution of large-scale, concurrent tasks that are critical for its dynamic retail operations and real-time analytics.

"Google Cloud Dataflow is an efficient and powerful tool for large-scale data processing tasks," software engineer Sahaja Puligadda wrote. "It is like having a personal assistant that can juggle both batch and streaming data, all while being easy to use, reliable, and able to grow with your business. It is like the superhero of data tools, providing real-time insights, powering up BI dashboards, and making data integration a walk in the park."

One of the challenges Walmart has tackled with these tools is making parallel API calls (simultaneous calls) to LLMs, which it achieves using Apache Beam and Langchain framework on Dataflow.

Richa Sharma, senior data scientist, wrote: "Modern AI applications often require integration with multiple large language model (LLM) APIs for leveraging different offerings from different LLM. The closed-source LLMs offer APIs that can readily be used for many real-world business problems.

"However, for large enterprises where data volumes are immense and key metrics such as reduced latency and improved throughput are crucial, it becomes essential to make parallel API calls to LLMs. These calls must be secure especially when dealing with sensitive enterprise information."

Walmart also uses RAG to combat hallucinations and works with open-source LLMs, including LLama, so does not appear to have patented its own LLMs.

Read more about Walmart's work on its technical blog.

Latest