Get paid faster: How Intuit’s new AI agents help businesses get funds up to 5 days faster and save 12 hours a month with autonomous workflows

Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more


Intuit has been on a journey over the last several years with generative AI, incorporating the technology as part of its services at QuickBooks, Credit Karma,Turbotax and Mailchimp.

Today the company is taking the next step with a series of AI agents that go beyond that to transform how small and mid-market businesses operate. These new agents work as a virtual team that automates workflows and provides real-time business insights. They include capabilities for payments, accounts and finance that will directly impact business operations. According to Intuit, customers save up to 12 hours per month and, on average, will get paid up to five days faster thanks to the new agents.

“If you look at the trajectory of our AI experiences at Intuit in the early years, AI was built into the background, and with Intuit Assist, you saw a shift to provide information back to the customer,” Ashok Srivastava, chief AI and data officer at Intuit, told VentureBeat. “Now what you’re seeing is a complete redesign. The agents are actually doing work on behalf of the customer, with their permission.”

Technical architecture: From starter kit to production agents

Intuit has been working on the path from assistants to agentic AI for some time.

In September 2024, the company detailed its plans to use AI to automate complex tasks. It’s an approach built firmly on the company’s generative AI operating system (GenOS) platform, the foundation of its AI efforts.

Earlier this month, Intuit announced a series of efforts that further extend its capabilities. The company has developed its own prompt optimization service that will optimize queries for any large language model (LLM). It has also developed what it calls an intelligent data cognition layer for enterprise data that can understand different data sources required for enterprise workflows.

Going a step further, Intuit developed an agent starter kit that builds on the company’s technical foundation to enable agentic AI development.

The agent portfolio: From cash flow to customer management

With the technical foundation in place, including agent starter kits, Intuit has built out a series of new agents that help business owners get things done.

Intuit’s agent suite demonstrates the technical sophistication required to move from predictive AI to autonomous workflow execution. Each agent coordinates prediction, natural language processing (NLP) and autonomous decision-making within complete business processes. They include:

Payments agent: Autonomously optimizes cash flow by predicting late payments, generating invoices and executing follow-up sequences. 

Accounting agent: Represents Intuit’s evolution from rules-based systems to autonomous bookkeeping. The agent now autonomously handles transaction categorization, reconciliation and workflow completion, delivering cleaner and more accurate books.

Finance agent: Automates strategic analysis traditionally requiring dedicated business intelligence (BI) tools and human analysts. Provides key performance indicator (KPI) analysis, scenario planning and forecasting based on how the company is doing against peer benchmarks while autonomously generating growth recommendations.

Intuit is also building out customer hub agents that will help with customer acquisition tasks. Payroll processing as well as project management efforts are also part of the future release plans.

Beyond conversational UI: Task-oriented agent design

The new agents mark an evolution in how AI is presented to users.

Intuit’s interface redesign reveals important user experience principles for enterprise agent deployment. Rather than bolting AI capabilities onto existing software, the company fundamentally restructured the QuickBooks user experience for AI.

“The user interface now is really oriented around the business tasks that need to be done,” Srivastava explained. “It allows for real time insights and recommendations to come to the user directly.”

This task-centric approach contrasts with the chat-based interfaces dominating current enterprise AI tools. Instead of requiring users to learn prompting strategies or navigate conversational flows, the agents operate within existing business workflows. The system includes what Intuit calls a “business feed” that contextually surfaces agent actions and recommendations.

Trust and verification: The closed-loop challenge

One of the most technically significant aspects of Intuit’s implementation addresses a critical challenge in autonomous agent deployment: Verification and trust. Enterprise AI teams often struggle with the black box problem — how do you ensure AI agents are performing correctly when they operate autonomously?

“In order to build trust with artificial intelligence systems, we need to provide proof points back to the customer that what they think is happening is actually happening,” Srivastava emphasized. “That closed loop is very, very important.”

Intuit’s solution involves building verification capabilities directly into GenOS, allowing the system to provide evidence of agent actions and outcomes. For the payments agent, this means showing users that invoices were sent, tracking delivery and demonstrating the improvement in payment cycles that results from the agent’s actions.

This verification approach offers a template for enterprise teams deploying autonomous agents in high-stakes business processes. Rather than asking users to trust AI outputs, the system provides auditable trails and measurable outcomes.

What this means for enterprises looking to get into agentic AI

Intuit’s evolution offers a concrete roadmap for enterprise teams planning autonomous AI implementations:

Focus on workflow completion, not conversation: Target specific business processes for end-to-end automation rather than building general-purpose chat interfaces.

Build agent orchestration infrastructure: Invest in platforms that coordinate prediction, language processing and autonomous execution within unified workflows, not isolated AI tools.

Design verification systems upfront: Include comprehensive audit trails, outcome tracking and user notifications as core capabilities rather than afterthoughts.

Map workflows before building technology: Use customer advisory programs to define agent capabilities based on actual operational challenges.

Plan for interface redesign: Optimize UX for agent-driven workflows rather than traditional software navigation patterns.

“As large language models become commoditized, the experiences that are built upon them become much more important,” Srivastava said.

Similar Posts

  • Razer Pro Click V2 Vertical Review: A Hybrid Gaming Mouse

    Switching to a vertical mouse is a hard sell. Having to change how you use a mouse completely can be an intimidating task, especially with how unnatural the new hand position feels at first—you’re going entirely against the muscle memory you’ve spent years building up.One of the largest challenges to the switch is the initial loss of pointer accuracy. If you’re in an office setting, you may find yourself wandering around a bit or struggling to move your new mouse as quickly as you did before. But in a slow-paced setting like that, all you struggle with is a few mis-clicks or slightly slower navigation. If you try to make this transition with gaming, it’s far more jarring, and the consequences are much more immediately noticeable.But even if it’s difficult to adapt to, could vertical mice be the future of gaming? Razer’s new Pro Click V2 Vertical Edition is a hybrid productivity and gaming vertical mouse. Vertical mice typically cater to office workers, but the focus on gaming performance makes the $120 Pro Click V2 one of a kind.Desk PresenceThe Pro Click V2 Vertical looks, more than anything else, like a modern gaming mouse. It has the textured exterior, metallic highlights, and slightly organic, H.R. Giger-esque curvature typical of Razer’s design language. But everything has been shifted around. The curved, cutting thumb rest sits on top of the mouse instead of on the side. A flare juts out from the right side as a place to rest the underside of your hand. The gunmetal highlight sits at the peak of the mouse rather than between the two buttons. Even the USB port is vertical, a humorous attention to detail.It’s intentionally designed as a gaming mouse that just happens to be vertical. Aesthetically, the only downside is the minimal RGB lighting. With only one section of lighting that runs along the bottom of the mouse, RGB lighting fans might feel disappointed. Still, it’s bright, reactive, and has great color accuracy. It’s more than enough for me, especially with how customizable it is with Razer’s Chroma software.The Pro Click V2 Vertical has the same specs as the standard Pro Click V2, with a 1,000-Hz polling rate, a 2.4-GHz dongle that can be stored on the underside, Bluetooth multi-device connectivity, and a reprogrammable button on top. The only features lost are the mouse wheel’s horizontal scrolling and toggleable non-ratcheted rotation.This mouse includes two major productivity features: app-specific profiles and multi-device connectivity, and both work effortlessly. Razer Synapse immediately detected different software and changed the active profile in response, and pressing the button on the underside of the mouse swapped between paired devices instantaneously.Beyond that, Razer Synapse is as impressive as always. I consistently find the software to be one of the best and most intuitive on the market, and that’s the case here. All of the menus are simple and efficient, the settings can be changed in real time, and the adjustments all have tooltips and explanations to tell you exactly what you’re changing.Annoyingly, Razer Synapse has advertisements on the homepage, something I’ve complained about when reviewing SteelSeries products in the past. However, unlike Steelseries GG, these “recommendations” can be permanently disabled in the app’s settings.Performance and PracticeThe overall hand position of the Pro Click V2 Vertical is natural, but incredibly upright. While some vertical mice, like those from Logitech or Hansker, find a middle ground between a standard and truly “vertical” hand position, Razer opted for a nearly perpendicular shape. While this is technically an ideal ergonomic shape, it will be harder to adapt if you’re moving directly from a standard mouse, and might not be as comfortable during the adjustment period.It felt unnatural for the first week or so, and required practice to use comfortably and confidently. Once I had acclimated, my speed and accuracy were nearly at the same level as a standard mouse, although consistent use still felt clunky and unfamiliar compared to the horizontal mice I’d been using for most of my life.

  • Google DeepMind’s new AI can help historians understand ancient Latin inscriptions

    Google DeepMind has unveiled new artificial-intelligence software that could help historians recover the meaning and context behind ancient Latin engravings.  Aeneas can analyze words written in long-weathered stone to say when and where they were originally inscribed. It follows Google’s previous archaeological tool Ithaca, which also used deep learning to reconstruct and contextualize ancient text, in its case Greek. But while Ithaca and Aeneas use some similar systems, Aeneas also promises to give researchers jumping-off points for further analysis. To do this, Aeneas takes in partial transcriptions of an inscription alongside a scanned image of it. Using these, it gives possible dates and places of origins for the engraving, along with potential fill-ins for any missing text. For example, a slab damaged at the start and continuing with … us populusque Romanus would likely prompt Aeneas to guess that Senat comes before us to create the phrase Senatus populusque Romanus, “The Senate and the people of Rome.”  This is similar to how Ithaca works. But Aeneas also cross-references the text with a stored database of almost 150,000 inscriptions, which originated everywhere from modern-day Britain to modern-day Iraq, to give possible parallels—other catalogued Latin engravings that feature similar words, phrases, and analogies. 
    This database, alongside a few thousand images of inscriptions, makes up the training set for Aeneas’s deep neural network. While it may seem like a good number of samples, it pales in comparison to the billions of documents used to train general-purpose large language models like Google’s Gemini. There simply aren’t enough high-quality scans of inscriptions to train a language model to learn this kind of task. That’s why specialized solutions like Aeneas are needed.  The Aeneas team believes it could help researchers “connect the past,” said Yannis Assael, a researcher at Google DeepMind who worked on the project. Rather than seeking to automate epigraphy—the research field dealing with deciphering and understanding inscriptions—he and his colleagues are interested in “crafting a tool that will integrate with the workflow of a historian,” Assael said in a press briefing. 
    Their goal is to give researchers trying to analyze a specific inscription many hypotheses to work from, saving them the effort of sifting through records by hand. To validate the system, the team presented 23 historians with inscriptions that had been previously dated and tested their workflows both with and without Aeneas. The findings, which were published today in Nature, showed that Aeneas helped spur research ideas among the historians for 90% of inscriptions and that it led to more accurate determinations of where and when the inscriptions originated. In addition to this study, the researchers tested Aeneas on the Monumentum Ancyranum, a famous inscription carved into the walls of a temple in Ankara, Turkey. Here, Aeneas managed to give estimates and parallels that reflected existing historical analysis of the work, and in its attention to detail, the paper claims, it closely matched how a trained historian would approach the problem. “That was jaw-dropping,” Thea Sommerschield, an epigrapher at the University of Nottingham who also worked on Aeneas, said in the press briefing.  However, much remains to be seen about Aeneas’s capabilities in the real world. It doesn’t guess the meaning of texts, so it can’t interpret newly found engravings on its own, and it’s not clear yet how useful it will be to historians’ workflows in the long term, according to Kathleen Coleman, a professor of classics at Harvard. The Monumentum Ancyranum is considered to be one of the best-known and most well-studied inscriptions in epigraphy, raising the question of how Aeneas will fare on more obscure samples.  Google DeepMind has now made Aeneas open-source, and the interface for the system is freely available for teachers, students, museum workers, and academics. The group is working with schools in Belgium to integrate Aeneas into their secondary history education.  “To have Aeneas at your side while you’re in the museum or at the archaeological site where a new inscription has just been found—that is our sort of dream scenario,” Sommerschield said.

  • Best Noise-Canceling Headphones: Sony, Bose, Apple, and More

    Honorable MentionsNow that the majority of new headphones and earbuds offer at least a modicum of noise canceling, it’d be impossible (and unproductive) to list everything we like above. If you haven’t yet found your fit, here are more favorites worth considering.Beyerdynamic Amiron 300 for $280: These simple-looking earbuds (8/10, WIRED Recommends) are a great way to experience quiet luxury. They have 10 hours of battery life with noise canceling engaged, and they have some of the best-sounding drivers for vocals I’ve heard in any earbuds.Sony WF-1000XM5 earbuds for $298: Sony’s fifth-generation flagship earbuds (7/10, WIRED Recommends) slim down while stepping up. These buds are smaller and slicker (maybe too slick when it comes to grabbing them) than the previous XM4 buds. As before, they provide great sound and noise canceling that outduels plenty of options, with a cost to match. In true Sony style, they serve up a truckload of adaptive features and EQ controls while retaining a solid eight hours of playback time per charge with ANC and 12 hours without it. —Ryan WaniataSoundcore Life Q30 for $60-85: Anker’s Soundcore line is nothing if not value-conscious, and the Life Q30 provide an embarrassing list of extras for their bargain-basement pricing. You’ll get clear and warm sound, great features, tons of battery life, and noise canceling that gets the job done even on a long flight, though it can’t keep up with flagship pairs. It’s hard to complain when they cost hundreds less, especially with sale pricing that sometimes drops to around $50.Sony WH-1000XM4 for $250-350: Sony’s WH-1000X lineup has produced some of the best noise-canceling headphones for nearly a decade, and the aging WH-1000XM4 (9/10, WIRED Recommends) are no exception. They periodically go on sale for under $300, but it’s getting harder to find them below full price, which is tough for a five-year-old model.Bowers & Wilkins Pi8 Earbuds for $400: Bowers & Wilkins’ Pi8 (8/10, WIRED Recommends) offer a sleek, comfortable design, solid (albeit not Bose-beating) noise canceling, and great sound. Call quality is also excellent, which makes these perhaps the perfect business-class earbuds, though their hefty price won’t appeal to everyone.Bowers and Wilkins PX7 S2e for $400: The Px7 S2e feature upgraded audio quality for fantastic sound in stylish and sophisticated design. They’re also among the most comfortable headphones we’ve tested, but their noise canceling doesn’t rise to the level of the top players for the money.Beyerdynamic Aventho 300 for $400: These over-ears from Beyerdynamic (7/10, WIRED Recommends) have the brand’s classic studio sound, with a tight crisp high range and punchy lows. The downside is that they don’t cancel noise quite as well as models from Sony, Bose, and others above. Still, they sound great and are worth considering, especially if you can snag them on sale.Soundcore Space A40 for $60: Another top value buy from Anker’s Soundcore brand, the Space A40 (8/10, WIRED Recommends) are some of our favorite cheap earbuds, especially as their price continues to fall. You’ll find a classy design, lots of features, quality sound, and great noise canceling for their class.Apple Beats Fit Pro for $199: The Beats Fit Pro are an aging but still knockout pair of wireless buds, with great sound, easy-access physical buttons, and solid noise canceling to boot. Add to that six hours of battery life, spatial audio compatibility with Apple Music and other services, and you’ve got one of the best pairs of earbuds ever “designed in California.”Epos/Sennheiser Adapt 660 for $210: Want excellent sound, a comfortable fit, and high-quality noise-canceling tech for less than what you’d pay for Sony or Bose headphones? Check out this collaboration between Epos and Sennheiser. The Epos/Sennheiser Adapt 660 (8/10, WIRED Recommends) sound fantastic and are some of the lightest noise-canceling headphones I’ve ever worn. They also feature excellent microphones for great silence on calls and Zooms.

  • RealSense spins out of Intel to scale its stereoscopic imaging technology

    After 14 years of developing inside of semiconductor giant Intel, RealSense is striking out on its own.

    RealSense sells cameras that use stereoscopic imaging, a process that combines two images of the same object from different angles to create depth, enhanced with infrared light. This technology helps machines like robots, drones, and autonomous vehicles have a better perception of the physical world around them. The tech is also used for facial authentication.

    “The common denominator of all of them is they live in the real, physical world,” CEO Nadav Orbach told TechCrunch. “They need to understand the surroundings in 3D and based on that, take and plan actions right in the world. And for that, they need a real-time, high-accuracy ability to understand the surrounding in 3D. And that’s what we do best.”

    Orbach joined Intel back in 2006 as a CPU architect in Israel. He started working on vision technology in 2011 before becoming the general manager of incubation and disruptive innovation in 2022 and moving to San Francisco last year.

    “We knew and understood that 3D perception was going to be big,” Orbach said about the early days of RealSense. “To be honest, we weren’t quite sure in which domain. We tried that across different market segments and different applications, all the way from gesture recognition with computers, phones, until we really found our sweet spot over the years, mostly in robotics.”

    The company works with numerous industries outside of robotics, too. Orbach said they’ve heard from fish farms looking to track the volume inside their pens. Chipotle has also used RealSense cameras, in a partnership with AI restaurant software company PreciTaste, to track when food containers are low.

    RealSense has more than 3,000 customers and has seen a surge in new interest over the last three to four years as AI has improved. With that, the applications for robotics, especially, have scaled.

    Techcrunch event

    Boston, MA
    |
    July 15

    The company realized it may have a better chance keeping up with demand — and scaling itself — if it spun out of Intel and raised its own capital, Orbach said.

    The spinout plans hatched last year and got the approval from former Intel CEO Pat Gelsinger. The company is now independent and raised a $50 million Series A funding round from Intel Capital and other strategic investors to get started on its own.

    “For me, it was exciting, to be honest,” Orbach said. “I’m a veteran executive in the company, but it’s first time that I’m, you know, I was on the other side of the table. It was a very humbling experience for me as a first-time CEO to go and and raise money.”

    RealSense will put the capital toward building out its go-to-market team and making improvements to its technology. The company is particularly focused on improving the tech so it can help improve safety during humans and robot interactions and to improve access control.

    “There is a learning curve of, you know, stepping out,” Orbach said. “I’m extremely excited about that. I’m fortunate to have a very strong team with a lot of people in my team that that have entrepreneurial experience. I feel that with my background, together with with some strong teammates, I think we have the right mix for success. And for me, it’s a dream coming true.”

  • Try on styles with AI, jump on great prices and more

    Whether you’re still on the hunt for the perfect summer maxi skirt, dreaming about a new fall jacket or starting your back to school shopping, our shopping tools can help you explore your personal style and get a good price. Here are a few ways you can use Google’s latest shopping features:Try clothes on, virtuallyAt I/O in May, we introduced our try on tool as a limited experiment in Search Labs, allowing shoppers to upload a photo of themselves and use AI to virtually try on clothes. Today, try on is launching in the U.S., letting you easily try on styles from the billions of apparel items in our Shopping Graph across Search, Google Shopping and even product results on Google Images.

Leave a Reply

Your email address will not be published. Required fields are marked *