Skip to main content

Run simulations for Fin Procedures

Learn how to use Simulations to validate Fin Procedures, build confidence, and catch problems before they impact customers.

Dawn avatar
Written by Dawn
Updated over 2 weeks ago

Simulations allow you to validate Fin Procedures, build confidence in your automation, and catch problems before they impact your customers. By modeling full conversations, simulations help your team handle high-volume or complex scenarios, such as cancellations and refunds, with certainty.

Designed to replace time-consuming manual checks, simulations help you identify issues or gradual changes in Fin's behavior as your business logic evolves.

Note: Simulations for Procedures are currently in closed beta.


Accessing simulations

Simulations are located within the testing panel of a Procedure. To access them:

  1. Open the Procedure you wish to test.

  2. Click Test in the top right corner of the canvas.

  3. Select the Simulations tab in the right-hand panel


Creating a simulation

You can create a simulation in two ways: using AI-generated suggestions for a quick start, or manually defining the scenario for full control.

AI-generated simulations

Based on your instructions, Fin AI will generate starter tests to help you quickly create "ready-made" simulations.

  1. Open the Simulations tab in the right-hand panel of your Procedure.

  2. Under Suggested for these instructions, review the list of proposed scenarios (e.g., "Full cancellation request").

  3. Click the Play icon next to a suggestion to run it instantly.

  4. Alternatively, click Run all to execute all suggested simulations at once.

Manually created simulations

You can also build a simulation from scratch to test specific edge cases based on the Procedure instructions.

  1. In the Simulations tab, click + New.

  2. Simulation name: Give your simulation a clear title.

  3. Simulate as: Choose a specific user or brand to test personalization. You can select from a dropdown list of real users in your workspace.

  4. Customer's opening message: Enter the message the customer sends to start the conversation (e.g., "My card is blocked").

  5. Attach image (Optional): Upload an image (e.g., a screenshot of an error) to test how Fin handles visual context.

  6. Additional details: Add context for the AI, such as "Customer is frustrated" or "Customer is on a Basic Plan".

Define available data

The Customer data available to Fin section allows you to define the data Fin has access to during the test. This ensures you are testing against precise data values rather than relying on vague descriptions.

  • Simulation time: Set a specific date and time to test time-sensitive logic, like checking if a refund window has expired.

  • Attributes and Data Connectors: This section pre-populates with the data referenced in your Procedure. You can update these values (e.g., set People.Is.On.Repayment.Plan to True or False) to test different branching outcomes.

Important: Avoid typing specific data values into the Customer's opening message or Additional details fields, as Fin might miss them. Instead, enter exact values into the specific Attributes or Variables in this section.

Evaluate Fin's behaviour

Define the criteria that must be true for the test to pass. Click + Add criteria and select:

  • Fin reply: Specify what Fin should (or should not) say during the conversation.

  • Attributes: Verify if an attribute was set, was not set, was equal to, or was not equal to a specific value.

  • Data connector: Verify if a connector is triggered, is not triggered, or is triggered exactly X times.

  • Instruction outcome: Check if the conversation Finished or was handed off to a teammate or specific team.

Once configured, click Save to store the test or Run to start it immediately.

Tip: When designing simulations, look at your branching logic. If a step implies multiple outcomes (e.g., "Check reason for card block"), create a separate test case for each path. This builds a "regression safety net" to ensure future updates don't break existing logic.


Running and reviewing results

Once you run a test, it appears in the Tests panel on the right-hand side with a status indicator:

  • Running: The test is actively being executed.

  • Passed: The test ran and successfully met all defined success criteria.

  • Failed: The test ran but did not meet the defined success criteria.

  • Queued: The test has been initiated but is waiting for the preceding simulation to finish before executing.

To investigate a result, click See conversation. This opens the full back-and-forth transcript between the simulated customer and Fin, making it easy to see exactly how the flow played out and why a test passed or failed.


FAQs

Why use Simulations instead of manual testing?

Manual testing is great for quick spot checks or configuration reviews. However, Simulations allow you to validate Procedures at scale and ensure Fin performs reliably in complex, high-risk scenarios. Running them before every launch helps you catch unexpected behavior early.

What happens if a Simulation fails?

You can review the full simulated conversation to understand why Fin didn’t behave as expected, adjust your Procedure, and re-run the Simulation with no impact to customers.

Why is my simulation marked "Failed" even though Fin successfully solved the problem?

This usually happens when your Success Criteria are too rigid. For example, if you require Fin to "Ask for an Order ID," but Fin is smart enough to find the ID automatically, the test will fail because Fin skipped the question. Update your criteria to focus on the final outcome (e.g., "Procedure finished") rather than mandating specific intermediate steps.

Fin stops midway through the simulation. Why is it getting stuck?

Fin often stops if it hits a "dead end" in your instructions, such as checking a variable that turns out to be empty (e.g., People.signed_up). If you haven't told Fin what to do when data is missing, it will stop. Ensure your instructions have a "fallback" plan, such as: "Check if the variable has a value. If it is empty, ask the customer for the date."

Where should I enter test data like "Sign up dates" or "Order history"?

Avoid typing specific data values into the Customer's opening message or Additional details fields, as Fin might miss them. Instead, use the Customer data available to Fin section in the simulation setup. Enter exact values (e.g., 2024-06-01) into the specific Attributes or Variables you are testing.

My tool works in real life, but fails in the simulation. Why?

Simulations try to fetch data for the specific user selected in the Simulate as field. If you select a random test user who doesn't actually have data in your external system (like Shopify or Stripe), the tool will return nothing. We recommend creating a dedicated test user in your external system with real data, and selecting that profile in the Simulate as dropdown.


💡Tip

Need more help? Get support from our Community Forum
Find answers and get help from Intercom Support and Community Experts


Did this answer your question?