At Attest, we prioritise your data’s privacy and security and want to be transparent about how we use AI to enhance our user’s experience.
How do we use AI in the Attest platform?
We use proprietary models that are built and managed by our in-house data experts, with limited use of third party generative AI tools.
Proprietary models
This category refers to our approach to developing in-house AI models via multiple learning techniques, including Natural Language Processing (”NLPs”).
These models are used as part of our data checks that help to ensure high quality survey responses. They are constantly reviewed to ensure a positive impact on the integrity of our customers’ research.
Third-party Generative AI
We are beginning to introduce a limited and carefully chosen selection of AWS’ AI models to complete tasks such as summarising open-text responses, or transcripts from our video response feature.
These features are currently only available via a limited beta as we learn more about the value they provide to our customers.
Do the AI models have access to customer data or PII?
The AI models do not have access to our customers’ data, and are overseen and carefully monitored by our expert Data Team.
The AI models utilise respondents’ survey text, and not customer data.
Survey responses do not contain any personally identifiable information, and only the anonymous response text data is utilised to ensure that individuals cannot be identified.
Is customer data used to train Attest’s AI models?
Attest does not use customers’ personal data to train any of our AI models. This extends to both third-party Generative AI tools, and our in-house proprietary models.
We also have broad opt-outs in place to ensure that any data provided by Attest is not used to train AWS’ models.
Where can I find more information re data protection and information-security at Attest?
You’ll find more information about our stance on data protection and security on our Legal webpages, here.