At Attest, we prioritise your data’s privacy and security and want to be transparent about how we use AI to enhance our user’s experience.
How do we use AI in the Attest platform?
We use proprietary models that are built and managed by our in-house data experts, with limited use of carefully selected third party generative AI tools.
Proprietary models
This category refers to our approach to developing in-house AI models via multiple learning techniques, including Natural Language Processing (”NLPs”).
These models are used as part of our data checks that help to ensure high quality survey responses. They are constantly reviewed to ensure a positive impact on the integrity of our customers’ research.
Third-party Generative AI
We have introduced a limited and carefully chosen selection of AI models to complete tasks such as summarising open-text responses, or transcripts from our video response feature, etc.
Is customer data used to train Attest’s AI models?
Attest does not use customers’ personal data to train any of our AI models. This extends to both third-party Generative AI tools, and our in-house proprietary models.
We also have broad opt-outs in place to ensure that any data provided by Attest is not used to train third party models.
Where can I find more information re data protection and information-security at Attest?
You’ll find more information about our stance on data protection and security on our Legal webpages, here.