The most common e-commerce solutions are aware of URLs with parameters and that these URLs are not essential for the crawlers of search engines and for users.
Ideally, your e-commerce solution is supporting an option to add rules in the robots.txt file of your website. The robots.txt sets up rules for User-agents (crawler are using User-agents) and allows or disallows them to crawl specific URLs or folders.
When you already have a fully operating robots.txt you can click on "Project-Settings" in the upper right corner of Ryte choose "robots.txt behaviour" in the tab "Advanced analysis". Now you can choose between crawl everything on your domain or crawl according to the robots.txt on your server.