Skip to main content
Private Data Share Guide
S
Written by Shipwell TMS Support
Updated over a week ago

Unleash the Power of Your Supply Chain Data with Shipwell's Private Data Sharing

Here at Shipwell, we understand the importance of having access to insightful data to drive informed decision-making in your shipping and logistics operations. That's why we're excited to introduce our new Private Data Sharing feature, designed to give you all of your supply chain data, in a modern, fast analytical schema so that you can incorporate it into your own data pipelines and BI tool of choice.

Unlock New Levels of Collaboration:

With Private Data Sharing, you can securely access your data within Shipwell’s Data Warehouse on replication frequency of choice. This enables you to:

  • Ready-to-Analyze Data:

    1. Pre-transformed and cleaned data: We've done the heavy lifting for you. No need to waste time cleaning and structuring your data.

    2. Star schema: This optimized schema facilitates lightning-fast analytics and BI reporting, allowing you to quickly extract valuable insights.

  • Secure and Flexible Collaboration:

    1. Granular access control: Share data securely with partners and stakeholders. Grant specific permissions to individual data objects, ensuring only authorized users have access.

    2. Seamless integration with your existing tools: No need to learn new platforms. Shipwell's Private Data Sharing integrates seamlessly with your preferred BI tools, such as Power BI, Tableau, and Domo.

  • Continuous Updates and Consistency:

    1. Automated data pipeline: Receive new schema updates and data revisions as we launch new features and products. Stay up-to-date without manual intervention.

    2. Data consistency across your organization: Eliminate discrepancies and ensure everyone works with the same accurate and complete information.

Flexibility and Control:

We believe that you should have complete control over your data. That's why Private Data Sharing allows you to:

  • Choose What You Share: Select specific data sets for sharing, ensuring that sensitive information remains protected.

  • Grant Precise Access: Define user roles and permissions to control who has access to specific data objects.

  • Maintain Ownership: Your data remains securely within your Snowflake account, ensuring data integrity and compliance.

Data Latency:

Stay informed with our consistent data refresh cycles:

  • In-platform Analytics: Access data with an average latency of 30 minutes. Reports are typically set to a 30 min refresh cycle

  • External Data Shares: Receive daily updates of shared data at 12 AM UTC by default. This refresh cycle can be configured based on business requirements

Easy Implementation:

Getting started is simple:

  1. Discuss your needs: Contact our Customer Success team to discuss your data-sharing requirements and confirm eligibility.

  2. We'll set it up: Our team will configure the data share according to your specifications.

  3. We grant access: We'll grant authorized users access to specific data objects within the share.

  4. We’ll train your data team: We will meet with your data team to review the schema and get them up to speed

  5. Start analyzing: You and your collaborators can now access and analyze the shared data.

Additional Support:

We're here to support you every step of the way:

  • Detailed Documentation: Access our comprehensive documentation for in-depth instructions and FAQs along with Schema diagrams.

  • Dedicated Support: Our experienced support team is available to answer any questions and provide assistance.

Ready to unlock the full potential of your data?

Contact us today to learn how Shipwell's Private Data Sharing can empower you to collaborate more effectively, gain deeper insights, and make data-driven decisions with confidence.

Shipwell’s Modern Data Architecture

At Shipwell, we believe that data is the lifeblood of informed decision-making. That's why we've invested in building a modern data architecture that's designed to deliver accurate, reliable, and actionable insights to our users. This architecture comprises several key components:

  1. Orchestration: We leverage cutting-edge orchestration tools to automate data pipelines and ensure smooth data flow across our entire system. This ensures data is consistently ingested, processed, and transformed, eliminating the risk of errors and delays.

  2. Monitoring: We continuously monitor our data pipelines and infrastructure to detect any anomalies or performance issues. This proactive approach allows us to quickly identify and address potential problems before they impact data quality or availability.

  3. Data Quality Assurance: We place a strong emphasis on data quality, implementing rigorous data QA procedures throughout our data lifecycle. This includes data validation, cleansing, and enrichment, ensuring the data we deliver is accurate, complete, and trustworthy.

  4. ELT (Extract, Load, Transform): We utilize an efficient ELT approach to data processing, extracting data from various sources, loading it into our data warehouse, and then transforming it for analysis. This approach allows us to store raw data for future reference while also providing readily accessible data sets for immediate analysis.

  5. Data Warehouse: We rely on a robust data warehouse powered by Snowflake, a leading cloud-based data platform. This data warehouse provides a centralized repository for all our data, enabling us to perform complex queries and generate comprehensive reports that provide invaluable insights into our operations.

By combining these key components, Shipwell's modern data architecture ensures that our users have access to the data they need, when they need it, to make informed decisions and optimize their logistics operations.

FAQs

  1. Our analytics infrastructure and reporting in the platform is updated every 5m to 1hr depending on the data source. What is the average update time? Do we have a list of reports and the data replication times?

    1. The data latency for in-platform analytics is 30min on average. Analytics reports are refreshed daily at 2am/4am/every hour. The vast majority of reports are refreshed every 30min.

  2. For the external data share process, how often is that data replicated and updated?

    1. Currently, this is set to daily at 14:00 UTC but can be changed based on the customer’s specification

  3. If they have PowerBI, Tableau, Domo, or any 3rd party BI tool. Will they need a snowflake account to access the data?

    1. Yes, the consumer of the private data share will need to have a snowflake account and pay a nominal consumption charge to use the private data share.

  4. Is there a cost associated with setting up and maintaining the data share?

    1. Yes, there is a $1000/month fee from Shipwell to cover the initial setup, ongoing maintenance, initial training, and consistent updates to the schema’s and data share.

  5. If there is a change in the data model and we propagate that change to the data share, will they also have to update it on their end? Will the data share need to be migrated? If so, what is our notification process?

    1. The customer is in full control of when and how frequently they move the data to their account.

    2. There will be model/data changes periodically given the additional reporting and analytical requirements of the customer

    3. Every non-breaking change will require a version, and we will maintain the older version for 6 months. We will communicate these changes as part of the bi-weekly deploy process.

Did this answer your question?