Bring Your Own Storage: Using S3 or GCS for App Distribution

When you distribute mobile app builds to your testing team, those builds have to live somewhere. For many teams, the default storage provided by their distribution platform is perfectly fine. But for some organizations, "somewhere" needs to be very specific. Healthcare companies bound by HIPAA. European companies subject to GDPR data residency requirements. Enterprises with internal policies that mandate all artifacts stay within company-controlled infrastructure. Government contractors with strict data sovereignty rules.

If your compliance team has ever asked "where are our app binaries stored?" and you could not give a precise answer, this article is for you. We will cover why custom storage matters, how TestApp.io's Bring Your Own Storage feature works, and the practical steps to set it up with Amazon S3, Google Cloud Storage, or Backblaze B2.

Why Custom Storage Matters

App builds are not just code. They contain proprietary business logic, API endpoints, embedded credentials (hopefully not, but often yes), and sometimes sensitive data used for testing. Where these files are stored has real compliance and security implications.

Data Sovereignty and Residency

Data residency laws require that certain data stays within specific geographic boundaries. GDPR, for instance, has implications for where data belonging to EU citizens can be processed and stored. If your app is built for a European market and your builds are stored in a US data center by default, your compliance team has a legitimate concern.

With custom storage, you control the region. Create an S3 bucket in eu-west-1 or a GCS bucket in europe-west3, and your builds stay where your compliance requirements say they should.

Regulatory Compliance

HIPAA, SOC 2, ISO 27001, FedRAMP: these frameworks all have requirements around data handling, access controls, and audit trails. When your builds live in your own cloud storage, you inherit the compliance controls you have already set up for that cloud account. Your existing encryption-at-rest configuration, access logging, lifecycle policies, and IAM rules all apply automatically.

This is significantly easier than trying to validate that a third-party platform's storage meets all your compliance requirements. Your cloud account is already audited. Your builds are just another set of objects in it.

Company Security Policy

Many organizations have internal security policies that require all production artifacts to reside in company-managed infrastructure, regardless of specific regulatory requirements. This is a reasonable security posture. Fewer third-party storage locations mean a smaller attack surface and simpler access auditing.

Supported Storage Providers

TestApp.io supports three storage providers for Bring Your Own Storage:

Amazon S3

The most widely used object storage service. If your organization is on AWS, this is the natural choice. You get full control over bucket region, encryption, versioning, lifecycle policies, and IAM-based access controls. S3 also supports compliance-relevant features like Object Lock (WORM storage) and detailed access logging via CloudTrail.

Google Cloud Storage

For organizations on Google Cloud Platform, GCS provides equivalent capabilities: regional and multi-regional buckets, customer-managed encryption keys, IAM integration, and audit logging via Cloud Audit Logs. If your CI/CD pipeline already runs on GCP (Cloud Build, for example), keeping your builds in GCS reduces cross-cloud data transfer.

Backblaze B2

A cost-effective alternative for teams that need custom storage but do not require the full feature set of AWS or GCP. Backblaze B2 offers S3-compatible APIs, straightforward pricing, and data center locations in the US and EU. For teams where budget is a consideration and compliance requirements are moderate, B2 is a practical choice.

How It Works: The Architecture

The key concept is straightforward: your app builds are stored in your bucket, while TestApp.io handles distribution.

When a build is uploaded (either manually or through the ta-cli command-line tool from your CI/CD pipeline), the binary goes directly to your configured storage bucket. TestApp.io manages the metadata, distribution links, QR codes, install flow, and access control. Testers still install builds through TestApp.io's interface, mobile app, or shared links. They do not need direct access to your S3 or GCS bucket.

This separation is important. You get the compliance benefits of controlling where data lives, without losing the distribution convenience of a purpose-built platform. Your testers do not need AWS credentials or GCP access. They just tap a link and install.

Setup Guide: Step by Step

Here is the practical walkthrough for each provider. For the most up-to-date instructions and screenshots, check help.testapp.io.

Amazon S3 Setup

  1. Create a dedicated bucket. In the AWS console, create a new S3 bucket for your TestApp.io builds. Choose a region that aligns with your data residency requirements. Use a clear naming convention like yourcompany-testappio-builds.
  2. Configure bucket settings. Enable encryption at rest (SSE-S3 or SSE-KMS, depending on your compliance requirements). Enable versioning if you want to retain previous builds even after deletion. Set lifecycle rules if you want builds to automatically transition to cheaper storage classes or expire after a certain period.
  3. Create IAM credentials. Create a dedicated IAM user or role with permissions scoped to only the TestApp.io bucket. The minimum permissions needed are s3:PutObject, s3:GetObject, s3:DeleteObject, and s3:ListBucket on the specific bucket. Follow the principle of least privilege.
  4. Configure in TestApp.io. In your organization settings, enter the bucket name, region, access key ID, and secret access key.
  5. Validate. TestApp.io will validate the connection by performing a test write and read to your bucket. If validation succeeds, you are ready to go.
  6. Enable. Activate the external storage configuration. New builds will now be stored in your S3 bucket.

Google Cloud Storage Setup

  1. Create a dedicated bucket. In the GCP console, create a new Cloud Storage bucket. Choose a location type (region, dual-region, or multi-region) based on your requirements. Regional is usually the right choice for compliance scenarios.
  2. Configure bucket settings. Set the default storage class (Standard for active builds, Nearline or Coldline for archival). Configure encryption using Google-managed keys or customer-managed encryption keys (CMEK) through Cloud KMS.
  3. Create a service account. Create a dedicated service account with the Storage Object Admin role scoped to the specific bucket. Generate a JSON key file for this service account.
  4. Configure in TestApp.io. Enter the bucket name, project ID, and service account credentials in your organization settings.
  5. Validate and enable. Same validation flow as S3: test write, test read, confirm, activate.

Backblaze B2 Setup

  1. Create a dedicated bucket. In the Backblaze console, create a new B2 bucket. Choose your preferred data center location.
  2. Create application keys. Generate a new application key scoped to the specific bucket with read and write permissions.
  3. Configure in TestApp.io. Enter the bucket name, key ID, and application key.
  4. Validate and enable. Same validation flow: test connection, confirm, activate.

Managing Your Storage Configuration

Once configured, TestApp.io provides clear visibility into your external storage status.

Status Indicators

Your storage configuration shows one of three states:

  • Active: External storage is enabled and working. Builds are being stored in your bucket.
  • Disabled: External storage is configured but not active. Your configuration (bucket name, credentials, etc.) is saved, but builds use default storage.
  • Error: There is a problem with the connection. This could be expired credentials, a deleted bucket, or changed permissions. The error state lets you know something needs attention without silently failing.

Enable and Disable Without Losing Configuration

One particularly useful feature: you can disable external storage without losing your configuration. If you need to temporarily switch back to default storage (for troubleshooting, during a credential rotation, or for any other reason), you can disable and re-enable without re-entering all your bucket and credential details.

Edit Settings

You can update your storage configuration at any time. Need to rotate credentials? Update the access key without changing the bucket. Need to switch regions? Update the bucket configuration. Changes take effect for new uploads; existing builds remain where they were stored.

Practical Considerations

Before setting up custom storage, consider these practical points.

Cost

You are responsible for the storage costs in your cloud account. For most teams, this is negligible. A typical mobile app build is 50-200 MB. Even at 10 builds per week, you are looking at 1-2 GB per week, which costs pennies on any cloud provider. But if you retain builds indefinitely and build frequently, implement lifecycle policies to archive or delete old builds automatically.

Credential Management

Treat the credentials you give TestApp.io with the same care as any service credential. Use dedicated IAM users or service accounts with minimum required permissions. Rotate credentials on a regular schedule (quarterly is a reasonable default). Monitor access logs for unexpected activity.

Network Performance

Build uploads go to your storage bucket, so the upload speed is determined by the network path between the uploader and your bucket. If your CI/CD pipeline runs in the same cloud region as your bucket, uploads will be fast. If your developers are uploading manually from a different continent, consider a bucket region that balances compliance requirements with upload performance.

Backup and Disaster Recovery

Your standard cloud backup and DR practices apply. Enable versioning to protect against accidental deletion. Set up cross-region replication if your DR requirements demand it. TestApp.io manages the distribution metadata, but the binaries are in your bucket and subject to your backup policies.

Who Is This For?

Bring Your Own Storage is available on the Pro plan. It is designed for teams where one or more of the following is true:

  • You have regulatory requirements that dictate where build artifacts must be stored.
  • Your company security policy requires all data to reside in company-managed infrastructure.
  • You need audit trails and access controls that integrate with your existing cloud IAM setup.
  • You operate in a regulated industry (healthcare, finance, government) where data handling is scrutinized.

If none of these apply and default storage works fine for your team, there is no need to add the complexity of managing your own bucket. But if compliance is a concern, this feature exists so you do not have to choose between meeting your requirements and having a functional distribution workflow.

For more details on the Pro plan and its features, visit testapp.io.

Getting Started

Setting up custom storage takes about 15 minutes if you already have a cloud account:

  1. Create a dedicated bucket in your preferred provider (S3, GCS, or Backblaze B2).
  2. Create scoped credentials with minimum required permissions.
  3. Enter the configuration in TestApp.io's organization settings.
  4. Validate the connection.
  5. Enable external storage.

From that point forward, every build uploaded through TestApp.io, whether manually or through your CI/CD pipeline, lands in your bucket. Your compliance team can point to a specific bucket in a specific region managed by your cloud account. Your distribution workflow stays exactly the same.

That is the point. Compliance should not require sacrificing convenience. Your testers still install via link or QR code. Your CI/CD pipeline still uploads via ta-cli. The only difference is where the bytes land, and now you control that.

Visit help.testapp.io for detailed setup guides with screenshots for each storage provider.