# S3 Cache for DataForSEO Data

SEO Utils can cache DataForSEO API responses using S3-compatible storage, so your teammates can access the same data without paying any extra fees.

For example, if you check the backlink data of site A, SEO Utils will upload that data to your S3 bucket. When your teammate also checks the backlink data for that site, it will pull data from S3 instead of requesting the DataForSEO API.

{% hint style="info" %}
S3 cache supports **AWS S3** (Recommended), **Cloudflare R2**, **DigitalOcean Spaces**, **MinIO**, and any other S3-compatible storage service.
{% endhint %}

## Setting Up S3 Cache

{% stepper %}
{% step %}
**Create a Bucket**

{% tabs %}
{% tab title="AWS S3 (Recommended)" %}

1. Log in to [AWS Console](https://console.aws.amazon.com/)
2. Go to **S3** → **Create bucket** (search "S3" in the header search bar if you can't find it)
3. Enter a bucket name (e.g., `my-seo-cache`)
4. Select your preferred region (e.g., `us-east-1`)
5. Keep **Block all public access** enabled
6. Click **Create bucket**

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-f0ce09ed0d723d9eab67cac0569ac2f6cb46d8bb%2Fs3-create-bucket.png?alt=media" alt=""><figcaption><p>Create a S3 bucket on AWS</p></figcaption></figure>
{% endtab %}

{% tab title="Cloudflare R2" %}

1. Go to your Cloudflare dashboard
2. Navigate to **R2** and click **Create Bucket**
3. Enter a bucket name and select a location
   {% endtab %}

{% tab title="DigitalOcean Spaces" %}

1. Go to your DigitalOcean dashboard
2. Navigate to **Spaces** and click **Create Space**
3. Select a region and enter a name
   {% endtab %}

{% tab title="MinIO" %}
Run `mc mb myminio/my-seo-cache` from the MinIO client.
{% endtab %}
{% endtabs %}

{% hint style="warning" %}
The bucket should be **private**. SEO Utils authenticates using Access Key and Secret Key — no public access is needed.
{% endhint %}
{% endstep %}

{% step %}
**Get Your Access Credentials**

You'll need an **Access Key ID** and **Secret Access Key** with read/write permissions to the bucket.

{% tabs %}
{% tab title="AWS S3 (Recommended)" %}
**Create an IAM Policy**

1. Go to **IAM** in the AWS Console (search "IAM" in the header search bar)

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-17007f6723f915632250a9721178fd166ae5ac14%2Fs3-iam-search.png?alt=media" alt=""><figcaption><p>Search for IAM in the AWS Console</p></figcaption></figure>

2. Go to **Policies** → **Create policy**
3. Select **S3** as the service

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-f95f524529a2aff241b6b34aaaad54245ca813b6%2Fs3-iam-select-service.png?alt=media" alt=""><figcaption><p>Select S3 as the service</p></figcaption></figure>

4. Click the **JSON** tab and paste:

```json
{
   "Version": "2012-10-17",
   "Statement": [
      {
         "Effect": "Allow",
         "Action": [
            "s3:GetObject",
            "s3:PutObject",
            "s3:DeleteObject",
            "s3:ListBucket"
         ],
         "Resource": [
            "arn:aws:s3:::YOUR-BUCKET-NAME",
            "arn:aws:s3:::YOUR-BUCKET-NAME/*"
         ]
      }
   ]
}
```

5. Replace `YOUR-BUCKET-NAME` with your actual bucket name
6. Click **Next** → Name it `SEOUtilsS3Access` → **Create policy**

**Create an IAM User**

1. Go to **IAM** → **Users** → **Create user**
2. Enter a name: `seo-utils-s3` → Click **Next**
3. Select **Attach policies directly**
4. Search for `SEOUtilsS3Access` and check the box
5. Click **Next** → **Create user**
6. Open the user → **Security credentials** tab
7. Click **Create access key** → Select **Third-party service**
8. Click **Create access key** → **Save both keys**

{% hint style="warning" %}
**Save your keys now.** The Secret Access Key is only shown once. Store it in a password manager.
{% endhint %}
{% endtab %}

{% tab title="Cloudflare R2" %}

1. Go to **R2 > Manage R2 API Tokens** in your Cloudflare dashboard
2. Click **Create API Token**
3. Select **Object Read & Write** permission
4. Copy the **Access Key ID** and **Secret Access Key**
   {% endtab %}

{% tab title="DigitalOcean Spaces" %}

1. Go to **API > Spaces Keys** in your DigitalOcean dashboard
2. Click **Generate New Key**
3. Copy the **Key** and **Secret**
   {% endtab %}
   {% endtabs %}
   {% endstep %}

{% step %}
**Configure S3 Cache in SEO Utils**

Open SEO Utils and go to **Settings > Services**.

Scroll down to the **S3 Cache Settings** section and fill in:

* **Access Key ID** — Your S3 access key
* **Secret Access Key** — Your S3 secret key
* **Bucket Name** — The name of the bucket you created
* **Region** — The region of your bucket (e.g., `us-east-1`). Leave empty or use `auto` for Cloudflare R2
* **Endpoint URL** — Leave empty for AWS S3. For other providers, enter the endpoint URL (e.g., `https://<account-id>.r2.cloudflarestorage.com` for Cloudflare R2)

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fiz8F0pChq7YZwtW3jlGh%2FXnapper-2026-02-13-17.39.44.png?alt=media&#x26;token=c7f14bc2-c0be-43d6-9fe5-0ea69d48ebc7" alt=""><figcaption><p>S3 Cache Settings in SEO Utils</p></figcaption></figure>
{% endstep %}

{% step %}
**Test the Connection**

Click the **"Test Connection"** button to verify that SEO Utils can connect to your bucket. You should see a success message.

{% hint style="info" %}
You don't need to save the settings before testing. The Test Connection button uses the values currently entered in the form.
{% endhint %}
{% endstep %}

{% step %}
**Save Settings**

Click the **"Save"** button to save your S3 cache settings.

From now on, every time you pull data from DataForSEO, it will automatically upload the data to your S3 bucket.
{% endstep %}
{% endstepper %}

## Sharing with Your Team

To share cached data with your teammates:

1. Share the **Access Key ID**, **Secret Access Key**, **Bucket Name**, **Region**, and **Endpoint URL** with your team
2. Each teammate enters the same credentials in their SEO Utils app under **Settings > Services > S3 Cache Settings**

Once configured, everyone on the team will read from and write to the same S3 bucket, avoiding duplicate DataForSEO API requests.

## Automatic Cache Cleanup

SEO Utils automatically deletes cached data older than **7 days**, so you always get fresh data. If you want fresh data sooner, you can:

* Click the **"Purge Cache"** button in S3 Cache Settings to delete all cached data immediately
* Or delete specific files directly from your S3 bucket

{% hint style="warning" %}
If you delete the S3 bucket or revoke the access credentials, remove them from the SEO Utils settings to avoid errors.
{% endhint %}

## Migrating from Google Drive Cache

If you were previously using Google Drive to cache DataForSEO data, simply configure your S3 cache settings as described above. SEO Utils will start using S3 for all new cache operations. Your old Google Drive cache data will no longer be used.

{% hint style="info" %}
The Google Drive settings in SEO Utils are now only used for [exporting content outlines to Google Docs](https://help.seoutils.app/content-struct#setup-google-drive-api-integration). If you don't use that feature, you can remove your Google Drive settings.
{% endhint %}
