go.mod Hackery for Compatibility Testing

- By Aval Kumar
Blog post image

Table of Contents

TL;DR

When evolving a Go library, you can ensure backward compatibility by importing two versions of the same module into a single test file. This is achieved by defining a synthetic module path for the older version and using the replace directive in go.mod to point it to a specific version. This allows for direct, in-process testing between versions without publishing temporary tags.


The Challenge of Evolving Go Libraries

Maintaining backward compatibility is a cornerstone of reliable software engineering, especially for shared Go libraries. A seemingly simple change to a serialization format, an encryption algorithm, or a public API can ripple across dozens of services, leading to subtle bugs that only surface in production.

While working on an internal library, our team faced a common but critical requirement: we needed to guarantee that a new version (v1) could seamlessly interoperate with its predecessor (v0). Specifically:

  • Data encrypted by a client using v0 must be decryptable by a server running v1.
  • Data encrypted by a client using v1 must be decryptable by a server still on v0.

The Go module system, however, is designed to enforce a single version of any given module path within a build and they do not natively support importing the same module twice at different versions. Additionally, Go resolves each module path to a single version in the dependency graph. This presents a challenge: how can you have both v0 and v1 of your library co-exist in the same test to verify they work together?

The Solution: replace Directive Trick

The solution lies in “tricking” or “hacking” the Go toolchain. We can create a synthetic module path (an alias) for the old version of our library and use the go.mod and then use replace directive to point that alias to a specific older commit from main branch.

This allows us to import both the “current” version and the “aliased old” version into the same test file. Note : A synthetic module path (.../v0-compat) is redirected to an older commit of the same library.


When to Use This Technique

This method is most powerful for testing regressions in:

  • Data Serialization: Ensuring new code can read data written by old code (JSON, Protobuf, Gob, etc.) and vice-versa.
  • Cryptographic Payloads: Verifying that changes to encryption schemes or key management don’t prevent decryption.
  • Behavioral Logic: Testing stateful interactions where both old and new logic must produce a compatible outcome.

It is less suited for catching public API signature changes (e.g., a function’s parameters changing), as those are typically caught by the compiler or by using tools like apidiff are better for that.

Step-by-Step Implementation

Let’s assume our library’s module path is github.com/your-org/go-library.

Step 1: Define a Synthetic Path in go.mod

First, we need to tell Go about our aliased module and where it should point. We’ll invent a new module path, like github.com/your-org/go-library/v0-compat, and replace it with a specific commit hash of our library’s v0.

In your go.mod file, add the following lines:

// go.mod
// 1. Firtsly `require` directive makes the synthetic path as a known dependency.
//    The zero version is just a convention, this version does not exist only acting as a placeholder; `replace` will override it anyway.
require github.com/your-org/go-library/v0-compat v0.0.0-00010101000000-000000000000

// 2. Then `replace` directive tells Go: "When you see an import for the synthetic
//    module path, use this specific commit of the real module instead",
replace github.com/your-org/go-library/v0-compat =>
    github.com/your-org/go-library v0.0.0-20250805083831-60276723e0d1

Why the require with a zero version? The require directive is necessary to make the synthetic module path (.../v0-compat) known to the Go toolchain. Without it, go get or go mod tidy would complain that the module is not used. The zero version (v0.0.0-...) is a placeholder; the replace directive ensures that this version is never actually fetched, redirecting the compiler to the commit hash you specified instead.

Step 2: Import Both Versions with Aliases

In your compatibility test file, you can now import both versions. Using named imports (aliases) is crucial to avoid naming collisions.

// compat_test.go
package crypt_test

import (
    "context"
    "testing"

    "github.com/stretchr/testify/require"

    // Current version (v1) is imported using the standard module path
    v1_crypt "github.com/your-org/go-library/crypt"

    // Old version (v0) is imported using our synthetic module path
    v0_crypt "github.com/your-org/go-library/v0-compat/crypt"
)

Step 3: Write the Compatibility Test

With both versions imported, you can now write tests that instantiate clients from each version and verify they can communicate.

// compat_test.go
func TestCrossVersionCompatibility(t *testing.T) {
    ctx := context.Background()
    plaintext := []byte("backward compatibility is important")

    // Instantiate encryptors/decryptors from both versions
    v0Encryptor := v0_crypt.NewEncryptor()
    v0Decryptor := v0_crypt.NewDecryptor()

    v1Encryptor := v1_crypt.NewEncryptor()
    v1Decryptor := v1_crypt.NewDecryptor()

    t.Run("v0 encrypts, v1 decrypts", func(t *testing.T) {
        // Encrypt with the old version
        ciphertext, err := v0Encryptor.Encrypt(ctx, plaintext)
        require.NoError(t, err)

        // Decrypt with the new version
        decryptedText, err := v1Decryptor.Decrypt(ctx, ciphertext)
        require.NoError(t, err)

        // Verify the result
        require.Equal(t, plaintext, decryptedText)
    })

    t.Run("v1 encrypts, v0 decrypts", func(t *testing.T) {
        // Encrypt with the new version
        ciphertext, err := v1Encryptor.Encrypt(ctx, plaintext)
        require.NoError(t, err)

        // Decrypt with the old version
        decryptedText, err := v0Decryptor.Decrypt(ctx, ciphertext)
        require.NoError(t, err)

        // Verify the result
        require.Equal(t, plaintext, decryptedText)
    })
}

This test provides high-fidelity assurance that your changes haven’t broken interoperability.

To make this log which version was used for which operation, you can a simple Println and use -v option while running tests

func (f *Encryptor) Encrypt(
	ctx context.Context,
	clrSrc any, // clear source
	encTgt any, // encrypted target
) error {
	fmt.Println("I'm v1 encrypt")
    // changed encrypt logic below

Alternative Strategies for Compatibility Testing

This replace trick is one of several tools. A robust strategy often combines multiple approaches.

  1. Golden Files

    • How: Store the output (e.g., a serialized JSON file) from the old version of the code in your testdata directory. Your test then runs the new code and compares its output against this “golden” file.
    • Pros: Simple, self-contained, and great for deterministic outputs.
    • Cons: Can be brittle. If the output format needs to change, all golden files must be regenerated. Doesn’t test the old code’s ability to read new data.
  2. Embedded Legacy Code

    • How: Copy the minimal required logic from the old version directly into a test-only package (e.g., testdata/v0_legacy).
    • Pros: No go.mod manipulation needed; tests are fully self-contained.
    • Cons: The copied code can become stale and no longer represent the actual old version. It can also bloat the repository.
  3. API Compatibility Tools

    • How: Use tools like golang.org/x/exp/apidiff or gorelease to statically analyze your public API and report breaking changes (e.g., removed functions, changed method signatures).
    • Pros: Fast, automated, and excellent for catching contract-breaking API changes in CI.
    • Cons: Cannot detect logical or behavioral regressions. It only checks the “shape” of the API, not what it does.
  4. Service-Level Testing

    • How: Run the old and new versions of your code as separate services (e.g., in Docker containers) and test their interactions over the network.
    • Pros: The most realistic form of testing, as it mimics a real production deployment.
    • Cons: Significantly more complex, slower, and resource-intensive to set up and maintain.

Conclusion

Using go.mod with replace directive is more than just a tool for forking dependencies; it’s a powerful mechanism for ensuring the graceful evolution of your Go libraries. By enabling direct, in-process compatibility tests, you can validate data formats and complex behaviors with confidence, long before your code reaches production.

When combined with static API analysis and golden file testing, this technique provides a comprehensive strategy for building robust, backward-compatible software that your users can depend on.

About Aval Kumar

Senior Software Engineer at Kablamo