What is Template Method Pattern?

The Template Method pattern is a behavioral design pattern that defines the skeleton of an algorithm in a base class and lets subclasses override specific steps without changing the algorithm’s structure. Think of it like a recipe - the overall cooking process is the same (prepare ingredients, cook, serve), but the specific steps can vary depending on what you’re making.

In Go, since we don’t have traditional inheritance, we implement this pattern using composition and interfaces, which actually makes it more flexible and idiomatic.

Let’s start with a scenario: Data Processing Pipeline

Imagine you’re building a data processing system that needs to handle different types of data sources (CSV files, JSON APIs, databases) but follows the same general workflow: connect to source, validate data, transform data, and save results. The overall process is the same, but each step varies depending on the data source.

Without Template Method Pattern

Here’s how you might handle this without the Template Method pattern:

type CSVProcessor struct{}

func (c *CSVProcessor) ProcessData(source string) error {
    // CSV-specific processing - lots of duplicated workflow logic
    log.Println("Starting CSV processing...")
    
    // Connect
    file, err := os.Open(source)
    if err != nil {
        return err
    }
    defer file.Close()
    
    // Validate
    reader := csv.NewReader(file)
    records, err := reader.ReadAll()
    if err != nil {
        return err
    }
    
    if len(records) == 0 {
        return errors.New("no data found")
    }
    
    // Transform
    var transformedData []map[string]interface{}
    headers := records[0]
    for _, record := range records[1:] {
        row := make(map[string]interface{})
        for i, value := range record {
            if i < len(headers) {
                row[headers[i]] = value
            }
        }
        transformedData = append(transformedData, row)
    }
    
    // Save
    output, _ := json.Marshal(transformedData)
    err = ioutil.WriteFile("output.json", output, 0644)
    
    log.Println("CSV processing completed")
    return err
}

type JSONProcessor struct{}

func (j *JSONProcessor) ProcessData(source string) error {
    // JSON-specific processing - similar workflow but different implementation
    log.Println("Starting JSON processing...")
    
    // Connect
    resp, err := http.Get(source)
    if err != nil {
        return err
    }
    defer resp.Body.Close()
    
    // Validate
    if resp.StatusCode != 200 {
        return fmt.Errorf("HTTP error: %d", resp.StatusCode)
    }
    
    var data []map[string]interface{}
    err = json.NewDecoder(resp.Body).Decode(&data)
    if err != nil {
        return err
    }
    
    if len(data) == 0 {
        return errors.New("no data found")
    }
    
    // Transform (different transformation logic)
    for _, item := range data {
        // JSON-specific transformations
        if timestamp, ok := item["timestamp"]; ok {
            item["processed_at"] = timestamp
        }
    }
    
    // Save
    output, _ := json.Marshal(data)
    err = ioutil.WriteFile("output.json", output, 0644)
    
    log.Println("JSON processing completed")
    return err
}

// This approach has duplicated workflow logic and is hard to maintain

With Template Method Pattern

Let’s refactor this using the Template Method pattern:

package main

import (
    "encoding/csv"
    "encoding/json"
    "fmt"
    "io/ioutil"
    "log"
    "net/http"
    "os"
    "strconv"
    "strings"
    "time"
)

// Step interfaces - define the contract for each step
type DataConnector interface {
    Connect(source string) (interface{}, error)
    Disconnect(connection interface{}) error
}

type DataValidator interface {
    Validate(data interface{}) error
}

type DataTransformer interface {
    Transform(data interface{}) ([]map[string]interface{}, error)
}

type DataSaver interface {
    Save(data []map[string]interface{}, destination string) error
}

// Template method interface
type DataProcessor interface {
    DataConnector
    DataValidator
    DataTransformer
    DataSaver
    ProcessData(source, destination string) error
}

// Base processor with template method
type BaseDataProcessor struct {
    processor DataProcessor
}

func NewBaseDataProcessor(processor DataProcessor) *BaseDataProcessor {
    return &BaseDataProcessor{
        processor: processor,
    }
}

// Template method - defines the algorithm skeleton
func (b *BaseDataProcessor) ProcessData(source, destination string) error {
    log.Printf("Starting data processing pipeline for: %s", source)
    startTime := time.Now()
    
    // Step 1: Connect to data source
    log.Println("Step 1: Connecting to data source...")
    connection, err := b.processor.Connect(source)
    if err != nil {
        return fmt.Errorf("connection failed: %w", err)
    }
    defer b.processor.Disconnect(connection)
    
    // Step 2: Validate data
    log.Println("Step 2: Validating data...")
    err = b.processor.Validate(connection)
    if err != nil {
        return fmt.Errorf("validation failed: %w", err)
    }
    
    // Step 3: Transform data
    log.Println("Step 3: Transforming data...")
    transformedData, err := b.processor.Transform(connection)
    if err != nil {
        return fmt.Errorf("transformation failed: %w", err)
    }
    
    // Step 4: Save data
    log.Println("Step 4: Saving data...")
    err = b.processor.Save(transformedData, destination)
    if err != nil {
        return fmt.Errorf("save failed: %w", err)
    }
    
    duration := time.Since(startTime)
    log.Printf("Data processing completed successfully in %v", duration)
    return nil
}

// Concrete implementation for CSV processing
type CSVDataProcessor struct {
    *BaseDataProcessor
}

func NewCSVDataProcessor() *CSVDataProcessor {
    processor := &CSVDataProcessor{}
    processor.BaseDataProcessor = NewBaseDataProcessor(processor)
    return processor
}

func (c *CSVDataProcessor) Connect(source string) (interface{}, error) {
    log.Printf("Opening CSV file: %s", source)
    file, err := os.Open(source)
    if err != nil {
        return nil, err
    }
    
    reader := csv.NewReader(file)
    records, err := reader.ReadAll()
    if err != nil {
        file.Close()
        return nil, err
    }
    
    return &CSVConnection{file: file, records: records}, nil
}

func (c *CSVDataProcessor) Disconnect(connection interface{}) error {
    if csvConn, ok := connection.(*CSVConnection); ok {
        log.Println("Closing CSV file")
        return csvConn.file.Close()
    }
    return nil
}

func (c *CSVDataProcessor) Validate(data interface{}) error {
    csvConn := data.(*CSVConnection)
    
    if len(csvConn.records) == 0 {
        return fmt.Errorf("CSV file is empty")
    }
    
    if len(csvConn.records) < 2 {
        return fmt.Errorf("CSV file must have at least a header and one data row")
    }
    
    log.Printf("CSV validation passed: %d rows found", len(csvConn.records))
    return nil
}

func (c *CSVDataProcessor) Transform(data interface{}) ([]map[string]interface{}, error) {
    csvConn := data.(*CSVConnection)
    records := csvConn.records
    
    var result []map[string]interface{}
    headers := records[0]
    
    for i, record := range records[1:] {
        row := make(map[string]interface{})
        
        for j, value := range record {
            if j < len(headers) {
                header := strings.TrimSpace(headers[j])
                
                // Try to convert to appropriate types
                if intVal, err := strconv.Atoi(value); err == nil {
                    row[header] = intVal
                } else if floatVal, err := strconv.ParseFloat(value, 64); err == nil {
                    row[header] = floatVal
                } else if boolVal, err := strconv.ParseBool(value); err == nil {
                    row[header] = boolVal
                } else {
                    row[header] = strings.TrimSpace(value)
                }
            }
        }
        
        // Add metadata
        row["_row_number"] = i + 2 // +2 because we skip header and arrays are 0-indexed
        row["_processed_at"] = time.Now().Format(time.RFC3339)
        
        result = append(result, row)
    }
    
    log.Printf("CSV transformation completed: %d records processed", len(result))
    return result, nil
}

func (c *CSVDataProcessor) Save(data []map[string]interface{}, destination string) error {
    output, err := json.MarshalIndent(data, "", "  ")
    if err != nil {
        return err
    }
    
    err = ioutil.WriteFile(destination, output, 0644)
    if err != nil {
        return err
    }
    
    log.Printf("CSV data saved to: %s (%d bytes)", destination, len(output))
    return nil
}

// Helper struct for CSV connection
type CSVConnection struct {
    file    *os.File
    records [][]string
}

// Concrete implementation for JSON API processing
type JSONAPIProcessor struct {
    *BaseDataProcessor
    client *http.Client
}

func NewJSONAPIProcessor() *JSONAPIProcessor {
    processor := &JSONAPIProcessor{
        client: &http.Client{Timeout: 30 * time.Second},
    }
    processor.BaseDataProcessor = NewBaseDataProcessor(processor)
    return processor
}

func (j *JSONAPIProcessor) Connect(source string) (interface{}, error) {
    log.Printf("Fetching data from API: %s", source)
    
    resp, err := j.client.Get(source)
    if err != nil {
        return nil, err
    }
    
    return &APIConnection{response: resp}, nil
}

func (j *JSONAPIProcessor) Disconnect(connection interface{}) error {
    if apiConn, ok := connection.(*APIConnection); ok {
        log.Println("Closing API connection")
        return apiConn.response.Body.Close()
    }
    return nil
}

func (j *JSONAPIProcessor) Validate(data interface{}) error {
    apiConn := data.(*APIConnection)
    
    if apiConn.response.StatusCode != 200 {
        return fmt.Errorf("API returned status code: %d", apiConn.response.StatusCode)
    }
    
    contentType := apiConn.response.Header.Get("Content-Type")
    if !strings.Contains(contentType, "application/json") {
        return fmt.Errorf("expected JSON content type, got: %s", contentType)
    }
    
    log.Printf("API validation passed: status %d", apiConn.response.StatusCode)
    return nil
}

func (j *JSONAPIProcessor) Transform(data interface{}) ([]map[string]interface{}, error) {
    apiConn := data.(*APIConnection)
    
    var rawData []map[string]interface{}
    err := json.NewDecoder(apiConn.response.Body).Decode(&rawData)
    if err != nil {
        return nil, err
    }
    
    // Transform the data
    var result []map[string]interface{}
    for i, item := range rawData {
        transformedItem := make(map[string]interface{})
        
        // Copy original data
        for k, v := range item {
            transformedItem[k] = v
        }
        
        // Add API-specific transformations
        transformedItem["_api_source"] = true
        transformedItem["_record_index"] = i
        transformedItem["_fetched_at"] = time.Now().Format(time.RFC3339)
        
        // Normalize timestamp fields
        if timestamp, ok := item["timestamp"]; ok {
            transformedItem["normalized_timestamp"] = timestamp
        }
        if createdAt, ok := item["created_at"]; ok {
            transformedItem["normalized_timestamp"] = createdAt
        }
        
        result = append(result, transformedItem)
    }
    
    log.Printf("API transformation completed: %d records processed", len(result))
    return result, nil
}

func (j *JSONAPIProcessor) Save(data []map[string]interface{}, destination string) error {
    // Create a wrapper with metadata
    output := map[string]interface{}{
        "metadata": map[string]interface{}{
            "source":      "json_api",
            "record_count": len(data),
            "processed_at": time.Now().Format(time.RFC3339),
        },
        "data": data,
    }
    
    jsonOutput, err := json.MarshalIndent(output, "", "  ")
    if err != nil {
        return err
    }
    
    err = ioutil.WriteFile(destination, jsonOutput, 0644)
    if err != nil {
        return err
    }
    
    log.Printf("API data saved to: %s (%d bytes)", destination, len(jsonOutput))
    return nil
}

// Helper struct for API connection
type APIConnection struct {
    response *http.Response
}

// Database processor example
type DatabaseProcessor struct {
    *BaseDataProcessor
    connectionString string
}

func NewDatabaseProcessor(connectionString string) *DatabaseProcessor {
    processor := &DatabaseProcessor{
        connectionString: connectionString,
    }
    processor.BaseDataProcessor = NewBaseDataProcessor(processor)
    return processor
}

func (d *DatabaseProcessor) Connect(source string) (interface{}, error) {
    log.Printf("Connecting to database: %s", d.connectionString)
    // Simulate database connection
    return &DBConnection{query: source, connected: true}, nil
}

func (d *DatabaseProcessor) Disconnect(connection interface{}) error {
    if dbConn, ok := connection.(*DBConnection); ok {
        log.Println("Closing database connection")
        dbConn.connected = false
    }
    return nil
}

func (d *DatabaseProcessor) Validate(data interface{}) error {
    dbConn := data.(*DBConnection)
    if !dbConn.connected {
        return fmt.Errorf("database connection is not active")
    }
    
    if strings.TrimSpace(dbConn.query) == "" {
        return fmt.Errorf("query cannot be empty")
    }
    
    log.Println("Database validation passed")
    return nil
}

func (d *DatabaseProcessor) Transform(data interface{}) ([]map[string]interface{}, error) {
    dbConn := data.(*DBConnection)
    
    // Simulate database query execution
    log.Printf("Executing query: %s", dbConn.query)
    
    // Mock database results
    result := []map[string]interface{}{
        {
            "id":         1,
            "name":       "John Doe",
            "email":      "[email protected]",
            "created_at": time.Now().Add(-24 * time.Hour).Format(time.RFC3339),
        },
        {
            "id":         2,
            "name":       "Jane Smith",
            "email":      "[email protected]",
            "created_at": time.Now().Add(-12 * time.Hour).Format(time.RFC3339),
        },
    }
    
    // Add database-specific metadata
    for i, record := range result {
        record["_db_source"] = true
        record["_query"] = dbConn.query
        record["_result_index"] = i
        record["_extracted_at"] = time.Now().Format(time.RFC3339)
    }
    
    log.Printf("Database transformation completed: %d records processed", len(result))
    return result, nil
}

func (d *DatabaseProcessor) Save(data []map[string]interface{}, destination string) error {
    // Database-specific save format
    output := map[string]interface{}{
        "metadata": map[string]interface{}{
            "source":           "database",
            "connection_string": d.connectionString,
            "record_count":     len(data),
            "exported_at":      time.Now().Format(time.RFC3339),
        },
        "records": data,
    }
    
    jsonOutput, err := json.MarshalIndent(output, "", "  ")
    if err != nil {
        return err
    }
    
    err = ioutil.WriteFile(destination, jsonOutput, 0644)
    if err != nil {
        return err
    }
    
    log.Printf("Database data saved to: %s (%d bytes)", destination, len(jsonOutput))
    return nil
}

// Helper struct for database connection
type DBConnection struct {
    query     string
    connected bool
}

func main() {
    fmt.Println("=== Data Processing with Template Method Pattern ===")
    
    // Create sample CSV file for testing
    csvData := `name,age,email,salary
John Doe,30,[email protected],50000
Jane Smith,25,[email protected],60000
Bob Johnson,35,[email protected],55000`
    
    err := ioutil.WriteFile("sample.csv", []byte(csvData), 0644)
    if err != nil {
        log.Fatalf("Failed to create sample CSV: %v", err)
    }
    
    // Process CSV data
    fmt.Println("\n--- Processing CSV Data ---")
    csvProcessor := NewCSVDataProcessor()
    err = csvProcessor.ProcessData("sample.csv", "csv_output.json")
    if err != nil {
        log.Printf("CSV processing failed: %v", err)
    }
    
    // Process JSON API data (using a mock API)
    fmt.Println("\n--- Processing JSON API Data ---")
    apiProcessor := NewJSONAPIProcessor()
    // Note: This would fail with a real API call, but demonstrates the pattern
    err = apiProcessor.ProcessData("https://jsonplaceholder.typicode.com/users", "api_output.json")
    if err != nil {
        log.Printf("API processing failed (expected): %v", err)
    }
    
    // Process Database data
    fmt.Println("\n--- Processing Database Data ---")
    dbProcessor := NewDatabaseProcessor("postgres://localhost:5432/mydb")
    err = dbProcessor.ProcessData("SELECT * FROM users", "db_output.json")
    if err != nil {
        log.Printf("Database processing failed: %v", err)
    }
    
    // Cleanup
    os.Remove("sample.csv")
    
    fmt.Println("\n=== Template Method Pattern Demo Completed ===")
}

Advanced Template Method with Hooks

You can add optional hooks for even more flexibility:

type ProcessorWithHooks interface {
    DataProcessor
    BeforeConnect(source string) error
    AfterConnect(connection interface{}) error
    BeforeValidate(data interface{}) error
    AfterValidate(data interface{}) error
    BeforeTransform(data interface{}) error
    AfterTransform(data []map[string]interface{}) error
    BeforeSave(data []map[string]interface{}, destination string) error
    AfterSave(destination string) error
}

type BaseProcessorWithHooks struct {
    processor ProcessorWithHooks
}

func (b *BaseProcessorWithHooks) ProcessData(source, destination string) error {
    // Before connect hook
    if err := b.processor.BeforeConnect(source); err != nil {
        return err
    }
    
    connection, err := b.processor.Connect(source)
    if err != nil {
        return err
    }
    defer b.processor.Disconnect(connection)
    
    // After connect hook
    if err := b.processor.AfterConnect(connection); err != nil {
        return err
    }
    
    // Continue with other steps and hooks...
    return nil
}

Real-world Use Cases

Here’s where I commonly use the Template Method pattern in Go:

  1. Data Processing Pipelines: ETL operations with different data sources
  2. HTTP Request Handlers: Common request processing with custom business logic
  3. File Processing: Different file formats with common processing steps
  4. Testing Frameworks: Test setup, execution, and teardown with custom test logic
  5. Authentication Systems: Common auth flow with different providers

Benefits of Template Method Pattern

  1. Code Reuse: Common algorithm structure is reused across implementations
  2. Consistency: Ensures all implementations follow the same workflow
  3. Flexibility: Allows customization of specific steps without changing the overall flow
  4. Maintainability: Changes to the algorithm structure only need to be made in one place
  5. Extensibility: Easy to add new implementations by implementing the required interfaces

Caveats

While the Template Method pattern is useful, consider these limitations:

  1. Interface Complexity: Can lead to large interfaces with many methods
  2. Rigid Structure: The algorithm structure is fixed and can be hard to change
  3. Debugging Complexity: Flow control across multiple objects can be hard to trace
  4. Over-engineering: May be overkill for simple scenarios
  5. Go-specific: Lacks traditional inheritance, so composition is required

Thank you

Thank you for reading! The Template Method pattern is excellent for defining consistent workflows while allowing customization of specific steps. In Go, using interfaces and composition makes this pattern even more flexible and testable than traditional inheritance-based implementations. Please drop an email at [email protected] if you would like to share any feedback or suggestions. Peace!