DEV Community

Ge Ji
Ge Ji

Posted on

Dart Lesson 23: Getting Started with Unit Testing — Ensuring Code Quality

In the previous lessons, we explored the fundamentals of performance optimization and learned techniques to improve program efficiency at the code level. Today, we'll focus on unit testing — a critical practice for ensuring code quality that helps us catch issues early in development, reduce production failures, and make code more maintainable and refactor-friendly.

I. Unit Testing Basics: Why Do We Need Unit Tests?

Unit testing involves testing the smallest testable units of software (typically functions, methods, or classes). Its core values include:

  1. Early issue detection: Catching logical errors during development before they reach production.
  2. Safe refactoring: Quickly verifying that functionality remains intact when modifying or refactoring code.
  3. Documentation value: Test cases serve as living documentation, clearly showing expected function behavior.
  4. Promoting modular design: Writing testable code encourages creating loosely coupled, single-responsibility modules.

In the Dart ecosystem, the officially recommended unit testing framework is the test package, which provides a concise API and rich assertion methods to simplify test writing.


II. Test Framework (test package) Installation and Configuration

1. Create a New Project (if you don't have an existing one)

First, we need a Dart project (Flutter projects work too, with the same configuration):

# Create a basic Dart console project
dart create -t console-simple my_test_project
cd my_test_project
Enter fullscreen mode Exit fullscreen mode

2. Add the test dependency

Add test as a development dependency in pubspec.yaml:

name: my_test_project
description: A simple console application.

environment:
  sdk: '>=3.0.0 <4.0.0'

# Production dependencies (required for runtime)
dependencies:

# Development dependencies (only needed for development and testing)
dev_dependencies:
  test: ^1.25.15  # 添加 test 包
Enter fullscreen mode Exit fullscreen mode

Save the file and install dependencies with:

dart pub get
Enter fullscreen mode Exit fullscreen mode

3. Project Structure Preparation

Following Dart community conventions, we typically place test code in a test folder at the project root, corresponding to the lib folder:

my_test_project/
├── lib/
│   └── math_utils.dart  # Code to be tested
├── test/
│   └── math_utils_test.dart  # Test code
└── pubspec.yaml
Enter fullscreen mode Exit fullscreen mode

Let's first create some utility functions to test in lib/math_utils.dart:

// lib/math_utils.dart
/// Addition function
int add(int a, int b) {
  return a + b;
}

/// Subtraction function
int subtract(int a, int b) {
  return a - b;
}

/// Multiplication function
int multiply(int a, int b) {
  return a * b;
}

/// Division function (integer division only, returns null if divisor is 0)
int? divide(int a, int b) {
  if (b == 0) return null;
  return a ~/ b;
}
Enter fullscreen mode Exit fullscreen mode

III. Writing Your First Test Case: test() and expect()

1. Basic Structure of a Test File

Create test code in test/math_utils_test.dart with this basic structure:

// test/math_utils_test.dart
// Import the test framework
import 'package:test/test.dart';
// Import the code to be tested
import 'package:my_test_project/math_utils.dart';

void main() {
  // Test group: typically named after the module being tested
  group('MathUtils', () {
    // Test case: testing the add function
    test('add should return sum of two numbers', () {
      // Test logic
    });
  });
}
Enter fullscreen mode Exit fullscreen mode
  • group(): Used to group related test cases. The first parameter is the group name, and the second is a function containing test cases.
  • test(): Defines a single test case. The first parameter is a description (clearly stating what's being tested), and the second is the test logic function.

2. Using expect() for Assertions

expect(actual, matcher) is the core function in testing, used to verify that actual results match expectations:

  • actual: The actual result (e.g., function return value).
  • matcher: The expected matcher (can be a specific value, type, or special matching rule).

Let's write test cases for each function in math_utils.dart:

// test/math_utils_test.dart
import 'package:test/test.dart';
import 'package:my_test_project/math_utils.dart';

void main() {
  group('MathUtils', () {
    // Testing add function
    test('add(2, 3) should return 5', () {
      final result = add(2, 3);
      expect(result, 5); // Verify result is 5
    });

    test('add(-1, 1) should return 0', () {
      expect(add(-1, 1), 0); // Concise version
    });

    // Testing subtract function
    test('subtract(5, 3) should return 2', () {
      expect(subtract(5, 3), 2);
    });

    test('subtract(3, 5) should return -2', () {
      expect(subtract(3, 5), -2);
    });

    // Testing multiply function
    test('multiply(4, 5) should return 20', () {
      expect(multiply(4, 5), 20);
    });

    test('multiply(0, 10) should return 0', () {
      expect(multiply(0, 10), 0);
    });

    // Testing divide function
    group('divide', () {
      test('divide(10, 2) should return 5', () {
        expect(divide(10, 2), 5);
      });

      test('divide(7, 3) should return 2 (integer division)', () {
        expect(divide(7, 3), 2);
      });

      test('divide(5, 0) should return null', () {
        expect(divide(5, 0), null);
      });
    });
  });
}
Enter fullscreen mode Exit fullscreen mode

3. Running Tests

Execute this command from the project root to run all tests:

dart test
Enter fullscreen mode Exit fullscreen mode

If all tests pass, you'll see output similar to:

00:00 +7: All tests passed!
Enter fullscreen mode Exit fullscreen mode

If a test fails (for example, if we intentionally modify the add function to return a - b), you'll see detailed error information:

00:00 +0 -1: MathUtils add(2, 3) should return 5 [E] Expected: 5 Actual: -1 package:test_api/src/expect/expect.dart 135:31 expect test/math_utils_test.dart 8:7 main.<fn>.<fn> 00:00 +0 -1: Some tests failed.
Enter fullscreen mode Exit fullscreen mode

4. Common Matchers

Beyond direct value comparison, the test package provides rich matchers for various testing scenarios:

Matcher Purpose Example
equals(value) Checks for equality (deep comparison, good for collections and objects) expect([1,2], equals([1,2]))
isA<Type>() Checks type expect(5, isA<int>())
isNull / isNotNull Checks for null expect(divide(5,0), isNull)
greaterThan(value) / lessThan(value) Compares sizes expect(5, greaterThan(3))
contains(value) Checks if collection contains element expect([1,2,3], contains(2))
throwsA(matcher) Checks if specified exception is thrown expect(() => int.parse('abc'), throwsA(isA<FormatException>()))

Example: Enhancing tests with special matchers

test('divide should throw ArgumentError when b is negative', () {
  // Testing if an exception is thrown for negative divisors
  expect(
    () => divide(10, -2),
    throwsA(isA<ArgumentError>().having(
      (e) => e.message,
      'message',
      contains('negative')
      ))
    );
});
Enter fullscreen mode Exit fullscreen mode

IV. Viewing Test Coverage: Ensuring Critical Code is Tested

Test coverage is a metric measuring test completeness, representing the proportion of code covered by tests. High coverage doesn't guarantee bug-free code, but low coverage usually indicates untested risk areas.

1. Installing Coverage Tools

To view test coverage, install the coverage package:

dart pub global activate coverage
Enter fullscreen mode Exit fullscreen mode

Ensure ~/.pub-cache/bin is added to your system environment variables (otherwise the collect_coverage command won't work directly).

2. Generating Coverage Data

Run these commands to execute tests and generate coverage data:

# Run tests and collect coverage data
dart test --coverage=coverage

# Convert coverage data to LCOV format (universal coverage report format)
format_coverage --lcov --in=coverage --out=coverage/lcov.info --packages=.dart_tool/package_config.json --report-on=lib
Enter fullscreen mode Exit fullscreen mode

A lcov.info file will be generated in the coverage folder upon success.

3. Viewing Coverage Reports

There are several ways to view LCOV format reports:

  • Using VS Code Extension: Install the Coverage Gutters extension, right-click the lcov.info file, and select Watch to visually see covered (green) and uncovered (red) code.
  • Generating HTML Reports: Use the genhtml tool (requires lcov package installation):
# Ubuntu/Debian
sudo apt-get install lcov

# MacOS (with Homebrew)
brew install lcov

# Generate HTML report
genhtml coverage/lcov.info -o coverage/html

# Open the report in browser
open coverage/html/index.html  # MacOS
xdg-open coverage/html/index.html  # Linux
Enter fullscreen mode Exit fullscreen mode

The HTML report shows coverage percentages for each file and specifically which lines aren't tested.

4. Key Code Testing Principles

While high coverage is a goal, it's more important to cover critical code and scenarios. Here are some practical principles:

  1. Cover core business logic: Prioritize testing functions handling data and business rules — errors here have the biggest impact.
  2. Cover boundary conditions:
    • Numeric types: maximum values, minimum values, zero, negative numbers
    • Strings: empty strings, special characters, very long strings
    • Collections: empty collections, single-element collections, large collections
  3. Cover exception scenarios:
    • Invalid inputs (e.g., division by zero, null parameters)
    • Error conditions like network failures or missing files
  4. Avoid over-testing:
    • No need to test simple getters/setters (unless they contain special logic)
    • No need to test dependent third-party libraries (assuming they're well-tested)
    • No need to test obviously error-free code just to reach 100% coverage

Example: Adding boundary tests for the divide function

test('divide with maximum integer values', () {
  expect(divide(9223372036854775807, 1), 9223372036854775807); // Maximum int value
  expect(divide(-9223372036854775808, -1), 9223372036854775807); // Minimum int value
});
Enter fullscreen mode Exit fullscreen mode

V. Advanced Testing Techniques

1. Testing Asynchronous Code

Many operations in Dart are asynchronous (e.g., file I/O, network requests). Testing asynchronous code requires async/await:

// Asynchronous function to test (lib/data_fetcher.dart)
Future<int> fetchData() async {
  // Simulate network request
  await Future.delayed(Duration(milliseconds: 100));
  return 42;
}

// Test code (test/data_fetcher_test.dart)
import 'package:test/test.dart';
import 'package:my_test_project/data_fetcher.dart';

void main() {
  test('fetchData should return 42', () async {
    // Use async/await for asynchronous operations
    final result = await fetchData();
    expect(result, 42);
  });
}
Enter fullscreen mode Exit fullscreen mode

2. Test Timeout Control

For asynchronous tests that might hang, you can set timeout limits:

test('fetchData should complete within 200ms', () async {
  final result = await fetchData().timeout(Duration(milliseconds: 200));
  expect(result, 42);
}, timeout: Timeout(Duration(milliseconds: 300))); // Overall test timeout
Enter fullscreen mode Exit fullscreen mode

3. Using Mocks to Isolate Dependencies

When testing code that depends on external systems (e.g., databases, APIs), use mock objects to isolate dependencies and ensure test stability.

First add the mockito dependency:

dev_dependencies:
  test: ^1.24.0
  mockito: ^5.4.0  # Mock framework
  build_runner: ^2.4.0  # Required for generating mock code
Enter fullscreen mode Exit fullscreen mode

Example: Testing code that depends on an API client using mocks

// lib/user_repository.dart
import 'user_api_client.dart';

class UserRepository {
  final UserApiClient apiClient;

  UserRepository(this.apiClient);

  Future<String> getUserName(int id) async {
    return await apiClient.fetchUserName(id);
  }
}

// lib/user_api_client.dart
abstract class UserApiClient {
  Future<String> fetchUserName(int id);
}

// Test code (test/user_repository_test.dart)
import 'package:test/test.dart';
import 'package:mockito/mockito.dart';
import 'package:my_test_project/user_repository.dart';
import 'package:my_test_project/user_api_client.dart';

// Generate mock class
class MockUserApiClient extends Mock implements UserApiClient {}

void main() {
  late UserRepository repository;
  late MockUserApiClient mockApiClient;

  // Initialize before each test
  setUp(() {
    mockApiClient = MockUserApiClient();
    repository = UserRepository(mockApiClient);
  });

  test('getUserName should return name from api client', () async {
    // Configure mock behavior: return "Alice" when fetchUserName(1) is called
    when(mockApiClient.fetchUserName(1)).thenAnswer((_) async => "Alice");

    // Test the repository
    final name = await repository.getUserName(1);

    // Verify the result
    expect(name, "Alice");
    // Verify the mock method was called correctly
    verify(mockApiClient.fetchUserName(1)).called(1);
  });
}
Enter fullscreen mode Exit fullscreen mode

Generate mock code:

dart run build_runner build
Enter fullscreen mode Exit fullscreen mode

4. Parameterized Tests

When multiple test cases have identical logic but different inputs and outputs, use parameterized tests to reduce code duplication:

import 'package:test/test.dart';
import 'package:my_test_project/math_utils.dart';

void main() {
  group('add parameterized tests', () {
    // Test data: input a, input b, expected result
    final testCases = [
      {'a': 2, 'b': 3, 'expected': 5},
      {'a': -1, 'b': 1, 'expected': 0},
      {'a': 0, 'b': 0, 'expected': 0},
      {'a': 100, 'b': -50, 'expected': 50},
    ];

    for (final testCase in testCases) {
      test('add(${testCase['a']}, ${testCase['b']}) should return ${testCase['expected']}', () {
        expect(add(testCase['a'] as int, testCase['b'] as int), testCase['expected']);
      });
    }
  });
}
Enter fullscreen mode Exit fullscreen mode

VI. Integrating Tests in CI/CD

To ensure tests pass with every code submission, integrate testing into your continuous integration (CI) process. For GitHub Actions, create .github/workflows/test.yml:

name: Tests

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest

  steps:
    - uses: actions/checkout@v3

    - name: Set up Dart
      uses: dart-lang/setup-dart@v1

    - name: Get dependencies
      run: dart pub get

    - name: Run tests
      run: dart test

    - name: Check code coverage
      run: |
        dart pub global activate coverage
        dart test --coverage=coverage
        format_coverage --lcov --in=coverage --out=coverage/lcov.info --packages=.dart_tool/package_config.json --report-on=lib
Enter fullscreen mode Exit fullscreen mode

This configuration automatically runs tests and reports results for every code push or pull request.

Top comments (0)