DEV Community

Rahul Singh
Rahul Singh

Posted on • Originally published at aicodereview.cc

How to Export Azure DevOps Data to Excel (6 Methods, 2026)

Why export Azure DevOps data to Excel

Teams export Azure DevOps data to Excel for reporting, audits, stakeholder presentations, offline analysis, and data migration. While Azure DevOps has built-in dashboards and Analytics views, Excel remains the tool that everyone on a team - including project managers, QA leads, and executives who never log into Azure DevOps - already knows how to use.

Common scenarios where exporting to Excel makes sense:

  • Sprint reports - pulling work item data for burndown charts and velocity calculations outside of Azure DevOps
  • Compliance audits - exporting code review history, test results, and pipeline logs for SOC 2 or ISO 27001 documentation
  • Stakeholder updates - creating filtered views of project status that non-technical team members can read
  • Data migration - moving work items between Azure DevOps organizations or to another platform
  • Custom analysis - running pivot tables, VLOOKUP formulas, or statistical analysis that Azure DevOps dashboards do not support
  • Offline access - having a snapshot of project data when internet access is unavailable

This guide covers six methods to get data out of Azure DevOps and into Excel, ranked from simplest to most powerful. Each method has different strengths depending on what type of data you need and how much automation you want.

Method 1 - Built-in Open in Excel (work items)

The fastest way to export Azure DevOps work items to Excel is the built-in Office Integration feature. This creates a live, bidirectional connection between Excel and Azure DevOps - meaning you can not only read data but also edit work items directly in Excel and publish changes back.

Prerequisites

Before using this method, you need:

  1. Excel 2016 or later installed on your machine (Excel for Microsoft 365 recommended)
  2. Azure DevOps Office Integration plugin - included with Visual Studio 2019+, or available as a standalone download from the Visual Studio Marketplace
  3. Permission to read work items in the Azure DevOps project

Step-by-step process

Step 1 - Create or open a work item query

Navigate to your Azure DevOps project. Go to Boards > Queries. Either open an existing saved query or create a new one. For example, to export all user stories in the current sprint:

Work Item Type = User Story
AND State <> Removed
AND Iteration Path = @CurrentIteration
Enter fullscreen mode Exit fullscreen mode

Make sure the query returns the columns you want in Excel. Click Column Options to add or remove fields like Story Points, Assigned To, Area Path, Tags, and any custom fields your team uses.

Step 2 - Click Open in Excel

With the query results displayed, click the Open in Excel button in the toolbar. This button appears at the top of the query results page. If you do not see it, confirm the Office Integration plugin is installed.

Azure DevOps will generate an .xlsx file and open it in Excel with the Team plugin connected. The spreadsheet will contain one row per work item with all the columns from your query.

Step 3 - Work with the data in Excel

Once the data is in Excel, you can:

  • Sort and filter using standard Excel features
  • Create pivot tables from the work item data
  • Add conditional formatting to highlight blocked items or overdue tasks
  • Use formulas to calculate custom metrics

Step 4 - Refresh data (optional)

Because this is a live connection, you can click Refresh in the Team ribbon tab to pull the latest data from Azure DevOps at any time. This is useful for recurring reports where you want to update the numbers without re-exporting.

Limitations of the built-in method

  • Only works on Windows (the Office Integration plugin does not support macOS)
  • Limited to work items - you cannot export pipeline data, test results, or repository statistics this way
  • Maximum of 50,000 work items per query
  • Requires the desktop Excel application - Excel for the web does not support the plugin

Method 2 - CSV export from the web UI

If you are on macOS or Linux, or simply do not want to install the Office Integration plugin, the CSV export is the next simplest option.

Step-by-step process

Step 1 - Run your work item query

Go to Boards > Queries in the Azure DevOps web portal. Open or create a query with the columns and filters you need.

Step 2 - Export to CSV

Click the three-dot menu (more actions) at the top right of the query results. Select Export to CSV. Azure DevOps will generate a .csv file and download it to your browser's default download location.

Step 3 - Open in Excel

Open the CSV file in Excel. If your data contains special characters or non-English text, use Excel's Data > From Text/CSV import wizard and set the encoding to UTF-8 to avoid garbled characters.

Step 4 - Format and save as .xlsx

Once imported, format the data as needed and save as an .xlsx file to preserve formatting, formulas, and multiple sheets.

CSV export tips

  • Column selection matters - only columns visible in the query results will appear in the CSV. Add all the fields you need before exporting.
  • HTML fields get messy - rich text fields like Description and Acceptance Criteria export as raw HTML in CSV. You may need to clean these up in Excel or strip HTML tags with a formula like =CLEAN(SUBSTITUTE(A1,"<br>",CHAR(10))).
  • Dates may need reformatting - depending on your locale, date fields might import as text. Use Excel's Text to Columns or DATEVALUE() function to convert them.

Method 3 - REST API export with PowerShell or Python

When you need data that the built-in export does not cover - pipeline runs, pull request details, code review comments, build logs, or any data beyond work items - the Azure DevOps REST API is the way to go.

Exporting work items with PowerShell

This script runs a WIQL query and exports the results to a CSV file that Excel can open:

# Configuration
$org = "your-organization"
$project = "your-project"
$pat = "your-personal-access-token"
$outputFile = "work-items-export.csv"

# Create auth header
$base64Auth = [Convert]::ToBase64String(
  [Text.Encoding]::ASCII.GetBytes(":$pat")
)
$headers = @{
  Authorization = "Basic $base64Auth"
  "Content-Type" = "application/json"
}

# Step 1 - Run a WIQL query to get work item IDs
$wiqlBody = @{
  query = "SELECT [System.Id] FROM WorkItems WHERE [System.TeamProject] = '$project' AND [System.WorkItemType] = 'User Story' AND [System.State] <> 'Removed' ORDER BY [System.Id]"
} | ConvertTo-Json

$wiqlUrl = "https://dev.azure.com/$org/$project/_apis/wit/wiql?api-version=7.1"
$wiqlResult = Invoke-RestMethod -Uri $wiqlUrl -Method Post -Headers $headers -Body $wiqlBody

# Step 2 - Fetch full work item details in batches of 200
$ids = $wiqlResult.workItems.id
$allWorkItems = @()

for ($i = 0; $i -lt $ids.Count; $i += 200) {
    $batch = $ids[$i..([Math]::Min($i + 199, $ids.Count - 1))]
    $idsParam = $batch -join ","
    $detailUrl = "https://dev.azure.com/$org/$project/_apis/wit/workitems?ids=$idsParam&`$expand=fields&api-version=7.1"
    $details = Invoke-RestMethod -Uri $detailUrl -Method Get -Headers $headers
    $allWorkItems += $details.value
}

# Step 3 - Convert to flat objects and export as CSV
$rows = $allWorkItems | ForEach-Object {
    [PSCustomObject]@{
        ID          = $_.id
        Title       = $_.fields.'System.Title'
        State       = $_.fields.'System.State'
        AssignedTo  = $_.fields.'System.AssignedTo'.displayName
        WorkItemType = $_.fields.'System.WorkItemType'
        StoryPoints = $_.fields.'Microsoft.VSTS.Scheduling.StoryPoints'
        AreaPath    = $_.fields.'System.AreaPath'
        IterationPath = $_.fields.'System.IterationPath'
        CreatedDate = $_.fields.'System.CreatedDate'
        ChangedDate = $_.fields.'System.ChangedDate'
        Tags        = $_.fields.'System.Tags'
    }
}

$rows | Export-Csv -Path $outputFile -NoTypeInformation -Encoding UTF8
Write-Host "Exported $($rows.Count) work items to $outputFile"
Enter fullscreen mode Exit fullscreen mode

Exporting pipeline runs with Python

This Python script pulls pipeline run data - something the built-in export cannot do at all:


from datetime import datetime

# Configuration
ORG = "your-organization"
PROJECT = "your-project"
PAT = "your-personal-access-token"
OUTPUT_FILE = "pipeline-runs-export.csv"

# Auth setup
credentials = base64.b64encode(f":{PAT}".encode()).decode()
headers = {
    "Authorization": f"Basic {credentials}",
    "Content-Type": "application/json"
}

def get_pipeline_runs():
    """Fetch all pipeline runs with pagination."""
    all_runs = []
    url = (
        f"https://dev.azure.com/{ORG}/{PROJECT}"
        f"/_apis/pipelines?api-version=7.1"
    )
    pipelines = requests.get(url, headers=headers).json()

    for pipeline in pipelines.get("value", []):
        runs_url = (
            f"https://dev.azure.com/{ORG}/{PROJECT}"
            f"/_apis/pipelines/{pipeline['id']}"
            f"/runs?api-version=7.1"
        )
        runs = requests.get(runs_url, headers=headers).json()

        for run in runs.get("value", []):
            all_runs.append({
                "Pipeline": pipeline["name"],
                "RunID": run["id"],
                "State": run["state"],
                "Result": run.get("result", "N/A"),
                "CreatedDate": run["createdDate"],
                "FinishedDate": run.get("finishedDate", "N/A"),
                "SourceBranch": run.get(
                    "resources", {}
                ).get(
                    "repositories", {}
                ).get(
                    "self", {}
                ).get("refName", "N/A"),
            })

    return all_runs

# Export to CSV
runs = get_pipeline_runs()
if runs:
    with open(OUTPUT_FILE, "w", newline="", encoding="utf-8") as f:
        writer = csv.DictWriter(f, fieldnames=runs[0].keys())
        writer.writeheader()
        writer.writerows(runs)
    print(f"Exported {len(runs)} pipeline runs to {OUTPUT_FILE}")
Enter fullscreen mode Exit fullscreen mode

Exporting pull request and code review data

For teams that need code review audit trails, this Python snippet exports pull request data including reviewer votes:

def get_pull_requests(repo_name, status="completed"):
    """Export pull requests with reviewer details."""
    url = (
        f"https://dev.azure.com/{ORG}/{PROJECT}"
        f"/_apis/git/repositories/{repo_name}"
        f"/pullrequests?searchCriteria.status={status}"
        f"&$top=1000&api-version=7.1"
    )
    response = requests.get(url, headers=headers).json()
    rows = []

    for pr in response.get("value", []):
        reviewers = ", ".join(
            f"{r['displayName']} ({r['vote']})"
            for r in pr.get("reviewers", [])
        )
        rows.append({
            "PR_ID": pr["pullRequestId"],
            "Title": pr["title"],
            "CreatedBy": pr["createdBy"]["displayName"],
            "CreatedDate": pr["creationDate"],
            "ClosedDate": pr.get("closedDate", "N/A"),
            "SourceBranch": pr["sourceRefName"],
            "TargetBranch": pr["targetRefName"],
            "Status": pr["status"],
            "MergeStatus": pr.get("mergeStatus", "N/A"),
            "Reviewers": reviewers,
        })

    return rows
Enter fullscreen mode Exit fullscreen mode

The vote values in Azure DevOps reviewer data map to: 10 = Approved, 5 = Approved with suggestions, 0 = No vote, -5 = Wait for author, -10 = Rejected. Knowing this is essential when building compliance reports from exported data.

REST API tips

  • Personal Access Tokens (PAT) - generate one at https://dev.azure.com/{org}/_usersSettings/tokens with the minimum required scopes (Work Items Read, Code Read, Build Read depending on what you are exporting)
  • Pagination - most API endpoints return a maximum of 200-1000 items per request. Use the continuationToken or $skip/$top parameters to page through large result sets.
  • Rate limits - Azure DevOps allows approximately 30 requests per second per user. Add a small delay between batch requests if you are exporting large datasets.

Method 4 - OData Analytics feeds in Excel

Azure DevOps Analytics is an OData-based reporting service that provides optimized, queryable access to work tracking, pipeline, and test data. The major advantage over the REST API is that Analytics is designed for reporting - it supports aggregations, filtering, and projections at the server level, so you pull less data over the wire.

Connecting Excel to Analytics OData

Step 1 - Get your Analytics URL

The base URL format is:

https://analytics.dev.azure.com/{organization}/{project}/_odata/v4.0-preview/
Enter fullscreen mode Exit fullscreen mode

Common entity sets you can query:

Entity URL Suffix Data Type
Work items WorkItems Current state of all work items
Work item snapshots WorkItemSnapshot Historical daily snapshots
Work item revisions WorkItemRevisions Every change to every work item
Pipeline runs PipelineRuns Build and release pipeline execution data
Test results TestResultsDaily Aggregated daily test pass/fail data
Test runs TestRuns Individual test run metadata

Step 2 - Open Excel and connect to OData

  1. Open Excel and go to Data > Get Data > From Other Sources > From OData Feed
  2. Paste the Analytics URL, for example: https://analytics.dev.azure.com/myorg/myproject/_odata/v4.0-preview/WorkItems?$select=WorkItemId,Title,State,WorkItemType,StoryPoints,AssignedTo&$filter=WorkItemType eq 'User Story'
  3. When prompted for authentication, select Basic and enter an empty username with your PAT as the password. Alternatively, select Organizational Account if your Excel is signed into the same Azure AD tenant.
  4. Excel will load the data into the Power Query editor where you can apply additional transformations.
  5. Click Close & Load to bring the data into a worksheet.

Step 3 - Build your report

With the data in Excel, create pivot tables, charts, and dashboards. The connection is refreshable - right-click the data table and select Refresh to pull the latest data from Azure DevOps at any time.

Useful OData query examples

All active bugs with priority and severity:

WorkItems?$select=WorkItemId,Title,State,Priority,Severity,AssignedTo,CreatedDate
&$filter=WorkItemType eq 'Bug' and State ne 'Closed' and State ne 'Removed'
&$orderby=Priority asc
Enter fullscreen mode Exit fullscreen mode

Work item history for sprint analysis:

WorkItemSnapshot?$apply=filter(Iteration/IterationPath eq 'MyProject\Sprint 15'
and WorkItemType eq 'User Story')
/groupby((DateSK,State),aggregate($count as Count))
&$orderby=DateSK asc
Enter fullscreen mode Exit fullscreen mode

Pipeline success rate by definition:

PipelineRuns?$apply=groupby((PipelineName,RunOutcome),
aggregate($count as RunCount))
Enter fullscreen mode Exit fullscreen mode

OData Analytics advantages

  • Server-side filtering - only the data you need crosses the network, making it much faster than pulling everything via REST API
  • Aggregation support - you can get counts, sums, and averages without downloading raw rows
  • Historical data - snapshot entities provide daily point-in-time data that the REST API does not offer
  • Refreshable in Excel - set up the query once and refresh it whenever you need updated numbers

Method 5 - Power BI as an intermediate step

While this guide focuses on Excel, Power BI deserves mention because it has the most polished integration with Azure DevOps Analytics, and you can easily export from Power BI to Excel.

When to use the Power BI route

  • You need complex data transformations before the data lands in Excel
  • You want to combine data from multiple Azure DevOps projects into a single report
  • Your organization already has Power BI licenses
  • You need scheduled automatic refreshes (Power BI Service can refresh on a schedule and email Excel exports)

Step-by-step Power BI to Excel workflow

Step 1 - Install the Azure DevOps Power BI connector

Open Power BI Desktop. Click Get Data > Online Services > Azure DevOps. If you do not see it, update Power BI Desktop to the latest version.

Step 2 - Connect and build your report

Enter your organization URL and authenticate. Select the data tables you need. Use Power Query to filter, join, and transform the data. Build visualizations if desired - or skip straight to the data model.

Step 3 - Export to Excel

In Power BI Desktop, go to the data view. Select the table you want to export. Click Export Data and choose CSV or Excel format. Alternatively, publish the report to Power BI Service and use the Analyze in Excel feature to create a live-connected Excel workbook.

Step 4 - Schedule automatic exports (Power BI Service)

If you need recurring Excel exports, publish your report to Power BI Service, set up scheduled refresh, and use Power Automate to email the refreshed data as an Excel attachment on a schedule.

Method 6 - Third-party tools and extensions

The Visual Studio Marketplace and third-party ecosystem offer several tools that simplify Azure DevOps data export.

Azure DevOps Marketplace extensions

Enhanced Export (by Artiso) - adds a one-click export button to query results with more formatting options than the built-in CSV export. Supports exporting to .xlsx format directly with preserved column widths and header formatting.

WIQL to OData (by Microsoft) - converts work item queries to OData URLs that you can paste directly into Excel's OData connector. Useful if you are comfortable writing WIQL but not OData syntax.

Excel Export PRO - a marketplace extension that supports exporting work items with parent-child hierarchies preserved in the Excel output, which the built-in CSV export flattens.

Power Automate (formerly Microsoft Flow)

Power Automate can create automated workflows that export Azure DevOps data to Excel on a schedule:

  1. Use the Azure DevOps connector trigger (e.g., "When a work item is updated")
  2. Add an Excel Online (Business) action to write data to an Excel file in OneDrive or SharePoint
  3. Set up a recurrence trigger for daily or weekly full exports

This approach is particularly useful for teams that want a continuously updated Excel dashboard without manual exports.

Azure CLI

The Azure DevOps CLI extension (az devops) provides command-line access to most Azure DevOps data. Combined with tools like jq for JSON processing, you can script exports:

# Install the Azure DevOps extension
az extension add --name azure-devops

# Login and set defaults
az devops configure --defaults organization=https://dev.azure.com/myorg project=myproject

# Export work items from a query to JSON, then convert to CSV
az boards query --wiql "SELECT [System.Id], [System.Title], [System.State] FROM WorkItems WHERE [System.WorkItemType] = 'Bug'" --output json | \
  python3 -c "

data = json.load(sys.stdin)
writer = csv.writer(sys.stdout)
writer.writerow(['ID', 'Title', 'State'])
for item in data:
    fields = item.get('fields', {})
    writer.writerow([
        item['id'],
        fields.get('System.Title', ''),
        fields.get('System.State', '')
    ])
" > bugs-export.csv
Enter fullscreen mode Exit fullscreen mode

Exporting test results to Excel

Test result data requires special handling because Azure DevOps stores it differently from work items.

Using the REST API

def export_test_results(plan_id):
    """Export test results for a specific test plan."""
    # Get test suites in the plan
    suites_url = (
        f"https://dev.azure.com/{ORG}/{PROJECT}"
        f"/_apis/testplan/Plans/{plan_id}"
        f"/suites?api-version=7.1"
    )
    suites = requests.get(suites_url, headers=headers).json()
    all_results = []

    for suite in suites.get("value", []):
        # Get test points (test case + configuration pairs)
        points_url = (
            f"https://dev.azure.com/{ORG}/{PROJECT}"
            f"/_apis/testplan/Plans/{plan_id}"
            f"/suites/{suite['id']}"
            f"/TestPoint?api-version=7.1"
        )
        points = requests.get(points_url, headers=headers).json()

        for point in points.get("value", []):
            results = point.get("results", {})
            all_results.append({
                "SuiteName": suite["name"],
                "TestCaseId": point.get("testCaseReference", {}).get("id"),
                "TestCaseName": point.get("testCaseReference", {}).get("name"),
                "Configuration": point.get("configuration", {}).get("name"),
                "Outcome": results.get("lastResultState", "Not Run"),
                "Tester": point.get("tester", {}).get("displayName"),
                "LastRunDate": results.get("lastResultDetails", {}).get("dateCompleted"),
            })

    return all_results
Enter fullscreen mode Exit fullscreen mode

Using the OData Analytics feed

For aggregated test data - especially trend analysis - the OData feed is more efficient:

https://analytics.dev.azure.com/{org}/{project}/_odata/v4.0-preview/TestResultsDaily?
$apply=filter(DateSK ge 20260101 and DateSK le 20260320)
/groupby((DateSK,TestOutcome),aggregate(ResultCount with sum as TotalCount))
&$orderby=DateSK asc
Enter fullscreen mode Exit fullscreen mode

Paste this URL into Excel's OData connector to get a daily breakdown of test pass/fail counts that you can chart over time.

Best practices for Azure DevOps data exports

Security considerations

  • Never hard-code PATs in scripts - use environment variables, Azure Key Vault, or credential managers
  • Use minimum-scope PATs - if you only need to export work items, do not create a full-access token
  • Rotate PATs regularly - set expiration dates of 90 days or less
  • Audit exports for sensitive data - work item descriptions and comments may contain customer information, credentials, or internal URLs that should not be shared in Excel files sent via email

Performance optimization

  • Use OData $select - only request the fields you need instead of pulling entire entities
  • Apply $filter server-side - filtering in Excel after downloading all data wastes bandwidth and time
  • Batch REST API requests - use the batch endpoint (_apis/wit/workitemsbatch) instead of individual GET requests
  • Cache results locally - if you run the same export daily, compare timestamps and only pull changed items using the ChangedDate field

Data freshness

  • REST API - real-time data, reflects the current state immediately
  • Analytics OData - data is typically 2-5 minutes behind, with some entities refreshing every 24 hours (snapshots)
  • Built-in Excel export - real-time data at the moment of export, but static until you refresh
  • Power BI - depends on your configured refresh schedule (minimum 30 minutes on Power BI Service)

Handling large datasets

If your Azure DevOps project has tens of thousands of work items or months of pipeline history:

  1. Partition your queries - split by area path, iteration, date range, or work item type instead of exporting everything at once
  2. Use continuation tokens - the REST API returns a x-ms-continuationtoken header when more results are available. Always check for it.
  3. Consider Azure Data Factory - for truly large-scale exports (millions of rows), Azure Data Factory has native Azure DevOps connectors and can write directly to Azure SQL or Data Lake, which Excel can then query

Comparison of all six methods

Method Data Types Automation Platform Difficulty
Open in Excel Work items only Manual with refresh Windows only Easy
CSV export Work items only Manual Any browser Easy
REST API Everything Fully scriptable Any platform Medium
OData Analytics Work items, pipelines, tests Refreshable in Excel Any platform Medium
Power BI Everything via Analytics Scheduled refresh Windows (Desktop) Medium-Hard
Third-party tools Varies by tool Varies Varies Easy-Medium

For most teams, the recommended approach is:

  • Use the built-in CSV export for quick one-time exports of work item queries
  • Use OData Analytics in Excel for recurring reports that need refreshable data
  • Use REST API scripts when you need pipeline, PR, or code review data that the built-in tools do not cover
  • Use Power BI when you need to combine data from multiple projects or create organization-wide dashboards that ultimately get shared as Excel files

Troubleshooting common issues

"Open in Excel" button is missing - install the Azure DevOps Office Integration 2019 plugin from the Visual Studio Marketplace. If already installed, check that your Excel version is 2016 or later and that the Team add-in is enabled in Excel's Add-ins manager (File > Options > Add-ins).

CSV export has garbled characters - the file is UTF-8 encoded but Excel may default to a different encoding. Use Data > From Text/CSV and explicitly select UTF-8 encoding during import.

OData feed returns 401 Unauthorized - verify your PAT has the Analytics read scope enabled. Go to your PAT settings and ensure the "Analytics (read)" permission is checked.

REST API returns 203 Non-Authoritative - this usually means your PAT has expired or the Authorization header is malformed. Regenerate the PAT and double-check the Base64 encoding.

Excel Power Query times out - the OData query is returning too much data. Add $filter and $select parameters to reduce the result set. Also check your network connection and proxy settings.

Data does not match Azure DevOps UI - Analytics data can lag 2-5 minutes behind the live system. For real-time accuracy, use the REST API instead of Analytics OData feeds.

Conclusion

Exporting Azure DevOps data to Excel does not have to be painful. For work items, the built-in tools handle most use cases with a few clicks. For pipeline data, test results, and code review audit trails, the REST API and OData Analytics feeds give you access to everything Azure DevOps stores - you just need a script and a PAT.

The key is choosing the right method for your specific use case. One-time exports are best served by CSV. Recurring reports belong in OData-connected Excel workbooks. Automated compliance workflows should use Power Automate or scheduled scripts. And large-scale analytics across multiple projects should flow through Power BI before landing in Excel.

Start with the simplest method that meets your requirements, and only move to more complex approaches when you hit limitations with the simpler ones.

Frequently Asked Questions

Can I export Azure DevOps work items directly to Excel?

Yes. Azure DevOps has a built-in Open in Excel button on the Queries page that launches a connected Excel session via the Azure DevOps Office Integration plugin. You can also export query results as a CSV file and open that in Excel manually. Both methods work with any work item query you have saved or created.

How do I install the Azure DevOps Excel plugin?

The Azure DevOps Office Integration plugin is included with Visual Studio 2019 and later. If you only have Excel, download the standalone Azure DevOps Office Integration installer from the Visual Studio Marketplace. After installation, you will see a Team ribbon tab in Excel with options to connect to Azure DevOps and pull work item data.

Can I export Azure DevOps pipeline data to Excel?

Azure DevOps does not have a built-in one-click export for pipeline data. You need to use the REST API to pull pipeline runs, stages, jobs, and task results as JSON, then convert that JSON to CSV or load it directly into Excel using Power Query. The Analytics service also exposes pipeline data through OData feeds that Excel can consume.

What is the row limit when exporting Azure DevOps data to Excel?

The built-in Open in Excel feature supports up to 50,000 work items per query. CSV exports from the web UI are also capped at the query result limit. If you need more than 50,000 rows, use the REST API with pagination (continuation tokens) or connect through Power BI and OData, which can handle millions of records.

How do I export Azure DevOps test results to Excel?

Use the Test Results REST API endpoint to pull test run data as JSON. You can also use the Analytics OData feed at https://analytics.dev.azure.com/{org}/{project}/_odata/v4.0-preview/TestResultsDaily. Both methods let you load data into Excel via Power Query or convert JSON to CSV with a script.

Is Power BI required to export Azure DevOps analytics data?

No. Power BI is one option, but you can also connect Excel directly to Azure DevOps Analytics using OData feeds. In Excel, go to Data > Get Data > From OData Feed and paste your Analytics URL. Excel will load the data into a table that you can refresh, filter, and pivot without needing Power BI at all.


Originally published at aicodereview.cc

Top comments (0)