dataflows-consumption-cli作成者: microsoft

Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.

npx skills add https://github.com/microsoft/skills-for-fabric --skill dataflows-consumption-cli

Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.

  • GitHub Copilot CLI / VS Code: invoke the check-updates skill.
  • Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
  • Skip if the check was already performed earlier in this session.

CRITICAL NOTES

  1. To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
  2. To find a dataflow by name: list all dataflows in the workspace and filter by displayName client-side — there is no server-side name filter
  3. getDefinition is a POST, not GET — even though it reads data

Dataflows Gen2 — Consumption via CLI

Table of Contents

TaskReferenceNotes
Finding Workspaces and Items in FabricCOMMON-CLI.md § Finding Workspaces and Items in FabricMandatoryREAD link first
Fabric Topology & Key ConceptsCOMMON-CORE.md § Fabric Topology & Key Concepts
Environment URLsCOMMON-CORE.md § Environment URLs
Authentication & Token AcquisitionCOMMON-CORE.md § Authentication & Token AcquisitionWrong audience = 401; read before any auth issue
Core Control-Plane REST APIsCOMMON-CORE.md § Core Control-Plane REST APIsIncludes pagination, LRO polling, and rate-limiting patterns
Job ExecutionCOMMON-CORE.md § Job Execution
Gotchas, Best Practices & TroubleshootingCOMMON-CORE.md § Gotchas, Best Practices & Troubleshooting
Tool Selection RationaleCOMMON-CLI.md § Tool Selection Rationale
Authentication RecipesCOMMON-CLI.md § Authentication Recipesaz login flows and token acquisition
Fabric Control-Plane API via az restCOMMON-CLI.md § Fabric Control-Plane API via az restAlways pass --resource; includes pagination and LRO helpers
Job Execution (CLI)COMMON-CLI.md § Job Execution
Gotchas & Troubleshooting (CLI-Specific)COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific)az rest audience, shell escaping, token expiry
Quick ReferenceCOMMON-CLI.md § Quick Referenceaz rest template + token audience/tool matrix
Consumption Capability MatrixDATAFLOWS-CONSUMPTION-CORE.md § Consumption Capability MatrixRead first — shows what ops are available
REST API Surface (Consumption)DATAFLOWS-CONSUMPTION-CORE.md § REST API SurfaceList, Get, Parameters, getDefinition, Jobs
Dataflow Definition ExplorationDATAFLOWS-CONSUMPTION-CORE.md § Dataflow Definition ExplorationDecode mashup.pq, queryMetadata.json, .platform
Parameter Discovery and AnalysisDATAFLOWS-CONSUMPTION-CORE.md § Parameter Discovery and AnalysisTypes, formats, M code patterns
Refresh and Job MonitoringDATAFLOWS-CONSUMPTION-CORE.md § Refresh and Job MonitoringLRO pattern, job instances, polling best practices
Agentic Exploration PatternDATAFLOWS-CONSUMPTION-CORE.md § Agentic Exploration Pattern6-step discovery sequence
Security and Permissions ModelDATAFLOWS-CONSUMPTION-CORE.md § Security and Permissions ModelPermission matrix by operation
Common ErrorsDATAFLOWS-CONSUMPTION-CORE.md § Common ErrorsError codes and resolutions
Gotchas and Troubleshooting ReferenceDATAFLOWS-CONSUMPTION-CORE.md § Gotchas and Troubleshooting12 numbered issues with cause + resolution
Quick Reference One-Linersconsumption-cli-quickref.mdaz rest one-liners for all consumption ops
Discovery Patternsdiscovery-queries.mdDefinition decoding, parameter extraction, connection analysis
Script Templatesscript-templates.mdCopy-paste bash and PowerShell templates
Tool StackSKILL.md § Tool Stack
ConnectionSKILL.md § Connection
Agentic Exploration ("Chat With My Dataflows")SKILL.md § Agentic ExplorationStart here for dataflow exploration

Tool Stack

ToolRoleInstall
az CLIPrimary: Auth (az login), Fabric REST API via az restPre-installed in most dev environments
curlAlternative HTTP client for REST callsPre-installed
jqParse JSON responses, extract fields, format outputPre-installed or trivial
base64Decode definition parts from base64Built into bash; PowerShell uses [Convert]::FromBase64String
bash/pwshScript executionPre-installed

Agent check — verify before first operation:

az account show >/dev/null 2>&1 || echo "RUN: az login"
command -v jq >/dev/null 2>&1 || echo "INSTALL: apt-get install jq OR brew install jq"

Connection

Resolve Workspace ID and Dataflow ID

Per COMMON-CLI.md Finding Workspaces and Items in Fabric:

# Find workspace ID by name
WS_ID=$(az rest --method get \
  --resource "https://api.fabric.microsoft.com" \
  --url "https://api.fabric.microsoft.com/v1/workspaces" \
  --query "value[?displayName=='My Workspace'].id" --output tsv)

# Find dataflow ID by name within workspace
DF_ID=$(az rest --method get \
  --resource "https://api.fabric.microsoft.com" \
  --url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/dataflows" \
  --query "value[?displayName=='Sales Data Pipeline'].id" --output tsv)

Reusable Connection Variables

# Set once at script top
WS_ID="<workspaceId>"
DF_ID="<dataflowId>"
API="https://api.fabric.microsoft.com/v1"
AZ="az rest --resource https://api.fabric.microsoft.com"

Agentic Exploration ("Chat With My Dataflows")

Discovery Sequence

Run these in order to fully explore a workspace's dataflows. See references/discovery-queries.md for extended patterns.

# 1. List workspaces → find target
az rest --method get --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces" --query "value[].{name:displayName, id:id}" -o table

# 2. List dataflows → enumerate all
az rest --method get --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces/$WS_ID/dataflows" \
  --query "value[].{name:displayName, id:id, desc:description}" -o table

# 3. Get dataflow properties
az rest --method get --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces/$WS_ID/dataflows/$DF_ID"

# 4. Discover parameters
az rest --method get --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces/$WS_ID/dataflows/$DF_ID/parameters" \
  --query "value[].{name:name, type:type, required:isRequired, default:defaultValue}" -o table

# 5. Get definition → decode mashup.pq
RESPONSE=$(az rest --method post --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces/$WS_ID/dataflows/$DF_ID/getDefinition")
echo "$RESPONSE" | jq -r '.definition.parts[] | select(.path=="mashup.pq") | .payload' | base64 --decode

# 6. Check job history
az rest --method get --resource "https://api.fabric.microsoft.com" \
  --url "$API/workspaces/$WS_ID/items/$DF_ID/jobs/instances" \
  --query "value[].{status:status, type:invokeType, start:startTimeUtc, end:endTimeUtc, error:failureReason}" -o table

Agentic Workflow

  1. Discover → Run Steps 1–3 to list and identify dataflows.
  2. Parameters → Step 4 to understand inputs and defaults.
  3. Definition → Step 5 to inspect M queries, connections, staging config.
  4. Monitor → Step 6 for refresh history and error patterns.
  5. Iterate → Drill into specific queries or connection details.
  6. Present → Summarize findings or generate a reusable script (see script-templates.md).

Gotchas, Rules, Troubleshooting

For full platform gotchas: DATAFLOWS-CONSUMPTION-CORE.md Gotchas and Troubleshooting Reference and COMMON-CLI.md Gotchas & Troubleshooting (CLI-Specific).

MUST DO

  • Always az login firstaz rest uses the active session. No session → cryptic failure.
  • Always --resource "https://api.fabric.microsoft.com" — wrong audience = 401.
  • Handle pagination — repeat requests with continuationToken until absent/null.
  • Handle LRO for getDefinition — may return 202 Accepted with Location header; poll until complete.
  • Decode base64 before inspecting — definition parts are base64-encoded.
  • Use POST for getDefinition — it is NOT a GET endpoint.

AVOID

  • Hardcoded GUIDs — always discover via list-then-filter pattern.
  • Assuming getDefinition is GET — it is POST (common mistake).
  • Ignoring pagination — list endpoints may return partial results.
  • Polling too aggressively — respect Retry-After headers on 429s.
  • Expecting getDefinition with Viewer role — requires Read+Write (Contributor+).

PREFER

  • az rest over raw curl — handles auth automatically.
  • List-then-filter pattern — no server-side name filter for dataflows.
  • Exponential backoff for job polling — 5s → 10s → 20s → 30s cap.
  • jq for response parsing — cleaner than shell string manipulation.
  • JMESPath --query for simple field extraction directly in az rest.
  • Env vars (WS_ID, DF_ID, API) for script reuse.

TROUBLESHOOTING

SymptomCauseFix
401 UnauthorizedToken expired or wrong audienceaz login; ensure --resource "https://api.fabric.microsoft.com"
403 Forbidden on getDefinitionViewer role (Read-only)Requires Contributor role or higher (Read+Write)
404 Not FoundWrong workspace or dataflow IDRe-discover via List Dataflows API
getDefinition returns 202Large definition or server loadPoll the Location header URL until operation completes
Empty parameters arrayDataflow has no parametersExpected behavior — check mashup.pq for IsParameterQuery
Base64 decode shows garbled textBOM in encoded contentStrip UTF-8 BOM (\xEF\xBB\xBF) when decoding
429 TooManyRequestsRate limitedRespect Retry-After header; implement exponential backoff
Duplicate results in listRe-using stale continuationTokenAlways use the token from the most recent response
OperationNotSupportedForItemWrong item typeVerify item is type Dataflow via Get Item

Examples

Example 1: List All Dataflows in a Workspace

az rest --method get \
  --url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items?type=Dataflow" \
  --resource "https://api.fabric.microsoft.com" \
  --query "value[].{Name:displayName, Id:id, Type:type}" -o table

Example 2: Decode a Dataflow Definition

# Step 1: Request definition (POST — returns 202 with Location header)
LOCATION=$(az rest --method post \
  --url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/getDefinition" \
  --resource "https://api.fabric.microsoft.com" \
  --headers "Content-Length=0" \
  --output none --include-response-headers 2>&1 | grep -i "^location:" | awk '{print $2}' | tr -d '\r')

# Step 2: Poll until definition is ready
DEF=$(az rest --method get --url "${LOCATION}" \
  --resource "https://api.fabric.microsoft.com")

# Step 3: Decode mashup.pq to see the Power Query M code
echo "$DEF" | python3 -c "
import json, base64, sys
parts = json.load(sys.stdin)['definition']['parts']
for p in parts:
    if p['path'] == 'mashup.pq':
        print(base64.b64decode(p['payload']).decode('utf-8'))
"

Example 3: Check Refresh Job History

# Get recent job instances for a dataflow
az rest --method get \
  --url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/jobs/instances?limit=5" \
  --resource "https://api.fabric.microsoft.com" \
  --query "value[].{Status:status, Start:startTimeUtc, End:endTimeUtc, Id:id}" -o table

Example 4: Discover Parameters from Definition

# After decoding the definition (see Example 2), extract parameters:
echo "$DEF" | python3 -c "
import json, base64, sys
parts = json.load(sys.stdin)['definition']['parts']
for p in parts:
    if p['path'] == 'queryMetadata.json':
        meta = json.loads(base64.b64decode(p['payload']).decode('utf-8'))
        for qname, qmeta in meta.get('queriesMetadata', {}).items():
            if qmeta.get('queryGroupId') == 'parameters' or 'IsParameterQuery' in str(qmeta):
                print(f'Parameter: {qname}')
"

NotebookLM Webインポーター

ワンクリックでWebページとYouTube動画をNotebookLMにインポート。200,000人以上のユーザーが利用中。

Chrome拡張機能をインストール