Gå til innhald

Power Query functions and example queries

Combine the functions exposed by the Cognite Data Fusion (REST API) connector for Power BI with Power Query to fetch and transform data with the Cognite API to create reports dashboards with Microsoft Power BI and Microsoft Excel.

:::caution Beta The features described in this section are currently in beta testing with selected customers and are subject to change. :::

Copy and adapt the queries and Power Query functions on this page to fit your business needs.

Utility functions

Convert timestamps from/to epoch

CDF resource types expect and return timestamps using milliseconds since the Unix epoch. Power Query doesn't have methods to automatically parse this format to a datetimezone type to represent timezone-aware timestamps. CDF data models represent timestamps using the ISO 8601 format for primitive fields.

Use the functions below to convert between a datetimezone variable and milliseconds since the Unix epoch and from a datetimezone variable to text in ISO 8601 format.

ConvertDateTimeZoneToMs

Convert datetimezone to milliseconds since epoch
//
(dtz as datetimezone) as number =>
let
    // Convert the input DateTimeZone to UTC
    UtcDateTime = DateTimeZone.RemoveZone(DateTimeZone.SwitchZone(dtz, 0)),
    // Define the Unix epoch start
    UnixEpochStart = #datetime(1970, 1, 1, 0, 0, 0),
    // Calculate the duration between the input date and Unix epoch start
    Delta = UtcDateTime - UnixEpochStart,
    // Convert duration to total milliseconds
    TotalMilliseconds = Duration.TotalSeconds(Delta) * 1000
in
    TotalMilliseconds

ConvertMsToDateTimeZone

Convert milliseconds since epoch to datetimezone
(ms as number) as datetimezone =>
let
    // Convert ms to seconds
    SecondsSinceEpoch = ms / 1000,

    // Create a duration
    DurationSinceEpoch = #duration(0, 0, 0, SecondsSinceEpoch),

    // Add duration to Unix epoch start to get UTC datetime
    UnixEpochStart = #datetime(1970, 1, 1, 0, 0, 0),
    UtcDateTime = UnixEpochStart + DurationSinceEpoch,

    // Convert UTC datetime to local time zone
    LocalDateTimeZone = DateTimeZone.From(UtcDateTime)
in
    LocalDateTimeZone

ConvertDateTimeZoneToIso

Convert DateTimeZone to ISO 8601 text representation
(dtz as datetimezone) as text =>
let
    // Use DateTimeZone.ToText with ISO 8601 format
    Result = DateTimeZone.ToText(dtz, [Format="yyyy-MM-ddTHH:mm:sszzz", Culture="en-US"])
in
    Result

Add function

To add a new function in Power Query, select Get Data > Blank Query and write your function or copy one of the functions below.

Time deltas

It’s common to define start and end timestamps based on time deltas. The values will be updated when a dataset is refreshed. In the example below, EndTime is the current time, and StartTime is 7 days before EndTime. You can adapt this example to use different time deltas.

Time deltas
CurrentTime = DateTimeZone.LocalNow(),
EndTime = CurrentTime,
StartTime = CurrentTime - #duration(7, 0, 0, 0)

Common GET request

When you fetch data from CDF using the GetCDF function from the CDF REST API connector for Power BI, you must use query parameters to pass on filters to select which data to fetch.

The example below shows how you can add the externalIdPrefix and limit query parameters from the /timeseries endpoint to the URL to filter data on the server side.

List all time series instances with an externalId starting with a specific string
let
    Source = GetCDF("/timeseries?externalIdPrefix=EVE&limit=1000")
in
    Source

Common POST request

When you fetch data from CDF with the [PostCDF]/(cdf/dashboards/references/rest/powerbi_rest_functions#postcdf) function, you must write a request body to select which data to fetch. The function accepts a text representation of the JSON body, but you can also write the body using a Power Query record data type and then convert it to a JSON text data type before passing the value to the PostCDF function.

List all data modeling instances for a view using the DMS API
let
    SpaceExternalId = "Geography",
    ViewExternalId = "City",
    ViewVersion = "1",
    Body = [
        sources = {
            [
                source = [
                    type = "view",
                    space = SpaceExternalId,
                    externalId = ViewExternalId,
                    version = ViewVersion
                ]
            ]
        },
        limit = 1000
    ],
    BodyText = Text.FromBinary(Json.FromValue(Body)),
    Source = PostCDF("/models/instances/list", BodyText)
in
    Source

Alternatively, you can manually write the POST body as text, but you need to escape double quotes (") with another set of double-quote characters:

let
    BodyText = "{""sources"": [{""source"": {""type"": ""view"", ""space"": ""Geography"", ""externalId"": ""City"", ""version"": ""1""}}], ""limit"": 1000}",
    Source = PostCDF("/models/instances/list", BodyText)
in
    Source

If you need to reuse a POST request, you can transform it into a Power Query function. For example:

(SpaceExternalId as text, ViewExternalId as text, ViewVersion as text) as table =>
let
    Body = [
        sources = {
            [
                source = [
                    type = "view",
                    space = SpaceExternalId,
                    externalId = ViewExternalId,
                    version = ViewVersion
                ]
            ]
        },
        limit = 1000
    ],
    BodyText = Text.FromBinary(Json.FromValue(Body)),
    Source = PostCDF("/models/instances/list", BodyText)
in
    Source

You can define the function name by right-clicking on the entry in the query list in the Power Query editor and selecting Rename.

If the function above is named ListInstancesDMS, you can use it in a new query by entering the field values in Power Query or by writing a new query:

let
    Source = ListInstancesDMS("Geography", "City", "1")
in
    Source

GraphQL requests

When you fetch data from CDF using the GraphQL function, you must write a GraphQL request to select which data to fetch from a specific data model. The function expects you to specify the external ID of the space, the external ID of the view, the version of the view, the GraphQL query to run, and optionally a set of variables to be used in the query.

The query below uses the GraphQL syntax and passes the variables as JSON texts. Using variables in the query makes it easier to parameterize and use with external values.

List all work orders with an end date greater than a specific date
let
    Source = GraphQL(
        "cdf_idm",
        "CogniteProcessIndustries",
        "v1",
        "query MyQuery($cursor: String, $endTime: Timestamp) {#(lf)  listCogniteMaintenanceOrder(#(lf)    first: 1000#(lf)    after: $cursor#(lf)    filter: {endTime: {gte: $endTime}}#(lf)  ) {#(lf)    items {#(lf)      name#(lf)      type#(lf)      startTime#(lf)      endTime#(lf)      priority#(lf)    }#(lf)    pageInfo {#(lf)      endCursor#(lf)      hasNextPage#(lf)    }#(lf)  }#(lf)}",
        "{""endTime"": ""2024-10-01T00:00:00+02:00""}"
    )
in
    Source

The M language used by Power Query currently doesn't support multiline strings, so the query must be on a single line. The #(lf) represents a line-break character. In the example above, the query was pasted into the text area field in Power Query and the variables were passed as JSON text. Notice how Power BI added the line breaks to the original query and expressed it as a single-line text variable.

Alternatively, you can write the query as multiple single-line texts and use the Text.Combine function to add the line breaks to the query. You can define the variables as Power Query records and convert them to JSON text before passing them to the GraphQL function. For example, see how the ConvertDateTimeZoneToIso function converts a datetimezone variable to a text representation of the ISO 8601 format and then passes it as a variable to the query.

let
    // This could be a parameter or referenced from another query
    EndTime = #datetimezone(2024, 10, 1, 0, 0, 0, 2, 0),
    VariablesRecord = [
        endTime = ConvertDateTimeZoneToIso(EndTime)
    ],
    VariablesText = Text.FromBinary(Json.FromValue(VariablesRecord)),
    Query = Text.Combine({
        "query MyQuery($cursor: String, $endTime: Timestamp) {",
        "  listCogniteMaintenanceOrder(",
        "    first: 1000",
        "    after: $cursor",
        "    filter: {endTime: {gte: $endTime}}",
        "  ) {",
        "    items {",
        "      name",
        "      type",
        "      startTime",
        "      endTime",
        "      priority",
        "    }",
        "    pageInfo {",
        "      endCursor",
        "      hasNextPage",
        "    }",
        "  }",
        "}"
    }, "#(lf)"),
    Data = GraphQL(
        "cdf_idm",
        "CogniteProcessIndustries",
        "v1",
        Query,
        VariablesText
    )
in
    Data

Advanced examples

Depending on the shape of the Cognite API response, you may need additional Power Query transformations to fetch and transform the data. Copy and adapt the examples below to fit your business needs.

Fetch sequence rows with the PostCDF function

The Power Query function below fetches and processes sequence rows data for a sequence external ID from CDF. It sends a POST request, extracts column information, expands nested response data, and reorganizes it into tabular format. The function handles data type conversion, removes unnecessary fields, and groups data by row. The final output is a well-structured table with correctly typed columns.

Fetch sequence rows with PostCDF
(externalId as text) as table =>
let
    RequestBody = "{""externalId"": """ & externalId & """, ""limit"": 10000}",
    Response = PostCDF("/sequences/data/list", RequestBody),
    // Extract columns information from the first page
    FirstPage = Response{0},
    Columns = FirstPage[columns],
    ColumnNames = List.Transform(Columns, each [externalId]),
    ColumnTypes = List.Transform(Columns, each
        if [valueType] = "STRING" then type text else
        if [valueType] = "DOUBLE" then type number else
        if [valueType] = "LONG" then Int64.Type
        else type any
    ),
    // Extract the 'values' from each row
    Rows = Table.ExpandListColumn(Response, "rows"),
    ValuesTable = Table.ExpandRecordColumn(Rows, "rows", {"rowNumber", "values"}, {"rows.rowNumber", "rows.values"}),
    RemoveColumns = Table.RemoveColumns(ValuesTable,{"id", "externalId", "columns", "nextCursor"}),
    ExpandValues = Table.ExpandListColumn(RemoveColumns, "rows.values"),
    // Group by rowNumber and create a record for each row
    GroupedRows = Table.Group(ExpandValues, {"rows.rowNumber"}, {
        {"RowData", (t) => Record.FromList(t[rows.values], ColumnNames)}
    }),
    // Expand the RowData column
    ExpandRows = Table.ExpandRecordColumn(GroupedRows, "RowData", ColumnNames),
    // Set column data types
    FinalTable = Table.TransformColumnTypes(ExpandRows, List.Zip({ColumnNames, ColumnTypes}))
in
    FinalTable

To use the function:

let
    Source = RetrieveSequenceRows("sequence-externalId")
in
    Source

Fetch instances from the DMS query endpoint with the PostCDF function

The Power Query function below retrieves and processes data modeling instances for a DMS query. It paginates through the response, extracts the instances, and expands the nested data.

Fetch instances from the DMS query endpoint with PostCDF
(query as text) as table =>
    let
        FetchPage = (query as text, optional cursors as nullable record) as table =>
            let
                Query = Json.Document(query),
                UpdatedQuery =
                    if cursors <> null then
                        let
                            // Get all field names of both records
                            QueryWithFields = Record.FieldNames(Query[with]),
                            QUerySelectFields = Record.FieldNames(Query[select]),
                            CursorsFields = Record.FieldNames(cursors),
                            // Find the intersection of field names
                            CommonFields = List.Intersect({QueryWithFields, QUerySelectFields, CursorsFields}),
                            // Create new records containing only the common fields
                            UpdatedQueryWithAndSelect = Record.TransformFields(
                                Query,
                                {
                                    {"with", each Record.SelectFields(_, CommonFields)},
                                    {"select", each Record.SelectFields(_, CommonFields)}
                                }
                            )
                        in
                            UpdatedQueryWithAndSelect
                    else
                        Query,
                // Add cursors if they are provided
                UpdatedQueryWithCursors =
                    if cursors <> null then
                        Record.AddField(UpdatedQuery, "cursors", cursors)
                    else
                        UpdatedQuery,
                FinalBody = Text.FromBinary(Json.FromValue(UpdatedQueryWithCursors)),
                Response = PostCDF("/models/instances/query", FinalBody)
            in
                Response,
        // Helper function to create next cursor record from result table
        CreateNextCursorRecordFromTable = (inputTable as table) as record =>
            let
                RecordsList = List.Transform(
                    Table.ToRecords(inputTable), each Record.FromList({[nextCursor]}, {[resultExpression]})
                ),
                CombinedRecord = Record.Combine(RecordsList)
            in
                CombinedRecord,
        // Helper function to check if all cursors are null
        AllCursorsNull = (cursorsRecord as record) as logical =>
            let
                CursorValues = Record.ToList(cursorsRecord),
                NullCount = List.Count(List.Select(CursorValues, each _ = null))
            in
                NullCount = List.Count(CursorValues),
        // Helper function to aggregate items from all pages and convert to tables
        AggregateResults = (results as list) as table =>
            let
                // Combine all tables
                CombinedTable = Table.Combine(results),
                // Group by resultExpression and convert items to tables
                GroupedTable = Table.Group(
                    CombinedTable,
                    {"resultExpression"},
                    {
                        {
                            "items",
                            each
                                Table.FromRecords(
                                    List.Combine(List.Transform([items], each if Value.Is(_, type list) then _ else {
                                        _
                                    }))
                                ),
                            type table
                        }
                    }
                )
            in
                GroupedTable,
        // Main pagination logic
        FetchAllPages = () as list =>
            let
                // Initialize accumulator
                InitialAcc = [
                    results = {},
                    currentCursors = null,
                    hasMore = true
                ],
                // Pagination function
                PaginationFunction = (acc as record) =>
                    let
                        CurrentPage = FetchPage(query, acc[currentCursors]),
                        NextCursors = CreateNextCursorRecordFromTable(CurrentPage),
                        HasMoreResults = not AllCursorsNull(NextCursors) and Table.RowCount(CurrentPage) > 0,
                        UpdatedResults = List.Combine({acc[results], {CurrentPage}})
                    in
                        [
                            results = UpdatedResults,
                            currentCursors = NextCursors,
                            hasMore = HasMoreResults
                        ],
                // Keep fetching until no more results
                AllResults = List.Generate(
                    () => InitialAcc, each _[hasMore], each PaginationFunction(_), each _[results]
                ),
                // Get the final list of results
                FinalResults = List.Last(AllResults)
            in
                FinalResults,
        // Execute pagination and combine results
        AllPages = FetchAllPages(),
        FinalTable = AggregateResults(AllPages)
    in
        FinalTable

To use the function:

let
    Query = [
        with = [
            cities = [
                nodes = [
                    filter = [
                        hasData = {
                            [
                                space = "Geography",
                                externalId = "City",
                                version = "1",
                                #"type" = "view"
                            ]
                        }
                    ],
                    chainTo = "destination",
                    direction = "outwards"
                ]
            ],
            countries = [
                nodes = [
                    filter = [
                        hasData = {
                            [
                                space = "Geography",
                                externalId = "Country",
                                version = "1",
                                #"type" = "view"
                            ]
                        }
                    ],
                    chainTo = "destination",
                    direction = "outwards"
                ]
            ]
        ],
        select = [
            cities = [
                sources = {
                    [
                        source = [
                            space = "Geography",
                            externalId = "City",
                            version = "1",
                            #"type" = "view"
                        ],
                        properties = {
                            "name"
                        }
                    ]
                }
            ],
            countries = [
                sources = {
                    [
                        source = [
                            space = "Geography",
                            externalId = "Country",
                            version = "1",
                            #"type" = "view"
                        ],
                        properties = {
                            "name"
                        }
                    ]
                }
            ]
        ]
    ],
    QueryText = Text.FromBinary(Json.FromValue(Query)),
    Source = QueryDMS(QueryText)
in
    Source

:::tip Add filters like the hasData filter in the example above to avoid fetching all instances from CDF. :::

Fetch time series datapoints with the PostCDF function

The Power Query function below retrieves and processes aggregated time series datapoints for a time series within a time range. It converts local timezone inputs to UTC for the CDF API requests, supports multiple aggregates and custom granularity, and handles data pagination. The function then converts the returned UTC timestamps back to the local timezone, expands the nested API response, and outputs a well-formatted table with properly typed columns. It also includes local timestamps and decimal aggregate values.

The example uses the ConvertDateTimeZoneToMs and ConvertMsToDateTimeZone functions to convert timestamps.

```m title="Fetch time series datapoints with PostCDF ( item as record, start as datetimezone, optional end as nullable datetimezone, optional aggregates as nullable text, optional granularity as nullable text, optional targetUnit as nullable text, optional targetUnitSystem as nullable text, optional timeZone as nullable text ) => let // Function to detect query type based on item record structure DetectQueryType = (item as record) => let Fields = Record.FieldNames(item), HasId = List.Contains(Fields, "id"), HasExternalId = List.Contains(Fields, "externalId"), HasSpace = List.Contains(Fields, "space"), FieldCount = List.Count(Fields), QueryType = if HasId and not HasExternalId and not HasSpace and FieldCount = 1 then "id" else if HasExternalId and not HasId and not HasSpace and FieldCount = 1 then "externalId" else if HasExternalId and HasSpace and not HasId and FieldCount = 2 then "instanceId" else Error.Record( "Invalid item content", "The item record does not match any supported query type", item ) in QueryType, // Detect query type queryType = DetectQueryType(item), // Determine limit based on presence of aggregates limit = if aggregates <> null then 10000 else 100000, // Convert aggregates from comma-separated string to list format accepted by the API AggregatesList = Text.Split(aggregates, ","), AggregatesTrimmedList = List.Transform(AggregatesList, each Text.Trim()), StartMs = Number.Round(ConvertDateTimeZoneToMs(start)), EndMs = Number.Round(ConvertDateTimeZoneToMs(end)), // Function to fetch a single page of data FetchPage = (cursor as nullable text) => let // Build body item bodyItem = if queryType = "id" then [id = Record.Field(item, "id")] & (if targetUnit <> null then [targetUnit = targetUnit] else []) & (if targetUnitSystem <> null then [targetUnitSystem = targetUnitSystem] else []) & (if cursor <> null then [cursor = cursor] else []) else if queryType = "externalId" then [externalId = Record.Field(item, "externalId")] & (if targetUnit <> null then [targetUnit = targetUnit] else []) & (if targetUnitSystem <> null then [targetUnitSystem = targetUnitSystem] else []) & (if cursor <> null then [cursor = cursor] else []) else if queryType = "instanceId" then [ instanceId = [ space = Record.Field(item, "space"), externalId = Record.Field(item, "externalId") ] ] & (if targetUnit <> null then [targetUnit = targetUnit] else []) & (if targetUnitSystem <> null then [targetUnitSystem = targetUnitSystem] else []) & (if cursor <> null then [cursor = cursor] else []) else error "Invalid query type", // Build request body body = [ items = {bodyItem}, limit = limit, ignoreUnknownIds = true, start = Text.From(StartMs) ] & (if end <> null then [end = Text.From(EndMs)] else []) & (if aggregates <> null then [aggregates = AggregatesTrimmedList] else []) & (if granularity <> null then [granularity = granularity] else []) & (if timeZone <> null then [timeZone = timeZone] else []), Response = PostCDF("/timeseries/data/list", Text.FromBinary(Json.FromValue(body))), // Try to fetch the cursor from the first item in the response FirstItem = if Type.Is(Value.Type(Response), type table) and Table.RowCount(Response) > 0 then Table.First(Response) else null, NextCursor = if FirstItem <> null then Record.FieldOrDefault(FirstItem, "nextCursor", null) else null, // Handles empty response and extracts data points when present FinalItemsList = if Table.HasColumns(Response, "datapoints") then let // Clean up the response table ColumnsToRemove = {"nextCursor", "isStep", "unit"}, ColumnsPresent = List.Intersect({Table.ColumnNames(Response), ColumnsToRemove}), CleanedTable = Table.RemoveColumns(Response, ColumnsPresent), // Expand the "datapoints" column ExpandedDatapointsList = Table.ExpandListColumn(CleanedTable, "datapoints"), // Handles the case where the list of "datapoints" is empty FinalDataPointsList = if List.NonNullCount(ExpandedDatapointsList[datapoints]) > 0 then let // Extract a sample record to determine available fields dynamically SampleRecord = ExpandedDatapointsList[datapoints]{0}, AvailableFields = Record.FieldNames(SampleRecord), // Expand the "datapoints" records using the available fields ExpandedDatapointsRecords = Table.ExpandRecordColumn( ExpandedDatapointsList, "datapoints", AvailableFields, AvailableFields ), DataPointsList = Table.ToRecords(ExpandedDatapointsRecords) in DataPointsList else {} in FinalDataPointsList else Table.ToRecords(Response) in {FinalItemsList, NextCursor}, // Recursive function to accumulate all pages of data AccumulateData = (cursor as nullable text, accumulatedItems as list) => let CurrentPage = FetchPage(cursor), NewItems = CurrentPage{0}, NextCursor = CurrentPage{1}, UpdatedAccumulatedItems = accumulatedItems & NewItems, Result = if NextCursor <> null then @AccumulateData(NextCursor, UpdatedAccumulatedItems) else UpdatedAccumulatedItems in Result, // Fetch all data AllItems = AccumulateData(null, {}), // Convert the accumulated items to a table ConvertToTable = if List.IsEmpty(AllItems) then Table.FromList({}, Splitter.SplitByNothing(), null, null, ExtraValues.Error) else Table.FromList(AllItems, Splitter.SplitByNothing(), null, null, ExtraValues.Error), // Expand the table column and convert timestamps ExpandedTable = if not Table.IsEmpty(ConvertToTable) and Table.HasColumns(ConvertToTable, "Column1") then let TmpTable = Table.ExpandRecordColumn( ConvertToTable, "Column1", Record.FieldNames(ConvertToTable{0}[Column1]) ), // timestamp should be always present when there are datapoints FixType = Table.TransformColumnTypes(TmpTable, {{"timestamp", Int64.Type}}), ParseTimestamp = Table.TransformColumns(FixType, {"timestamp", each ConvertMsToDateTimeZone()}), ParsedWithType = Table.TransformColumnTypes(ParseTimestamp, {{"timestamp", type datetimezone}}), // check if the timeseries is of type string FirstEntry = ParsedWithType{0}, IsString = FirstEntry[isString], CleanedTable = Table.RemoveColumns(ParsedWithType, {"isString"}), // Convert aggregate/value columns to decimal number ValuesAsDecimal = if aggregates <> null then Table.TransformColumnTypes( CleanedTable, List.Transform(AggregatesTrimmedList, each {, type number}) ) else if IsString then CleanedTable else Table.TransformColumnTypes( CleanedTable, List.Transform({"value"}, each {, type number}) ), // Check if "id" column is present and convert to integer IdAsInteger = if Table.HasColumns(ValuesAsDecimal, "id") then Table.TransformColumnTypes(ValuesAsDecimal, {{"id", Int64.Type}}) else ValuesAsDecimal in IdAsInteger else ConvertToTable in ExpandedTable

The function is more complex than the previous examples and handles many different scenarios, with pagination, data type conversion, and nested data expansion. To use the function:

```m
let
    Source = RetrieveDataPoints(
        [ externalId = "EVE-TI-FORNEBU-01-3" ],
        #datetimezone(2024, 10, 1, 0, 0, 0, 2, 0),
        #datetimezone(2024, 10, 13, 10, 0, 0, 2, 0),
        "average,max,min",
        "1d",
        null,
        "SI",
        "Europe/Oslo"
    )
in
    Source

Based on this function, you can create another function to iterate over a list of time series external IDs and combine the results into a large table. The list can be a column in another table, where you, for example, filter time series. You can adapt the function to iterate over a list of internal IDs or instance IDs.

(
    externalIds as list,
    start as datetimezone,
    end as datetimezone,
    aggregates as text,
    granularity as text,
    optional targetUnitSystem as nullable text,
    optional timeZone as nullable text
) =>
let
    // Iterate over each externalId and get corresponding table
    TablesList = List.Transform(
        externalIds,
        each RetrieveDataPoints(
            [ externalId = _ ],
            start,
            end,
            aggregates,
            granularity,
            null,
            targetUnitSystem,
            timeZone
        )
    ),
    // Combine all tables into one
    CombinedTable = Table.Combine(TablesList)
in
    CombinedTable

To use the function:

let
    Source = RetrieveDataPointsMultipleTs(
        {"EVE-TI-FORNEBU-01-2", "EVE-TI-FORNEBU-01-3"},
        #datetimezone(2024, 10, 1, 0, 0, 0, 2, 0),
        #datetimezone(2024, 10, 13, 10, 0, 0, 2, 0),
        "average,max,min",
        "1d",
        "SI",
        "Europe/Oslo"
    )
in
    Source

:::info Learn more

:::