Use Computed Attributes

Computed attributes allow you to create buckets that are defined by ranges of a metric. A computed attribute breaks the underlying attribute into logical thresholds.

For example, you may want to group companies into three sizes based on their number of employees:

  • Small – up to 50 employees
  • Medium – up to 200 employees
  • Large – more than 200 employees

To do so, you use computed attributes.

Computed attributes appear together with all your attributes under the Manage tab, and you can use and manage them like other attributes (for example, apply as dashboard filters).

Computed Attributes and Underlying Metrics

Computed attributes are dynamic. When you change the definition of the metric that a computed attribute is computed from, the buckets are recalculated according to the new metric. For example, if you are ranking sales representatives by quarterly revenue, then change the metric to rank them by monthly revenue.

When you upload new data, this data is grouped into the buckets in the computed attribute.

Limitations

  • Computed attributes do not appear and cannot be managed in CloudConnect logical data models.
  • You cannot update buckets in a computed attribute. To update buckets, you have to delete the computed attribute and create a new one.

Create a Computed Attribute

When you create a computed attribute, a dataset for that computed attribute is also created as a separate attribute in the logical data model.

A computed attribute can contain only one metric and only one attribute.

Steps:

  1. Click Manage on the workspace toolbar. 

     The administration page opens.

  2. On the Data tab, click Attributes. The list of all workspace attributes opens.

  3. Click Create Computed Attribute

     The screen for creating a computed attribute opens.

  4. Select the attribute to break into buckets.

  5. Select the metric to break the attribute by.

  6. Define the buckets and their threshold values.

  7. Enter a meaningful name for the computed attribute.

  8. Click Create Computed Attribute. The attribute is created, and the attribute details page opens. A dataset for that computed attribute with the same name is created. You can review it under the Data Sets page (see Manage Datasets).

You can immediately start using the computed attribute.

Delete a Computed Attribute

You cannot directly delete a computed attribute. To delete a computed attribute, you have to delete its dataset. Deleting the dataset deletes both the dataset and its associated computed attribute.

Steps:

  1. Click Manage on the toolbar. The administration page opens.
  2. On the Data tab, click Data Sets. The list of all workspace datasets opens.
  3. Click the dataset that is related to the computed attribute you want to delete. By default, the dataset has the same name as the related computed attribute. The dataset details page opens.
  4. Click Delete. You are asked to confirm the deletion.
  5. Click Delete to confirm. The dataset and related computed attribute are deleted from the workspace. The list of available workspace datasets opens.

Migrate Computed Attributes between Workspaces

Computed attributes are built on attributes, metrics, and a relation between the metric values and the attribute and its values. When migrating objects (reports, dashboards) that contain computed attributes, you have to migrate the components of the computed attributes separately (by updating the logical data model and by direct object migration) and in a predefined order so that the computed attributes are correctly assembled in the target workspace.

The following procedure describes how to migrate computed attributes and the objects that use them (dashboards, reports) between workspaces. This procedure alternates between two workspaces: source and target. It may be helpful to use a text editor for staging the code that you copy and paste.

Steps:

  1. In the source workspace, go to the gray page for getting a link to view the logical data model (LDM) of the workspace with computed attributes included:

    https://secure.gooddata.com/gdc/projects/{source_workspace_id}/model/view?includeCA=true&includeDeprecated=true&includeGrain=true
    
    • includeCA=true includes computed attributes in the workspace LDM.

    • includeDeprecated=true includes deprecated attributes and facts in the workspace LDM.

    • includeGrain=true includes the Fact Table Grain in the workspace LDM.

    The link to a polling resource that displays the workspace LDM is returned.

  2. Click the polling link. The JSON structure that describes the workspace LDM is displayed. The JSON contains the modelMetadata section that confirms that the returned LDM includes computed attributes.

    "projectModel": {  
      "modelMetadata": {
        "containCA": true
      }
    ...
    
  3. Copy the section of the returned JSON structure starting with the following segment:

    {
      "projectModel": {
        ...
    

  

  1. In the target workspace, go to the gray page for generating a MAQL DDL script with computed attributes included:

    https://secure.gooddata.com/gdc/projects/{target_workspace_id}/model/diff?includeCA=true&includeDeprecated=true&includeGrain=true
    
    • includeCA=true includes computed attributes in the workspace LDM.

    • includeDeprecated=true includes deprecated attributes and facts in the workspace LDM.

    • includeGrain=true includes the Fact Table Grain in the workspace LDM.

  2. Paste the copied JSON structure from the source workspace into the gray page form, and click Create Diff. The link for polling for status of the diff and MAQL DDL generation is returned.

  3. Poll for the status until the gray page returns the diff and MAQL DDLs. The diff between the source workspace LDM and the target workspace LDM is represented by script operations that you can execute on the target workspace LDM to make it equal to the source workspace LDM (specifically, to recreate the components of the computed attributes from the source workspace in the source workspace).

  4. In the returned MAQL DDLs, locate the updateScript -> maqlDdlChunks section. For example:

    "updateScript": {
      "maqlDdlChunks": [
        "CREATE FOLDER {ffld.employee} VISUAL(TITLE \"Employee\") TYPE FACT;\nCREATE FACT {fact.employee.age} VISUAL(TITLE \"Employee Age\", FOLDER {ffld.employee}) AS {f_employee.f_age};\nALTER DATASET {dataset.employee} ADD {fact.employee.age};\nSYNCHRONIZE {dataset.employee};"
      ]
    }
    
  5. Copy and store the MAQL statement (we will refer to this statement as “MAQL statement 1” further in this procedure):

    CREATE FOLDER {ffld.employee} VISUAL(TITLE \"Employee\") TYPE FACT;\nCREATE FACT {fact.employee.age} VISUAL(TITLE \"Employee Age\", FOLDER {ffld.employee}) AS {f_employee.f_age};\nALTER DATASET {dataset.employee} ADD {fact.employee.age};\nSYNCHRONIZE {dataset.employee};
    
  6. Go back to the returned MAQL DDLs, and locate the computedAttributesScript -> maqlDdlChunks section. For example:

    "computedAttributesScript": {
      "maqlDdlChunks": [
        "alter attribute {attr.comp.nPhsOoM} add relations to {attr.payroll.store} as case when {aabqCCZAJaKw} <= 1000 then {attr.comp.nPhsOoM?\"Small\"}, when {aabqCCZAJaKw} <= 2000 then {attr.comp.nPhsOoM?\"Medium\"}, when {aabqCCZAJaKw} > 2000 then {attr.comp.nPhsOoM?\"Large\"} else {attr.comp.nPhsOoM?\"\"} end"
      ]
    }
    
  7. Copy and store the MAQL statement (we will refer to this statement as “MAQL statement 2” further in this procedure):

    "alter attribute {attr.comp.nPhsOoM} add relations to {attr.payroll.store} as case when {aabqCCZAJaKw} <= 1000 then {attr.comp.nPhsOoM?\"Small\"}, when {aabqCCZAJaKw} <= 2000 then {attr.comp.nPhsOoM?\"Medium\"}, when {aabqCCZAJaKw} > 2000 then {attr.comp.nPhsOoM?\"Large\"} else {attr.comp.nPhsOoM?\"\"} end
    
  8. Use any free online JSON escape tool to make the following changes in both MAQL statement 1 and MAQL statement 2:

    1. Remove all instances of \n.
    2. Replace \" with ".

    You are now ready to apply the MAQL statement to the target workspace LDM to recreate the components of the computed attribute.

  9. In the target workspace, go to the gray page for updating the LDM:

    https://secure.gooddata.com/gdc/md/{target_workspace_id}/ldm/manage2
    
  10. Paste the fixed MAQL statement 1 into the gray page form, and click submit. The link for polling for status of the LDM update is returned.

  11. Poll for the status until the gray page returns OK as task status. The LDM of the target workspace is updated and now contains the non-metric part of the computed attributes from the source workspace (that is, the attribute that the migrated computed attribute is built on). You are now going to migrate the objects that use the computed attributes (dashboards, reports, including the metric that the migrated computed attribute uses).

  12. Read Migrate Selected Objects between Workspaces and migrate the objects that use the computed attributes from the source workspace to the target workspace. To do so, perform partial metadata export from the source workspace and partial metadata import to the target workspace. The metrics that the computed attributes are built on are migrated as part of the migrated objects. When you are done, the target workspace contains the metrics that are used in the computed attributes and the objects that use the computed attributes.

  13. In the target workspace, go to the gray page for updating the LDM again:

    https://secure.gooddata.com/gdc/md/{target_workspace_id}/ldm/manage2
    
  14. Paste the fixed MAQL statement 2 into the gray page form, and click submit. The link for polling for status of the LDM update is returned.

  15. Poll for the status until the gray page returns OK as task status. The LDM of the target workspace is updated and now contains the metric-attribute relation part of the computed attributes from the source workspace.

  16. Log in to the GoodData Portal and go to the target workspace. Verify that the migrated objects are displayed correctly, and the computed attributes are applied.