Sunday, June 15, 2025

Automate replication of row-level safety from AWS Lake Formation to Amazon QuickSight

Amazon QuickSight is cloud-powered, serverless, and embeddable enterprise intelligence (BI) service that makes it simple to ship insights to your group. As a totally managed service, Amazon QuickSight enables you to create and publish interactive dashboards that may then be accessed from completely different units and embedded into your functions, portals, and web sites.

When authors create datasets, construct dashboards, and share with end-users, the customers will see the identical knowledge because the creator, except row-level safety (RLS) is enabled within the Amazon QuickSight dataset. Amazon QuickSight additionally offers choices to go a reader’s identification to an information supply utilizing trusted identification propagation and apply RLS on the supply. To study extra, see Centrally handle permissions for tables and views accessed from Amazon QuickSight with trusted identification propagation and Simplify entry administration with Amazon Redshift and AWS Lake Formation for customers in an Exterior Id Supplier.

Nonetheless, there are a number of necessities when utilizing trusted identification propagation with Amazon QuickSight:

  • The authentication technique for Amazon QuickSight should be utilizing AWS IAM Id Middle.
  • The dataset created utilizing trusted identification propagation shall be a direct question dataset in Amazon QuickSight. QuickSight SPICE can’t be used with trusted identification propagation. It’s because when utilizing SPICE, knowledge is imported (replicated) and subsequently the entitlements on the supply can’t be used when readers entry the dashboard.

This publish outlines an answer to mechanically replicate the entitlements for readers from the supply (AWS Lake Formation) to Amazon QuickSight. This resolution can be utilized even when the authentication technique in Amazon QuickSight shouldn’t be utilizing IAM Id Middle and might work with each direct question and SPICE datasets in Amazon QuickSight. This allows you to benefit from auto scaling that comes with SPICE. Though we deal with utilizing a Lake Formation desk that exists in the identical account, you possibly can prolong the answer for cross-account tables as effectively. When extracting knowledge filter guidelines for the desk in one other account, the execution function will need to have crucial entry to the tables within the different account.

Use case overview

For this publish, let’s take into account a big monetary establishment that has applied Lake Formation as its central knowledge lake and entitlement administration system. The establishment goals to streamline entry management and preserve a single supply of fact for knowledge permissions throughout its whole knowledge ecosystem. By utilizing Lake Formation for entitlement administration, the monetary establishment can preserve a strong, scalable, and compliant knowledge entry management system that serves as the inspiration for its data-driven operations and analytics initiatives. This method is especially essential for sustaining compliance with monetary rules and sustaining knowledge safety. The analytics staff needs to construct an Amazon QuickSight dashboard for knowledge and enterprise groups.

Answer overview

This resolution makes use of APIs of AWS Lake Formation and Amazon QuickSight to extract, rework, and retailer AWS Lake Formation knowledge filters in a format that can be utilized in QuickSight.

The answer has 4 key steps:

  1. Extract and rework the row-level safety (knowledge filters) and permissions to knowledge filters for tables of curiosity from AWS Lake Formation.
  2. Create a guidelines dataset in Amazon QuickSight.

We use the next key providers:

The next diagram illustrates the answer structure.

Stipulations

To implement this resolution, you must have following providers enabled in the identical account

  1. AWS Lake Formation and
  2. Amazon QuickSight
  3. AWS Id and Entry Administration (IAM) permissions: Be sure to have crucial IAM permissions to carry out operation throughout all of the providers talked about within the resolution overview above
  4. AWS Lake Formation desk with knowledge filters with proper permissions
  5. Amazon QuickSight principals (Customers or Teams)

The beneath part reveals how one can create Amazon QuickSight teams and AWS Lake formation tables and knowledge filters

Create teams in QuickSight

Create two teams in Amazon QuickSight: QuickSight_Readers and QuickSight_Authors. For directions, see Create a gaggle with the QuickSight console.

You possibly can then type the Amazon Useful resource Names (ARNs) of the teams as follows. These shall be used when granting permission in AWS Lake Formation for knowledge filters.

  • arn:aws:quicksight:<>:<>:group/<>/QuickSight_Readers
  • arn:aws:quicksight:<>:<>:group/<>/QuickSight_Authors

You can even get the ARN of the teams by executing the Amazon QuickSight CLI command list-groups. The next screenshot reveals the output.

Create a desk in AWS Lake Formation

The next part is for instance functions and never crucial for manufacturing use of this resolution. Full the next steps to create a desk in AWS Lake Formation utilizing pattern knowledge. On this publish, the desk is known as saas_sales.

  1. Obtain the file Saas Gross sales.csv.
  2. Add the file to an Amazon S3 location.
  3. Create a desk in AWS Lake Formation.

Create row-level safety (knowledge filter) in AWS Lake Formation

In AWS Lake Formation, knowledge filters are used to filter the information in a desk for a person or group. Full the next steps to create a knowledge filter:

  1. Create a knowledge filter known as QuickSightReaderFilter within the desk saas_sales. For Row-level entry, enter the expression phase="Enterprise".
  2. Grant the Amazon QuickSight group entry to this knowledge filter. Use the reader group ARN from step one for SAML Customers and teams.
  3. Grant the QuickSight_Authors group full entry to the desk. Use the reader group ARN from step one for SAML Customers and teams.
  4. (Non-obligatory) You possibly can create one other desk known as second_table and create one other knowledge filter known as SecondFilter and grant permission to the QuickSight_Readers group.

Now that you’ve arrange the desk, permissions, and knowledge filters, you possibly can extract the row-level entry particulars for the QuickSight_Readers and QuickSight_Authors teams and the saas_sales desk in AWS Lake Formation, and create the foundations dataset in Amazon QuickSight for the saas_sales desk.

Extract and rework knowledge filters and permissions from AWS Lake Formation utilizing a Lambda operate

In AWS Lake Formation, knowledge filters are created for every desk. There might be many tables in AWS Lake Formation. Nonetheless, for a staff or a mission, there are solely a particular set of tables that the BI developer is involved in. Subsequently, select a listing of tables to trace and replace the information filters for. In a batch course of, for every desk in AWS Lake Formation, extract the information filter definitions and write them into Amazon S3 utilizing AWS Lake Formation and Amazon S3 APIs.

We use the next AWS Lake Formation APIs to extract the information filter particulars and permissions:

  • ListDataCellFilters – This API is used to record all the information filters in every desk that’s required for the mission
  • ListPermissions – This API is used to retrieve the permissions for every of the information filters extracted utilizing the ListDataCellFilters API

The Lambda operate covers three components of the answer:

  • Extract the information filters and permissions to knowledge filters for tables of curiosity from AWS Lake Formation
  • Rework the information filters and permission right into a format usable in Amazon QuickSight
  • Persist the reworked knowledge

Full the next steps to create an AWS Lambda operate:

  1. On the Lambda console, create a operate known as Lake_Formation_QuickSight_RLS. Use Python 3.12 because the runtime and create a brand new function for execution.
  2. Configure Lambda operate timeout to 2 minutes. This will fluctuate relying on the variety of tables to be parsed and the variety of knowledge filters to be reworked.
  3. Connect the next permissions to the Lambda execution function:
    {
    "Model": "2012-10-17",
    "Assertion": [
    {
    "Sid": "VisualEditor0",
    "Effect": "Allow",
    "Action": [
    "lakeformation:ListDataCellsFilter",
    "lakeformation:ListPermissions"
    ],
    "Useful resource": "*"
    },
    {
    "Sid": "VisualEditor1",
    "Impact": "Enable",
    "Motion": "s3:PutObject",
    "Useful resource": "arn:aws:s3:::/*"
    }
    ]
    }

  4. Set the next atmosphere variables for the Lambda operate:
    Title Worth
    S3Bucket Worth of the S3 bucket the place the output recordsdata shall be saved
    tablesToTrack Listing of tables to trace as JSON transformed to string
    Tmp /tmp

The Lambda operate will get the record of tables and S3 bucket particulars from the atmosphere variables. The record of tables is given as a JSON array transformed to string. The JSON format is proven within the following code. The values for catalogId, DatabaseName, and Title might be fetched from the AWS Lake Formation console.

[
{
"CatalogId": "String",
"DatabaseName": "String",
"Name": "String"
}
]

  1. Add a folder named tmp.
  2. Obtain the zip file Lake_Formation_QuickSight_RLS.zip.
    Word: That is pattern code for non-production utilization. You must work along with your safety and authorized groups to fulfill your organizational safety, regulatory, and compliance necessities earlier than deployment.
  3. For the Lambda operate code, add the downloaded .zip file to the Lambda operate, on the Code tab.
  4. Present crucial entry to the execution function in AWS Lake Formation. Though the AWS Id and Entry Administration (IAM) permissions are given to the Lambda execution function, express permission must be given to the function in AWS Lake Formation for the Lambda operate to get the small print in regards to the knowledge filters. Subsequently, it’s important to explicitly grant entry to the execution function to restrict the Lambda function to read-only admin. For extra particulars, see Viewing knowledge filters.

Within the following sections, we clarify what the Lambda operate code does in additional element.

Extract knowledge filters and permissions for knowledge filters and tables in AWS Lake Formation

The primary circulation of the code takes the record of tables as enter and extracts desk and knowledge filter permissions and knowledge filter guidelines. The method right here is to get the permissions for the complete desk and in addition for the information filters utilized to the desk. This manner, each full entry (desk degree) and partial entry (knowledge filter) might be extracted.

...
....
tablesToTrack= json.masses(os.environ["tablesToTrack"])
lf_client = boto3.consumer('lakeformation')
# For every desk within the record get the information filter guidelines connected to the desk.
for desk in tablesToTrack:
df_response= lf_client.list_data_cells_filter(
Desk= desk
)
d_filters += df_response["DataCellsFilters"]

# Additionally, for every desk within the record get the record of permissions at desk degree.
# This determines who has entry to all rows within the desk.
tresponse=lf_client.list_permissions(
Useful resource= {
"Desk": desk
}
)

d_permissions += tresponse["PrincipalResourcePermissions"]
transformDataFilterRules(d_filters)
# For every knowledge filters fetched above, get the permissions.
# This determines the row degree safety for the tables.
for filter in d_filters:
p_response=lf_client.list_permissions(
Useful resource= {

"DataCellsFilter": {
"DatabaseName": filter ["DatabaseName"],
"Title": filter["Name"],
"TableCatalogId": filter["TableCatalogId"],
"TableName": filter["TableName"]
}

}
)
d_permissions += p_response["PrincipalResourcePermissions"]

transformFilterandTablePermissions(d_permissions)

Rework knowledge filter definitions in to a format usable in Amazon QuickSight

The extracted permissions and filters are reworked to create a guidelines dataset in Amazon QuickSight. There are alternative ways to outline knowledge filters. The next determine illustrates a number of the instance transformations.

The operate transformDataFilterRules within the following code can rework a number of the OR and AND circumstances into Amazon QuickSight acceptable format. The next are the small print accessible within the reworked format:

  • Lake Formation catalog ID
  • Lake Formation database identify
  • Lake Formation desk identify
  • Lake Formation knowledge filter identify
  • Listing of columns from all of the tables offered within the enter for which the information filter guidelines are outlined

See the next code:

def transformDataFilterRules(guidelines):
international complete_transformed_filter_rules
transformed_filter_rules = []
filter_to_extract=[]
complete_transformed_filter_rules = []
col_headers=[]
col_headers.append("catalog")
col_headers.append("database")
col_headers.append("desk")
col_headers.append("filter")

for rule in guidelines:
print(rule)
catalog=rule["TableCatalogId"]
database = rule["DatabaseName"]
desk = rule["TableName"]
filter = rule["Name"]
row=[]
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
logger.information(f"row==={row}")

f_conditions = re.cut up(' OR | or | and | AND ' , rule["RowFilter"]["FilterExpression"])

for f_condition in f_conditions:
logger.information(f"f_condition={f_condition}")
f_condition = f_condition.substitute("(","")
f_condition = f_condition.substitute(")","")
filter_rule_column= f_condition.cut up("=")
if len(filter_rule_column)>1:
filter_rule_column[0] = filter_rule_column[0].strip()
if not filter_rule_column[0].strip() in col_headers:
col_headers.append(filter_rule_column[0].strip())
i= col_headers.index(filter_rule_column[0].strip())
j= i- (len(row)-1)
if j>0:
for x in vary(1, j):
row.append("")
logger.information(f"i={i} j={j} {filter_rule_column[1]}")
row.insert(i, filter_rule_column[1].substitute("'",""))
print(row)
transformed_filter_rules.append(','.be a part of(row))

row=[]
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
max_columns = len(col_headers)
complete_transformed_filter_rules=[]
for rule in transformed_filter_rules:
r = rule.cut up(",")
to_fill = max_columns - len(r)
if to_fill>0:
for x in vary(1, to_fill+1):
r.append("")
complete_transformed_filter_rules.append(','.be a part of(r))

complete_transformed_filter_rules.insert(0,','.be a part of(col_headers))

The next determine is an instance of the reworked file. The file comprises the columns for each tables. When making a guidelines dataset for a particular desk, the information are filtered for that desk pulled into Amazon QuickSight.

The operate transformFilterandTablePermissions within the following code snippet combines and transforms the desk and knowledge filter permissions right into a flat construction that comprises the next columns:

  • Amazon QuickSight group ARN
  • Lake Formation catalog ID
  • Lake Formation database identify
  • Lake Formation desk identify
  • Lake Formation knowledge filter identify

See the next code:

def transformFilterandTablePermissions(permissions):
    international transformed_table_permissions,transformed_filter_permissions
    # Learn and set desk degree entry
    transformed_table_permissions = []
    transformed_filter_permissions = []
    transformed_filter_permissions.insert(0,"group,catalog,database,desk,filter")
    transformed_table_permissions.insert(0,"group,catalog,database,desk")
    
    for permission in permissions:
    group=""
    database=""
    desk =""
    catalog=""
    
    p= permission["Permissions"]
    
    if "DESCRIBE" in p or "SELECT" in p:
    
    group = permission["Principal"]["DataLakePrincipalIdentifier"]
    if "Database" in permission["Resource"]:
    catalog=permission["Resource"]["Database"]["CatalogId"]
    database=permission["Resource"]["Database"]["Name"]
    desk = "*"
    transformed_table_permissions.append(group + "," + catalog+ "," + database + "," + desk)
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk)
    elif "TableWithColumns" in  permission["Resource"]  or "Desk" in permission["Resource"]:
    if "TableWithColumns" in  permission["Resource"]:
    catalog=permission["Resource"]["TableWithColumns"]["CatalogId"]
    database = permission["Resource"]["TableWithColumns"]["DatabaseName"]
    desk = permission["Resource"]["TableWithColumns"]["Name"]
    elif "Desk" in  permission["Resource"]:
    catalog=permission["Resource"]["Table"]["CatalogId"]
    database = permission["Resource"]["Table"]["DatabaseName"]
    desk = permission["Resource"]["Table"]["Name"]
    transformed_table_permissions.append( group + "," + catalog + "," + database + "," + desk)
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk)
    elif "DataCellsFilter" in permission["Resource"]:
    catalog=permission["Resource"]["DataCellsFilter"]["TableCatalogId"]
    database = permission["Resource"]["DataCellsFilter"]["DatabaseName"]
    desk = permission["Resource"]["DataCellsFilter"]["TableName"]
    filter = permission["Resource"]["DataCellsFilter"]["Name"]
    transformed_filter_permissions.append(group+"," +catalog + ","+ database + ","+ desk+ ","+ filter)

The next determine is an instance of the extracted knowledge filter and desk permissions. AWS Lake Formation can have knowledge filters utilized to any principal. Nonetheless, we deal with the Amazon QuickSight principals:

  • The QuickSight_Authors ARN has full entry to 2 tables. That is decided by reworking the table-level permissions along with the information filter permissions.
  • The QuickSight_Readers ARN has restricted entry based mostly on filter circumstances.

Retailer the reworked guidelines and permissions in two separate recordsdata in Amazon S3

The reworked guidelines and permissions are then continued in a knowledge retailer. On this resolution, the reworked guidelines are written to an Amazon S3 location in CSV format. The identify of the recordsdata created by the Lambda operate are:

  • transformed_filter_permissions.csv
  • transformed_filter_rules.csv

See the next code:

with open("/tmp/transformed_table_permissions.csv", "w") as txt_file:
for line in transformed_table_permissions:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()
s3 = boto3.useful resource('s3')
s3.meta.consumer.upload_file(Filename = "/tmp/transformed_table_permissions.csv", Bucket= os.environ['S3Bucket'], Key = "table-permissions/transformed_table_permissions.csv")

with open("/tmp/transformed_filter_permissions.csv", "w") as txt_file:
for line in transformed_filter_permissions:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()

s3.meta.consumer.upload_file(Filename = "/tmp/transformed_filter_permissions.csv", Bucket= os.environ['S3Bucket'], Key = "filter-permissions/transformed_filter_permissions.csv")

with open("/tmp/transformed_filter_rules.csv", "w") as txt_file:
for line in complete_transformed_filter_rules:
txt_file.write(line + "n") # works with any variety of components in a line
txt_file.shut()

s3.meta.consumer.upload_file(Filename = "/tmp/transformed_filter_rules.csv", Bucket= os.environ['S3Bucket'], Key = "filter-rules/transformed_filter_rules.csv")

Create a guidelines dataset in Amazon QuickSight

On this part, we stroll by the steps to create a guidelines dataset in Amazon QuickSight.

Create a desk in Lake formation for the recordsdata

Step one is to create a desk in AWS Lake Formation for the 2 recordsdata, transformed_filter_permissions.csv and transformed_filter_rules.csv.

Though you possibly can instantly use an Amazon S3 connector in Amazon QuickSight, making a desk and making the foundations dataset utilizing an Athena connector provides flexibility in writing customized SQL and utilizing direct question. For the steps to carry an Amazon S3 location into AWS Lake Formation, see Creating tables.

For this publish, the tables for the recordsdata are created in a separate database known as quicksight_lf_transformation.

Grant permission for the tables to the QuickSight_Authors group

Grant permission in AWS Lake Formation for the 2 tables to the QuickSight_Authors group. That is important for Amazon QuickSight authors to create a guidelines dataset in Amazon QuickSight. The next screenshot reveals the permission particulars.

Create a guidelines dataset in Amazon QuickSight

Amazon QuickSight helps each user-level and group-level RLS. On this publish, we use teams to allow RLS. To create the foundations dataset, you first be a part of the filter permissions desk with the filter guidelines desk on the columns catalog, database, desk, and filter. Then you possibly can filter the permissions to incorporate the Amazon QuickSight principals, and embody solely the columns required for the dataset. The target on this resolution is to construct a guidelines dataset for the saas_sales desk.

Full the next steps:

  1. On the Amazon QuickSight console, create a brand new Athena dataset.
  2. Specify the next:
    1. For Catalog, select AWSDataCatalog.
    2. For Database, select quicksight_lf_transformation.
    3. For Desk, select filter_permissions.
  3. Select Edit/Preview knowledge.
  4. Select Add knowledge.
  5. Select Add supply.
  6. Choose Athena.
  7. Specify the next:
    1. For Catalog, select AWSDataCatalog.
    2. For Database, select quicksight_lf_transformation.
    3. For Desk, select filter_rules.

  8. Be a part of the permissions desk with the information filter guidelines desk on the catalog, database, desk and filter columns.
  9. Rename the column group as GroupArn. This must be completed earlier than filter is utilized.
  10. Filter the information the place column desk equals saas_sales.
  11. Filter the information the place column group can be filtered for values beginning with arn:aws:quicksight (Amazon QuickSight principals).
  12. Exclude fields that aren’t a part of the saas_sales desk.
  13. Change Question mode to SPICE.
  14. Publish the dataset.

In case your group has a mapping of different principals to a Amazon QuickSight group or consumer, you possibly can apply that mapping earlier than becoming a member of the tables.

You can even write the next customized SQL to attain the identical consequence:

SELECT a."group" as GroupArn, phase FROM "QuickSight_lf_transformation"."filter_permissions" as a
left be a part of
"QuickSight_lf_transformation"."filter_rules" as b
on
a.catalog = b.catalog and
a.database = b.database and
a."desk" = b."desk" and
a.filter = b.filter
the place a."desk" = 'saas_sales'
and a."group" like 'arn:aws:quicksight%'

  1. Title the dataset LakeFormationRLSDataSet and publish the dataset.

Take a look at the row-level safety

Now you’re prepared to check the row-level safety by publishing a dashboard as a consumer within the QuickSight_Authors group after which viewing the dashboard as a consumer within the QuickSight_Readers group.

Publish a dashboard as a QuickSight_Authors group consumer

As an creator who belongs to the QuickSight_Authors group, the consumer will have the ability to see the saas_sales desk within the Athena connector and all the information within the desk. As proven on this part, all three segments are seen for the creator when creating an evaluation and viewing the revealed dashboard.

  1. Create a dataset by pulling knowledge from the saas_sales desk utilizing the Athena connector.
  2. Connect LakeFormationRLSDataSet because the RLS dataset for the saas_sales dataset. For directions, see Utilizing row-level safety with user-based guidelines to limit entry to a dataset.
  3. Create an evaluation utilizing the saas_sales dataset as an creator who belongs to the QuickSight_Authors group.
  4. Publish the dashboard.
  5. Share the dashboard with the group QuickSight_Readers.

View the dashboard as a QuickSight_Readers group consumer

Full the next steps to view the dashboard as a QuickSight_Readers group consumer:

  1. Log into Amazon QuickSight as a reader who belongs to the QuickSight_Readers group.

The consumer will have the ability to see solely the phase Enterprise.

  1. Now, change the RLS in AWS Lake Formation, and set the phase to be SMB for the QuickSightReaderFilter.
  2. Run the Lambda operate to export and rework the brand new knowledge filter guidelines.
  3. Refresh the SPICE dataset LakeFormationRLSDataSet in Amazon QuickSight.
  4. When the refresh is full, refresh the dashboard within the reader login.

Now the reader consumer will see SMB knowledge.

Cleanup

Amazon QuickSight assets

  1. Delete the Amazon QuickSight dashboard and evaluation created
  2. Delete the datasets saas_sales and LakeFormationRulesDataSet
  3. Delete the Athena knowledge supply
  4. Delete the QuickSight teams utilizing the DeleteGroup API

AWS Lake Formation assets

  1. Delete the database quicksight_lf transformation created in AWS Lake Formation
  2. Revoke permission given to the Lambda execution function
  3. Delete the saas_sales desk and knowledge filters created
  4. When you have used Glue crawler to create the tables in AWS Lake Formation, take away the Glue crawler as effectively

Compute assets

  1. Delete the AWS Lambda operate created
  2. Delete the AWS Lambda execution function related to the lambda

Storage assets

  1. Empty the content material of the Amazon S3 bucket created for this resolution
  2. Delete the Amazon S3 bucket

Conclusion

This publish defined tips on how to replicate row-level safety in AWS Lake Formation mechanically in Amazon QuickSight. This makes positive that the SPICE dataset in QuickSight can use row-level entry outlined in Lake Formation.

This resolution may also be prolonged for different knowledge sources. The logic to programmatically extract the entitlements from the supply and rework them into Amazon QuickSight format will fluctuate by supply. After the extract and rework are in place, it may scale to a number of groups within the group. Though this publish laid out a primary method, the automation must be both scheduled to run periodically or triggered based mostly on occasions like knowledge filters change or grant or revoke of AWS Lake Formation permissions to guarantee that the entitlements stay in sync between AWS Lake Formation and Amazon QuickSight.

Check out this resolution in your personal use case, and share your suggestions within the feedback.


In regards to the Authors

Vetri Natarajan is a Specialist Options Architect for Amazon QuickSight. Vetri has 15 years of expertise implementing enterprise enterprise intelligence (BI) options and greenfield knowledge merchandise. Vetri makes a speciality of integration of BI options with enterprise functions and allow data-driven choices.

Ismael Murillo is a Options Architect for Amazon QuickSight. Earlier than becoming a member of AWS, Ismael labored in Amazon Logistics (AMZL) with supply station administration, supply service suppliers, and our buyer actively within the area. Ismael centered on final mile supply and supply success. He designed and applied many revolutionary options to assist cut back price, affect supply success. He’s additionally a United States Military Veteran, the place he served for eleven years.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles