Quantcast
Channel: CodeSection,代码区,SQL Server(mssql)数据库 技术分享 - CodeSec
Viewing all articles
Browse latest Browse all 3160

How it is build: the SSAS memory report

$
0
0

So recently I created a new SSAS memory report that you can use to analyze the memory usage of your SSAS servers, read more on the actual report here . In this blog post we’ll look at how this report was build.

First let’s look at what is needed for us to get the information we need for tabular models with compatibility level 1200 or higher. The most detailed information for the tabular model we can get by querying the DMV DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS. This will give us plenty of data to work with like:

Databasename Servername Table (actually dimension name) Column Partition Recordcount Size Compression type

And more, but with the above we should be able to make a great report. Unfortunately there is one issue, the query above can only be send per database not for the entire server. This used to be a problem with the previous memory reports but with PowerQuery we can solve this. What we will do is do a second discover that will give us all the databases on the server “DBSCHEMA_CATALOGS”. Here we will also filter out any database that doesn’t have the compatibility level 1200 or higher. We will then use the results of that query to send the DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS per query.

So to build this the first thing I did was create a query in Power BI desktop that will give a list of all the databases on my server using “DBSCHEMA_CATALOGS”. Unfortunately you have to specify a Database on the server for each connection to SSAS while this should be an optional argument in Analysis Services.Database. I have asked the M folks to put it on the backlog but I can live with this for now, this scenario is not that common.

This gives me the list of databases:


How it is build: the SSAS memory report

Next I want someone to be able to have the user type in the server name and database trough a prompt instead of changing the M script, so I created 2 parameters:


How it is build: the SSAS memory report

Next I use these 2 parameters in my M expression:

let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DBSCHEMA_CATALOGS#(lf)where compatibility_level >= '1200'"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"CATALOG_NAME", "DatabaseName"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DATE_QUERIED", "CURRENTLY_USED", "POPULARITY"}),
#"Added Custom" = Table.AddColumn(#"Removed Columns", "Server", each Server)
in
#"Added Custom" Done
How it is build: the SSAS memory report
.

Now that was the easy part. Next I need to use the list of databases to get the information per database. I first do one manually so I again import directly against my SSAS server using “DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS”

This gives me the information I care about for a single database on the server (I renamed and removed some columns):


How it is build: the SSAS memory report

This is the query I used, again leveraging the parameters I defined earlier:

Source = (Database as text, Server as text) => let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"DATABASE_NAME", "Cube"}, {"MEASURE_GROUP_NAME", "Table"}, {"PARTITION_NAME", "Partition"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DIMENSION_NAME"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"TABLE_ID", "TableId"}, {"COLUMN_ID", "ColumnId"}, {"SEGMENT_NUMBER", "SegmentNr"}, {"TABLE_PARTITION_NUMBER", "TablePartitionNr"}, {"RECORDS_COUNT", "RecordCount"}, {"ALLOCATED_SIZE", "AllocatedSize"}, {"USED_SIZE", "UsedSize"}, {"COMPRESSION_TYPE", "CompressionType"}, {"BITS_COUNT", "BitsCount"}, {"BOOKMARK_BITS_COUNT", "BookmarkBits"}, {"VERTIPAQ_STATE", "VertiPaqState"}})
in
#"Renamed Columns1"

Now the interesting part. Now we need to call the above code that gets the information per database for each database on the server. To do this I create a function out of the M script above:

let
Source = (Database as text, Server as text) => let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"DATABASE_NAME", "Cube"}, {"MEASURE_GROUP_NAME", "Table"}, {"PARTITION_NAME", "Partition"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DIMENSION_NAME"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"TABLE_ID", "TableId"}, {"COLUMN_ID", "ColumnId"}, {"SEGMENT_NUMBER", "SegmentNr"}, {"TABLE_PARTITION_NUMBER", "TablePartitionNr"}, {"RECORDS_COUNT", "RecordCount"}, {"ALLOCATED_SIZE", "AllocatedSize"}, {"USED_SIZE", "UsedSize"}, {"COMPRESSION_TYPE", "CompressionType"}, {"BITS_COUNT", "BitsCount"}, {"BOOKMARK_BITS_COUNT", "BookmarkBits"}, {"VERTIPAQ_STATE", "VertiPaqState"}})
in
#"Renamed Columns1"
in
Source

This allows me to pass in Database and Server and run the code. The function is called “fnDatabaseSize”:


How it is build: the SSAS memory report

Now we need to pass in that information automatically form the data . I do this using the following M script:

let
Source = Databases,
#"Removed Columns" = Table.RemoveColumns(Source,{"DESCRIPTION", "ROLES", "DATE_MODIFIED", "COMPATIBILITY_LEVEL", "TYPE", "VERSION", "DATABASE_ID"}),
#"Invoked Custom Function" = Table.AddColumn(#"Removed Columns", "fnDatabaseSize", each fnDatabaseSize([DatabaseName], [Server])),
#"Expanded fnDatabaseSize" = Table.ExpandTableColumn(#"Invoked Custom Function", "fnDatabaseSize", {"AllocatedSize", "BitsCount", "BookmarkBits", "ColumnId", "CompressionType", "Cube", "CUBE_NAME", "Partition", "RecordCount", "SegmentNr", "Table", "TableId", "TablePartitionNr", "UsedSize", "VertiPaqState"}, {"AllocatedSize", "BitsCount", "BookmarkBits", "ColumnId", "CompressionType", "Cube", "CUBE_NAME", "Partition", "RecordCount", "SegmentNr", "Table", "TableId", "TablePartitionNr", "UsedSize", "VertiPaqState"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded fnDatabaseSize",{{"AllocatedSize", Int64.Type}, {"BitsCount", Int64.Type}, {"BookmarkBits", Int64.Type}, {"RecordCount", Int64.Type}, {"SegmentNr", Int64.Type}, {"TablePartitionNr", Int64.Type}, {"UsedSize", Int64.Type}})
in
#"Changed Type"

I get to the these steps by doing the following steps in the UI.First I base my query on the information that is retrieved from the databases query and then I remove all the columns expect name and server:


How it is build: the SSAS memory report

And then for the magic part, in the next step I invoke a custom function:


How it is build: the SSAS memory report

Fortunately Power BI desktop automatically recognized the parameters in my function and asks me which column values it should use to enter the variables.

This results in a new table being added for each database:


How it is build: the SSAS memory report

This I now can expand and clean up to get the full results.


How it is build: the SSAS memory report
This gives me all the information I need for all the databases on my server. Unfortunately you will probably have gotten an error message about data privacy or firewall, what happens is that Power Query is trying to prevent some sort of data injectio

Viewing all articles
Browse latest Browse all 3160

Trending Articles