Blog Post

FinOps Blog
19 MIN READ

What’s new in FinOps toolkit 0.8 – February 2025

flanakin's avatar
flanakin
Icon for Microsoft rankMicrosoft
Mar 12, 2025

Whether you consider yourself a FinOps practitioner, someone who's enthusiastic about driving cloud efficiency and maximizing the value you get from the cloud or were just asked to look at ways to reduce cost, the FinOps toolkit has something for you. This month, you'll find a complete refresh of Power BI with a new design, greatly improved performance, and the ability to calculate reservation savings for both EA and MCA; FinOps hubs have a new Data Explorer dashboard and simpler public networking architecture; and many more small updates and improvements across the board. Read on for details!

In this update:

New to the FinOps toolkit?

In case you haven't heard, the FinOps toolkit is an open-source collection of tools and resources that help you learn, adopt, and implement FinOps in the Microsoft Cloud. The foundation of the toolkit is the Implementing FinOps guide that helps you get started with FinOps whether you're using native tools in the Azure portal, looking for ways to automate and extend those tools, or if you're looking to build your own FinOps tools and reports. To learn more about the toolkit, how to provide feedback, or how to contribute, see FinOps toolkit documentation.

Website refresh with documentation on Microsoft Learn

Before we get into each of the tool updates, I want to take a quick moment to call out an update to the FinOps toolkit website, which many of you are familiar with. Over the last few months, you may have noticed that we started moving documentation to Microsoft Learn. With that content migration final, we simplified the FinOps toolkit website to provide high-level details about each of the tools with links out to the documentation as needed. Nothing major here, but it is a small update that we hope will help you find the most relevant content faster.

If you find there's anything we can do to streamline discovery of information or improve the site in general, please don't hesitate to let us know! And, as an open-source project, we're looking for people who have React development experience to help us expand this to include deployment and management experiences as well. If interested in this or any contribution, please email us at ftk-support@microsoft.com to get involved.

Power BI report design refresh

In the 0.8 release, Power BI reports saw some of the most significant updates we've had in a while. The most obvious one is the visual design refresh, which anyone who used the previous release will be able to spot immediately after opening the latest reports.

The new reports align with the same design language we use in the Azure portal to bring a consistent, familiar experience. This starts on the redesigned Get started page for each report. The Get started page helps set context on what the report does and how to set it up.

Select the Connect your data button for details about how to configure the report, in case you either haven't already set it up or need to make a change.

If you run into any issues, select the Get help button at the bottom-right of the page for some quick troubleshooting steps. This provides some of the same steps as you'll find in the new FinOps toolkit help + support page.

Moving past the Get started page, you'll also see that each report page was updated to move the filters to the left, making a little more room for the main visuals. As part of this update, we also updated all visuals across both the storage and KQL reports to ensure they both have the latest and greatest changes.

I suppose the last thing I should call out is that every page now includes a “Give feedback” link. I'd like to encourage you to submit feedback via these links to let us know what works well and what doesn't. The feedback we collect here is an important part of how we plan and prioritize work. Alternatively, you're also welcome to create and vote on issues in our GitHub repository. Each release we'll strive to address at least one of the top 10 feedback requests, so this is a great way to let us know what's most important to you!

Calculating savings for both EA and MCA accounts

If you've ever tried to quantify cost savings or calculate Effective Savings Rate (ESR), you probably know list and contracted cost are not always available in Cost Management. Now, in FinOps toolkit 0.8, you can add these missing prices in Power BI to facilitate a more accurate and complete savings estimate.

Before I get into the specifics, I should note that there are 3 primary ways to connect your data to FinOps toolkit Power BI reports. You can connect reports:

  • Directly to FOCUS data exported to a storage account you created.
  • To a FinOps hub storage account ingestion container.
  • To a FinOps hub Data Explorer cluster.

Each option provides additive benefits where FinOps hubs with Data Explorer offers the best performance, scalability, and functionality, like populating missing prices to facilitate cost savings calculations. This was available in FinOps hubs 0.7, so anyone who deployed FinOps hubs with Data Explorer need only export price sheets to take advantage of the feature.

Unfortunately, storage reports didn't include the same option. That is, until the latest 0.8 release, which introduced a new Experimental: Add Missing Prices parameter. When enabled, the report combines costs and prices together to populate the missing prices and calculate more accurate savings.

Please be aware that the reason this is labeled as “experimental” is because both the cost and price datasets can be large and combining them can add significant time to your data refresh times. If you're already struggling with slow refresh times, you may want to consider using FinOps hubs with Data Explorer. In general, we recommend FinOps hubs with Data Explorer for any account that monitors over $100K in total spend. (Your time is typically more valuable than the extra $125 per month.)

To enable the feature, start by creating a Cost Management export for the price sheet. Then update parameters for your report to set the Experimental: Add Missing Prices parameter to true. Once enabled, you'll start to see additional savings from reservations. While this data is available in all reports, you can generally see savings on three pages within the Rate optimization report.

The Summary page shows a high-level breakdown of your cost with the details that help you quantify negotiated discount and commitment discount savings. In this release, you'll also find Effective Savings Rate (ESR), which shows your total savings compared to the list cost (what you would have paid with no discounts).

The Total savings page is new in this release and shows that same cost and savings breakdown over time.

And lastly, the Commitment discount savings page shows gives you the clearest picture of the fix for MCA accounts by showing the contracted cost and savings for each reservation instance.

If savings are important for your organization, try the new Add Missing Prices option and let us know how it works for you. And again, if you experience significant delays in data refresh times, consider deploying FinOps hubs with Data Explorer. This is our at-scale solution for everyone.

Performance improvements for Power BI reports

Between gradually increased load times for storage reports and learnings from the initial release of KQL reports in 0.7, we knew it was time to optimize both sets of reports. And we think you'll be pretty excited about the updates.

For those using storage reports, we introduced a new Deprecated: Perform Extra Query Optimization parameter that disables some legacy capabilities that you may not even be using:

  • Support for FOCUS 1.0-preview.
  • Tracking data quality issues with the x_SourceChanges column.
  • Fixing x_SkuTerm values to be numbers for MCA.
  • Informative x_FreeReason column to explain why a row might have no cost.
  • Unique name columns to help distinguish between multiple objects with the same display name.

Most organizations aren't using these and can safely disable this option. For now, we're leaving this option enabled by default to give people time to remove dependencies. We do plan to disable this option by default in the future and remove the option altogether to simplify the report and improve performance.

Cosmetic and informational transforms will be disabled by default in 0.9 and removed on or after July 1, 2025 to improve Power BI performance. If you rely on any of these changes, please let us know by creating an issue in GitHub to request that we extend this date or keep the changes indefinitely.

For those using KQL reports that use FinOps hubs with Data Explorer, you'll notice a much more significant change. Instead of summarized queries with a subset of data, KQL reports now query the full dataset using a single query. This is made possible through a Power BI feature called DirectQuery. DirectQuery generates queries at runtime to streamline the ingestion process. What may take hours to pull data in a storage report takes seconds in KQL reports. The difference is astounding.

Let me state this more explicitly: If you're struggling with long refresh times or need to setup incremental refresh on your storage reports, you should strongly consider switching to FinOps hubs with Data Explorer. You'll get full fidelity against the entire dataset with less configuration.

Important note for organizations that spend over $100K

I've already stated this a few times, but for those skimming the announcement, I want to share that we've learned a lot over the past few months as organizations big and small moved from storage to KQL reports in Power BI. With a base cost of $130 per month, we are now recommending that any organization who needs to monitor more than $100,000 in spend should deploy FinOps hubs with Data Explorer.

While we won't remove storage as an option for those interested in a low-cost, low-setup solution, we do recognize that Data Explorer offers the best overall value to cost. And as we look at our roadmap, it's also important to note that Data Explorer will be critical as we expand to cover every FinOps capability. From allocation through unit economics, most capabilities require an analytical engine to break down, analyze, and even re-aggregate costs. At less than 0.2% of your total spend, we think you'll agree that the return is worth it. Most organizations see this as soon as they open a KQL report and it pulls data in seconds when they've been waiting for hours.

Give it a shot and let us know what you think. We're always looking for ways to improve your experience. We think this is one of the biggest ways to improve and the great thing is it's already available!

New Data Explorer dashboard for FinOps hubs

With the addition of Data Explorer in FinOps hubs 0.7, we now have access to a new reporting tool built into Azure Data Explorer and available for free to all users! Data Explorer dashboards offer a lighter weight reporting experience that sits directly on the data layer, removing some of the complexities of Power BI reporting. Of course, Data Explorer dashboards aren't a complete replacement for Power BI. If you need to combine data from multiple sources, Power BI will still be the best option with its vast collection of connectors. This is just another option you have in your toolbelt. In fact, whether you use Power BI reports or not, we definitely recommend deploying the Data Explorer dashboard.

Deploying the dashboard is easy. You import the dashboard from a file, connect it to your database, and you're ready to go! And once you setup the dashboard, you'll find pages organized in alignment with the FinOps Framework, similar to the Power BI reports. You'll find a few extra capabilities broken out in the dashboard compared to Power BI, but the functionality is generally consistent between the two, with some slight implementation differences that leverage the benefits of each platform.

If you're familiar with the Power BI reports, you may notice that even this one screenshot is not directly comparable. I encourage you to explore what's available and make your own determination about which tool works best for you and your stakeholders.

Before I move on to the next topic, let me call out my favorite page in the dashboard: The Data ingestion page. Similar to Power BI, the Data ingestion page includes details about the cost of FinOps hubs, but much more interesting than that is the ingested data, which is broken down per dataset and per month. This gives you an at-a-glance view of what data you have and what you don't! This level of visibility is immensely helpful when troubleshooting data availability or even deciding when it's time to expand to cover more historical data!

Whether you choose to keep or replace your existing Power BI reports, we hope you'll try the Data Explorer dashboard and let us know what you think. They're free and easy to set up. To get started, see Configure the Data Explorer dashboard.

About the FinOps hubs data model

While on the subject of Data Explorer, I'd also like to call out some new, updated, and even deprecated KQL functions available in FinOps hubs as well as how to learn more about these and other functions and tables.

I'll start by calling out that FinOps hubs with Data Explorer established a model for data ingestion that prioritizes backward compatibility. This may not be evident now, with only having support for FOCUS 1.0, but you will see this as we expand to support newer FOCUS releases. This is a lot to explain, so I won't get into it here, but instead I'll point you to where you can learn more at the end of this section. For now, let me say that you'll find two sets of functions in the Hub database: versioned and unversioned. For instance, Costs() returns all costs with the latest supported FOCUS schema (version 1.0 today), while Costs_v1_0() will always return FOCUS 1.0 data. This means that, if we were to implement FOCUS 1.1, Costs() would return FOCUS 1.1 data and Costs_v1_0() would continue to return FOCUS 1.0, whether the data was ingested with 1.0, 1.1, or even 1.0-preview, which we continue to support. I can cover this more in-depth in a separate blog post. There's a lot to versioning and I'm very proud of what we're doing here to help you balance up-to-date tooling without impacting existing reports. (This is another benefit of KQL reports over storage reports.) The key takeaway here is to always use versioned functions for tooling and reports that shouldn't change over time, and use unversioned functions for ad-hoc queries where you always want the latest schema.

Beyond these basic data access functions, we also offer 15 helper functions for common reporting needs. I won't go over them all here, but will call out a few additions, updates, and replacements.

Most importantly, we identified some performance and memory issues with the parse_resourceid() function when run at scale for large accounts. We resolved the issue by extracting a separate resource_type() function for looking up resource type display names for resources. This is mostly used within internal data ingestion, but also available for your own queries. The main callout is that, if you experienced any memory issues during data ingestion in 0.7, please look at 0.8. We're seeing some amazing performance and scale numbers with the latest update.

As you can imagine, FinOps reports use a lot of dates. And with that, date formatting is mandatory. In 0.8, we renamed the daterange() function to datestring() to better represent its capabilities and also extracted a new monthstring() function for cases when you only need the month name.

  • datestring(datetime, [datetime]) returns a formatted date or date range abbreviated based on the current date (e.g., “Jan 1”, “Jan-Feb 2025”, “Dec 15, 2024-Jan 14, 2025”).
  • monthstring(datetime, [length]) returns the name of the month at a given string length (e.g., default = “January”, 3 = “Jan”, 1 = “J”).

We also updated the numberstring() function to support decimal numbers. (You can imagine how that might be important for cost reporting!)

  • numberstring(num, [abbrev]) returns a formatted string representation of the number based on a few simple rules that only show a maximum of three numbers and a magnitude abbreviation (e.g., 1234 = “1.23K”, 12345678 = “12.3M”).

And of course, these are just a few of the functions we have available. To learn more about the data model available in Power BI or Data Explorer, see FinOps hubs data model. This article will share details about managed datasets in FinOps hubs, Power BI functions used in both KQL and storage reports, Power BI tables, and KQL functions available in both Power BI and Data Explorer dashboards. If you're curious about the tables, functions, and even details about how versioning works, this will be a good reference to remember.

Simplified network architecture for public routing

In 0.7, we introduce a much-anticipated feature to enable FinOps hubs with private network routing (aka, private endpoints). As part of this update, we added all FinOps hubs components into a dedicated, isolated network for increased security. And after the release, we started to receive immediate feedback from those who prefer the original public routing option from 0.6 and before, which was not hosted within an isolated network.

Based on this feedback, we updated the public routing option to exclude networking components. This update simplifies the deployment and better aligns with what most organizations are looking for when using public routing:

We also published new documentation to explain both the public and private routing options in detail. If you're curious about the differences or planning to switch to one or the other, you'll want to start with Configure private networking in FinOps hubs. Configuring private networking requires some forethought, so we recommend you engage your network admins early to streamline the setup process, including peering and routing from your VPN into the isolated FinOps hubs network.

I also want to take a quick moment to thank everyone who shared their feedback about the networking changes. This was an amazing opportunity to see our tiny open-source community come together. We rallied, discussed options openly, and pivoted our direct to align with the community's preferred design direction. I'm looking forward to many more open discussions and decisions like this. The FinOps toolkit is for the community, by the community, and this has never been more apparent than over the last few months. Thank you all for making this community shine!

Managing exports and hubs with PowerShell

We probably don't do a good enough job raising awareness about the FinOps toolkit PowerShell module. Every time I introduce people to it, they always come back to me glowing with feedback about how much time it saved them. And with that, we made some small tweaks based on feedback we heard from FinOps toolkit users. Specifically, we updated commands for creating and reading Cost Management exports, and deleting FinOps hubs. Let's start with exports…

The New-FinOpsCostExport command creates a new export. But it's not just a simple create call, like most PowerShell commands. One of the more exciting options is the -Backfill option, which allows you to backfill historical data up to 7 years with a single call! But this isn't new. In 0.8, we updated New-FinOpsCostExport to create price sheet, reservation recommendation, and reservation transaction exports. With this, we added some new options for reservation recommendations and system-assigned identities.

The Get-FinOpsCostExport command retrieves all exports on the current scope based on a set of filters. While updating other commands, we updated the command to return a more comprehensive object and renamed some of the properties to be clearer about their intent.

And just to call out another popular command: The Start-FinOpsCostExport command allows you to run exports for an existing export. This is most often used when backfilling FinOps hubs but works in any scenario. This command is what's used in the New-FinOpsCostExport command.

Lastly, we were asked to improve the confirmation experience for the Remove-FinOpsHub command (#1187). Now, the command shows a list of resources that will be deleted before confirming delete. Simple, but helpful.

There's a lot we can do with PowerShell. So much it's hard to know where to go next. If you find yourself looking for anything in particular, please don't hesitate to let us know! We're generally waiting for a signal from people like you who need not just automation scripts, but any tools in the FinOps space. So if you find something missing, create an issue to let us know how we can help!

Other new and noteworthy updates

Many small improvements and bug fixes go into each release, so covering everything in detail can be a lot to take in. But I do want to call out a few other small things that you may be interested in.

In the Implementing FinOps guide:

In FinOps hubs:

  • Clean up ResourceType values that have internal resource type IDs (for example, microsoft.compute/virtualmachines).
  • Updated the default setting for Data Explorer trusted external tenants from “All tenants” to “My tenant only”.
    • This change may cause breaking issues for Data Explorer clusters accessed by users from external tenants.
  • Updated CommitmentDiscountUsage_transform_v1_0() to use parse_resourceid().
  • Documentation updates covering required permissions and supported datasets.
  • Fixed timezones for Data Factory triggers to resolve issue where triggers would not start due to unrecognized timezone.
  • Fixed an issue where x_ResourceType is using the wrong value.
    • This fix resolves the issue for all newly ingested data.
    • To fix historical data, reingest data using the ingestion_ExecuteETL Data Factory pipeline.
  • Added missing request body to fix the false positive config_RunExportJobs pipeline validation errors in Data Factory.
  • Deprecated the monthsago() KQL function. Please use the built-in startofmonth(datetime, [offset]) function instead.

In Power BI reports:

  • Added the Pricing units open dataset to support price sheet data cleanup.
  • Added PricingUnit and x_PricingBlockSize columns to the Prices table.
  • Added Effective Savings Rate (ESR) to Cost summary and Rate optimization reports.
  • Expanded the columns in the commitment discount purchases page and updated to show recurring purchases separately.
  • Fixed a date handling bug that resulted in a “We cannot apply operator >= to types List and Number” error (#1180). If you run into issues, set the report locale explicitly to the locale of the desired date format.

In FinOps workbooks:

  • On the Optimization workbook Commitment discounts tab, added Azure Arc Windows license management.
  • On the Optimization workbook, Enabled “Export to CSV” option on the Idle backupsquery.
  • On the Optimization workbook, Corrected VM processor details on the Computetab query.

In Azure optimization engine:

  • Improved multi-tenancy support with Azure Lighthouse guidance.

In open data:

  • Added 4 new region mappings to existing regions.
  • Added the “1000 TB” pricing unit.
  • Added 45 new and updated 52 existing resource types.
  • Added 4 new resource type to service mappings.

Thanking our community

As we approach the two-year anniversary of our first public release, I have to look back and acknowledge how far we've come. We all want to do more and move faster, which makes it easy to get lost in the day-to-day work our community does and lose sight of the progress we're making. There are honestly too many people to thank, so I won't go into listing everyone, but I do want to send out an extra special thank you to the non-Microsoft contributors who are making this community and its tools better.

I'll start the list off strong with Roland Krummenacher, a consultant who specializes in Azure optimization. He and his team built a tool similar to FinOps hubs and, after seeing 0.7 ship with Data Explorer, rearchitected their tool to extend FinOps hubs. Roland's team helps clients optimize their environment and build custom extensions to FinOps hubs that drive value realization. We're collaborating regularly to build a plan on how to bring some of their extensions into the toolkit. Several 0.8 improvements were made because of our collaboration with Roland.

Next up is Graham Murphy, a FinOps professional who's been using FinOps hubs since the early days. Graham has always been amazingly collaborative. He extended FinOps hubs to bring in GCP and AWS FOCUS data and often shares his experiences with the FinOps community on Slack. Graham is also part of the FOCUS project, which has also proven useful.

Speaking of FOCUS, Brian Wyka is an engineer who provided some feedback on our FOCUS documentation. But what impressed me most is that not only did Brian give us feedback, but he also engaged deeply in our pull request to address his feedback. It was amazing to see him stick to the topic through to the end.

Similar to Graham, John Lundell is a FinOps practitioner who also extended FinOps hubs and is sharing his experiences with the community. John took the time to document his approach for using FinOps hubs to get data into Microsoft Fabric. For those interested, check out Sharing how we are enhancing the toolkit for bill-back purposes.

Eladio Rincón Herrera has been with us for over a year now. The thing that really stands out to me about Eladio is the depth in which he gives context. This has helped immensely in a few times as we've narrowed down issues that not only he, but others were facing. Eladio's engagement in our discussion forums has helped many others both directly and indirectly. It's always a pleasure to work with Eladio!

Psilantropy has also been with us for over a year. They have been quite prolific over that time as well, sharing ideas, issues, and supporting discussions across four separate tools! Their reports are always extremely detailed and immensely helpful in pinpointing the underlying problem or fully understanding the desired feature request.

And now for someone who holds a special place in my heart: Patrick K. Patrick is an architect who leveraged FinOps hubs within his organization and needed to add private endpoints. He took the time to submit a pull request to contribute those changes back to the product. This was our first major external pull request, which is what made it so special. This spun up many discussions and debates on approaches that took time to get in, but I always look back to Patrick as the one who really kickstarted the effort with that first pull request!

Of course, this isn't everyone. I had to trim the list of people down a few times to really focus on a select few. (I'm sure I'll feel guilty about skipping someone later!) And that doesn't even count all the Microsoft employees who make the FinOps toolkit successful – both in contributions and through supporting the community. I'm truly humbled when I see how this community has grown and continues to thrive! Thank you all!

What's next

As we rounded out 2024, I have to say I was quite proud of what we were able to achieve. And coming into 2025, I was expecting a lightweight initial release. But we ended up doing much more than we expected, which is great. We saw some amazing (and unexpected) improvements in this release. And while I'd love to say we're going to focus on small updates, I have to admit we have some lofty goals.

Here are a few of the things we're looking at in the coming months:

  • FinOps hubs will add support for ingesting data into Microsoft Fabric eventhouses and introduce recommendations, similar to what you see in Azure Optimization Engine and FinOps workbooks.
  • Power BI reports will add support for Microsoft Fabric lakehouses.
  • FinOps hubs and Power BI will both get updated to the latest FOCUS release.
  • FinOps workbooks will continue to get recurring updates, expand to more FinOps capabilities, and add cost from FinOps hubs.
  • Azure Optimization Engine will continue to receive small updates as we begin to bring some capabilities into FinOps hubs in upcoming releases.
  • Each release, we'll try to pick at least one of the highest voted issues (based on 👍 votes) to continue to evolve based on your feedback, so keep the feedback coming!

To learn more, check out the FinOps toolkit roadmap, and please let us know if there's anything you'd like to see in a future release. Whether you're using native products, automating and extending those products, or using custom solutions, we're here to help make FinOps easier to adopt and implement.

 

Updated Mar 10, 2025
Version 1.0
No CommentsBe the first to comment