Black Diamond | OneStream Solutions https://blackdiamondadvisory.com Tue, 12 Apr 2022 15:02:57 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.2 https://blackdiamondadvisory.com/wp-content/uploads/2020/09/cropped-diamonds-32x32.png Black Diamond | OneStream Solutions https://blackdiamondadvisory.com 32 32 How Many Connector Rules Do We Really Need? https://blackdiamondadvisory.com/2022/04/12/connector-business-rules/ Tue, 12 Apr 2022 06:43:30 +0000 https://blackdiamondadvisory.com/?p=1474 Fellow OneStream enthusiast, if your list of OneStream Connector Business Rules looks like the below then this blog is for you. If it doesn’t, I hope you still find it useful or at least mildly interesting!

The post How Many Connector Rules Do We Really Need? appeared first on Black Diamond | OneStream Solutions.

]]>

How Many Connector Rules Do We Really Need?

Not as many as you may think (or were taught)

Fellow OneStream enthusiast, if your list of OneStream Connector Business Rules looks like the below then this blog is for you. If it doesn’t, I hope you still find it useful or at least mildly interesting!

Graphical user interface, application Description automatically generated

Figure :Why, oh why, did I not see the light earlier?

Now I must confess, many of my early solutions looked like this: it is the common and generally taught method and it feels like the natural way to set Connector Business Rules up due to the way we link a DataSource to a Connector Rule.

Graphical user interface, application, table Description automatically generated

Figure : The old way, one rule per DataSource

The Epiphany

As you may be aware I am an inherently lazy developer have a strong preference to build reusable functions wherever possible. It was as I was creating a myriad of API connector rules for a client that I realized the error of my ways. Most of the logic in these rules is the same, declare a source database connection, call that connection, get the data and return it to the stage.

Why, I asked myself, if I have a standard OAuth2 security handling function in an extender rule to save me the hassle of rewriting facilitate standardization am I copying pasting and tweaking the same rule over and over?

As you read through this post, you will (hopefully) come to understand why a lazy developer is a good developer: writing and testing core functionality and then reusing it means said developer produces quality solutions, does not go bonkers with boredom writing nearly-identical code ad infinitum, and is now free to go onto bigger and better things. This practice actually is not laziness, it is competence. And laziness.

One Rule to Rule Them All

How do we do it?

Having had my epiphany moment I set about deleting all the connector rules I had created to that point (to avoid the temptation to just continue with what I had created and force me to deliver something new) and set about creating one to rule them all. For this blog I will be using REST APIs as the example which returns JSON data, however the same methodology is applicable to any source type, be it API, SQL, SAP, FTP, and so on.

But how does the rule know what to connect to and what the data looks like?

There were two key items in creating a single rule and making it work that I considered first; how does the rule know which DataSource is initiating it and how do we flex the fields returned.

Getting the DataSource Name

This is easy enough, when a DataSource in OneStream (or (almost) anything) calls a business rule the action knows what told it to do something. In a connector rule this is in the args object.

Figure : But where do we get the DataSourceName, oh, there it is

The name of the DataSource for the import process is captured in args.DataSourceName. Problem one solved!

Structuring the data, data field types

My preferred method here is to use custom classes created within the connector business rule.

Graphical user interface, application Description automatically generated

Figure : Give a class a home

The Example

For brevity we will show two API calls, but this methodology will work with as many as you like. Additionally, because we are only looking at two examples, I am going to keep the logic simple. As the volume of sources increases, I would advise additional optimization of the code. For example, one rule to hold the logic, another to hold the classes and a config file to hold the connection details.

I love a class

Without further ado let’s go look at one of the Trial Balance classes.

Graphical user interface, text, application Description automatically generated

Figure : Each DataSource needs two classes

First off you may notice I have lied to you (I will try not to do this again), we create two classes here, not the one I led you to believe.

  1. The first is a simple flat object used to provide the field list to the DataSource for mapping fields to dimensions, additionally used to create a datatable to hold the flattened JSON string returned by the API.
  2. The second represents the nested JSON structure returned by the API, this is how our friend Newtonsoft.Json knows how to deserialize the nested JSON string. I may have forgotten to mention we need additional References to make all this work.

Text Description automatically generated

Figure : The additional References we need for this connector methodology

The key ones I will explain here are Newtonsoft.Json (explained above), Microsoft.Identity.Client (required if you have an OAuth2 protected API) and the last two which are the beginnings of my epiphany moment. This is a Business Rule in OneStream in the Extender section that I use to manage all OAuth2 security token calls.

Getting the fields

To get the field list for the DataSource we just need to utilize the class we created and return these to the Main function. Within this you can see that we have our first use of args.DataSourceName which is used to return the correct class object for the DataSource. Then we just use a For-Each loop to loop through the class, add the field names to a List and return this:

Text Description automatically generated

Figure : Getting the fields from our classes

Et voilà, we have field names in our DataSource to map to our Source Dimensions.

Graphical user interface, application Description automatically generated

Figure : We have fields to map!

The URL

Often a URL will have additional components required to identify or filter the records we require. Below is an example of how this could look.

https://apicalls.mydomain.com/SendTBFromERP1ToOneStream?company=myCompanyID&postingEndDate=myPeriodEndDate

Where the items in blue may vary, not exist, or have additional elements. We vary these by a simple use of dictionaries and passing in the items we need for the API to accept our call.

Text Description automatically generated

Figure : Handling varying URL components

This will generate a URL string like the below:

  1. https://apicalls.mydomain.com/SendTBFromERP1ToOneStream?Company=BlackDiamondAdvisory&PostingEndDate=20221231
  2. https://apicalls.mydomain.com/SendTBFromERP2ToOneStream?Company=BlackDiamondAdvisoryUKLtd&StartDate=20220101&EndDate=20221231&Fixed=True

The Data

Finally, the data, which is why we’re all here I assume.

For each DataSource we use the relevant Class Object to manage the deconstructing of the nested JSON data into a flat data string. Following this all we need to do is to create a datatable to put the data in and then return it to the stage.

Text Description automatically generated with medium confidence

Figure : Get me some data!

The Conclusion

As always with OneStream there are at least five ways to do anything. The above is just one illustration of simplifying Connector Rules.

A key factor in making the above methodology a success is to start by identifying the source databases, URL endpoints and/or fileservers needed for data. Please do take into consideration that there may be additional sources required after you and your initial build is in Production! Categorize these into sources that make sense to group together and go from there. The categorization is completely up to you. My recommendation is to think along the lines of source type, for example, REST API v SQL, etc. and/or by Source End Point. For example, if you have 10 imports from multiple systems (TB, FA, AP, AR, etc.) maybe it is the right mindset to create a rule for each system.

One thing is for sure, save yourself time and stop creating a rule for each DataSource!

The post How Many Connector Rules Do We Really Need? appeared first on Black Diamond | OneStream Solutions.

]]>
No Calc To Calc https://blackdiamondadvisory.com/2022/03/29/no-calc-to-calc/ Tue, 29 Mar 2022 19:24:48 +0000 https://blackdiamondadvisory.com/?p=1459 Have you ever wondered what you can do with the calculation type “No Calculate” when setting up a Calculation Definition in your Workflow Profile? It’s sort of odd: Calculation Definition implies a calculation; No Calculate does precisely the opposite. “No Calculate”, despite its name, is a powerful technique that can kick off a Data Management Sequence (to, um, calculate) from a Workflow Profile Process step, which opens the world to endless types of different scenarios (or at least calculations running after data loads).

The post No Calc To Calc appeared first on Black Diamond | OneStream Solutions.

]]>

The Gateway to DM Sequences via Workflow Profiles – “No Calculate”

Have you ever wondered what you can do with the calculation type “No Calculate” when setting up a Calculation Definition in your Workflow Profile? It’s sort of odd: Calculation Definition implies a calculation; No Calculate does precisely the opposite. “No Calculate”, despite its name, is a powerful technique that can kick off a Data Management Sequence (to, um, calculate) from a Workflow Profile Process step, which opens the world to endless types of different scenarios (or at least calculations running after data loads).

Say What Meme GIFs | Tenor

Use Case

In this example, we will be loading new growth factors and executing a business rule to calculate forecasted dollar amounts. The key here is that the calculation ought to fire directly after the data load.

Pre-Req’s

1. A Workflow Profile Child that at a minimum has a “Process” step in it. In my example, we will use the “Import, Validate, Process” Workflow name.

Graphical user interface, application Description automatically generated

2. You will need to have a Data Management Sequence setup and working (and of course one that calls a Business Rule). In this example, I have created a DM Sequence and Step called “NoCalc_Demo”. Spaces are allowed if you wish!

Graphical user interface, text, application, email Description automatically generated

3. You’ll need to have an Entity to run this job from. I have created an entity called “NoCalc” to launch this rule. While you can use this entity in the DM Step, this is not always the entity that will be calculated in the DM Step. A separate, calculation-only Entity is a good idea because it segregates calculations that run on load from the real consolidation process.

Graphical user interface, application, Word Description automatically generated

4. Lastly, you will need a Data Quality Event Handler. If you do not already have this Business Rule created, navigate to Application->Tools->Business Rules and create a new rule with a type of “Data Quality Event Handler”. The name of the rule – “DataQualityEventHandler” – is assigned automatically on creation of the rule and cannot be changed. You will only ever have one of this rule type.

Graphical user interface, text, application Description automatically generated

Now you will need to place some code into the business rule which will actively identify a “No Calculate” calculation type. Two key areas in the business rule that I would like to call out are on lines 45 and 67 below. The first is identifying a No Calculate process type that is actively identifying the process in the background (automagically) and the second part where it is starting the Data management Sequence with a filter value (this will come into play in a later step).

BDA nor I will take any claim in writing this rule as it was passed down from the Legend himself. No changes are necessary to this business rule.

Text Description automatically generated

Where To Find It If You’re Not Keen on Lots of Typing

The above is a bit hard to read and isn’t populated when the rule is created. It is available in GolfStream and is a direct copy and paste (be sure to do a validate just in case). There’s only one DataQualityEventHandler rule so it’ll not be hard to find; look for it under the Extensibility Rules hierarchy.

Connect the Dots

At this point, we have all the necessary components to kick off the DM Sequence from our WF Profile. The linkage between the Workflow Profile Child and Data Management can now be done in just a few easy steps:

1. Within the WF Profile Calculation Definition setup you will need to add an entry with the following details:

    • Entity – NoCalc (Step 3 from the Pre-Reqs)
    • Parent – You can leave this at Unassigned
    • Calc Type – No Calculate – hence the reason you are reading this blog! Seriously, this is not terribly intuitive, but this really and truly is the Calc Type.
    • Scenario Filter Type – If you would like to only run for certain Scenario Types you can select a value.
    • Confirmed – Checkbox Y/N value. In this example we will leave it blank.
    • Order – Defined order of operations that will be unique to your use case. In this case I only have one step so I will leave it at 0.
    • Filter Value – Must equal the Data Management Sequence name. Check line 67 of Tom’s code to see that this value is what is being started in the DM Sequence. You must type in the name of the Sequence name directly so either have a good memory or write it down on a piece of paper; there is no dropdown control to give you the names of the Sequences.

Graphical user interface, text, application Description automatically generated

2. You are done!

Final Product

Once we have completed the setup, we can go ahead and step through the process and test it out. So here we go…

1. Imported data successfully:

Graphical user interface, text, application Description automatically generated

2. Validated data successfully:

Graphical user interface, text, application Description automatically generated

3. Load and Process the cube successfully:

Graphical user interface, text, application Description automatically generated

4. Data Management Sequence executed successfully, as shown in the Task Activity monitor:

We can also review our cube views to make sure the numbers are calculated as anticipated post process. In the example for this demo purpose – we have the original data plus a growth factor and then a second slice that will contain the calculated data below it.

Pre Process

Application, table Description automatically generated

Post Process

Graphical user interface, application, Word Description automatically generated

Automation Nirvana Achieved

The requirement to run a calculation (or a series of calculations) after a data load is a common use case. That calculation must be run manually unless the No Calc-to-Calculate Workflow Process properties are defined and the (written for you on creation) DataQualityEventHandler rule is created.

With those two components in hand, with a simple and reusable Business Rule you can unleash the power of OneStream’s Workflow engine to create simple and integrated processes! What could possibly be easier?

The post No Calc To Calc appeared first on Black Diamond | OneStream Solutions.

]]>
Keeping Your Balance With Unbalanced Math https://blackdiamondadvisory.com/2022/03/15/keeping-your-balance-with-unbalanced-math/ Tue, 15 Mar 2022 17:06:00 +0000 https://blackdiamondadvisory.com/?p=1446 Gentle Reader, if you – as your author most definitely was when he first researched it – are somewhat puzzled when reading about Unbalanced Math, this is post is for you as it is a powerful method that illustrates OneStream’s flexibility and utility, prevents pretty dramatic error messages, and stops you from writing stupendously ill thought out nonoptimal solutions to simple problems. Seriously, if you write Finance Business Rules, you need to understand this.

The post Keeping Your Balance With Unbalanced Math appeared first on Black Diamond | OneStream Solutions.

]]>
The Riddle of the Docs

Gentle Reader, if you – as your author most definitely was when he first researched it – are somewhat puzzled when reading about Unbalanced Math, this is post is for you as it is a powerful method that illustrates OneStream’s flexibility and utility, prevents pretty dramatic error messages, and stops you from writing stupendously ill thought out nonoptimal solutions to simple problems. Seriously, if you write Finance Business Rules, you need to understand this.

Oh sure, I know all of this now, but as I first read through the documentation, I simply couldn’t figure it out. Why is it broken out separately from plain old api.Data.Calculate? “Unbalanced” is such an odd name. How do I use it? In short: what on earth are they going on about in the in the Design and Reference Guide and why should I care?

I’m sad to relate that as I read (and re-read) on, I became more confused. In particular, the references to Data Buffers just seemed…odd. I had to – as with practically everything else that involves OneStream and Yr. Obt. Svt.­ – figure this out by actually using it in a concrete functional use case of my own.

TL;DR, but you should

“Unbalanced” simply means that when a target member tuple in an api.Data.Calculate statement does not mirror the dimensions in the source member tuples, it is unbalanced in its dimensionality and when the tuples – which translate to Data Buffers – are unbalanced, a standard calculation will fail.

In Itsy Bitsy Words

A simple example is a centrally stored rate that is applied to multiple Entities and UD members. In pseudocode, it would look like:

Distribution = Sales * Distribution_Rate at No Geography at No Product

This looks simple enough. Sales is in multiple States (sorry, international readers) and Products. There is a single rate for the entire country by month. Multiplying Sales by that centrally stored expense rate for all States and Products should be a trifle.

Balanced Tuple Dimensions

A#Sales could be at O#Import and O#Forms, so O#BeforeAdj will be the Origin dimension member. These rate calculations only make sense at V#Periodic so that too will be explicitly set in the method. All calculations go to O#Forms. So far, so good.

First Pass, But Fails

That formula might look something like this:

Note that the U1#Total_Products.Base member filter will apply A#Distribution’s calculated results to every Product where A#Sales exists.

Also note that U1#No_Product is in A#Distribution_Rate’s tuple but not in the target A#Distribution’s tuple; this is no accident as the calculation must write to many Products using a rate stored at a single calculation-only driver Product.

Data Management

The super simple Data Management job cycles through just two States – enough for illustrative purposes and no more.

Graphical user interface, text Description automatically generated

The data is equally simple as is the Excel proof of concept math:

Graphical user interface, application Description automatically generated

Seriously, this is x = y * z. How hard could this be?

Harder Than Anticipated

A picture containing text Description automatically generated

No, OneStream doesn’t throw an error quite like that, but it’s almost as bad:

Text Description automatically generated

Ouch.

Despite the error message’s length, OneStream is pretty clear about the issue: A#Distribution_Rate has an explicit U1#No_Product member definition and A#Distribution does not; the member filter of U1#Total_Products.Base does not balance U1.

The error message states that either a specific target U1 member should be used or U1#All could apply the calculated results to, well, all U1 members.

Please Do Not Do This

U1#All will work in this very specific context:

Ta da:

A picture containing table Description automatically generated

Don’t, Just Don’t

There are warnings in the Design and Reference Guide to be very, very, very careful when using the All keyword because it can lead to data – perhaps quite a lot of data – being in places neither you nor anyone else might expect which is generally viewed as a Bad Thing. OneStream are not shy about pointing this out:

Graphical user interface, text, application Description automatically generated

The author in your, um, author appreciates the “please do not do this” note which he suspects came from Product Support or Product Development or practically everyone who works for OneStream.

Having shown you the wrong way to do this, let’s try the right way: Unbalanced Math.

The Four Faces of Unbalanced Math

There are four unbalanced functions: AddUnbalanced, SubtractUnbalanced, DivideUnbalanced, and MultiplyUnbalanced, the last of which is the focus of this blog post. See the Design and Reference Guide for more detail on the first three.

I think of the functions as following (super roughly) this pattern:

x = y, z that is out of balance with x, the unbalanced bits of z that aren’t mentioned in x

In this use case, the x, y, and z as well as the missing bits must be surrounded by MultiplyUnbalanced and of course the whole thing is encapsulated within a api.Data.Calculate statement.

In All Its Glory

What does it take? Is it as complicated as I first thought?

Repeating U1#No_Product in that third parameter is all it takes. Easy-peasy.

NB — E#No_Geography isn’t out of balance because E#Pennsylvania and E#South_Carolina as defined in the Data Management Step are implicitly in the target Data Unit tuple.

I’ve modified A#Distribution_Rate to illustrate the impact:

Graphical user interface, table Description automatically generated with medium confidence

That’s all there is to it. Also, this prevents OneStream’s documentation team (and the rest of that company) from a deep existential despair that ensues when #All is used. Win, win.

As Always, Easier Once Done

OneStream’s functionality can sometimes be difficult to suss out but with a bit of experimentation, it will give up its secrets. The reward is usually worth the struggle.

Rate calculations are common across all OneStream application. If you have not yet run across a requirement to perform Unbalanced Math, you will. It’s easy and powerful. Use it, and don’t use #All.

Be seeing you.

 

The post Keeping Your Balance With Unbalanced Math appeared first on Black Diamond | OneStream Solutions.

]]>
Black Diamond Advisory Announces New Director in the East Region and Continues Expansion of Global OneStream Talent https://blackdiamondadvisory.com/2022/03/09/black-diamond-advisory-announces-new-director-in-the-east-region-and-continues-expansion-of-global-onestream-talent/ Wed, 09 Mar 2022 16:56:07 +0000 https://blackdiamondadvisory.com/?p=1440 The post Black Diamond Advisory Announces New Director in the East Region and Continues Expansion of Global OneStream Talent appeared first on Black Diamond | OneStream Solutions.

]]>

Black Diamond Advisory Announces New Director in the East Region and Continues


OneStream Diamond partner extends footprint for growth with advisory services

Philadelphia, PA January 31, 2022 (GLOBE NEWSWIRE) – Black Diamond Advisory announces its continued expansion adding a new Director to its Global Advisory team in Philadelphia. Black Diamond Advisory is the leading Global Digital Finance Transformation firm and a OneStream Diamond Partner.

Client demands for the expansion of Global Advisory services prompted this latest strategic move by Black Diamond, the firm built to transform companies by creating an industry powerhouse of top talent from the most respected leaders in OneStream technology, together with consulting leaders in digital finance transformation.

The Global Advisory team brings together individuals with extensive OneStream cross-market sector experience and a proven track-record of delivering successful large-scale OneStream projects.

Cameron Lackpour joins the Black Diamond team as Director, Center of Excellence. His 30-year Performance Management career started with Comshare’s mainframe products, Arbor’s Essbase, Hyperion’s Planning, and OneStream’s OneStream XF. Cameron was the editor in chief of the Developing Essbase Applications books, co-author of the recently released OneStream Planning:  The Why, How, and When, has spoken internationally on Performance Management, presented at numerous technical conferences, and is an Oracle ACE Director Alumni. Cameron runs a technical  blog, https://www.thetruthaboutcpm.com and is cohost of the Epmconversations podcast, which cover Performance Management in its many facets.

“Cameron brings an unparalleled level of expertise to the Black Diamond’s Center of Excellence. His perspective and experience are a welcome addition to the team.” says Randy Werder, Black Diamond CEO.

About Black Diamond Advisory

Black Diamond is the leading Global Digital Finance Transformation firm and OneStream Diamond Partner. Our services include Financial Transformation, Change Management, Process Automation, OneStream Solutions. We are a global partner operating in the U.S., Canada, and Europe. As a single firm with truly global capability, Black Diamond is committed to meeting the combined needs of the CFO and Controller, as well as IT and Business Unit Leaders. The firm knows that the solution to a company’s digital finance transformation is expert implementation and ongoing collaboration.

Our industry practices include Manufacturing & Industrial, Hospitality & Retail, Financial Services, Insurance, Energy & Utilities, Healthcare, Private Equity, Technology, Media & Telecommunications, Travel, Transport and Logistics.

We lead with our talent of “Experts Only” and develop a unique platform for each of our clients combining finance and operational data into interactive dashboards and real-time analytics. Our firm has a single mission of 100% Customer Success that is 100% aligned with the OneStream executive leadership.

About OneStream Software

OneStream Software provides a market-leading intelligent finance platform that reduces the complexity of financial operations. OneStream unleashes the power of finance by unifying corporate performance management (CPM) processes such as planning, financial close & consolidation, reporting and analytics through a single, extensible solution. We empower the enterprise with financial and operational insights to support faster and more informed decision-making. All in a cloud platform designed to continually evolve and scale with your organization.

OneStream is an independent software company with over 900 customers, 200 implementation partners and over 1000 employees, our primary mission is to deliver 100% customer success.

PRESS CONTACT 

Randy Werder
CEO, Black Diamond Advisory

T: (407)758-7382

E: rwerder@blackdiamondadvisory.com

The post Black Diamond Advisory Announces New Director in the East Region and Continues Expansion of Global OneStream Talent appeared first on Black Diamond | OneStream Solutions.

]]>
Black Diamond Advisory Announces New Director in the South Region and Continues Expansion of Global OneStream Talent https://blackdiamondadvisory.com/2022/03/09/black-diamond-advisory-announces-new-director-in-the-south-region-and-continues-expansion-of-global-onestream-talent/ Wed, 09 Mar 2022 16:36:53 +0000 https://blackdiamondadvisory.com/?p=1439 The post Black Diamond Advisory Announces New Director in the South Region and Continues Expansion of Global OneStream Talent appeared first on Black Diamond | OneStream Solutions.

]]>

Black Diamond Advisory Announces New Director in the South Region and Continues


OneStream Diamond partner extends footprint for growth with Advisory Services

Charlotte, NC January 31, 2022 (GLOBE NEWSWIRE) – Black Diamond Advisory announces its continued expansion adding a new Director to its Global Advisory team in Charlotte. Black Diamond Advisory is the leading Global Digital Finance Transformation firm and a OneStream Diamond Partner.

Client demands for the expansion of Global Advisory services prompted this latest strategic move by Black Diamond, the firm built to transform companies by creating an industry powerhouse of top talent from the most respected leaders in OneStream technology, together with consulting leaders in digital finance transformation.

The Global Advisory team brings together individuals with extensive OneStream cross-market sector experience and a proven track-record of delivering successful large-scale OneStream projects.

Celvin Kattookaran, joins the Black Diamond team as Director. He has over 17 years of experience in the industry, focusing on Hyperion and OneStream products. He is an Oracle ACE Director. Celvin has successfully implemented enterprise-wide performance management solutions across many industries with heavy experience in the OneStream Platform and various specialty applications. Celvin is an avid blogger and shares his ideas and utilities on his blog cpminsights.com. He is the co-author OneStream Planning: The Why, How and When. He is also a host on the podcast EPMConversations.com.

“The addition of Celvin brings unmatched OneStream technical depth to the Black Diamond team. His experience and insight in delivering complex solutions across industries paves the way for continued Black Diamond expansion,” says Randy Werder, Black Diamond CEO

 

About Black Diamond Advisory

Black Diamond is the leading Global Digital Finance Transformation firm and OneStream Diamond Partner. Our services include Financial Transformation, Change Management, Process Automation, OneStream Solutions. We are a global partner operating in the U.S., Canada, and Europe. As a single firm with truly global capability, Black Diamond is committed to meeting the combined needs of the CFO and Controller, as well as IT and Business Unit Leaders. The firm knows that the solution to a company’s digital finance transformation is expert implementation and ongoing collaboration.

Our industry practices include Manufacturing & Industrial, Hospitality & Retail, Financial Services, Insurance, Energy & Utilities, Healthcare, Private Equity, Technology, Media & Telecommunications, Travel, Transport and Logistics.

We lead with our talent of “Experts Only” and develop a unique platform for each of our clients combining finance and operational data into interactive dashboards and real-time analytics. Our firm has a single mission of 100% Customer Success that is 100% aligned with the OneStream executive leadership.

About OneStream Software

OneStream Software provides a market-leading intelligent finance platform that reduces the complexity of financial operations. OneStream unleashes the power of finance by unifying corporate performance management (CPM) processes such as planning, financial close & consolidation, reporting and analytics through a single, extensible solution. We empower the enterprise with financial and operational insights to support faster and more informed decision-making. All in a cloud platform designed to continually evolve and scale with your organization.

OneStream is an independent software company with over 900 customers, 200 implementation partners and over 1000 employees, our primary mission is to deliver 100% customer success.

PRESS CONTACT 

Randy Werder
CEO, Black Diamond Advisory

T: (407)758-7382

E: rwerder@blackdiamondadvisory.com

 

The post Black Diamond Advisory Announces New Director in the South Region and Continues Expansion of Global OneStream Talent appeared first on Black Diamond | OneStream Solutions.

]]>
How Does OneStream Store Data? https://blackdiamondadvisory.com/2022/03/01/how-does-onestream-store-data/ Tue, 01 Mar 2022 18:50:08 +0000 https://blackdiamondadvisory.com/?p=1421 A Financial Analyst Perspective Working with OneStream software for the first time, coming from a financial analyst background, I was always a bit confused by how data was stored for the reports I was creating or the hacked-together solutions I was deploying through OneStream reporting objects. I was familiar with data warehousing concepts and had...

The post How Does OneStream Store Data? appeared first on Black Diamond | OneStream Solutions.

]]>
A Financial Analyst Perspective

Working with OneStream software for the first time, coming from a financial analyst background, I was always a bit confused by how data was stored for the reports I was creating or the hacked-together solutions I was deploying through OneStream reporting objects. I was familiar with data warehousing concepts and had worked almost entirely within a Kimball-designed data warehouse, creating Power BI reports, throwing data into Excel sheets, crunching data in Python, but when I switched over to OneStream’s cube-like data solutions, it didn’t really quite click as to how to manipulate data within OneStream until I saw the SQL tables under the hood.

It’s probably also worth noting that when you start out as a financial analyst, you usually start to understand the OneStream application backwards because usually the entire system is already built by the time they even let you in to mess around with things like Cube Views or Report Books. So hopefully, this perspective is at least somewhat relatable to those in similar roles.

Essentially, data within OneStream – except for dynamically aggregated results – in its purest form is stored in data records in MS SQL Server that use a “DataRecordYYYY” naming convention, where YYYY is the corresponding year that the data is recorded in. Each row in this table corresponds to all 18 dimensions as columns, along with additional columns for M1, M2, … , M12, coupled with values and statuses.

Something like below:

The extra column PartitionId is what gets used by OneStream’s in-memory engines to split up processing by EntityId

Disclaimer: I am assuming you know how api.Data.Calculate() works as well as how to write Member Filters

So with that in mind, let’s look at an example of when you run an api.Data.Calculate() function within a business rule or stored formula:

api.Data.Calculate("A#SomeAccount = A#SomeOtherAccount")

When you run this api.Data.Calculate() function above through a business rule or stored value formula, OneStream does some finaggling in the background (using functions that manipulate the data in-memory) that looks at all existing instances of A#SomeOtherAccount and sets A#SomeAccount to A#SomeOtherAccount’s values for the data unit that you’re running the function for (where a Data Unit is defined as Cube, Entity, Parent, Consolidation, Scenario, Time).

However, a data unit only covers some of the columns in the SQL table – the full listing of dimensions are the following:

  • Cube
  • Entity
  • Parent
  • Consolidation
  • Scenario
  • Time
  • View
  • Account
  • Flow
  • Origin
  • Intercompany
  • UD1 – UD8

For the above api.Data.Calculate() example, were it run through a Custom Calculate Data Management step to run a Finance Business Rule (with only the Data Unit defined and POV left blank), then the function will go out and look for every single row in the appropriate DataRecordYYYY table (again, for the relevant data unit) and do a calculation for A#SomeAccount using A#SomeOtherAccount’s values for all the other dimension combinations where data exists.

The important bit here, is it will only grab the dimension combinations where data exists – so for every other dimension combination that has no data, OneStream’s finance engine will not go through the entire database and copy over zeroes if the data does not exist. To riff on this a bit more – api.Data.Calculate() can only see rows within the relevant “DataRecordYYYY” table when doing comparisons. This limits data size in the tables, improves performance by observing sparsity and results in meaningful results

As a financial analyst, if you were tasked to create some calculation and fill data for some intersection, it is extremely important that you understand what base members you’re doing the calculation for. Because if you left everything wide open like above and you don’t really have a fundamental understanding of how the data is structured, you could end up creating a ton of data accidentally. In the case above, the finance engine will quite literally grab every single data point related to A#SomeOtherAccount for the relevant data unit and create the necessary intersections for A#SomeAccount within the SQL table (which may or may not be what you wanted to happen; but obviously, if it’s something that you’re trying to do intentionally, then go for it).

To harp on about data existing, I mean that the specific dimension combination would need to have at least one value populated for the year.

So, to continue with the api.Data.Calculate() example – let’s say I wanted to copy all the values in A#GrossSales to A#Salary shown in this cube view below (for some weird reason):

The second row level in this cube view is a UD1 that has several products and only serves a purpose in throwing some data in a different dimension combination specific to this example:

Now, If I ran this code within a finance business rule (for the relevant data unit within this cube view):

api.Data.Calculate("A#Salary = A#GrossSales")

This is what happens:

For those values to copy over to A#Salary:U1#None, data must have existed in A#GrossSales:U1#None and if they didn’t then nothing would get copied over. Immediately upon copying, a corresponding row in the relevant SQL table shows up with all dimensions specified as well as values populated.

If I look at the cell POV for the T#2021M1 entry for A#Salary:U1#None in this cube view:

These exact dimensions would be (must be) populated in the SQL table (as MemberIds, not names) along with every M1-M12 value with a corresponding status. You can check yourself in System -> Database if you tried this on your own instance.

Moreover, suppose I wanted to copy some data from A#GrossSales:U1#Product1 to A#Salary:U1#Product1 where data was sparsely populated like below:

In this case, the time periods that show no data will be thrown into the database as zeroes with a cell status of ‘No Data’ for January to April. From June to December, zeroes will also be put in its place, but instead with a status of ‘Calculated/Derived’.

Since those values are tagged with a ‘No Data’ status, they will show up as blanks in the cube view, for the other months that are in the database as zeroes, but with a status of ‘Calculated/Derived’, they show up as grey:

This is a fairly long example using api.Data.Calculate(), but this entire process of OneStream throwing data into respective DataRecordYYYY tables isn’t exclusive to just this function – it’s what happens for everything and anything that is involved with storing cube data in OneStream (not to be confused with stage data, those have their own tables). Another example would be through the input of data using the Forms Origin from a Cube View specifically setup for data entry – the same kind of thing happens.

Moreover, just because this data record table exists as a SQL object doesn’t mean you can just update M1-M12 values and statuses with an update SQL statement. OneStream deals with consolidations (and really any calculation in the application) using its own calculation engine by referencing the data record tables and there’s more to it than just updating the data record table. It’s also advisable to never do this to any of the formally created OneStream SQL tables unless you absolutely know what it will do within the system, just experimenting on data like this will most likely corrupt your application. Feel free to try and break things on your own experimental application though; in fact, I actively encourage you to do so because some of this stuff just isn’t documented as well as it should be.

Warnings aside, understanding how OneStream stores its data (even on the surface like this) definitely gave me a better mental model of what was going on in the background whenever I imported data, saved it through a form, or calculated it through a business rule. I’m the type of person that needs to understand structures at their most granular level when designing solutions, so visually seeing how the data sat in the SQL database not only made it easier to design processes through business rules, but it also gave me a grounding of how I should be thinking about OneStream’s data in general (so as to combat situations where I have no idea why data isn’t showing up when I plop Member Filters into Cube Views or why data that I calculated isn’t as I expect).

There’s way more to talk about when it comes to OneStream’s data, but this should be a good starting point for those just getting into it.

 

The post How Does OneStream Store Data? appeared first on Black Diamond | OneStream Solutions.

]]>
What’s in a (Workflow) Name https://blackdiamondadvisory.com/2022/02/09/whats-in-a-workflow-name/ Wed, 09 Feb 2022 17:01:01 +0000 https://blackdiamondadvisory.com/?p=1364 It has been your author’s observation that the glue that holds OneStream applications together – Workflow – is a victim of terminological inexactitude. No, not that odious euphemism, but instead the common usage of just one term – “Workflow Profile” – for the four (arguably five) different Workflow Profile types. When we OneStream practitioners use the same word to mean many things, we confuse ourselves, make mistakes, and generally make everyone who touches the application unhappy. Happy is more fun.

The post What’s in a (Workflow) Name appeared first on Black Diamond | OneStream Solutions.

]]>

Many names, much confusion, and it’s all rather important

It has been your author’s observation that the glue that holds OneStream applications together – Workflow – is a victim of terminological inexactitude. No, not that odious euphemism, but instead the common usage of just one term – “Workflow Profile” – for the four (arguably five) different Workflow Profile types. When we OneStream practitioners use the same word to mean many things, we confuse ourselves, make mistakes, and generally make everyone who touches the application unhappy. Happy is more fun.

To avoid that state of Workflow-induced despair, we need a commonly agreed upon taxonomy and then we must use it. Happily, OneStream has created those Workflow types and definitions (they could not do otherwise) so the path to understanding then is to get all OneStream practitioners to comprehend and adhere to those taxonomical definitions. Working with a tool as sophisticated as Workflow requires terminological exactitude so that we all understand what on earth we’re talking about.

Four, just four

As noted, there are four main Workflow types: Cube Root Workflow Profile, Default Workflow Profile, Workflow Profile, and Workflow Child Profile.

How hard could it possibly be? Let’s find out.

A note

This post is not a comprehensive guide to Workflow. For that, see the OneStream Design and Reference Guide and its Workflow Guides section.

Cube Root Workflow Profile

The Cube Root Workflow Profile is defined at the Cube Level by Scenario Type via Scenario Suffixes. Think of using Scenario Suffixes at the Cube level as a sort of extended (OneStream uses the term “varying”) Workflow as it allows your application to segregate Workflow by Scenario Type.

Graphical user interface, table Description automatically generated

In the above example, the Scenario Type Suffixes are Actual and Forecast. The values are arbitrary – they could be Potato or Happy or more likely the ones shown or Budget or LRP. Try to use something meaningful.

Missing Scenario Types

Scenario Types are good practice because they allow explicit assignments of an Entity to more than one Workflow Profile (more anon on this term). Even if there is no immediate need for them, applications have a way of growing and it’s best to have Scenario Types in place when they do. There’s no need to assign Suffixes to each Scenario Type, just one will do as a start and then expand as many times as needed.

Naming confusion

A note about Scenario Types – don’t conflate a Scenario named “Plan” with a Scenario Type of “Plan” as the only logical and functional link one you, Gentle Reader, should define in the tool.  Although a “Plan” Scenario Type can certainly have a Suffix of “Plan” and be used in the “Plan” Scenario, it isn’t required.  This sort of identical naming convention, while appealing on its face, breaks down if there is more than one Scenario that logically shares a Scenario Type which is often the case in planning applications.  Whew.

As with everything OneStream, there are many ways to approach a requirement, none of which are exactly wrong but some of which are not quite as good as others. Your application’s needs will dictate what is best.

Using the example below of three Scenario Types (Actual, Forecast, and Plan) with two different Scenario Suffixes (Actual and Forecast), when a new Cube Root Profile is created, the two Scenario Types of Actual and Forecast appear; the Workflow Scenario Type Suffix defines the Workflow Cube Root Profiles, not the Scenario Types themselves.

Creating a Cube Root Workflow Profile

The naming convention is CubeName_ScenarioType.

Graphical user interface, application Description automatically generated

Clicking on either choice will create the Cube Root Workflow Profile Name. For the purposes of this post, only Sample_Forecast will be used.

It’s easy to identify in the Workflow Profile editor hierarchy as it’s at the very tippy top and has a cube icon to the left of the name:

Default Workflow Profile

Once the Cube Root Workflow Profile is created, the Default Workflow Profile Sample_Forecast_Default appears automatically.

Graphical user interface Description automatically generated with low confidence

A Default Workflow Profile connects the Cube’s Entities and Workflow itself. All Entities are by default assigned to the Workflow – note that the Entity Assignment property sheet does not exist in the Default Workflow Profile.

Some of its salient characteristics are:

  1. It is named CubeName_WorkflowSuffix_Default.
  2. It joins Cube Entities to Workflow.
  3. It cannot be deleted.
  4. Only Administrators should be able to see it.

As with all Workflow Profiles, the Workflow Child Profiles of Import, Forms, and Adj appear below the Workflow Profile name.

A click on the Import Child Profile (this is just illustrative – don’t actually use the Default Workflow Profile) shows that two Scenario Types are available: Forecast and Plan. These are the Scenario Types that share the Workflow Suffix “Forecast”.

Graphical user interface, application Description automatically generated

Scenarios with Scenario Types

The Plan Scenario has a Scenario Type of “Forecast”. (Remember what I said about the potential for confusion? Here it is.)

Scenario Types are linked to the Scenario Type property in the Scenario itself. This relationship defines the Scenarios in OnePlace. Whew, again.

Graphical user interface, application Description automatically generated

All you really have to know is that if a Cube’s Workflow has defined Scenario Types, the Workflow is extended and a Scenario tagged with that Scenario Type is now part of that Workflow; only Scenarios with that Scenario Type will appear in OnePlace.

Graphical user interface, text, application, email Description automatically generated

Whew, again and again.

Workflow Profile

Workflow Profile Types

There are three Workflow Profile Types: Review, Base Input, and Parent Input.

Graphical user interface, text, application, email Description automatically generated

As this post is written by Mr. Planning, I’ll confidently state that Base Input is the overwhelmingly most used type in planning applications although of course Review and Parent Input are used as well; Consolidations applications are far more likely to use all three types As the purview of this post is not All Things Workflow but instead Workflow terminology, only Base Input will be examined.

Workflow Child Profile

We have now almost reached the end of our Workflow taxonomical journey.

Note that by default, the Workflow Child Profiles of Import, Forms, and Adj have been automatically created.

Graphical user interface, text, application Description automatically generated

Workflow Child Profile types are tied to the Origin dimension, with a fairly logical grouping of Import with Import, Forms with Forms, and Adj with AdjInput.

Graphical user interface, text, application, email Description automatically generated

That’s it – we are at the bottom of the Worfklow Profile tree with Workflow Profile as the most atomic.

There is one more we-call-it-Workflow-Profile-but-really-it’s-something-else element: Workflow Names.

Workflow Names

Workflow Names are confusingly called “Default Workflow” when a Workflow Child Profile is created:

Graphical user interface, application Description automatically generated

They are called Workflow Names within a Workflow Child Profile:

Graphical user interface, text, application, email Description automatically generated

Think of Workflow Names as the actions that drive Workflow. Given that the property sheet for a Workflow Child Profile uses “Workflow Names”, it seems most logical to use that term when referring to the many, many, many actions (almost 60) they support. Whew, one last time.

As an example, in the Workflow Child Profile Import, I can use the traditional Import, Validate, Load Workflow Name to load data:

Graphical user interface, application, Word Description automatically generated

Or I can use Direct and change the way data is loaded into the Cube:

Graphical user interface, application Description automatically generated

Do we have unanimity? Close to it? We should.

Workflow is the core structure OneStream application data processing. Workflow is sophisticated and powerful. Its potential is great, as is its potential to go sideways if discussed and thought about incorrectly.

To use it correctly, we must mean what we say by using the right terms in the right place.

Thus:

  1. Cube Root Workflow Profiles are the topmost level of the Workflow hierarchy. They are tied to Workflow Types. Scenarios that have a matching Scenario Type are visible in OnePlace.
  2. The Default Workflow Profile is automatically generated when a Cube Root Workflow Profile is created. It bridges Cube Entities and Workflow. Do not use it.
  3. Below the main Cube Root Workflow Profile parent, Workflow Profiles join data, metadata, and users.
  4. Workflow Child Profiles are where users interact with Workflow be it data loads, forms, or adjustments via Workflow Name action types.

That’s it.

Be seeing you.

The post What’s in a (Workflow) Name appeared first on Black Diamond | OneStream Solutions.

]]>
BDA Announces New Director in Mid-West Region https://blackdiamondadvisory.com/2022/01/12/bda-announces-new-director-in-mid-west-region/ Wed, 12 Jan 2022 20:27:57 +0000 https://blackdiamondadvisory.com/?p=1355 The post BDA Announces New Director in Mid-West Region appeared first on Black Diamond | OneStream Solutions.

]]>

Black Diamond Advisory Announces New Director in the Mid-West Region and Continues Expansion of Global OneStream Talent

OneStream Diamond partner extends footprint for growth with advisory services

St. Louis, MO December 16, 2021 (GLOBE NEWSWIRE) –Black Diamond Advisory announces its continued expansion into the Mid-West adding a new Director to its Global Advisory team in St. Louis, Missouri. Black Diamond Advisory is the leading Global Digital Finance Transformation firm and a OneStream Diamond Partner. 

Client demands for the expansion of Global Advisory services prompted this latest strategic move by Black Diamond, the firm built to transform companies by creating an industry powerhouse of top talent from the most respected leaders in OneStream technology, together with consulting leaders in digital finance transformation.  

The Global Advisory team brings together individuals with extensive OneStream cross-market sector experience and a proven track-record of delivering successful large-scale OneStream projects.  

Extending the practice in North America is Michael Vannoni, who joins the Black Diamond team as Director, Global Advisory.  Michael joins us from Centene Corporation where he led the organization and teams responsible for delivering multiple successful OneStream implementations in Spain and in the US. During his tenure at Centene Michael also directed support teams, large projects, and strategic initiatives for the organization related to their ERP, CPM, EPM, and RPA platforms. Michael possesses a diverse blend of functional, technical, project management, and change management knowledge that has been cultivated during his 20+ year career in consulting and industry roles.  With experience in a wide range of industry verticals and strategy, Michael is uniquely suited to assist our clients with selection and implementation of best-in-class technology within their financial systems roadmap that positions them for long-term success.  

“We are really excited Mike is now part of our team. Mike brings industry expertise combined with OneStream and global knowledge, enabling him to be a key leader in our Global Advisory Practice,” says Randy Werder, Black Diamond CEO.

About Black Diamond Advisory
Black Diamond is the leading Global Digital Finance Transformation firm and OneStream Diamond Partner.  Our services include Financial Transformation, Change Management, Process Automation, OneStream Solutions.  We are a global partner operating in the U.S., Canada and Europe. As a single firm with truly global capability, Black Diamond is committed to meeting the combined needs of the CFO and Controller, as well as IT and Business Unit Leaders.  The firm knows that the solution to a company’s digital finance transformation is expert implementation and ongoing collaboration.  

Our industry practices include Manufacturing & Industrial, Hospitality & Retail, Financial Services, Insurance, Energy & Utilities, Healthcare, Private Equity, Technology, Media & Telecommunications, Travel, Transport & Logistics.

We lead with our talent of “Experts Only” and develop a unique platform for each of our clients combining finance and operational data into interactive dashboards and real-time analytics.  Our firm has a single mission of 100% Customer Success that is directly aligned with the OneStream executive leadership.  

About OneStream Software
OneStream Software provides a market-leading intelligent finance platform that reduces the complexity of financial operations. OneStream unleashes the power of finance by unifying corporate performance management (CPM) processes such as planning, financial close & consolidation, reporting and analytics through a single, extensible solution. We empower the enterprise with financial and operational insights to support faster and more informed decision-making. All in a cloud platform designed to continually evolve and scale with your organization.

OneStream is an independent software company backed by private equity investors KKR, D1 Capital Partners, Tiger Global and IGSB. With over 900 customers, 200 implementation partners and over 1000 employees, our primary mission is to deliver 100% customer success.

 

PRESS CONTACT 
Randy Werder 
CEO, Black Diamond Advisory
T: (407)758-7382
E: rwerder@blackdiamondadvisory.com 

 

The post BDA Announces New Director in Mid-West Region appeared first on Black Diamond | OneStream Solutions.

]]>
McCain Customer Webinar https://blackdiamondadvisory.com/2021/11/16/mccain-customer-webinar/ Tue, 16 Nov 2021 22:33:09 +0000 https://blackdiamondadvisory.com/?p=1340 The post McCain Customer Webinar appeared first on Black Diamond | OneStream Solutions.

]]>

McCain Customer Webinar

The post McCain Customer Webinar appeared first on Black Diamond | OneStream Solutions.

]]>
Why OneStream https://blackdiamondadvisory.com/2021/11/16/why-onestream/ Tue, 16 Nov 2021 22:30:31 +0000 https://blackdiamondadvisory.com/?p=1339 The post Why OneStream appeared first on Black Diamond | OneStream Solutions.

]]>

Why OneStream

The post Why OneStream appeared first on Black Diamond | OneStream Solutions.

]]>