Just wanted to share this in case anyone else wants to use the code. I wanted to create a custom color series for pie charts instead of using a single color or the "rainbow" default (I am looping through data to specify the series, not listing each one like in this example). So I made this expression rule:
with( local!halfColorSeries: reverse( remove( remove( ri!colorsList, length( ri!colorsList ) ), 1 ) ), local!fullColorSeries: append( ri!colorsList, local!halfColorSeries ), local!colorSeriesIndex: mod( { ri!index - 1 }, { length( local!fullColorSeries ) } ) + 1, local!fullColorSeries[local!colorSeriesIndex] )
For the color parameter in a!chartSeries() call the expression rule passing a list of hex colors as strings (ri!colors) and the data index (ri!index)
I used this color palette for mine to match our app's branding: {"#276bf2", "#93aefd", "#ecf1ff", "#8598c7", "#00488f"}
a!pieChartField( series: { a!forEach( items: local!datasubset, expression: a!chartSeries( color: rule!GetColorByIndex(local!colors, fv!index), ...
Was able to create this:
The expression rule expects a list of strings (at least 2) and an integer, I didn't add null checks/validation/error handling. Interested to see if it can be done in a better way. Thanks!
Discussion posts and replies are publicly visible
Just curious. Why is the color picking expression that complex? Below my more simple example
index(ri!colorList, 1 + mod(ri!index, length(ri!colorList)), "#a0a0a0")