Hello Community! This week is our quarterly CA Agile Central Hackathon, and I've written an app that I think you're going to love. Have you had Teams that:
If so, you should try out my Estimation Calibration app which guides you through analyzing your Team's cycle times by plan estimate to achieve more consistent and predictable estimation. Below is an overview, and instructions for using it. If you try it, I'd love to hear your feedback by the end of the week (Friday January 5) to make the app even better and potentially include in my Hackathon demo video. I'm passionate about using data to guide Teams to their highest performance yet, and I hope you are too!
The app starts by showing your Team's cycle times by estimate for a given Release:
Although this graph is valuable as-is for a quick check of consistent delivery, it's only the start of the app's flow to guide you through analysis. Clicking "Calibrate Estimates" shows the data in box plots to highlight the cycle time distribution by plan estimate:
Ideally we'd see tight box plots that shift upwards as the estimates increase. That's rarely the case for a Team, so the app's next step is identifying which estimate is your Team's best at being consistent already. By starting with what your Team is already good at, the app helps transition to what could be improved:
Using this anchor, the app then defines ideal cycle times for each estimate. It scales the median cycle time for your anchor by the difference in plan estimate for your other stories to create relative ideal cycle times for each estimate. You can already start to see which types of stories need the most calibration:
Using these bands for ideal cycle times per estimate, the app applies it back to the scatter plot to show which stories were in the ideal band and which were out. Since we want our Agile Teams to make incremental improvements, the app identifies the top five stories that could use the most adjustment, rather than have the Team focus on every problem:
Finally, the app guides your Team into a retrospective to identify actions that could be used in the future to make better estimate choices next time for more consistent cycle times:
If you've never used a custom app in Agile Central, you're missing out! Check out Extend CA Agile Central With Apps | CA Agile Central Help for instructions. Specifically, you'll be using a Custom HTML app to copy my code into Agile Central Custom HTML | CA Agile Central Help. If you need any help while installing the app, just let me know!
My app can be found at https://raw.githubusercontent.com/wkammersell-ca/estimation-calibration/master/deploy/App-uncompressed.html As this is built during a Hackathon, it is unsupported by CA. The app only reads your data, and doesn't try to update/write to your Agile Central data.
Again, any feedback or comments are greatly appreciated, and have an awesome start to 2018!
wkammersell this is awesome! The simple workflow makes this a great tool for retrospectives - can't wait to try it with my team!
Under my breath just now, I said the exact same thing. "THIS IS AWESOME!" I really really really appreciate this app! Thank you wkammersell
Oh, William . You dun good!
wkammersell I'm sure we're doing something wrong. That, or our data isn't right. For us, on load, the app shows a single line:
"There is no data. Check if there are stories with PlanEstimate assigned to your selected timebox which were marked in-progress and accepted within the timebox."
I'm confused about the 'assigned to your selected timebox'. On load, the app never renders any controls in the GUI, only the message, above. Screenshot attached. Any ideas?
Thanks for trying it out Agile_Central_Tooler! The app needs to be on a release-filtered custom page. Your screenshot doesn't seem to go high enough for me to see if that filter is up there. You can make a custom page be release-filtered if you click the gear in the upper right > Edit Page > Show Filter at the bottom.
If you think having it rely on a release filter makes it not usable, please let me know. Primarily I did the release filter to make the app more performant and to match the use case of a release retrospective, and I could try removing that limitation and see how it does.
Appreciate the work you put into creating these apps, wkammersell - especially logic & calculations for complex analysis like this. We find that Agile Central tool owners & tool managers rarely recognize the value of custom app engineering & management on this platform. I find myself scouting custom apps where ever I can find them. Coming from you, this is a treat.
Approach 1. Originally, using your app code, we created a custom app in our subscription's app catalog because, after internal app governance review & approval, we'll make this a standard app available to everyone in the subscription: Admin -> Subscription -> Apps -> Add New. When creating a custom app for the subscription's App Catalog, there's options to include 3 separate timebox filters (iterations, releases, milestones) in the app itself. We included a release filter here. Added the app to a custom page & dashboard pages. Nothing. Then we removed the release filter on the custom app. Re-added app to the page. Still nothing.
Approach 2. We created a custom release-filtered page and added the app to this page. App works like a charm!
If possible, we'd like to use approach 1 because the App Catalog is our standard app delivery method. Also, from our experience, we think self-contained apps are more scalable. Scoping the release filter to the app itself instead of the page might be more flexible. But whatever the final app release, it would help to have more scope-details so that admins know exactly how to setup working apps inside the subscription. First impressions, from the tooling world, the app looks really good. I'll be sure to pass along reactions regarding business value from our Enterprise Agile Coaches.
Agile_Central_Tooler - I made a change so that the Release filter should show up in the app even if it's not on a release-filtered page. If you grab the code again from the link in the post it should work for you. If not, please let me know.
No start/end date filter yet, but that's next on my backlog.
Perfect! Thanks,wkammersell. Added the app to a few dashboard pages and it works great.
William, Yup really a great app to show teams variability in estimates. Leads to retro discussion. But, maybe you could add some functionality. My team runs Kanban and is not release centric, and the app is. I need to demo the app for the team, but there is no way for me to look back (unless I create a "private" release for the project and assign stories to the prior time box) So there is a workaround... But it sure would be great when the app is added to be able to set a start end date without the work around. Feel free to contact me for a demo of my issue. firstname.lastname@example.org
Interesting situation: wanting to leverage this app without using release timebox objects (which have start and end dates) but rather, manually input start & end dates. The painful work-around you suggested creates an unnecessary mess. I smell a Kanban version of this app...
Is it just a faint whiff from far away, or a beautiful aroma that is just next door?
I was so confused when I got your comment out of context in my inbox I think the idea of adding a start/end date picker is awesome, though I don't think I'll be able to get to it this Hackathon. I'll try, though, and if not I'll keep hacking on the app over the next weeks as folks are loving it.
Thanks, I can demo for my team now, with my work around. I think they will love it. I’m the Rally SME for my team, and I don’t want them to get distracted learning the tool.
kudos, wkammersell. ya, this is impressive
WaltDietz - if you grab the latest code, I added the ability to set a start/end date range if you're not on a release-filtered page. I plan to make a further update to it soon to save the start/end date values so you don't have to enter the start/end dates every time you use the app.
Did anyone ever tell you that you are awesome? I am!
Agile Coach - Affinity Products
302 457 3795
“You cannot test courage cautiously”
Looking forward to sharing this with my customers -- will be extremely useful -- great App! I particularly like the flow...
Hopefully folks have been enjoying the app! If you've had a chance to try it with a Team, I'd love to hear if it led to performance improvements/experiments and anything you have that could make it more effective in guiding a Team with data. From your feedback the expectations were high, so I'd love to hear how reality went.
I've finished off the rest of the backlog I had for the app by adding:
If you update the app's source code from the original link (https://raw.githubusercontent.com/wkammersell-ca/estimation-calibration/master/deploy/App-uncompressed.html) you'll get all the newness.
I didn't win at this quarter's CA Agile Central Hackathon, but don't worry, if you love Agile Central awesomeness the winners were even better. Personally I'm passionate about apps like this, and thank you all for the amazing responses. I've started a second app with this same philosophy of guiding teams to their best yet via data analysis, and I look forward to posting it in the Community when it's ready for feedback!
nice try, kamwi02. and it was so innovative, too. Is there a list of candidate entries, finalists, & winners available? Who won?
Thanks Jerry! I unfortunately don't have a list of the winners I can share right now, but the winners are getting funded and appear on our upcoming roadmap. You'll hopefully have a conversation with your account manager or receive an invite to join a call to see the roadmap soon.
I appreciate the awesomeness!!
Hi William. I love the app and my teams are getting value big time. However, I got questioned on how the definition of accepted is calculated. The issue is that there were more stories shown as accepted on their kanban board than there were data points in the graph. Below is my comparison. 21 stories "qualified" to be used, but only 14 data points on the initial screen. What am I missing?
That's awesome to hear your teams are getting results with the app! I believe the reason for your graph is that there are multiple stories for some of those points. The app looks at the cycle times to the nearest quarter day, and it looks like many of the values in your time spent field will be on top of each other. There are several in the 2.85 to 2.88 cycle time range in your grid, for instance. Kudos to your team for being so consistent!
WaltDietz - sorry I missed the question about how accepted is calculated. Each work item has a couple important dates:
These dates will also be set when you skip them. For instance, if you have "Released" as an optional last schedule state after "Accepted" and you move a work item from "Completed" to "Released" skipping "Accepted", that will set the Accepted Date.
For this app, to make it simple to use, I'm calling "Cycle Time" the number of business days between the In Progress Date and Accepted Date, rounded to the nearest quarter day.
If you want to write your own apps, this ability to easily access the in progress date and accepted date for each work item is one of the most powerful parts of Agile Central. The fact that each Team can customize their flow, either using a Kanban board or the Team Board, and their flow maps to schedules states, means that powerful reports can be written that work in any team environment.
If you have more questions, please let me know.
William, I understand your definition. But, in the specific issue I have is ambiguity between definition and what is produced. See the screen shots above. On the graph there are 14 data points. But on the list I exported from Rally, there are 21 that supposedly meet your definition. What am I missing?
Hi WaltDietz, I believe what's going on is that several of the stories are corresponding to the same dot on the graph. Specifically, items 16-20 all have the same estimate and very similar cycle times as well as 10-12. Thus, these points on the graph are actually a stack of 3 to 5 stories all corresponding to that same data point of estimate and cycle time. The app doesn't have a way to show multiple entries corresponding to the same dot right now. That accounts for 6 of the missing dots, and I assume the seventh is that two of the other stories are the same dot as well. My hunch is it's item 21 is also in the same stack of dots as 16-20 as some of those were accepted on 1/21/18, a Sunday so their business day cycle time may be the same as item 21.
If you'd like more detail, please let me know know, and if you see this happening a lot I should be able to have the app make the tooltip for one dot show all stories corresponding to that dot, or increase the size of the dot if there are multiple stories for one dot.
As a CSM the app is fine as is, I can explain the discrepancy just fine. As an end user, making the dot with multiples bigger, and mouse-over displays all the stories in the dot, would add even more to your awesomeness score. Which IMHO is pretty darn high.