Custom Development

Using Economic Value Measurement to Enrich Your Agile Process

Dean Salvucci

The Agile world talks a lot about delivering value to customers. But after a new custom software solution has been deployed and is in use, how often do we measure the economic value actually gained by the customer’s business? At worst, economic value is completely overlooked. At best, it’s an under-used metric, because no one is accountable/responsible, we don’t have the data, and people resist change. So there are no best practices yet for measuring true business value after deployment. I think it’s time for this to change – time to adopt Economic Value Measurement. Note that Economic Value Measurement is not to be confused with the traditional project management term "Earned Value Management".

First let’s look at the typical Agile pattern. We have a Portfolio Backlog of Business Problems, presented in the form of new Business Features or Architectural Features that lay the groundwork for continued functional delivery. Most organizations have some up-front method of prioritizing features on the backlog based on the expected outputs. 

Then the team breaks down features from the Portfolio Backlog into User Stories, delivers the defined functionality at a predictable cadence, and the project meets expected deadlines. Happy ending for all. We delivered what was requested at a high quality, and therefore we were successful.

This is how I worked for many years in Agile product companies. What made me a convert to Economic Value Measurement was a project for an educational institution, on which I served as a consultant and as the Project Manager. We were about to shift gears from App Modernization to New Feature Development.

Solid Working Relationship

Let me give you some back story on this project. It begins with a strong, 7-year relationship between Summa and the institution in question. This foundation gave our Summa team the credibility to innovate in Economic Value Measurement. 

The initial goal of the application modernization project was to migrate the institution's Student Information System (SIS) to a new S3 Java web app platform and from the old Ingres DB to Oracle, while retaining the original business functionality. Thanks to the hard work and dedication of a truly agile team (a mix of Summa employees and the client's employees), we found ourselves on the other side of a very successful migration with a backlog full of valuable new Business Features. 

By design, we did not give a great deal of thought up front to the business value of the work backlog prioritization. Our main goal was to complete the planned migration as quickly as possible. After everything on the backlog was migrated, we would start solving our business owners’ problems in the exciting new world of S3 Feature Development. 

Time for Tough Decisions

Now we entered the ‘Functionally Focused Phase’ of the project, needing to prioritize our Portfolio Backlog while providing value and fairness to the various departments of the educational institution vying for our attention. There was no shortage of features for the development team(s) to consider. But as development resources were limited, we couldn’t build everything right away. 

So here’s our dilemma: how do we (the S3 Managers—myself, another Summa consultant and the client's managers/stakeholders) prioritize our backlog of features across multiple departments, including Admission, Registrar and Student Accounts?

We considered Weighted Shortest Job First (WSJF), a popular prioritization model prescribed by the Scaled Agile Framework (SAFe), whose formula incorporates business value and effort and helps large programs with multiple product owners prioritize. However, at this institution, we were one team working with multiple product owners, but we were only able to work with one at a time because of the academic calendar. We decided on a lighter-weight model that included business value and a fairness factor for the various business owners. 

Becoming Economic Value-Heads

Having picked our model, we realized that we should look at the economic value of the proposed feature, not only pre-implementation, but also post-implementation, to see if we hit the mark and delivered the intended benefits. No matter how much thought and care goes into your prioritization and elaboration processes prior to software development, you can design the wrong solution, the environment can change, or some other factor can diminish the intended value.

In the spirit of continuous improvement, we decided to train ourselves to be economic value-heads throughout our entire process.  Economic value should be near top of mind when a feature is just an idea in a business owner’s head, when it's first presentated to the team, when it's deployed to production and beyond. Once we knew how previous decisions turned out, we could become better decision-makers. Basically we created another feedback loop for continuous improvement.  

The Fine Print

We realized right away that evaluating software development is part objective and part subjective. In manufacturing, the value of a product is conveniently objective: cost of inputs vs. sales of outputs, i.e. end product. However, for software, user perception of the outputs is a big factor in defining the value, i.e. changes to existing software or new functionality added. This is particularly true for a non-product company.  We depend on user experience to tell us if an internal software development project has delivered value.

While measuring software economic value is not an exact science, we can quantify some common benefits of software development.

  • Did this feature cut man hours for the end user?
  • Did its functionality allow the user to work more efficiently?
  • Did it reduce the risk of rework, an audit and possible fine?
  • Did we provide future scalability that will extend the software’s life?
  • Did system performance improve?

And then, the most subjective question: did users like it?   

The industry hasn’t established standards for measuring the economic value of software development, though there are a few frameworks out there. And there is not much data on how to build these metrics into software development.  But pulling from what’s available,  we decided to incorporate the following:

  • New functionality
  • Extended life
  • Reduced risk
  • Time savings
  • Improved performance
  • Customer satisfaction

In the end, we assumed that there must be a relationship between the features we delivered, the perceived value and whether the functionality met the user’s needs

Our Economic Value Measurement Process

At a high level, our process looked something like this:

  1. Introductory meeting where the Business Owners describe the Business Problem that we are trying to solve to the Development Team.
  2. Development Team comes up with potential approaches to solving the problem along with a high level Epic size.
  3. Based on the proposed solution and its size, the Business Owners decide whether to include it in their Feature Development period.
  4. Elaboration Meetings with Business Owners begin to further define the User Stories that must be completed by the Development Team.
  5. We perform an Economic Value interview with Business Owners in which we analyze the current work process. This could include a Value Stream Analysis.
  6. We Develop, Test and Deploy the feature to Production using our Lean/Kanban workflow.
  7. After the feature has been in Production for at least 1 month, we perform a post-implementation interview in which we examine the same workflow with the newly implemented feature.
  8. We share what we learned with the Business Owners and other stakeholders—how our decisions turned out about which features to work on, and what value was actually delivered.

How It Worked: Case Study 1

The feature was NTIA – update deposit automatically. The problem was too much manual work in updating the S3 app and database when freshmen sent in their acceptance deposit. We learned that Admission often spends several hours every day (including weekends) processing incoming students’ deposits during admission season.

Ninety percent of incoming freshmen pay their deposit by credit card. We wanted to automate the process for this large group with the F47 feature, so that Admission would only have 10% of deposits to handle manually. So our main goal for this feature was time savings, but there were also elements of new functionality and reduced risk, extended life and customer satisfaction.

We were very pleased with the outcome:

  • Time savings: the feature reduced manual processing time from 15 – 20 hours per week to less than 1 hour per week!

Economic_Value_Agile_Process_chart_1.jpg

  • New functionality and reduced risk: the client gained greater security and more information with less work.
    • Added authentication of applicant against S3 Student data for added security and less opportunity for confusion or mistakes
    • No more need for applicant to type personal info, which is now populated dynamically
    • Dynamic collection of colleges chosen by applicants not going to the institution in question, adding clarity to the process and requiring less manual oversight by Dean of Admission
  • Extended life: the institution now has the luxury to rewrite, upgrade or decommission the legacy web apps in the future.
    • Moved 90% of important business functionality from the hard-to-maintain legacy web apps to the more easily maintainable S3 system
    • Poised for better compatibility with third party apps in the future as code for automated processing is accessible through secured web services
  • Customer satisfaction
What We Learned: Saving Time Is A Winner

What did the S3 managers learn from this case? For this particular feature, the benefits were highly visible due to the huge time savings provided by the new functionality. 

  • Time-saving features should continue to be high-priority, particularly when the cost is low and benefits are measurable. The Admission feature was completed in just 4 weeks and will provide savings for many  years to come
  • Value is increased if we add in the new tasks that staff can now accomplish in the time saved by automating the process
  • Cases like this demonstrate value and build credibility for future budget requests within any organization
  • If we take this study one step further, we can actually calculate a more precise cost-benefit of the feature to demonstrate the dollar value of our project. By measuring the cost of the man hours required during the admission process vs. the cost of the man hours that went into the development of the feature, we can find a breakeven point

How It Worked: Case Study 2

The feature in this case was Course Cross Listing, basically a data cleanup effort. Universities often offer courses across more than one academic department. But the S3 data model did not properly define such courses. As a result, any course existing in the cross-list table got applied to all semesters and all sections within S3 regardless of whether the course was actually offered. Accurate cross-listing required manual processing of data from a paper form.

In addition, cross-list data was overwritten when a new semester schedule was built, while the current semester was still in session. There was no historical record for users to find out what was cross-listed in previous semesters. As a workaround, staff needed to output data and archive for reference purposes before building each new semester within S3.

Naturally, the underlying data also created problems with various business processes such as scheduling rooms (using 25 Live, a 3rd party app), scheduling final exams and faculty course evaluations. 

Not surprisingly, many users were confused and unhappy with the current process. We saw an opportunity to save time and extend process life by eliminating the need for the Registrar to archive data before creating each new semester schedule. Just as important, we also saw a huge opportunity to improve customer satisfaction by eliminating confusion within the URO as well as across various Academic Departments.  

  • Time savings: simplified processes, as the illustration shows, saved significant time
    • Improved regular path and reduced rework due to improved data clarity
    • Eliminated need to archive data before making course schedules
    • Saved 4 days in course scheduling by reducing mistakes made due to data ambiguity as well as 25 Live room scheduling exceptions
    • Saved 2 days during final exams as registrar no longer has to review cross-listed courses one by one

 Course Scheduling Diagram Pre-implmentation

Economic_Value_Agile_Process_chart_2a.jpg

Potential for rework exists due to scheduling mistakes which are discoverd due to overall lack of data clarity and also due to location exceptions that occure when running 25 Live, wihch is the room scheduling application.

Course Scheduling Diagram Post-implmentation

Economic_Value_Agile_Process_chart_2b.jpg

Post implementation, a majority of the rework has been eliminated, resulting in a 4 day savings on average.

  • Extended life
    • Process is easier to maintain and sustain with archiving step eliminated
    • The presence of historical data in the DB makes life less confusing for everyone, particularly when it comes to answering questions or reporting on old data
  • Customer satisfaction
    • Business users report an overwhelming level of satisfaction, ‘5+ on a scale of 1 to 5’
    • Word from other academic departments and institutional research & analysis regarding improved data clarity has also been extremely positive
    • No more confusion regarding faculty course evaluations. Previously, due to confusing data, faculty members teaching a cross-listed course may not have been included. Now, it is clear who should be included in faculty course evaluations and no one is excluded
What We Learned:  Value of MVP

You may not have time to solve every business problem with your proposed solution, but it is worth determining the MVP (Minimum Viable Product) and delivering some value to end users.When we planned this feature as a team, we realized that we did not have time to tackle all of the Registrar's pain points. We still delivered time savings and improved customer satisfaction while extending the life of the product and process so that we could potentially circle back one day and ‘fix everything’.

  • The success of the MVP will make us less hesitant to take on a partial solution in the future
  • Customer satisfaction can be as powerful as time savings when it comes to end users. We should consider developing functionality that improves the user experience even if that is the primary benefit

Conclusion

The big lesson of Economic Value Measurement is about continuous improvement. After each development period, the team conducts a retro and looks for improvements. The business value of past decisions should influence future choices about what features to work on. It only makes sense to follow up after delivery to evaluate those decisions. I highly recommend adding Economic Value Measurement into your feedback loop and finding ways to improve your decision making during feature prioritization.

Want to know more about our agile processes? Read on.

Agile is part of everything we do. It's how we work, and how we coach our customers to reach their full organizational potential. If you're curious what agile can do for your business, contact us today.

Dean Salvucci
ABOUT THE AUTHOR

Dean is a Project Manager, SAFe Agilist and Certified Scrum Product Owner on Summa's PMO Team. He has 20 years experience working in software development, which spans the healthcare, banking/finance and higher education industries. He has managed on-site and offshore teams utilizing Agile Methodologies and is an advocate for Lean/Kanban. During his downtime, he enjoys playing basketball with his two sons, making wine and following the Steelers. He is also a bit of a film buff and a big fan of many different genres of music, especially live concerts.