Webinar Replay: Navigating the Software Project Journey

Webinar Replay: Navigating the Software Project Journey

How to Plot and Control the Course to Successful Outcomes — Every software project is a journey.  Perhaps it's the excitement of the adventure that lead you to a career in software development.  Every software project has unique characteristics and challenges - technology, development methods, business goals, constraints, etc. - that have the potential to take the project off course.  Yet the fundamental behavior of software projects is well-known, measurable, and predictable.  Navigation requires data about heading, position, environmental conditions, and speed, plus skills for interpreting the data and making course adjustments accordingly.   In this webinar, Laura Zuber will share software tracking & forecasting best practices based on simple core metrics (project size, schedule, effort, defects, and productivity).  Visually compare project actuals to the project plan, then forecast the most likely delivery date, cost, and quality based on actual performance to date. Tracking provides early warning and forecasting lets you evaluate multiple strategies to get your project back on track.  Both tools promote timely, effective course corrections to reach your project goals.


My name is. Kate Armel and I would like to welcome all of you to today's webinar Navigating the Software Project Journey, How to Plot and Control the Course to Successful Outcomes. Before we get started, I'd like to make you aware of a few technical items. Everyone watching this presentation is currently on mute. To ask a question at any point in the presentation, just use the Q&A dialogue at the right side of your screen. Laura will Field as many of these questions as she has time for at the end of the presentation. If she doesn't have time to get to your question, don't worry. We'll save all the questions. We don't get to and refer them to Laura so she can get you some answers.

And I'd just like to tell you a little bit about our presenter today, Laura Zuber. Laura has 27 years of experience in software development, consulting, training and support. She has conducted training and coaching sessions for all QSM SLIM-Suite tools and helped customers implement slim across a wide variety of processes and platforms. Laura has managed software development projects served as a senior software process improvement specialist, performed process assessments, designed and implemented best practices, and authorized, authored numerous training programs. She is a Certified Scrum Master and SAFe Agilist.

Thank you, Kate. I appreciate that and I appreciate all of you joining today as well. Taking time out of your busy schedules. You know, to successfully navigate software projects in today's complex and dynamic environment, you need an accurate picture of where you are, where you're going, and when you'll get there. You trust your GPS app to navigate across town or across the country, and likewise you need a reliable picture of your project landscape so that you can see if your software lease is on course. Your GPS, you know, will compute your average speed. It varies, right? Maybe there's an accident and your speed goes down and it will show you how far you've gone, and then also that you have this far to go. Your arrival time will also get adjusted if you stop. So, your highly visible software project has started and as time goes on, how do you circle back to your original plan or estimate to make sure that the actual work to be done is taking you where you want to go?

Your software project journey is dictated by your plan. Reliable plans are constructed from reliable estimates. Too many firms aren't doing empirically based estimates. They start with a goal-based estimate. You know, wanting to win the work or meet the sales goal rather than a data-driven estimate. The question is who's really driving the bus? Both approaches involve uncertainty. That's just the nature of estimating in the software development world. We have to deal with uncertainty. However, data-driven estimates are focused primarily on the size or the scope of the application - making sure we're focusing on understanding what's it going to take to deliver the final product and also based on our performance- the efficiency of our development organization. But even if you have the best estimate in the world and the best plan, every plan encounters reality, right? Something is going to happen.

For over 45 years, QSM SLIM-Suite of applications has helped thousands of organizations manage software projects throughout the lifecycle. The SLIM estimating and forecasting methodology is what I'm going to focus on today, these two products: SLIM-Estimate and SLIM-Control; mostly SLIM-Control of course. Our methodology is top-down scope-based and it models the non-linear behavior of software projects. We said a while ago that reliable plans are constructed from reliable estimates. And the problem that SLIM-Estimate solves is getting companies used to not using goal-based estimates and creating data-driven estimates. The problem SLIM-Control solves is tracking the actual data so you can see, literally, how it lines up to the plan and tells you where you'll end up.

The Project Management Institute (PMI) has well defined and recommended practices for managing software projects, and QSM has mapped our SLIM tool feature to show how we support them. Today, I want to focus on a few simple steps, some of which you're mostly doing, that go beyond task, resource and cost tracking to provide real insight into what's going on so you can make informed decisions about the best way to move forward. I'll briefly list the steps here. And then share details about each.

1.      The first thing we want to do, of course, is enter the plan and then compare it to trends. What we mean here is some historical data. You can use the QSM industry database, which I'll talk about in a little bit, or a best practice would be to gather your own history and compare your estimates and plans to your own history.

2.      The second step is to enter actual data, just a few metrics, some core metrics, plus any other metrics that you want, to every reporting period.

3.      And then we're going to pay attention to tracking the actuals versus the plan, over time at regular intervals with some charts that provide that visual map of where you are.

4.      Then at appropriate points along the course we want to calculate some forecasts based on where we are today. Where are we going to end? You can run multiple forecasts and compare them to really get that idea of that. Try out different scenarios ahead of time. Then you can decide what is the best course to take. We call those some What-if Scenarios.

5.      The final simple step really is to select the route or the forecast that achieves your most important goals. You may not be able to get everything done that you originally wanted to, but determine the most important thing for you to achieve to keep this project moving forward?

Before describing the tracking and forecasting process let's talk a little bit about how your software plan is derived. Many companies estimate by creating detailed plans based the roles needed for a task, or a task list, both commonly referred to as bottoms-up estimates. The last big project I worked on was a very large pharmacy workflow application. We had departments for every role almost, so we relied on role-based estimates.

These approaches have challenges, the main one being a lack of focus on the total scope of the project. Plus, there's difficulty getting that detailed data in a timely fashion anyway. These can take a long time. It's not a bad way to go. Some really good plans could come out this way, it is just that they’re fraught with a lot of challenges. SLIM’ scope-based method uses your estimate of the product size and a productivity or efficiency of your development environment to determine time and effort. This is a a typical output from SLIM-Estimate It lets you calculate a defensible estimate in a very short time by using just five core metrics. So that's what I've highlighted over here. We typically show the core metrics for the main development effort, plus any other activities you might have included in your life cycle. We only have the main development here. We've got duration, effort, and cost estimated for this release. It also shows staffing levels and a quality or reliability measure. The size is a generic unit of measure for software size that we sometimes use. And here's the efficiency, so you can see we're staying very high-level. These two charts over here are scatter plots. The software size is plotted on the X axis, and for this particular case, we've got the estimated duration for the project and the estimated effort. This little blue dot here is this current estimate solution and the lines are statistical trend lines. The other data points are our own historical projects. So, this is the way that we can look at the estimate or the plan and sanity check it against industry data, which can be super powerful. A best practice would be to also have some of your own data. Before we ever get started with the project, the best practice is to have a very good plan.

QSM's industry database contains over 14,000 completed projects. It provides the size and the productivity data not generally available early in the lifecycle. And not only do you get that benefit, but you can use it to sanity check your estimates and identify unrealistic expectations.

If you have use SLIM-Estimate, and you and you don't have to, you import that estimate into SLIM-Control, which is the tracking and forecasting tool, and it creates this plan showing each of the major management metrics. Here's the schedule. Pretty simple. Just a single bar on a on a Gantt chart. We've got our staffing plan. This is a level loaded. This is the build up of the construction. That's the generic implementation unit that came from the estimate, but we added another size measure of features. You can add as many as you want. You want to ask yourself ahead of time, What is it hat is super important for me to track. QSM works with large organizations using the Scaled Agile Framework and we've helped them track progress based on several size measures as they are decomposed. You might start with capabilities and also track epics, features, and stories. It’s not that hard and it is extremely valuable to see how these track with each other and understand their relationships, their relative sizes and dependency. Agin, you don't have to use SLIM-Estimate to use slim control. It can work with any plan you've got. You can enter a custom plan.


5 core metrics are used throughout the SLIM-Suite of tools. For each time tracking period you will enter the actual data for the first four metrics. So size - how many epics have we built out yet or how many stories have been completed? Time - simply a matter of entering the start, end dates and maybe some milestone dates for certain activities like requirements and design. When do we start requirements and design? When do we plan to finish that those milestones? The start and end dates help mark the actual time. We were going to start on June 1st, but we actually didn't get started until the 15th. That's the difference between plan versus the actual. On the effort, primarily expressed in terms of hours worked or full time equivalent staffing, and you have your choice about, you know which actual data is easiest for you to go and get. Then defects found. One thing I really loved about the SLIM tools when I was first introduced was the the modeling of how many defects you should expect. Even if you didn't have a plan for defects, counting your actual defects and making sure that they're decreasing before you ship is really an important concept. The 5th core metric is called the Productivity Index, which in SLIM is a whole project metric. When the plan comes from SLIM-Estimate it includes the implied Productivity Index, the efficiency level. It indicates the capability that you need to meet in order to deliver on your goals.

The plan and SLIM-Control is simply those metrics that you want to track during the project. We call them plan assumptions and this is a screen of what they might look like. You can see over here that there are not a lot. Just the five Core Metrics. Productivity doesn't show up here. There are several defect metrics because they're broken out by severity class. You could track just the total defect if you if you want. For each metric, SLIM-Control wants to know, when should I start seeing and collect actual measures for this actual data for this particular measure. Staffing would be starting at the very beginning of the project for whatever set of activities are being performed first, and staffing we would captured all the way through the project. Hopefully people do not stop working. The size measure that we happen to be tracking would start not until the development and and testing efforts start and it will go all the way through the end of the lifecycle. Indicate how many items you expect to build. This is thousands of Implementations Units, roughly equivalent to a line of code, but you may have features completed or some other size unit. For example, we expect to complete 240 Feature. So, whatever your measure is, you just tell it when to start, when does it end and how many do we expect to build.

The other thing that's happens just in SLIM-Control is this idea of a production pattern. This ist typically an “S” curve. You start slow and it and it builds up. The other thing you might want to think about is Effort. Is it easy for me to get the number of staff that's working on the project or do I have a nice system where I can go and pull the person hours out. So again, just very simple, not a whole lot of information. Once you've got the plan entered then like the estimate, we want to compare the plan to industry trends. Now if it has come from SLIM-Estimate then you may you may have already done thes step. But what you're looking for here is that good feeling that this a good plan. This particular example isthe same release that QSM did, but in this particular case, these trend lines are constructed from our own history. You can see it gives you a good feeling that this plan is very much falling in line with what we've been able to do. In the past, one of our consultants was supporting a large defense system. And he did an estimate and plan and made sure they were consistent with the trends, and management rejected what he came up with. They wanted it faster, of course. He quickly replanned in Estimate to reflect their goals. And that was quite aggressive. When the completed project came in, it was almost on the dot to our consultant’s original estimate. It took them twice as long as they wanted, so there's really a lot of ROI if you can avoid this kind of disaster. Similarly, for one of our own SLIM-Suite releases we determined the date for that release based on a sales goal. We got it out on time, but we had to do four point releases because the quality wasn't great. So trying to release early just wasn't a good idea. We went back to management and said let's never do this again. Future releases, you know, since that time have been extremely good quality and it also reduces maintenance. Quality is very much a part of your plan. This is an important easy step so.

The second step is to go ahead and start collecting that that actual data, which requires a little bit of planning of its own. What data is available for the size measure? Features, stories, story points, lines of code? There's probably other measures that you're used to working with, and it is nice these days that people are explicitly dealing with size much more than they used to. I think we can thank Agile development for that. For the time, again, some start and end dates for a set of activities - fairly straightforward. The effort I've talked about already, either person hours or full time equivalents, but we want to get them if we can, and if your development process supports it, tie these hours to a set of development activities. Then of course, defects are a a nice-to-have. You don't have to collect defects, but if you can then collect defects found, perhaps how many fixed and then ultimately calculating the reliability of the application. Where is the data stored? You know we have a lot of this data in other tools that we use. It could be a PM tool, some other kind of project tracking software. Maybe a lot of information about sizing, effort or counting number of tasks could be in JIRA or another application like Rally. There's all kinds. If you're working with lines of code then you can use your configuration library and pull a line of code count. We do this in at QSM. We do both. We have a a tracking system for our requirements and bugs. We call the requirements features. We'll get a line of code for each month and see how many features have we also completed. Then we can see how those relate to each other. Lastly, there may be some conversions required - probably not a lot. For effort we may want to map to SLIM phases and you may have to aggregate over a certain reporting period. So let's say you're only gathering actuals monthly, but you have actual data for very week. Then, of course, you might need to aggregate those very. I'm sure you're used to already.

So like the plan assumptions, this is an image of what it would look like to enter your actual data. You can see that for each reporting period we just come in here and say, well, how many people were working this month, how much functionality did we complete this month? It could be weeks or sprints. There are defects by category. A nice feature is that for each reporting period and each metric, SLIM-Control will tell you what that plan value is. So you kind of already get an immediate barometer of what's going on. Another neat thing to do, or a good practice is, to put in some explanatory notes that might help either remind yourself of what was going on this reporting period, maybe help somebody else help you figure out what to go do about it. In this case we were making a note that the defects were high this reporting period, but we understand that it was not necessarily coding issues, but that we upgraded some packages in our platform and that caused a few issues.

Once you have the plan and actuals - here's the most interesting and exciting part - to have the visuals of the actuals versus plan. These control charts provide great insight. We're going to talk about forecasting in a few minutes, but just tracking alone by itself can be extremely valuable. These are some example charts. This is the number of staffing on the project. You can see it's quite a long project. This is just a total number of tasks to be completed and, this is actually some of the lifecycle deliverables. What you're looking at is the plan - how things should shape out over time - is the blue line. Associated with those in SLIM-Control are control bounds. The yellow configurable area that you set up for your tolerance for deviations from plan. We don't want to overreact if something is not exactly on plan because it, you know, if something is exactly matching plan, I would actually be suspicious. You should expect a little variability, maybe not with staffing so much. We do find that that it is a common problem that projects get behind because they just don't get the staff on board as quickly as they needed to, like in this example. When you deviate away from the green control bound you see the traffic lights on your chart and of course red is is no good. You're getting that GPS early warning and it causes you to ask questions. Why are we behind? That may prompt you to go and collect another metric to get more insight into really what's going on. Development work is hard and collecting data takes effort, but it's super worth it.

What you're looking at here is some charts from, again, another one of our consultants was doing some expert witness tasks and she reconstructed the vendor’s project tracking of only effort and time, but she added the size. What are you getting for all that time and effort and the product development? The construction and deliverables fell behind right away. You could see that here for sure on these two charts. And you didn't even have to forecast it to see it. Not having this kind of visual, the customer didn't know what questions to ask. Again, this is a good approach for anyone who's doing vendor management. This view shows the progression of the deliverables and how far behind they were. An important and unique benefit of SLIM-Control is the calculation of the implied Productivity Index. Each reporting period we've got the Productivity Index. We've got what we planned versus what efficiency we are actually achieving. It turns out that when our consultant reconstructed and validated the vendor’s estimate, it was considered reasonable against industry trends. It actually was a decent plan, but their actual Pi is far below that plan, and it just shows one of the reasons that this project was in jeopardy.

The next step, before we compare multiple forecasts, we have to at least calculate one or more forecasts. Once you've identified some issues, this will help you figure out what should be done. We're going to extrapolate how things are. Same scope, same staffing, right. We're just going to extrapolate our current plan out and see where we end up. And then we can do some what-iffing - explore the range of of potential outcomes. SLIM-Control provides three types of forecasts:

·         Curve Fit is the most common one and the first one you should do. It forecasts by finding a a theoretical Rayleigh curve that best fits your actual data for each metric. For example, the product construction and defects or features each have their own forecast. Early in the lifecycle milestones, another component of your plan, can carry more weight in the forecast. Are we reaching those dates that we thought we would? Later in the project the actual product construction and the defects carry more weight. Those are defaults within the tool, but you can of course specify your own weights. You have to ask yourself which metric do I think is really the one I believe the most, the most reliable, and then in your forecast you could put more weight on that particular metric.

·         Another kind of forecasting is a Trade Off forecast. I briefly mentioned that SLIM models a non-linear trade off between time and effort. Software development is not manufacturing. Putting twice as many people on a project to compress the schedule is just not going to get you what you think it does. Here, you enter a new level of staffing value andSLIM-Control will calculate your new end date. You can see the schedule and cost implications for that scenario. Notic these forecasting assumptions look just like the plan assumption. Maybe that first time you run your curve forecast you're not going to change anything, but maybe you run another scenario and say we think they might want to add 10 more features. What would that do to our project? In one of our recent releases at QSM we overestimated the size and so our new forecasts were actually based on reducing the scope. It can go either way.

·         The third kind of forecast is what we call a Maintenance Forecast. This is when development is over. We're ready to release and we want to make sure that we reach our quality goal. This could be software for medical devices, or some kind of application that needs to be of high quality. You iterate on different dates and see what the reliability would be based on working, let's say another three months to remove the defects that are currently within the system. Just because the schedule says the project is over, it doesn't mean that the product really is good enough to go out the door, and this can help you figure that out.

Once we run a forecast then this is what it looks like. We've got our Gantt chart that's a little bit more exciting for this project than the one I showed earlier. These are 4 development phases. The plan is the blue bar and then you know, we've started collecting actuals. This set of activities took later than we thought. And so did this one. This actual tells us where we are today with this little green arrow there is a as-of date. The status report tells us the deviations down here. We've got our actuals, this is our plan, and the forecast will clearly give us a picture of where we're going to end up based on how we're performing today. And you know you can use this to negotiate. You know the Productivity Index, if it's not up to snuff, and it's a lot of times it is not going to magically get better, you have to go with what you've got. In this situation, you might need to reduce the functionality. We can't get it all to you now so let's do a two-release scenario. We'll get you the most important features now and we'll deliver the rest in the next release. Hopefully you have some room to negotiate. You may not, but this will certainly help communicate the possibilities to the decision makers. More than anything else you can visualize from the top of the hill what this looks like and communicate it with others. Can't really get that from a spreadsheet. Spreadsheets and detailed plans and tracking are good data, but not always in a digestible form.

Another good practice is to take what we call a snapshot of the tracking metrics each reporting period. These are four different months-worth of tracking up here. These are the metrics that we're tracking. You can see that progression from green, and hopefully staying green, but some of them not so much and it shows you how things kind of went downhill and you, so you're capturing that. It includes the traffic light status every reporting period. Here, you know you can select the charts for each reporting period that show the most important outcomes, good or bad - hopefully some good - and use this notes section to explain your findings and your recommended action. You could point out what was going on this month with the two charts with some descriptive data here, then you can see the trends over time. It provides an audit trail. Too often, I think project plans aren't formally updated or re-baselined, so this kind of history can be extremely valuable.

You've seen one example of a forecast. The power of all of the SLIM tools is the ability to quickly and easily explore a range of potential outcomes and compare them. Don't settle for one potential solution. Run multiple forecasts. The actuals are what the actuals are, but we can run What if scenarios and log multiple forecasts. This leads to the next slide coming up, but it goes along with this slide too, which is we can turn a forecast into the plan. Once you decide which forecast is best, it's the one that makes the most sense to you based on the weights of the metrics you gave, your’e going to turn that into the new plan. You can continue to kind of do your analysis and maybe a month from now you might want to replan again. Hopefully things aren't that fluctuating, but the possibilities are here. What might be different about each forecast? Well, the reporting period for one. It's going to be as of this date. It's either a more recent forecast or it's an older one, and the forecast assumptions would be different. Again, that idea of let's see what happens if we reduce the scope or add a couple more people to the project. You can decide which metrics or which indicators carry the most weight in your forecast. You probably want one that's optimistic because you know your boss is going to ask you about achieving those aggressive goals. You could run the forecast that's optimistic and then one that's pessimistic and then maybe have the one that you really favor as your recommended forecast.

The last step is to make the forecast the current plan. You can see here we don't have that blue line anymore. It's over here because this particular chart is designed to show all three. But here it shows that the history of the current plan is the actuals and then we're just using that green forecast line moving forward. We baselined this plan and this is what we expect to happen as of today.

Software project management is hard. When I was a project manager, I heard colleagues say the project managers get all the responsibility and no authority. But hopefully taking the steps that we've talked about today helps stakeholders understand the situation. And help them buy in, or at least foster some successful negotiations for better outcomes. To keep your project heading in the right direction, you want to:

·         Base plans on known capabilities.

·         Include product and quality metrics as part of the metrics you're tracking and looking at. Those are the real progress indicators.

·         Capture actual data at regular intervals. We're only talking 5 or less metrics.

·         Create some management assessment reports with charts if you can. Communicating well with others goes a long, long way.

·         Forecast to complete and include What if scenarios. Include at least a handful, then take that trusted forecast and make that your new plan.

Thanks again for your your time and attention today. I hope that that some of this was helpful to you. I'll be happy to answer any questions if you have any.

Question: Do your tools integrate with JIRA or any other software management packages?

Yes. We have an API for SLIM-Control and SLIM-Estimate as well. We've built a prototype of an integration between SLIM-estimate and JIRA and a full integration between Estimate and Clarity, the PPM tool. QSM's happy to work with you to do the same for SLIM-Control. Often creating an Excel integration using a data export from one of those tools is a good place to start.

Question: You were talking about productivity earlier, on basing your plans on your known capabilities. How do you measure productivity?

That is a good question. SLIM’s Productivity Index is unique to our approach. It iss from the original work by Larry Putnam, Sr. and developing of the software production equation. Basically it's a measure of the overall efficiency of your development environment and what it's impacted by. It accounts for the influences of factors like your tools and methods, the technical complexity of the work that you're trying to do, and management style. It is calculated from completed projects. We just need to know the actual software size when the project is over, how much calendar time did you spend, and how much effort, then we can calculate the Productivity Index. If you want to know more, send us an e-mail.We also have several articles on the website.

Question: You talked earlier about running multiple forecasts or about what-iffing. Can you give some examples of the types of things you might vary or why would you do this, with the optimistic and pessimistic forecast? Can you just expand on that a little bit?

Sure. You know the software size is pretty important and early on in the estimating and planning process we may not have as much data as we'd like in our project estimate and plan. We may be off on the scope a little bit. So that's one of the first things I would look at. OK, I think we're going to deliver, let's say 2000 story points. Maybe that's the estimate that we were working with or a certain number of stories. That's my plan. But if I'm going to forecast, I would forecast the actual, you know, maybe we should have done as of this point in the project 300, let's say and and we've completed 250. So maybe we're slightly behind. My first forecast would be just sticking with that estimate of 300, but I might also bump it up a little bit to see what might happen if we underestimated. Or perhaps we have to add more work. I could also do one and say we've only done 250, but I think we understand the application pretty well. Why don't we run a forecast based on the fact that it's not 300 - it may be less than that. I could do the same scenario for staffing. We have a small development team at QSM and we know what our plan should be, but people get diverted on other projects and we don't always get everybody working that we need to. So I could what if we don't get everybody working. Or if you are behind and you think you can remedy that with the bigger staff, then you would run that scenario. Kind of common sense stuff.

Question: Does your tool support earned value analysis?

It does. It's kind of a secondary thought that we have in there because a lot of project managers are used to that. We talked about tracking and forecasting based on different metrics and in SLIM-Control. You pick the metric that you want to represent he best breakdown of the tasks that need to be completed, or the size - What do you think is the best size measure? Both the plan and the actual data have to have been entered to the current date, then we just have the Earned Value Wizard compute the metrics that Earned Value is built upon.

Question: About the trade off forecast. Does does your tool consider when you're making changes?

Yes. In other words, if you add the same number of staff at different points in the project, does it have the same effect on the delivery date?

Well, each forecast has an as of date and they would be independent. So I'm thinking, let's say we have 5 people and I run a want to run a forecast that says well. What if I I bumped it up to 8 and that would be one forecast and I would log and save that. And then maybe next month you know we have 6, but we're not eight. I might want to play with it again. Is that what you mean? I mean, they're just they're independent forecast and. Your actuals are going to be the most reliable. Is that answering you're asking?

Well, what I was thinking, this is Kate, is one of the neat things with SLIM-Control, and I use it every month to monitor our software development project, letting you visualize things and sort of having a visual representation or a prediction - a way of predicting what will happen if you do various things. And one thing we've seen a lot that our data (QSM industry database) backs up is that corrections made early on, you know it's very much like when you're driving a car. Corrections made early on in the project really can have a much larger impact. That's where having those stop lights to warn you when something's coming off track so that you can make those early course corrections. For instance, if you might be able to get away with adding a person or two at the beginning of the project. If you wait until almost to the end of the project and add one or two people, most of what you're going to do is see a hit to your quality because those people don't have the time to get up to speed. And to actually, you know, be familiar with the project so often what you see is a spike in mistakes or what we see is defects.

Excellent point. Yeah. Thank you for that.

Actually, I had one more thought. That.

Go for it.

The other thing that's maybe not intuitive is a lot of times the reason management adds people to a project is they are trying to bring in the schedule and that can work on smaller projects with an early course correction, because again, there's enough time for those new staff members to get up to speed. When you do it late in the project, sometimes you can even push the project out even later, and not only do you not get any schedule compression, but you're actually going to be later than you would have been. But if you just kept going the way you are, and that's one of the nice things SLIM-Control, it can actually show you in real time because it's got all of this trade off behavior that is inherent in the data built into the tool.

Great. Thanks. Well, if there aren't any more questions, again, I really appreciate your time today. I hope that this was valuable. As we indicated in the beginning we will make the recording available. Let us know if you if you have other questions. You can send us an e-mail. Have a good rest of your day. Thanks again folks. Appreciate it.