Archive for July, 2011

20 Questions to Understand What Your Vendor Means by “Re-Architect”

Friday, July 29th, 2011

I think the term “re-architect” needs to be stricken from our vocabulary.

Whether you’re building a home or a system, the architecture is the first thing you decide upon.  It’s fundamental.  Just as the architecture for a home lays out the blue print for the physical structure, the architecture of a system lays out the blue print from which components and functionality are developed. Before the first batch of concrete is poured or the first line of code is written, you put that blue print in place, and from it every piece of the home or system is defined and developed.

With large-scale systems like policy administration, every line of code is written with that architecture in mind and it is core to everything in the system.  The architecture defines the very essence of the system – it is beyond “foundational”.  Once you go down an architectural path, you might be able to make changes to it, but there’s no re-doing the architecture unless you throw everything out and start from scratch.

So when vendors throw around the term “re-architect”, I think it can be very misleading.  “Re-architect” seems to imply that they have applied some fundamental, structural changes to the software without compromising the system itself; yet, to truly “re-architect” anything would mean a complete overhaul of the architecture that defines it. So, it’s kind of a misnomer.  If they had actually rewritten the system, they would say “we did a complete 100% rewrite from the ground up”.  If they didn’t rewrite it, then there is NO WAY the system is not compromised by the legacy architecture it had.  When you’re purchasing a system – large or small –, there’s a fair amount of legacy architecture and legacy code that comes with the package in almost every case.  You may be OK with that, but you need to have a good appreciation for the extent of it.

Along the lines of finding the “freshness date” or newness of the architecture of the system – as well as just doing some good due diligence – here are 20 key questions I think you need to be asking your vendor as part of the assessment:

  1. When was the last time you “re-architected”?  What parts were not rewritten and why?
  2. Why was the “re-architecting” deemed necessary? And what was the overall impact to the system in terms of % of code rewritten?
  3. What year was the oldest code in your system written?  (I suggest you have the vendor make that a contractual representation and you should get an honest answer.)
  4. For your policy administration system, do you have a monolithic or component based object structure within your code?  If component, do your components reference each other within the code?  Are there database joins that go across your components?
  5. Can you tell me how many lines of code your software is (by language)?  (The purpose here is to gain an understanding of the ratio of newer code to legacy or older code.)
  6. How many customers have converted or committed to converting to this new architecture? For those who have converted, what was the average project duration?
  7. How many lines of code did you write for each of your last three new client implementations?  How long did those implementations take?  (Make sure you can reconcile the lines of code to the duration and that it passes the “smell” test – in other words “if there’s that little code, why did it take so long?”)
  8. Assuming coding changes are required (and they will be), what restrictions are there on using external consultants?   Here the vendor will likely represent that you don’t need to make changes to the core system – do not under any circumstances accept that answer.
  9. How many developers do you have supporting clients offshore and onshore on this platform today?  (Given the low amount of programming you will be told is required, make sure you can reconcile this answer to the answer in #7.)
  10. Am I forced to use your proprietary rules engine?  How easy is it to swap out?
  11. What contractual guarantees are you willing to make about maintaining this platform in the future and future releases?
  12. What is the biggest implementation (# of policies) that anyone who is running on one of your past two releases of the platform we are buying?  (Work to make sure that the answer IS for the platform you are buying.)
  13. List your current clients for the platform I am buying and version of the platform they are running.   (Accept that some clients can’t be named, but insist on a complete list.)  Exactly what functionality are they using and what is the product mix?
  14. Can I have the attendee list from your last user group meeting?  This IS NOT highly confidential and they should be able to give it to you.
  15. Who was the last policy admin customer to go into production for individual life?  Individual annuity?  What were the released version those companies used?
  16. How long did it take your last policy admin customer to go into production (from initial requirements through production)?
  17. Over the past five years, how many companies who have selected your platform where you started projects and the customer never ended up going into production?  (This is information you can learn for yourself in the industry.)
  18. How many policy administration customers do you have that haven’t made a significant number of code changes for customization?  Please name them.
  19. How many policy administration customers have done production upgrades to a newer version over the past five years?
  20. How many customers have terminated maintenance agreement in the past three years?

 

John Gorman

FAST

How Do You Define “Good Productivity” for Software Developers?

Friday, July 22nd, 2011

When I started running in my twenties, I was using a treadmill and doing 10 minute miles for three or four miles.  I felt pretty good about that.  Then I ran a five mile race and I found out that 8 minute miles for a longer run were kind of the minimum for a pretty good recreational runner, so that’s what I ran after that.

Now, my son runs high school cross country and 5 ½ minute miles for a 5k (3.11 miles) is pretty much where you need to be to be competitive.  Benchmarks are set and each year new freshmen come in and shave minutes off their times to meet the established standards.

Basically, how good is “good” is a matter of perspective and benchmarks.  Measuring speed, of course, is very easy and very objective.  There are teammates and years and years of records to compare to, so it’s easy to say what good is.

With software development, we really don’t have a great way of measuring “good” productivity.  Sure, we can measure quality and ability to meet commitments, but how do we know the productivity was good?  And how do you incent people to be good?

How can we tell the difference between the developer who runs the 10 minute miles (and he looks good doing it) and the guy who runs the 5 minute miles?  How do we reward them?

Do you ever feel like your developers are walking at a 20 minute mile pace and passing it off as “running”?

 

John Gorman

FAST