I started thinking somewhat about the projects I've worked in and with, and how they line up against each other regarding engineering practices and how the software is being developed. I noticed that when thinking and talking about these things it's quite hard to quantify the level of sophistication a specific project has, and there are no really good measurements of this thing.
Sure, CMMI is a method of quantifying this, but to be a bit unfair that's more aligned for consultants in suit and tie than for developers or managers wondering how sophisticated the development environment is. Also, CMMI and other ISO certifications are way too document centric, instead of being people centric. With this I mean that the level of sophistication you are at the moment often is based on how nice procedures you have written and placed in binders. I'd rather have this quantification based on what you actually do in your project, and the factors to base the level upon should be on how much you've embraced good engineering practices and agile thoughts.
So I started thinking about this, and come up with some examples of different levels, somewhat like the Nokia Test for Scrum is a litmus test on if you're doing Scrum or not. Notice that these things shouldn't be seen as a prescriptive "do this or else you are doing it wrong", but rather as a pointer upon how far you've come in optimizing your software development process.
Level 1 - Ad-hoc
- Source code is on a shared server or a dev's box
- Releases are done on an ad-hoc basis and is made from private builds on the developers' machines
- Deployment is made by replacing some files on a server or changing code on the server by hand
- The debugger is often used when developing new code
- Small or no thought is given to application design or architecture
- Bugs are tracked in a Excel/text document
Level 2 - Managed
- Source code is handled in a SCM tool
- Some test code is written; mainly smallish unit tests and/or some error-prone integration tests
- Automated tests is done with point-and-click program (e.g. Mercury QTP)
- Test code often has dependencies which aren't provisioned in the test setup code (e.g. database contents etc.)
- Measurements are made of test coverage
- The debugger is used when bugs arise
- Everyone in the technical team sit by each other in an open room
- Bugs are tracked in some sort of system (Bugzilla, Team Foundation Server etc)
- Thought are given to splitting major releases into iterations
Level 3 - Continuous improvement
- A long-term plan for the SCM exists (> 2 major releases + minor and patches)
- The team uses TDD
- There is a high percentage of test case coverage
- Tests are both on unit level and integration level
- Deployment to test environments is automated
- Incoming defects are validated by writing tests
- The domain expert works closely with the team, and meets with them regularly
- There are iterations in the development
- A retrospective is being done after each iteration
Level 4 - Optimizing
- When test case coverage decreases there will be questions raised during daily standup meeting
- There is Osmotic Communication within the team (both business and IT representatives)
- Acceptance tests are automated
- Smoke tests are run after (the automated) deployment
- The debugger is frowned upon
- Deliveries are made frequently; there is thus seldom need for developing a patch
Now, dear reader (if you've come this far), what do you think of this idea? Could it be something to develop a bit more? Are there more than four levels? Should we divide this horizontally into different categories ("test", "requirements gathering" etc)?
2 kommentarer:
I think you're on to something here - a more formalized set of criteria for maturity is almost necessary to gain wide acceptance for Agile practices in many enterprises, and these criteria seem relevant.
..although, level 2: "debugger is used", level 4: "debugger is frowned upon"? A bit contradictory. :)
Personally, I've never used a debugger (actually only used IDE's for the last 4 years, in an almost 10 year career) - I write tests to reproduce bugs whenever I come across them.
I don't even know how to use the Eclipse debugger, and I have no interest in finding it out..
Well, I've noticed that (at least in the Microsoft camp, where I've got the most experience), that people tend to use the integrated IDE debugger frequently. As you get more and more experience, though, you tend to wield other swords, and using testing as the primary way of finding defects. Hence the transition of 1 - New features = debugger -> 2 - Bugs = debugger -> 4 - The debugger is frowned upon. But the mileage may vary, especially if you move between different dev platforms.
Eclipse debugger? Start the server in debug mode, set a breakpoint, switch to the debug perspective and you're all set. :)
Post a Comment