I've been a computer programmer for over 20 years now, enduring framework after framework, changes in technology, changes in how we should develop, being told this is better, or that is better. Sometimes I really get fed up with it.
For the experienced programmer, we just want to develop a system/application as we know how it should be done and not force new tech on us just for the sake of it - especially whilst we are developing that project. I for one, want to produce an end result on time, and something that I'm pleased and confident with.
The state of development projects never seem to get any better either. How many failed I.T. projects have been highlighted over the years, costing governments millions and taking years to complete or getting scrapped?
Surely, with the rise in various methodologies, one of them must work? Or maybe we are doomed because the fundamentals or I.T. development is just wrong?
Traditional Factories
Looking back at my days as a Mechanical Engineer (some 25 years ago) things we different, and working for a manufacturing company that developed railway components, the correct processes and procedures were in place to produce components that were developed and delivered on time and were properly quality controlled.
In a factory, a product or component is not developed on the fly. It has gone through a process of R & D, quality controlled, reviewed, re-visited and then after that is then adopted and is viable for inclusion in larger projects or as a product that can be sold as is.
Where I worked, each part of the process was timed, with Route Cards showing the process of taking a raw piece of material through each defined step to finally end up with a finished product. Each part of the process required a Job Card, allowing the individual worker to indicate the time taken for their bit of the work. All this was fed back into the EFACS system (does this exist any more?) and used to more accurately estimate production times etc.
In a factory, if a new product is being designed, then the R & D to do that is a project in itself and can be time boxed if necessary.
The I.T. Factory
In I.T., we don't have the same rules. We must developed a system, include new technologies/techniques/frameworks as we go (often just for the sake of it), take on more complexity and still give accurate estimates on how we are tracking and if we are on time. Naturally, we can only guess, and quite often we are miles away from the true figures.
We are expected to do the R & D while we develop, be agile, handle change, work with Project Managers that add their spin on how things should work and worse, tell us one thing and tell the project sponsors another...
It's no wonder that projects fail. And if it doesn't fail, how much of that software is re-written for version 2? And how much has to be re-written because the business process has changed?
The funny thing is, the end users of the software don't care if it uses this framework or that framework or technology. They just want it to work, improve their job role and make things easier for them and/or perform their role more efficiently.
Instead, they end up with software that is already creaking at the seems, is slow, buggy, does not make their lives any easier and sometimes makes it worse.
So, how can this be fixed?
1) Start dedicating resources to R & D projects. This will allow us to focus on new technology in a controlled manner, which can be time boxed and give the development team a chance to understand how 3rd party frameworks should be used.
2) Don't to R & D in a project.
3) Use the skills of the team. If they need to learn something for the next project, then it must be recognized that time needs to be allocated for that learning.
4) Learn from environments that work and have been working efficiently for years.