The software engineering world is in danger of repeating the mistake it made with objects two decades ago.
Back then there were legacy "structured" languages like C and Ada, and new exciting "object oriented" languages like Smalltalk and Eiffel. C++ was promoted as a "middle way" that let you "choose the best tool for the job". This made the pure OO languages look extremist. So, it was argued, if you had a problem best solved by structured programming you could do that, and if you were doing an application with objects in it then you could use those. It also meant that your old C programmers could pick up the tool and start using it immediately without having to relearn how to design a program.
"Aversion to Extremes" is a well-known cognitive bias, and these arguments play up to it, but of course it didn't work well in practice. OO features didn't dovetail neatly with the existing structured features, leading to an exponential explosion in the rules defining how the various features interacted. The mess was not helped by experienced structured programmers who felt they should use the new sexy OO features; the result was often a conventional structured design with some random virtual functions sprinkled around.
Today we have the same story happening again. On one hand we have legacy OO languages like Java and C++, and on the other hand we have pure functional languages like Scheme and Haskell. So along come "hybrid functional" languages like Scala which basically make the same promise as C++: if your problem has lots of objects then you can carry on doing the same OO designs you know and love, but if you think that these magic first-class functions would be useful in some complicated algorithms then you can use those as well. And its going to fail for the same reasons that C++ failed: the OO and functional features don't interact well, so we are going to have lots of messy rules about them that cause subtle bugs, along with attempts by OO programmers to use chains of map and filter functions that work inefficiently because the compiler can't optimise them. And in ten years time there will be an "Industrial Strength Scala" book consisting of a long list of features that should be avoided if you want reliable software.
Back then there were legacy "structured" languages like C and Ada, and new exciting "object oriented" languages like Smalltalk and Eiffel. C++ was promoted as a "middle way" that let you "choose the best tool for the job". This made the pure OO languages look extremist. So, it was argued, if you had a problem best solved by structured programming you could do that, and if you were doing an application with objects in it then you could use those. It also meant that your old C programmers could pick up the tool and start using it immediately without having to relearn how to design a program.
"Aversion to Extremes" is a well-known cognitive bias, and these arguments play up to it, but of course it didn't work well in practice. OO features didn't dovetail neatly with the existing structured features, leading to an exponential explosion in the rules defining how the various features interacted. The mess was not helped by experienced structured programmers who felt they should use the new sexy OO features; the result was often a conventional structured design with some random virtual functions sprinkled around.
The book "Industrial Strength C++" is a case in point. It is basically a catalogue of C++ language features that interact in dangerous ways. http://www.amazon.com/Industrial-Strength-Recommendations-In...
Today we have the same story happening again. On one hand we have legacy OO languages like Java and C++, and on the other hand we have pure functional languages like Scheme and Haskell. So along come "hybrid functional" languages like Scala which basically make the same promise as C++: if your problem has lots of objects then you can carry on doing the same OO designs you know and love, but if you think that these magic first-class functions would be useful in some complicated algorithms then you can use those as well. And its going to fail for the same reasons that C++ failed: the OO and functional features don't interact well, so we are going to have lots of messy rules about them that cause subtle bugs, along with attempts by OO programmers to use chains of map and filter functions that work inefficiently because the compiler can't optimise them. And in ten years time there will be an "Industrial Strength Scala" book consisting of a long list of features that should be avoided if you want reliable software.