Here are some nice posts by Hung Jung Lu on comp.lang.python, 2004/03.

...

I use C++ and Python everyday. Let us be fair and point out some good things about each of them.

(a) In compiled language like C++, changing function prototypes and variable names is comfortable, because the compiler will find all those spots that you need to change. In Python, you do not have the same level of comfort. Sure, there are other techniques, but it's different than clicking a button.

(b) Cameron said something very true in my opinion: for large projects, you want Python. But he said so without giving more details. So let me add some comments.

In my opinion, the essence of software development is code/task factorization. It seems such a trivial concept, but if you really really think about it, goto statements, loops, functions, classes, arrays, pointers, OOP, macros/templates, metaprogramming, AOP, databases, etc, just about every single technique in programming has its base in the concept of code/task factorization. Take for instance classes and inheritance, basically, you factor out the common parts of two classes and push it up into a common parent class. To go one level deeper, my belief is that at the bottom, all human intellectual activities are based on factorization: no more, no less.

In large projects, you'll find that you need to factor out even more. Let us take an example. Suppose you write an application, and later on you realize that you need to make it transactional: that is, if some exceptions happen, you want to roll back the changes. Now, this kind of major after-thought is terrible for languages without metaprogramming capabilities. To add a new feature, you will have to make modifications in hundreds or thousands of spots. Another example, suppose your software is versioned, more over, you have different versions for the application and for the data file format, and your application needs to work with legacy file formats. Again, without metaprogramming capabilities, your code will have many redundant lines of code, or be cluttered with tons of if-statements or switch-statements. Another similar problem: you have several different clients that buy your application, and they want some different extra features. Again, without metaprogramming, your code will be either hard to code (using virtual functions, function pointers, and/or templates in C++), or will be cluttered with if-else- and switch- statements (a terrible practice that will make your code unmaintainable.)

As your project grows more and more complex (become threaded, many new clients requirements, support for legacy versions, using distributed computing in a cluster, etc.) you will realize more and more that you need to factorize efficiently, otherwise your pain will be unbearable.

When you have reached that point, you'll come to appreciate simplicity and purity in a language. Frankly, Python is good but still not good enough.

For large projects, if you use a rigid language, then your best bet is to use tons of programmers coding trivial interfaces and APIs? to make up for the shortcomings of the language. In flexible languages like Python, you often can use metaprogramming features to factor out the common areas. At that point, I think that issues like automatically finding name changes as I mentioned in point (a) become small issues, because you will have bigger concerns. The fact that you may miss a name change or function header change is not the thing that will kill you. The fact that your entire system is unmaintainable is the thing that will kill you. Don't look at individual bugs when you are talking about large projects, because your worry should not be there: your worry should be focused on how to make your system maintainable. Bugs can and will be fixed. But if your language does not allow you to factorize efficiently, at the end of the day, that's what's going to kill you.

regards,

Hung Jung

and:

Harald Massa:

 >> Can you explain to me in easy words, why it is NOT possible to integrate 
 >> prototypes into Python to stand "side by side" with classes?

It is possible, but you will not be able to retroactively apply it to many existing objects. You will only be able to do things with your new customized objects.

For instance, there is a class called module, and in Python you cannot add attributes to it. Similary, there is a metaclass type, and you cannot add attributes nor insert hooks to it.

Either you start afresh with prototypes from ground up, or you won't be able to modify the behavior of existing Python objects.

I believe there was already some previous attempts along the line that you have said.

 >> I never had a problem to "add an attribute" to an existing object; I really 
 >> can't see why it should be more than some small hacks to allow "adding a 
 >> function to an existing object". 

Sure, adding an attribute to your objects is not an issue. Adding attributes and modify the behavior of other people's objects is the issue. These "other people's objects" include system objects, and objects created by third-party.

The "other people" often include yourself. It is hard to explain. Maybe I can suggest reading my previous posting:

http://groups.google.com/groups?q=g:thl1486542349d&dq=&hl=en&lr=&ie=UTF-8&oe=UTF-8&selm=8ef9bea6.0403260837.72a8fade%40posting.google.com&rnum=27

There are quite a few software development needs that one only discovers when one goes to large projects, with various components, maybe even in different languages.

It is only when things get complex that you wish you had a clean and pure foundation. When your projects are small, deficiencies and impurities in your language don't matter too much.

-----------------------

I think the current way how OOP is taught is kind of bad. The lectures would start with definition of classes, inheritance, virtual functions, etc.

As I have mentioned a few times in this mailing list, software engineering, and all human intellectual activities, ultimately come down to factorization (of code, or of tasks). From simple algebra to supersymmetric quantum field theory, it has been so. From goto statements to OOP to metaclasses to AOP to prototype-based, it has been so.

Instead of starting with dogmas and axioms, people can probably better focus on factorization and how it happened. People don't just invent OOP or prototype-based language out of blue, nor did they come up with database normalization rules out of blue. People arrived at these devices because they observed: (1) similar tasks or code spots all over places, that is, they discovered a symmetry, a repetitive pattern, which often was becoming painful to deal with, (2) they then figured out a way to factorize the code or organize the tasks, so to factor out the common part and make their lives less painful.

It's only after (2) that they invent a new concept or technology, and from then on they know that in the future they can start right away with the new approach, instead of writing the same code in two spots and later having to factorize them out.

------------------

I often don't know how to take it when I see people talking about OOP by using definitions like: polymorphism, data hiding, etc. As if these definitions were something of utmost importance. To me, OOP is just a tool for factorizing code, just like using for-loops and using functions to factor out repetitive code. Polymorphism, data hiding, etc. are all secondary features: code factorization is the heart and soul of OOP. Class-based OOP is a way of factorizing. Prototype-based is just another way of factorizing, which seems to be more elegant: instead of two concepts (classes and instances), you unify them and have only one concept (objects). Moreover, in a prototype-based language like Io, even scopes and objects are unified.

In C++, many new programmers get confused about the usage of macros, templates (generics in Java in C#) and multiple inheritance (mix-in). Sure, they are harder to read. But behind each device, the simple and ultimate goal is nothing but code factorization. Metaprogramming in Python? The same thing.

A CS professor friend of mine once said: "all problems in CS are solved by just one more level of indexing," which has been very true in my experience. I would like to say further that if someone truly understands factorization and applies it at every moment, then he/she should not only be awarded a Ph.D. in CS but perhaps also a Nobel Prize.

Hung Jung