Wednesday, November 17, 2004

Boa's _custom_classes

Shame on me, if I bodered to read the help I could have long ago start subclasing wx controls "inside Boa GUI builder".

class wxFrame1(wxFrame):
_custom_classes = {'wxTreeCtrl': ['MyTreeCtrl', 'MyOtherTreeCtrl']}
def _init_utils(self):

python static typing should be protocol adaptation

This post is about this link

Glad to hear that. As for me, I'd love it if function arguments, only,
could bear an 'as' clause (making 'as' a keyword is long overdue):

def x(y as foo, z as bar):
body of x

translating into:

def x(y, z):
y = adapt(y, foo)
z = adapt(z, bar):
body of x

where builtin 'adapt' does protocol adaptation. On the other hand, if
the semantics of those 'as' clauses were something like:

def x(y, z):
if type(y) is not foo: raise TypeError, 'y must be foo'
if type(z) is not bar: raise TypeError, 'z must be bar'
body of x

then that's the day I move to Ruby, or any other language that remained
halfway sensible (if any).

Of course adaptation is posible already using decorator functions, or if using PyProtocols someone can also go having all kinds of contracts, the simplest would be function dispatch conditions, like:

def foo(x):
def foo(x):

but anyways I beleave some syntax in the language for stuff like that will make people actualy use them more often.
I beleave there are lots of people in python coming from C, VB, Java like languages, who have a harder time using them, while others coming from Lisp who had them all since forever will say finaly, oh God what took you so long.
Designing with different conceps it's never going to be easy. To put this in paradoxical terms, nobody never love anything unless they already do.

Thursday, November 11, 2004

Creepy Polar Express - Uncanny Valley

This post is about this link

Uncanny Valley

Monday, November 08, 2004


This post is about this link

Loredana Groza on

abstractions save time working, but not learning

This post is about this link

Today I read this great article again and somehow I can't take this out of my mind:

So the abstractions save us time working, but they don't save us time learning

And all this means that paradoxically, even as we have higher and higher level programming tools with better and better abstractions, becoming a proficient programmer is getting harder and harder.

Computers are useless. They can only give answers

Evan Jones:
"Computers are useless. They can only give answers" - Pablo Picasso

Nice motto, however I guess they can make humans raise questions too. But the topic of what makes humans raise questions seams insanely hard to reason about. And what is a valid new question and what is a reformulation is another one.

It's just me being bored and having nothing better to do.
But is it ever different in such topics ever, anywhere?

Sunday, November 07, 2004

Bloom Filters

This post is about this link

The first think for me would be to implement a quick exists(txt) in a given huge collection of texts. rom there I can start searching.
Maybe the vector could be a "hash table" with indexes in the collection as values and provided that the hash functions, between themselves have very low collision ratio, the bits in the vector are now pointing to index in the collection instead of seen/not-seen and we could have a very fast search.
Also would make a nice implementation of "e-mail and beyond" black-list/white-list.
The only question is how do you come up with k different hashing mechanisms yet in a limited space like (1..m) and what is the best k,m to use for a given lenght of your collection?. I guess the answer is God bless the mathematicians.

UpdateNow that I think a little more about, this is all blabla (hash table), since m would probably need to be as big as the collection length, or somewhere near to allow for overlaping not to happend, and so cannot be fixed from the start, wich I beleve was the hole point of this, (where there are k different hashes to minimize the impact of a given relatively small m).

Wednesday, November 03, 2004

http pull / push

This post is about this link

For push:
server sends Content-type: multipart/x-mixed-replace and then sends each part whenever it wants.
The browser will keep waiting for parts and present each part as a different page.
-in ASP do response.flush for each part and "sleep" until next part?
-is CGI ok since they will stay alive for so long?
-how is the server to cope with many of this such request, what is better separate processes (CGI, out of process ASP) or in-server-process scripts (pooled ASP)
-what about apache, is CGI or mod-python better
-what about third parties like webware?

How does someone do incremental updates of only one part of the page?
Maybe via JavaScript window.XMLHttpRequest and document.write or element.innerHTML ?
Maybe the MIME part is some javascript in a hidden (1 pixel) frame that updates the rest of the page (frames)?

It kind of starts to smell already, but I should give it a try and eventualy look for more info, for now i think I'll stay with pull via javascript XMLHttpRequest or HTML <meta refresh>.