2006-11-14

Current reading

Following the announcement of Mozilla Tamarin - the adaptation of Adobe's ActionScript virtual machine to give JIT compilation of JavaScript in Gecko applications, and the starting up of the SmallTalk.rb project to get Ruby running on a better VM, I've been doing a bit of reading:

StrongTalk - fast, typed smalltalk
- interestingly, unlike ActionScript, the typing is orthogonal to the speed optimisations
- everything is implemented as mixins, functions which add features to classes

Mirrors - a mechanism for separating reflection and conventional object APIs

JS Typechecking via JS-Sourcerer or a more formal approach which I don't know half the notation of
- JS-Sourcerer doesn't have a model for the interfaces to XUL, and doesn't accept IDL to define it (though it should be possible to create an IDL to its proprietary type definition language)
- there's also a more complete modal type model; I need to investigate more of the theory here, and here.
- at the least you can infer types at any execution point and then assume that changes to object's or prototypes relaxes assumptions
- I need to find whether most type inference systems for dynamic languages are Prolog-like, and whether a Rete-like system is more suitable for dynamic languages
- assuming the cost of the inference actually is worth it in execution terms; it may be better to do some pattern matching then allow constraint violations to push execution back to the interpreter. So choosing an inference engine which is suited to the pattern matching may be more important.
- Type Feedback vs. Concrete Type Inference - maybe TI in the bytecode generator, then TF in the VM?

And, in other news, there's the re-branding of the Semantic Web as Web3.0. Personally, I've been using Web3 for community driven 3D on the web. Let's see if RDF hacking get as popular as World of Warcraft machinima or Virtual Life bartering.


TME

Labels: , , , ,

2006-04-25

Art, state of, one year on from last year (the).

Previous frustrations with XMLHttpRequest, and more recently finding DeltaV didn't appear to be supported even in Firefox at work may be changed if a bit of sensible flexibility gets the W3C spec to conform to the HTTP rfc's extension-method = token rather than a vendor specific white-list.

I'd still really like a browser based, version controlled, graph drawing tool for modelling and knowledge capture, but with the WhatWG's canvas and support for SVG in Firefox stable enough that I'm writing production code based on it, and the real possibility of single page applications such as this wiki using Amazon Simple Storage Solution, I'm thinking of retiring the Java based, serverside image code of my LapisBlue, which I never got round to connecting to a versioned store anyway.

So I'm thinking of retiring LapisBlue, since I'm paying monthly for a full featured server solution that's not getting any use, whereas I can pay for a tiny amount of data storage and get the clients to do the rendering work now. Though proper version control would be nice, saving deltas or labelled versions to S3 should also be possible, more fun that configuring a tomcat installation that pulls in a thousand or so libraries, and not reliant on extension methods as subversion's DeltaV implementation is. What you lose is a queryable database, but I'm thinking of using it for a pattern wiki rather than anything else.

In other news, I got rather exited over the weekend thinking about using SSE for a faster 'byte' code interpreter, and resurrecting kin - my toy language for graph matching based code generators to generate simulation models defined generically on traits, which I'd partly implemented on the JVM - as a scripting language plugin for the gecko platform. If you can SIMD the graph matching, and maybe also either SIMD the bytecode scripting, or (since kin uses pure visitor functions a lot) use SIMD optimised blocks with scripting, you may get close to Java performance without having to track Sun's generics cruft.

It's still easier for me to write Java than C++, especially when you need to use libraries - each library having its own code conventions and memory management model - or Lisp for that matter (since I've done far more Java than Lisp in anger), but for many things JavaScript's good enough. The only things I've found this year that I've written in Java have been something to test an algorithm for work, which could have been written in anything really, and an annealing based graph layout, which ran too slow in JS to be useable. But annealing graphs is exactly what kin would be suited to, and be designed to parallelise it, so it may be that the Java world gets even smaller for me.

I'm not sure how useful web-based simulation tools would be, and suspect a good enough interpreter + a really, really good code generator would be a better match to a lot of the problems I like thinking about than trying to do anything like Sun's Hotspot, brilliant though it is.

Third point of this summary - I'm also excited about building distributed clusters of collaborating applications and services on top of xmpp. It's something I've been pushing at work, and I've got enough of the infrastructure there that the rest of my team are starting to play with it, building models and connecting them to XUL UI's with xmpp pub-sub. I've got till mid June to build it out to a full distributed system with service discovery, which means a mix of quite easy xml binding and doing some fairly hard concurrency work to get the simulators' execution model and the pubsub code working well without excessive threads or critical sections.

Oh, and I'm going to XTech2006 next month. It's nice to be working for somewhere that's not too stingy to send it's people away again.


TME

Labels: , , , , ,