Swish introduction of Web 2.0

Link: http://www.youtube.com/watch?v=6gmP4nk0EOE


We are teaching the machine.
The machine is us.

Labels: ,


8 ideas that will revolutionize the 21st century - Ben Hammersley

Link: http://lesblogs.vpod.tv/2005/12/15h4516h45_8_id.html

Which, to put the meat before the fish, are:

  • Information wants to be free

  • Zero distance

  • Mass amateurisation

  • More is much more

  • True names

  • Viral behaviour

  • Everything is personal

  • Ubiquitous computing

His early historic examples of technologies which met a limit - horses and swords - were portrayed as self limiting. The last Pattern sabre was better than the previous Pattern sabre, which in turn was the best cavalry sword. You don't see them much in wall displays, as they're not very pretty. You could make a better sword still, but someone invented trench warfare and the machine gun. So the only swords are now decoration or sporting equipment, where their evolution is intentionally limited by the Federation Internationale d' Escrime. Similarly, you could engineer better horses by genetic enhancement and composite augmentation the skeleton, but we have cars and planes as transport, but what you can do to race horses is limited by the relevant authorities.

The history of these technologies is not that they hit an internal limit, but they were made obsolete by other technologies, became items of nostalgic and leisure interest, and the evolution consciously limited by aesthetic concerns or sportsmanship.

Comparing these technologies to 'information technology', Hammersley put up a log/lin exponential graph showing Moore's law, but then admitted to not knowing what 'exponential' actually means.

You cannot compare swords to 'information technology'. Even the Wikipedia Moore's Law illustration shows five different technologies just in the computational hardware domain - a better comparison would be swords to valves, and 'war-fighting technology' to IT. People still use valves for niche applications - guitar amps, some retro kit - but the mass market moved on to a different technology. Whereas we dropped as much explosive in the first three days of Gulf War I as in World War 2, so war-fighting is still on a pretty high up-curve (though I haven't enough meaningful data points to say whether it's exponential, or even whether it matters if it is).

What does differentiate information technologies from other technologies, as Hammersley points out, is that they augment the creation of technologies. Improvements in the sword relied on improvements in metallurgy, which relied in turn on the rapid industrialisation of the late 19th century. Although factories and foundries make the material for better factories and foundries, IT and particularly media for open communication between otherwise disjoint groups allows intellectual boot strapping.

On the octet above, Hammersley stops his explanation after the first three.

He only illustrates the third - mass amateurisation - with a direct example, that of comparing his early experiments with cine film and the current consumer-level video camera technology. Nowadays, you can get good at video with lower personal cost. That has more to do with the market than democratising the technology - with a cine camera (or a tin-can), you own the technology, not Apple or Sony.

When considering the first, he points out the governments push back against it. This may be true, but technology streams churn faster than legislation. Censorship at a level that democracies will tolerate has to target a particular technology, and so has the effect of bringing forward its obsolescence.

Zero distance matters as specialist communicacies can form, which then mature into thought shops as the technologies they promote become commonplace - the evolution of the xml-dev mailing list is a case in point.

When talking about blogging, Hammersley says 'Blogging is all of these things'. Blogging, Podcasting, Videocasting are current technologies that illustrate these. Blogging seems to have out-competed BBSes which out-competed email lists for certain modes of discussion. Podcasting and video don't seem to promote discussion (this is a written reply to a video, not me on-camera), so IME have less of an effect on viral behaviour. That is different for non-techies - it's not unusual for something on myTube to have a video response, but that really do have to different the creation of technologies and their adoption and their consumerisation.

Certain video blogs - such as the videocast's on Jon's Radio - may have an acceleration effect on technology creation, but the vast majority don't. You also cannot properly absorb technology any quicker - watching Jon demonstrate something doesn't mean you don't have to learn how to use it.

One group of people - the developers of Microsoft's Outlook web access - invented XMLHttpRequest - which allows you to develop richer web applications. Seven years later, we have Ajax toolkits and many professional web application creators use it, and there are now companies offering simple Web2.0 hosting, which are getting the technology from adoption by specialists and into amateurisation. Personally, I'm bored already.

I'm not aware of non-specialists who create technology (not all specialists are professionals). Exponential growth in videocasts may not effect the specialists - you require higher information density, and you have to learn your stuff. Being able to put a video blog together is nothing to do with being able to squeeze more transistors onto a chip, or even to create a video blogging service.

The scaling effect of ubiquitous communication will only accelerate technological change up to the point where those people creating technology are fully in communication with everyone they need to be. I'm already limited not by my ability to find the information I need to create, or people to talk with, but by my ability to absorb the information. I don't believe IT will provide much more acceleration in the creation phase. What viral communication does do is lower the time from adoption to consumerisation.

The 19th century engineers who pioneered the industrial revolution also believed that they 'were the flat-mate[s] of Leonardo da Vinci'; everyone is a child of their own renaissance. Since the early 20th century, it has been impossible for one individual to understand all that humans know of maths or of physics. I'm sure I understood a higher percentage of computing technology when I was a teenager than I do know - I've made conscious decision to drop anything to do with low-level graphics hardware programming, which I did my MSc project in 14 years ago.

Our renaissance has a higher churn - Web2.0 will have much a shorter a lifespan than the small-sword.

I can see two outcomes - either IT really is different, and we have a future in which more and more people rely on technologies further and further away from the understanding of the average individual, or the history of the horse and sword will repeat itself and the movement of technologies from professional into amateur use will inhibit the evolution of the technology to one that is democratically acceptable.




You can't buy icecream from Amazon.



Art, state of, one year on from last year (the).

Previous frustrations with XMLHttpRequest, and more recently finding DeltaV didn't appear to be supported even in Firefox at work may be changed if a bit of sensible flexibility gets the W3C spec to conform to the HTTP rfc's extension-method = token rather than a vendor specific white-list.

I'd still really like a browser based, version controlled, graph drawing tool for modelling and knowledge capture, but with the WhatWG's canvas and support for SVG in Firefox stable enough that I'm writing production code based on it, and the real possibility of single page applications such as this wiki using Amazon Simple Storage Solution, I'm thinking of retiring the Java based, serverside image code of my LapisBlue, which I never got round to connecting to a versioned store anyway.

So I'm thinking of retiring LapisBlue, since I'm paying monthly for a full featured server solution that's not getting any use, whereas I can pay for a tiny amount of data storage and get the clients to do the rendering work now. Though proper version control would be nice, saving deltas or labelled versions to S3 should also be possible, more fun that configuring a tomcat installation that pulls in a thousand or so libraries, and not reliant on extension methods as subversion's DeltaV implementation is. What you lose is a queryable database, but I'm thinking of using it for a pattern wiki rather than anything else.

In other news, I got rather exited over the weekend thinking about using SSE for a faster 'byte' code interpreter, and resurrecting kin - my toy language for graph matching based code generators to generate simulation models defined generically on traits, which I'd partly implemented on the JVM - as a scripting language plugin for the gecko platform. If you can SIMD the graph matching, and maybe also either SIMD the bytecode scripting, or (since kin uses pure visitor functions a lot) use SIMD optimised blocks with scripting, you may get close to Java performance without having to track Sun's generics cruft.

It's still easier for me to write Java than C++, especially when you need to use libraries - each library having its own code conventions and memory management model - or Lisp for that matter (since I've done far more Java than Lisp in anger), but for many things JavaScript's good enough. The only things I've found this year that I've written in Java have been something to test an algorithm for work, which could have been written in anything really, and an annealing based graph layout, which ran too slow in JS to be useable. But annealing graphs is exactly what kin would be suited to, and be designed to parallelise it, so it may be that the Java world gets even smaller for me.

I'm not sure how useful web-based simulation tools would be, and suspect a good enough interpreter + a really, really good code generator would be a better match to a lot of the problems I like thinking about than trying to do anything like Sun's Hotspot, brilliant though it is.

Third point of this summary - I'm also excited about building distributed clusters of collaborating applications and services on top of xmpp. It's something I've been pushing at work, and I've got enough of the infrastructure there that the rest of my team are starting to play with it, building models and connecting them to XUL UI's with xmpp pub-sub. I've got till mid June to build it out to a full distributed system with service discovery, which means a mix of quite easy xml binding and doing some fairly hard concurrency work to get the simulators' execution model and the pubsub code working well without excessive threads or critical sections.

Oh, and I'm going to XTech2006 next month. It's nice to be working for somewhere that's not too stingy to send it's people away again.


Labels: , , , , ,