Anthropomorphizing Technology

I’ve just read an extract of Clay Shirkys Cognitive Surplus book in the Times along with a very good interview about him and other web gurus. Unfortunately you have to pay to get Times articles these days (hmmm. Ironic) but there’s a good review if it in Guardian. There are lots of good videos on YouTube of him talking about the concept of cognitive surplus so I encourage you to listen to them.

Clay Shirky

…anyhow I could spend the rest of the year dissecting Shirky’s writing because I love his enthusiasm and agree with much of what he says but what I wanted to get out in this post is the fact that people are really anthropomorphizing technology. He does it and Skirky has particularly emotional prose about the internet and how when he used the internet in 1992 it was an emotional experience for him (his brain flipped out!), his compatriots do it when they write about the internet and technology and we’re all doing it as a society.

I was out drinking with Martin Weller the other week (always a bad idea) and we got to talking about the fact the friends of ours talk about a piece of technology with such irrational love and affection that to an outsider it seems bizarre but to us it’s quiet normal although we might not always share their love of a particular technology. Some people at the OU for example love FirstClass because we’ve used it at the OU since the mid 90’s and some feel a kind of ownership of it that others might not.

It’s not just ownership though but a sense that the technology is life enhancing.  Take the recent ‘buzz’ about the iPad. When all is said and done the iPad technology is not a big leap forward from that of tablet PC’s or indeed from Apple’s own iPhone but it really got into people emotionally in a way that I haven’t seen a technology do to the same extent before. It was slightly scary to see the reaction of some people to it and how they talk about it as if it is a living breathing thing.

I think there are two distinct patterns here.

1. A kind of addictive quality to new technology where it fills a gap that people never knew they had.

2. A sense of ownership and stakeholding for technology that has been around a long time and has given that person a wholesome experience over a sustained period of time so that they have become personally involved with the technology in a way they wouldn’t have imagined when they first saw it.

Both these have parallels to relationship building. The instant attraction of new lovers, and the slowly growing deep love of long term relationships.

…I don’t think that’s a coincidence.

Advertisements

Convergence v Specialism

I’m very interested in the trend with devices such as XBox 360 towards a convergence of media types and delivery with it’s support of Sky TV through the XBox and broadband via Sky Player – Stephen Nuttall from Sky was quoted as saying: ‘Our partnership with Xbox is a further example of our commitment to put choice and control in the hands of customers.’

I’m particularly interested in the ‘blurring’ or perhaps integration is a better word between the different media types so the idea of interactivity around watching a football match whilst downloading stats and also interacting with other fans is cool, also concepts around adding value to experiences through ‘back channel’ activities is something becoming more prevalent, as is the concept of ‘on demand’ services.

I think the really interesting stuff will be when the boundaries between an interactive TV experience, a gaming experience or an internet experience all disappear to the extent that they become platform neutral and coherent rather than bolt on things. The announcement of the Boxee box earlier this month is a step in the right direction, this really is opening up the rich resources and putting power int he hands of the users. It also means that you no longer need to get content ‘produced’ on a TV channel in order to get your content to a large audience, consumers become producers.

I’m very interested in using gaming technology and interactive TV in more powerful ways to develop engagement and learning, supported with internet they become extremely powerful tools.

kids, computers and change

I haven’t been blogging for a while because I’ve been involved in the logistics of moving a university depratment to a new building this week (think herding cats and you get the picture). Now that I’m back I’m going to make up by having a bit of a stream of consciousness about slightly connected topics.

1. My kids have managed to ruin my computing at home by spilling water over the keyboard which has taken out the ‘n’ key and the space bar. I tried using the on-screen keyboard and it’s like pulling teeth – it is worse than trying to do a full blog post via txting. I found it seriously hard work so I’ve resorted to the laptop. It’s these small uphill battles that make my online experiences so erratic, or perhaps I’m just making excuses however other people seem to be ‘always on’ and in my experience power plug issues, network connectivity, kids and other various things always get in the way of me doing this. I talked to a colleague about the transient nature of my online stuff when we were trying out plaxo recently. I was trying to get my twitter friends into it and then tried using my facebook contacts, but the interface started to annoy me and I was five seconds in without success and about 5 seconds away from giving up so I do think that the ten second rule than NN (Neilsen Norman) the usability gurus used to apply to websites still holds (in my experience) to web apps. In 2006 the BBC had this down to 4 seconds for commercial sites selling goods.

2. Janet Street Porter did a rant in the Independent on Sunday in her editorial about how all our details are being exposed and exploited by, for example, YouTube and the fact that studies show that people using the internet and social network for long periods have trouble making real friends and that relationships for the next generation are going to suffer. I hardly ever agree with JSP and my views are significantly different to hers on this but I do think that getting the public/private stuff right on the internet is difficult. I tend to be very cagey about myself because I do prefer to keep my private life a closed book, knowledge is power and you never know when that slip of the tongue might come back to bite you. Other people however are totally very open and I find this refreshing but also a bit disconcerting. I’m a very shy person and I expect that comes through with how I act online and choose to reveal myself in the virtual world. I don’t worry about how kids will deal with real relationships by the way. They’re just finding new ways to communicate, not replacing the old but enhancing these.

Twittertrends

I see that Tony Hirst has added a post showing some nice graphs showing the trends showing potential growth of Twitter. What is they say about lies, damn lies and statistics? – It’s interesting to note trend data like this because things such as Secondlife get big blips in popularity and I think it’s when something new has been created within the space. Twitter shows steady growth but as Tony says the figures lie. Twitter is certainly filling a gap that existed as judging by the response to my recent post and the number of people who arguing very strongly in favour if it.

I’ve been recommended by a friend to read a book called “The Future of the Internet and How to Stop It” (Jonathan Zittrain). There’s a review of it in this months BBC Focus magazine too. According to the review he argues that the end of the internet as we know it will be because of the lack of creativity and people turning away from the web because of the lack of control and the prevenlence of malware and viruses, moving to more ‘locked down’ solutions. I haven’t read it yet but it does sound like an interesting read. I’m off to get it.

My failing memory and fear of going outside

I have just started to use Remember the Milk which is set up to do all those things that you always mean to do but never actually get around to doing. It’s got a good range of web 2.0 type integrators with twitter and phones and google calendar etc. My problem is currently that I’ve not had the time to remember to put reminders into RTM to remind myself to do stuff. I’m now having to put a reminder into RTM to remind me to use RTM to put stuff in! – It must be my age.

Actually one of the developers in my team (Nick) is using RTM and the RTM API to provide some elements of the Social:learn project. I think Social:learn is a fantastic concept in that it’s exploring methods of learning that are much less intitutionally (provider) focused and much more learner focused which is exactly how it should be. The investment is relatively lightweight at this stage as it’s largely glueing, adapting and sharing data across a series of existing tools and architectures but the potential is huge if even one of these applications takes off and I think it’s exactly the right approach to explore in the ‘post VLE’ era where people are less concerned about where they get information from than they are about what the data is and how it will help them.

In the same vein colleagues of mine are discussing moving wholesale away from using the institutional systems to provide them with email, scheduling, document sharing and many other business functions but instead moving over to using external providers for this (e.g. Google) and the arguments against doing this are now being outweighed by the arguments for it. I still have some reservations though and so a group of us are going to explore this and work alongside the central services provider at the OU to see how well these things work to meet staff needs.

Here are some of the argument against (in my opinion)

1. Stuff less secure and more open to attack

2. External providers can disappear or have services out of action when they’re needed.

3. External providers have no responsibility to maintain the (free) services for users.

4. The amount of space you get may not be adequate for your needs

5. There is no “institutional branding” on emails etc. coming from external engines.

6. What happens if things go missing, there’s no backup or retrieval mechanism.

7. There is no (institutional) support for dealing with configuring external clients or services.

8. Stuff coming through external providers may be prone to interception or blocking.

9. It doesn’t integrate with other instituional services.

Here are my responses…

1. The security on services externally is now as good as security internally. The bigger more established players have invested much more time and money into methods of ringfencing and securing data than anything that a public sector institution could do.

2. The datacentres used by most external hosting providers have levels of redundancy which again outstrip anything that could be provided by a single institution. Their businesses rely on keeping the services up 100% of the time, they have massive contingency and failover options in place to ensure that individual parts can be removed without the service failing (there is a good talk that I went to last year as part of the Future of web apps conference by Matt Mullenweg the guy who developed Wordpress about this very topic).

3. This is true although advertising revenue helps to ensure that they have a need to try to maintain free services and these are also the methods people use to get ‘hooked in’ to the next teir of services so not providing these would be catastrophic. Also the big players rely on the number of users they attract, so having a failing free service would soon stop them from operating.

4. This is no longer true, in fact external providers can generally provide more space than any instituional service provider can meet. Email is an example of this with Gmail being vast in size compared to the meagre limit set by the institution (50Mb?).

5. This can be got around. You can do work on the headers to allow them to show that it’s from another account and you can add the instituional signature to emails etc. I think this is problematic though and the header stuff can mean that your mails get trapped. I would suggest though that it may not be the most terrible thing in the world if emails come out from an account which doesn’t have the institutional domain. There have been occasions for example when local email has been down and the IT people here have all switched to Gmail and other mail providers and external IM systems in order to keep in touch and keep exchanges of information running.

6. There can be backup and retrieval. I think it’s to do with how you manage your account. You can for example get POP mail to keep a copy on the server and also download so you can have versions stored by your local mail client periodically. You can set up routines to do this automatically.

7. I think empowering users to help themselves is always a good thing and takes burden off IT support. External systems tend to be very easy to use and configure in order to attract customers. I see that as a good thing for organisations.

8. There is some truth in this however it’s a manageable issue. I’ve heard of people not receiving mail that they should have and others being blocked or blacklisted because the mail server they use is on some blacklist. It’s manageable because if you find it happening then you can do things about it. You can switch to another mail server, you can reduce the likelihood of your email being used by spam bots and you can watch what filters are being applied to incoming or outgoing mail.

9. I would say that the opposite is true. Institutional services tend to be siloed. Internet provided services tend to have open API’s and talk happily to many other tools and services. If they don’t then you can build the integrators yourself.

Finally the reasons FOR going outside..

1. cross browser, cross platform, cross system, cross organisational access to services.

2. No issues or barriers to use.

3. More space to use and store data.

4. Better for sharing.

5. Doesn’t require a complex infrastructure to use (similar to 1 but slightly different in that I’m talking about the dependancies and platform requirements, for example having to use VPN from home and having the Office 200x suite installed on top of Windows)

6. Always available!

Corporate Authentication Systems (hell!)

I’ve been struggling recently with the ‘enterprise security system’ in place at the OU. This is some obscure system invented in-house (by sadists) to authenticate people against our systems.

It works OK most of the time but it’s not standards-based. It doesn’t talk LDAP. It doesn’t talk to other authentication systems in any meaningful way. You need to set it up on every service you run. You need to set up ‘tokens’ in every directory of web servers where it’s installed to tell them who to allow in. etc etc.

We have a myriad of great systems in the university but they are being hamstrung by the fact that we can do any kind of meaningful pass through authentication. Luckily a colleague of mine has invented a mechanism for getting the  system to work in harmony with OpenID and we’re close to achieving some way to allow us to work with other systems more meaningfully in the future. I’m very frustrated about it now though because although the current system works reasonably well for people in the OU there is no reasonable way of allowing ‘authenticated visitor’ or ‘logged in public’ access in any meaningful way, we can of course merge authentication systems for a particular services (as I do) but this gives problems later when the same visitors want to access other OU services.
I’m not sure how much of a problem this is elsewhere but I would guess that the lack of a decent authentication and user verification service has put the OU back several years in development time because every new project with a mixed user community(OpenLearn being the most recent example) will have to find some sort of individual workaround. Central services don’t see a problem because most of the services they provide are staff only (or student only) and therefore it’s simple for them and anyone else doing development across user spheres just has to find their own solution.

Rant over I’m off for a bath now!

Policing the internet..

This is a topic we covered as part of the “Future of Web Content” discussion and I wondered how long it would be before things in the real world started to catch up. Not long since we’ve now got proposals for policing the internet specifically to find and remove people who may be illegally downloading music. There an article about it on the BBC News website.

Three things that interest me about the proposal

1. ISP’s says it will be impossible to realise. I think that’s in line with what I was suggesting in my piece on FOWC.

2. The method of removal of service is interesting because it’s also a method I was proposing for dealing with virus spreaders, except mine was more subtle but the idea of cutting them off gradually from the web. The problem is as pointed out today on the BBC that it’s indescriminate because if you cut off based on an IP address you cut off a whole family (or internet cafe station or library terminal etc.) it’s not targetting the individual necessarily.

3. The internet is jammed full of kiddie porn and suicide websites and freakishly mad and deviant stuff and it’s interesting the the first attempt in this country to police it ‘en masse’ is caused by the fact that fat cats are worried about losing their royalties from record sales and it’s driven by commerce not by any kind of moral, social or conscience driven imperitive. I think this is quite shameful personally, I’m not against protecting copyright but I think there are other methods of protecting copyright and there are other things to police.

4. The people that do it will find a way around it within a few weeks so it’s pointless.