Three tips for Microsoft about developing better software…

I’ll probably think of some more but these are my current three big gripes about using MS software….

1. When shutting down Operating Systems assume that people will want to save and close applications and then shut the computer down. Do not wait to get responses on every open application or document. Please PLEASE take the initiative that if a user requests the computer to be shut down the logical conclusion is that they want it to be shut down NOW! (“shutdown -h now”) – By all means allow options for other shutdown but have a default setting of closing apps and shutting the darn system down without user intervention.

2. Don’t shove all the less obvious functions under a generic “windows” icon. I don’t understand why everything went from being sensibly under menus to being shoved in one catchall thing that requires you to navigate down a tree again. I don’t want to have to customise my app each time I buy a new machine or log in somewhere else, if I can have my customisations at all times then I want a menu system that’s easier to use, not harder than previous versions.

3. Always assume that people use other tools as well as Microsoft ones. I applaud the opening up of client stuff to cross browser/cross platform (see my blog on Silverlight ) but please consider that people may write and produce stuff in different tools and having back end integration with other developer apps and the ability to work with content that isn’t MS tool developed would be beneficial to Microsoft as well as to others.

Advertisements

My failing memory and fear of going outside

I have just started to use Remember the Milk which is set up to do all those things that you always mean to do but never actually get around to doing. It’s got a good range of web 2.0 type integrators with twitter and phones and google calendar etc. My problem is currently that I’ve not had the time to remember to put reminders into RTM to remind myself to do stuff. I’m now having to put a reminder into RTM to remind me to use RTM to put stuff in! – It must be my age.

Actually one of the developers in my team (Nick) is using RTM and the RTM API to provide some elements of the Social:learn project. I think Social:learn is a fantastic concept in that it’s exploring methods of learning that are much less intitutionally (provider) focused and much more learner focused which is exactly how it should be. The investment is relatively lightweight at this stage as it’s largely glueing, adapting and sharing data across a series of existing tools and architectures but the potential is huge if even one of these applications takes off and I think it’s exactly the right approach to explore in the ‘post VLE’ era where people are less concerned about where they get information from than they are about what the data is and how it will help them.

In the same vein colleagues of mine are discussing moving wholesale away from using the institutional systems to provide them with email, scheduling, document sharing and many other business functions but instead moving over to using external providers for this (e.g. Google) and the arguments against doing this are now being outweighed by the arguments for it. I still have some reservations though and so a group of us are going to explore this and work alongside the central services provider at the OU to see how well these things work to meet staff needs.

Here are some of the argument against (in my opinion)

1. Stuff less secure and more open to attack

2. External providers can disappear or have services out of action when they’re needed.

3. External providers have no responsibility to maintain the (free) services for users.

4. The amount of space you get may not be adequate for your needs

5. There is no “institutional branding” on emails etc. coming from external engines.

6. What happens if things go missing, there’s no backup or retrieval mechanism.

7. There is no (institutional) support for dealing with configuring external clients or services.

8. Stuff coming through external providers may be prone to interception or blocking.

9. It doesn’t integrate with other instituional services.

Here are my responses…

1. The security on services externally is now as good as security internally. The bigger more established players have invested much more time and money into methods of ringfencing and securing data than anything that a public sector institution could do.

2. The datacentres used by most external hosting providers have levels of redundancy which again outstrip anything that could be provided by a single institution. Their businesses rely on keeping the services up 100% of the time, they have massive contingency and failover options in place to ensure that individual parts can be removed without the service failing (there is a good talk that I went to last year as part of the Future of web apps conference by Matt Mullenweg the guy who developed Wordpress about this very topic).

3. This is true although advertising revenue helps to ensure that they have a need to try to maintain free services and these are also the methods people use to get ‘hooked in’ to the next teir of services so not providing these would be catastrophic. Also the big players rely on the number of users they attract, so having a failing free service would soon stop them from operating.

4. This is no longer true, in fact external providers can generally provide more space than any instituional service provider can meet. Email is an example of this with Gmail being vast in size compared to the meagre limit set by the institution (50Mb?).

5. This can be got around. You can do work on the headers to allow them to show that it’s from another account and you can add the instituional signature to emails etc. I think this is problematic though and the header stuff can mean that your mails get trapped. I would suggest though that it may not be the most terrible thing in the world if emails come out from an account which doesn’t have the institutional domain. There have been occasions for example when local email has been down and the IT people here have all switched to Gmail and other mail providers and external IM systems in order to keep in touch and keep exchanges of information running.

6. There can be backup and retrieval. I think it’s to do with how you manage your account. You can for example get POP mail to keep a copy on the server and also download so you can have versions stored by your local mail client periodically. You can set up routines to do this automatically.

7. I think empowering users to help themselves is always a good thing and takes burden off IT support. External systems tend to be very easy to use and configure in order to attract customers. I see that as a good thing for organisations.

8. There is some truth in this however it’s a manageable issue. I’ve heard of people not receiving mail that they should have and others being blocked or blacklisted because the mail server they use is on some blacklist. It’s manageable because if you find it happening then you can do things about it. You can switch to another mail server, you can reduce the likelihood of your email being used by spam bots and you can watch what filters are being applied to incoming or outgoing mail.

9. I would say that the opposite is true. Institutional services tend to be siloed. Internet provided services tend to have open API’s and talk happily to many other tools and services. If they don’t then you can build the integrators yourself.

Finally the reasons FOR going outside..

1. cross browser, cross platform, cross system, cross organisational access to services.

2. No issues or barriers to use.

3. More space to use and store data.

4. Better for sharing.

5. Doesn’t require a complex infrastructure to use (similar to 1 but slightly different in that I’m talking about the dependancies and platform requirements, for example having to use VPN from home and having the Office 200x suite installed on top of Windows)

6. Always available!