One of the problems I've encountered is that most people (users and, to an extend, designers) are socially conditioned to believe that tasks should be difficult. They expect a learning curve that isn't always logical. It stems from childhood when we don't understand something and are told "that's just the way it is".
Users have become conditioned into accepting a hundred page manual for toaster ovens because, often, underlying tasks are complex and so they expect the method to accomplish such tasks require complexity.
Take writing a letter. The actual business which counts (the thinking of what you want to say, ordering it into sentences etc) is fairly difficult for most people. The business of learning to write takes a few years to master. So, why should using a radically different paradigm (Word Processor) be any different?
As for designers, ignoring the fact that we are usually designing for people unlike ourselves, we have to work within the existing framework of usability/acceptability.
Take the usability difference between Windows, Mac and Linux - the uproar if someone develops an application based on the usability of a different system is immense. Now imagine if you develop an entirely new way of working that is radically different but, technically, easier to use? Transition scares people.
At the moment I'm working on a research project looking into ubiquitous computing. One of the things I've found is that people:
- Can't focus on the task rather than the tools - they've been conditioned that the tool is the task.
- Find change, even for the better, scary - The tool is the task
I think what needs to happen is a gradual progression. Even if every developer in the world suddenly got turned on to the idea of HCI - the shock to the system would be incredible! Two examples
- Windows - I find it gets easier to use each version. But it also retains an (annoying) amount of legacy usability to help wean people into a new way of working.
- Palm Pilot. A fairly different approach to computing which symbioses nicely with existing technology.
Most schools separate students into either Code Monkeys or Grad Students. One day, everyone's first lesson in CompSci will be "Usability 101"... one day!
Every software module has a UI, how else could you code to use it? eg. Poor UI -
Foo.doThingthatDoesstuff(); Good UI -
It's the classic case of code readability and commenting, I look back on some of my earlier non-trivial programs and it gives me a headache to try and read them, mostly because my understanding of how I work was incomplete.
HCI doesn't just cover "end-users" it also covers other users of your code. If you can teach people about mental models, it should drip through into their coding. If you look at undergrad group projects at University, much of the cause of failure is different people having different models of how they perceive the problem, convincing people to write usable code is the first step to having them write usable programs 🙂