30 March 2007

Screencast tutorials

Since bandwidth availability has improved, the last few years have seen more videos and screencam/screencast tutorials on websites, for both marketing and training purposes. They're getting easier to make and deliver, with more sophisticated purpose built software, some of which is cheap or even free.

Tutorials featuring a video capture of screen activity, have been a stock elearning item in the corporate training arena for a long time now, and software like Flash is used to build components that allow scaffolded interaction to create a 'show-me' or 'let-me-try' environment for employees learning to use company software systems.

In Higher Education, bite-sized screen activity tutorials can help students learn to use new tools, such as statistics software, a wiki or a learning management system.

Different types of screencast tutorials

  • Screen video only - this 'show me' approach gives view to an expert user's desktop as they demonstrate a task or process. This can be done with or without narration, but generally benefits greatly from a synced voice over explanation.

  • Enhanced screen video - this captures user activity as above, but helps the learner focus on the important points by including highlights of certain screen elements, explanatory balloons, arrows, etc. overlaying the video itself.

  • Interactive tutorial - the 'let me try' approach guides the learner through a protected process of actually interacting with the software to be learned. The learner can click buttons or control selected screen elements to move through a task, but only certain options are available at any one time and they may be following specific instructions about what to do next. Learning by doing leads to better retention, and the screencast tutorial allows it to happen in a 'safe' directed environment where wrong choices don't lead them into trouble. check out the excellent article on varying levels of interactivity in simulations/tutorials, by web-based learning author William Horton

  • Narrated tutorial - This can be video demo or interactive. The narration describes or guides the user through the process being taught. Narration is generally a very effective approach as it contextualizes the screen activity as it happens, helps to focus the learner on what's important and helps sustain learner attention (it's so easy to be distracted by phone and email noises when you're 'merely' watching something). To put it another way, narration makes optimal use of one's cognitive processing capabilities (see note on Richard Mayer's research.)

  • Audio enhanced tutorial - Even if a tutorial doesn't include narration, it can still include other audio, such as clicks, music, or feedback sounds for wrong and right answers. Use these with caution (and only after reading the Mayer research) as irrelevant audio, such as unnecessary music can become distracting or confusing. Repetitive sounds can become irritating (imagine a loud buzz after every wrong answer.)

  • Text or subtitled tutorial - when narration is not an option (and in addition to potential accessibility and hardware issues, a good, well-recorded narration can be hard to come by), many tutorials use subtitle explanations or text balloons pointing to things as they happen. This can be tricky to manage well, as there are various pitfalls. People read at different speeds and following the onscreen activity at the same time as trying to take in the contextual text information can impede the learning process. (When you're reading the text, you're missing the activity and vice versa.) If you take this approach for demonstrating screen activity, balloons are generally preferable to subtitles, not only because the balloon can point specifically to the action to which it's referring, but because it allows the visual information to be displayed in close proximity, which eliminates the back and forth eye movement required by subtitling.

Note: Richard Mayer's findings on the effectiveness of different combinations of visuals, text and narration are essential prerequisites to good screencast tutorials. As a starting point, my earlier post "Mayer's Principles for the Design of Multimedia Learning" provides a summary of the essentials.

Educational screencasts can be short and sweet nuggets you self-narrate and throw together in a half hour, or sophisticated interactive tutorials, requiring programming skills for software simulation, professional narration and scripted instructional design.

Screencasting Tools

Software like Camtasia, CamStudio (open source), Snapz Pro (Mac), ScreenCam and others make short screencasting easy, and some are more full-featured than others. Adobe's Captivate allows an interactive experience with more than just video to be built easily. This Captivate review in Digital Web Magazine has the complete story.  [Note: since Snow Leopard you can use QuickTime player to record the screen and audio narration directly from your Mac.  no downloading of software required. Update: 21 Oct 2011]

The latest version of captivate reportedly allows for the easy creation and management of branched scenario based training (these are those 'what would you do next' simulations). In the case of screencasting, this is in the 'let me try' interactive category, but takes it a step further and can represent a "test me" category. Simulations can make for extremely useful and engaging elearning but have been cumbersome to create and manage. Learners are faced with a set of options at each of a number of decision-making points. For a simulation to be at all realistic, and not just glorified true/false , different learner responses must branch out towards various results so the options grow exponentially. I've spent many a bleary-eyed afternoon trying to organize, complete and visually represent dozens of these simulation screens, so the prospect of a tool making it "so easy a SME could do it" is truly enticing.

Simulations can be static and slide based or video based. One customer service training program for a large bank had the learner looking directly at a customer (on video) and interacting with them to deal with their customer complaint. The learner would listen to their complaint, and select from a list of options about what to do next. Depending on their choice, the customer would respond differently (eg."Well okay, I guess I can try that" or "Thanks, that will help a lot!") and the simulation would continue down the decision making tree. Of course these video simulations are a very different tutorial beastie to screencasting, so forgive the apparent tangent, but there's certainly overlap and room for integration of the two.

How to Screencast

From the technical perspective, screencasts on how to (you said it) screencast, are available at showMeDo.com. From the conceptual angle, veteran screencaster John Udell looks at other uses for the screencast, and describes the process of creating one, as does the handy article, How to create a screencast with a Mac at Digital Web Magazine, which includes cheap software options for the Mac. Udell also highlights interesting variations on the theme, like interview screencasts. Why not have two people narrating?


Examples

For ideas, inspiration or 'what not to do' check out myscreencast.com, or click through the selected list below..
  • User controlled, no sound
    This pyDev tutorial acts like a slide presentation in that the user must click next to see the next step. This allows the user to go at his own pace. It's interesting to consider how a more interactive approach could have been taken with this one. Instead of using a next/back button, why not prompt the user to click relevant buttons to move the task forward -- as though the learner were helping to complete the process himself. For example, the second balloon says "pressing ctrl+1 brings us the choices we have". Then on the next screen, these choices appear. Why not have the user actually 'click ctrl+1' to make that happen?
  • Screen video with sound
    Adobe Captivate tutorial
    on adding audio feedback to buttons demonstrates a 'buzz and bell' button feedback action, which although possibly of debatable benefit, demonstrates a potentially useful concept. I worked on an interactive tutorial that, at well distributed points, responded to the user with narrated feedback (eg. "that's right" or "Try that one again") The responses were varied to make the effect more natural (less conspicuously repetitive) and to keep it from getting too irritating. The phrase variation, quality of the professional narrator, and modest use of the device were keys to its success.
  • Screen video with narration


Publicly available interactive screencast tutorials, are much harder to come by since they're much more difficult and time-consuming to produce. Please drop a line if you know of one.

And just for a lark, don't miss the ACLU's pizza ordering screencast! (okay, not technically a tutorial, but it's engaging and gets a point across.)

23 March 2007

50 Ways to experience the web - Designing for ALL possibilities

There may be 50 ways to leave your lover and 100 ways to skin a cat, but what web designers are worried about these days is the perpetually increasing number of ways in which users might be viewing their sites...




From computer ubergeeks to "12:00 Flashers" (every appliance in their house flashes 12:00*), users are coming at websites in a growing variety of ways. They may be using a screen reader to listen to your site, or talking to it from a car. And that’s to say nothing of the unique personal characteristics each user brings to the prospect. Besides varying levels of technical ability, they may be colour blind or low vision, deaf or hearing impaired, and their native language could be French, Zulu or Chinese. It’s enough to send a well-intentioned designer leaping over the proverbial edge.

Likewise, mobile and 3G technologies are clearly no flash in the pan. A mobile phone used to be a humble bulky thing that made a phone call (how quaint). Now it’s practically making dinner and walking the dog. People are playing online video games, getting live stock quotes and even e-learning with their mobiles, and according to ubiquitous technologist Bob Kummerfeld, it won’t stop there. “We are entering an age of “ubiquitous” or “pervasive” computing” says Kummerfeld, co-director of the Smart Internet Technology Group at the University of Sydney, “Computers are getting smaller and more powerful and disappearing into everyday appliances, but are connected to the global internet.” Projects at the University of Sydney’s Pervasive Computing Laboratory, use Bluetooth technology to customise services for wandering users. “For example, we have an electronic notice board that can determine who is standing in front of it and customise the information appropriately,” says Kummerfeld, “Another project involves building a coffee table that can display images on its surface and allow the user to manipulate and share them with others. "Gestures" are used to control the system Instead of a keyboard and mouse.”

It’s no longer about a portable object but an environment that’s embedded with technology, so seamlessly that you forget it’s there. So in the near future we may be dealing with users viewing our websites from table tops, soda dispensers and toll booths. For a peak at the future of human interface design, check out the Hit Lab.

But don’t worry, this is not a cheap plug for a “how to design for refrigerators and coffee tables” short course. This is simply another very good reason to design with web standards. How do you manage an infinite variety of possible technical user scenarios? By tackling them one by one? No way. This is where accessibility comes in – it’s the method to tame the madness. Most people think of accessibility as compliance guidelines that ensure your site can be used by people with disabilities. This is true but it’s not the whole picture. Adherence to accessibility guidelines also means that people using old devices, new devices, handhelds and cutting edge technologies will have a better chance at accessing your site. You’re adding a lot of value at once when you make your site accessible, and if you’re already designing with standards than most of the work is already done.

So what does it take to make a site accessible? An accessible website is essentially, one that’s built with standard html and CSS, and has some additional bits thrown in (like alt text for all images). It adds immense flexibility to the way your site can be experienced and, naturally this means more visitors, and for businesses, more customers.

Unsurprisingly, some of the best resources on accessibility are available online. Joe Clark has generously made his entire book Building Accessible Websites available on the web. If you prefer the incremental approach, you can take accessibility one day at a time with Dive Into Accessibility and learn all you need to know in 30 days. No matter how you get up to speed with accessibility, you’ll need to make a stop at the W3C -- The authoritative hub for all web standards including accessibility. They even put together a list of guidelines that form the basis for legal compliance in many countries including Australia and the UK. There are only 14 guidelines and they want are broken down into levels of importance: priorities 1, 2 and 3. To be compliant, you only have to meet the minimum set of priority 1 checkpoints, which isn't too tricky. So if you'r ein a rush, go straight to the guidelines and apply just the 1st priority set. When you’re done, you’ll be able to boast a Level A compliant site. And, if you don’t find the colour combination too disagreeable, you can even add their compliance logo to your pages.

In terms of catching up on standard html and CSS, the ever popular Designing with Web Standards by Jeffrey Zeldman is a classic place to start. If you already know the basics and want some ideas for implementing your designs, Dan Cederholm’s Web Standards Solutions is a handy little reference. And don’t worry, you don’t have to chuck out your layout tables to meet accessibility requirements. In fact, if you think building to standards means compromising on design options, then you still haven’t seen the world renowned CSS Zen garden.

It may take a little burst of autodidactic enthusiasm to get past the hurdle of learning how to make an accessible site, but it will never go unrewarded, as the need to design with this kind of flexibility is not going away. There’s also an increasing demand for accessibility-savvy designers as top companies begin to realise the business benefits, as well as the need to comply with the law. That being said, the real reward lies in the yogic peace of mind that follows. At the end of a long day, it’s an unparalleled feeling…being able to sit back, flip open a can of a ginseng enriched beverage, and relax, knowing that no matter who’s looking at your site, no matter where they are, it’s pretty much going to be okay.

Experience it yourself

Get a first-hand feel for how others are doing it. Here are 6 fresh ways to experience the web without having to leave your desk.
  • Experience the talking web. Experience the web as a user with visual impairments does when they access a site with software that reads the web to them. Try the screen reader simulation or, watch a blind user do it himself in this brief but amazing video. If you’re really inspired, you can download a demo of IBM home page reader.
  • Change your screen size. These Favelets allow you to view a web page on oodles of different screen sizes including PocketPC, TV safe and our old friend 640x480.
  • Disable your browser. The Firefox ‘Web Developer’ extension is a handy toolbar for the Firefox browser that has a disable menu which allows you to disable images, cookies, java, javaScript, page colours and more, at the click of a mouse. It also makes it easy to validate a page for standards compliance.
  • Go text-only. Try the Lynx browser and see how your site fares in text only. This is also a second best way to experience a screen reader, since someone using a reader to listen to a site is only hearing text. You can get a taste without having to download the browser at lynxview
  • Move backwards. The backwards compatibility viewer is dandy little time machine dedicated to graceful degradation. It gives you a peak at how users with older equipment and older browsers are seeing your site today.
  • Kill the mouse. Unplug your mouse. Go ahead, just do it, you can plug it back in later, I promise. It actually is possible to navigate the web without one. Oh it’s not easy, but it’s possible, and that’s how many people do it. People with mobility impairments like arthritis and those using devices that have no mouse (ie. mobile phones) are pros at it. If you try it out, you’ll find that accessibility compliant sites are much easier to navigate this way. You may even discover the purpose of the “skip navigation” link. Hint: you’ll have to use the tab button to get around.

* Reference from “Welcome to the Internet Helpdesk” by comedy troupe Three Dead Trolls in a Baggie.

This article was originally printed in Desktop magazine.


16 March 2007

Data -> Information -> Knowledge -> Wisdom

Understanding the difference between data, information, knowledge and even wisdom, and how a designer can participate in the transformation from one to the other, is a critical part of creating learning experiences.

Allow me to coin a phrase (originally invented by my endocrinologist, Dr. John O'Dae) -- we're now living in a society characterized by data-rich ignorance.

Have you ever spent hours on a Google hunt searching for some holy grail of an answer only to come out with loads more data, and loads more confusion? We've never had so much information accessible to us so easy, but does it lead to more knowledge, better understanding or more wisdom? We do know it leads to information overload, information anxiety and interesting new conditions like cybercondria. (eg. "I'm freaking out, I think I might have cybercondria")

But it's not all bad news.
After all, there are also those times when you do find that grail googling, and you learn something along the way.

As designers our mission is to organize this data better, help transform it into meaningful information, help prevent the overload, and use it for experiences that facilitate learning.

The article Information Interaction Design: A Unified Field Theory of Design [pdf] presents an interesting look at data, information, knowledge, and wisdom and how the roles of information design, interaction design and sensorial design come together equally for all forms of communication from web and tv to performance - to create what, in essence, are learning experiences. Check it out.

- "What do you do?"
- "Oh I'm a designer...I'm like a big can of bug spray for the nasty infestation of data-rich ignorance."
-"...right.......would you look at the time?"

14 March 2007

Adobe Captivate - job opp

Adobe Captivate (previously Macromedia) is a handy no-programming-required training development tool for creating subtitled/narrated tutorials, simulations and demonstrations. We've used it for simple, try-it-yourself software training tutorials and how-to's to help students use our LMS, CMS and wiki.

Here's a contract job opportunity in Sydney for a developer with Captivate experience. This was originally posted to the Elearning Network of Australasia job forum...


Original Post:


Forum: Elnet General
Thread: Captivate contract
Author: Martine Barclay (grantandmartine@smartchat.net.au)

Want to work in the professional services industry?

I am looking for an experienced Captivate developer for a short-medium term
contract based in Sydney.

Immediate start.

Give you a call if you would like to know more.

Martine Barclay
T: 93357653
E: mbarclay@kpmg.com.au




06 March 2007

Design Essentials: HCI, User Experience and Usability

HCI, User Expereince, User-centred Design, Usability, Information Design, Information Architecture, Interaction Design, Accessibility and Univesral Design...it gets hard to know where one starts and the other ends. The usability of all this termenology is sorely lacking in good information design. There seems to be so much overlap, and redundancy and is there really difference among them? If so what is it? Let's get quick and dirty...

Human-Computer Interaction (HCI)

HCI, you will generally find, refers to a field of research. There are more journals and conferences than how-to books on this topic. It's a large field of research having to do with how human beings interact with computers, websites, and digital systems. Therefore it's founded on gathering information about human perceptions and behaviours and how computer systems can be made to better support them. Discoveries and principles of HCI feed into and form a research basis for other specialties (usability, interaaction design, etc.)
>> Example: "Display layouts should accomodate the fact that people can be sidetracked by the smallest movement in the outer part of their visual fields, so only important areas should be specified by moving or blinking visuals." (see "HCI and your website" by Nicky Danino at Sitepoint)

User-Experience or Experience Design

... is more of an umbrella term that can deal with more than just your desktop computer and a lot more than just websites. It's a holistic term that can cover branding, functionality, usability, information architecture, graphic design, and interaction design, and deals with the overall satisfaction a user has when dealing with a product or system. It includes perceptions and emotional responses as well as ease of use and function. (check out this BBC article on emotional design and the wow factor for another angle.)
>> Example: "The User-exerience should be comfortable, intuitive, consistent and trustworthy" (see "Brand value and the user experience" by Kelly Goto, at digital-web magazine)

Usability or Usability Engineering

...is more specific, though a part of the user experience, and is most often used in reference to software and websites. It's about applying scientific method to making websites as easy and intuitive to use as possible for their audience. One key aspect of Usability is systematic user testing. Usability tests can be run on a prototype, beta or current public version of a website. The results can be fed back into improvements to solve usability errors (where users get stuck) thus improving the user's experience of the site. Usability tends to have less to do with making a site attractive, warm or brand consistent and more specifically to do with making it easy to use -- facilitating tasks and functionality.
>> Example: "Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution." (From Jakob Nielsen's Heuristics for User Interface Design)

Information Architecture

IA can be thought of as part of the user-experience design. One goal of IA is also good usability in the sense that IA is trying to make it as easy and intutive as possible for the user to access and use information throughout the site. The difference is that IA deals more specifically with "findability" or the organization, structuring, categorization, labelling, navigating and tagging of information in a website. It includes the blueprint as well as the navigation options and search keywords. There are usually more IA-specific tasks at the beginning of a web project (as the categories and pathways are planned, mapped out and prototyped).
>> Example: The site should support multiple ways to reach content, as in search, site-wide navigation, site index, site map, etc. (from Louis Rosenfeld's Information Architecture heuristics)

Interaction design

...often refers very specifically to the elements of interface design that involve action between the computer and user (less on graphics more on user goals, tasks and processes) for websites, software, and games. Games in particular are a good example as they are highly interactive. In other words there is a lot of information being passed back and forth from user to computer and a lot of specific and sophisticated tasks a user is trying to complete. Interactivity design as a research, as well as a practice, discipline finds out how to do this more seemlessly, and make it more engaging for users. They deal with issues of interactive control (how much to entice, guide, or coerce a user into doing one thing or another - clicking a button, entering the wizard cave, etc.) among many other things.
>> Example: Users should be able to respond to the computer at their leisure. The computer on the other hand, needs to respond immediately to the user. computer response time to a user action should be kept under a second 90% of the time (From The Art of Interactive Design, by Chris Crawford)

Accessibility and Universal Design

...are in essence an aspect of usability, and therefore of the overall user experience, but are specifically associated with ensuring users with disabilities are not left out. Beyond that, accessibility standards which have been set out to ensure this, also ensure users on all sorts of various computers systems, browsers, software types and mobile devices are not left out because of the technology they use. In a sense it's about making sure a website is usable accross all platforms and across all abilities, universally, catering for the inevitable differences among users. Whether you're using a braille refresher, a mobile phone or your voice to navigate, accessibility standards make sure a site design is flexible enough to cater to your needs. Accesibility is the only element on this list that is mandated by law.
>> Example: All images should have alternate text included in the HTML so vision impaired users using screen reader technology, those using text-only browsers or users with slow connections, can still access the content of the image. (see www.w3.org/wai)

Bringing it all together

The differences among each of these elements of the user experience become more pronounced the larger a website project gets. For a small site, they are essentially all dealt with by the same person or small team that deals with the interface/graphic design or the web development. For mammoth sites (like large corporate sites), there are teams of people each dedicated to a particular specialty and tasks need to get broken down. One team may be researching search analytics and leading card sorting exercises to categorize products, another team conducts usability tests on a prototype, and another works with the visual designers on creating an interface that appropriately reflects corporate image, while yet another group may be ensuring this design is incorporating accessibility standards, etc.
The one thing they all have in common is the idea that people come first.
It's the idea that you're basing your design and build of a product, application or website on the person it's for, what they need, how they respond and what they're like. The ideal is that a person shouldn't have to change their behaviour to accommodate the idiosyncrasies of a computer system. Instead, the system should work to seamlessly adapt to their behaviour and goals. Which of course, makes sense. Technology isn't made to please itself or boss us around (not yet anyway). It's made to support human activity. If it doesn't make a human's task easier, it's essentially pointless.
Disclaimer: These are, naturally, totally incomplete generalized overviews of each discipline, that many would argue over until late in the night, and which would be impossible to fairly define on one blog post. They're just intended for other designers like myself in visual design and elearning, to help add a bit more clarity to a messy web of terminology. You gotta start somewhere.