Displays in a vibrating environment: How NASA addressed a blurry problem

vibratingDisplay2

The effect of synchronized strobed display to reduce motion blurring readibility issues in a high vibration environment. Or more specifically, Turning a mess into less of a mess.

The stereotypes are true: the boost into space from sea level is a shaky, G-infested carnival ride with every Fourier component you care to name. NASA had a similar problem as part of its problematic Ares 1 project. Some rockets have a dominant resonance frequency in long axis that is termed ‘pogo’ (like the stick) and in human rated vehicles this means a dominant mode vibration passes to the passengers. In the case of the Ares I, this was on the order of 0.7G’s at about 12 Hertz, working out to around 5mm motion. If the computer displays the passenger is looking at do not have the same damping and resonance characteristics as their own eyes and head, motion blur in the displays will make them unreadable as simulated above.

A solution tested was to strobe the display in the same way LED-based displays are dimmed – a square wave duty cycle is applied so that the display is actually off some of time. The duty cycle is synchronized to the main vibrational component of the the pogo motion, removing the worst of the motion blur at the expense of some brightness (This simulated view assumes that brightness can be boosted somewhat to compensate).

 

Trading Brightness for clarity: effect of strobing a display in sync with a sinusoidal vibration mode

Trading Brightness for clarity: effect of strobing a display in sync with a sinusoidal vibration mode

When compensating for a single, sinusoidal mode, the loss in brightness is not that great if the duty cycle of the strobing is phase matched to a displacement peak of the motion as shown. A vibration reduction of 90% is possible with a 20% duty cycle, or 80% loss in brightness.

Read the article and see the demo video here:

http://gizmodo.com/5880850/how-nasa-solved-a-100-million-problem-for-five-bucks

How to effectively study

Alan Turing Statue

Alan Turning : “We can only see a short distance ahead, but we can see plenty there that needs to be done.”

You never stop learning, which means you never stop studying. Sometimes the hardest things to learn are those that don’t have a concrete test or exam at the end. How do you know how well you did in a race if there are no hurdles, laps, timer or finish line? That’s part of being an adult, and actually a part of a Human Factors model where your “comfort zone” must be stretched into an area where you are uncomfortable, but, as it turns out, competent.

A good way to stretch yourself in the direction of learning something new is not just to read the manual. Humans are designed to learn through doing, so doing examples and writing example exams is generally more effective than just linear reading.

Why?

 

Continue reading

UX Science – measurement gets the answer

The simplification of user interfaces has been proceeding quickly now that the last vestiges of skewmorphism are gone. Like any new technology, the first iterations of the interface must be familiar to the users. Early cars looked like carriages, early lightbulbs behaved like gaslight, early televisions looked like radios, and the first home computers worked like typerwriters (and still do!).

But any design trend ultimately overshoots the mark, in this case iconography has possibly become oversimlified, and buttons without outlines or contrast fill are being used because retina-class displays support the fine line widths.

Curt Arledge addresses one basic question in this user interface direction: does an outline or contrast button have more usability. Check out his results here

In summary, what seems to matter are two things. First, the users’ familiarity with the icon type: i.e. the common language all interfaces share to a great degree in the iconography alphabet. Second, that user testing is still required, since differences appear in counter-intuitive places, and some design decisions affect usability less than expected.

Javascript performance across many browsers – run your tests and share the results here

jsperf is like a horserace – it allows you to set up and run your own test cases simultaneously on multiple remote browsers to see the performance differences. One recent one I was checking out was “Math.floor(4.6)” versus “~~4.6″. This is a powerful tool to diagnose some cross browser performance issues, and to have a look at what other people have been benchmarking.