Technical Training….Training

“People aren’t puzzle pieces, turn-key products, or canned beans. They are investments, copilots, and will form connections within and outside your organization you never would have guessed”

Fuelling People is probably one of the tougher things to do in an organization – training programs, interesting work, mentorship, matched teams, surveys, open door policies, carefully tailored workloads are all a nice set of sliders and controls an organization can play with and optimize, But the most difficult thing is still creating training around an institutional knowledge set that includes some really hard-to-capture aspects – gut feel, interpersonal relations, estimating workloads and performance, turning measurements into decisions, and understanding people’s true capacities and capabilities.

Smashing has an interesting take on setting up technical courses for technical people inside a technical organization. A few key points, which I’ll add a bit to:

  • Give context and purpose to what you are teaching. The big picture – how will you be stronger within the organization, and how will it make the organization stronger (i.e. a better place to be!)
  • Teach by example. For example, the ‘Cookbook style’ Where a specific need is solved, allowing the learner to auto-generalize (something humans are almost too good at). Check out the micro article on learning as foundations and differences
  • Have some awesome homework. Learning is, in the end, doing, and watching someone else cook doesn’t make you a chef. Make the homework rewarding, possibly collaborative, and flexible to meet the motivations and interests of the learner. It’s also a breeding ground for real questions and feedback, not the polite stuff or pile-ons you might get immediately after a presentation
  • Learn how the training went – get feedback and get it immediately as well as later on. You have to learn how to teach people, after all, and accurate feedback is more than one datapoint in time – The term, “Let it sink in”, and “experience is the best teacher” have a very good reason to exist!

 

 

Splitting Learning into Foundations and Differences

My favourite techniques in extending technical knowledge are a “Parallels” method and the “Hub Analogy” method.

Parallels Method: For example, if someone knows how to solve a particular problem in Java, map each step over to the target language – say Objective C, and line up the equivalent functionality. Then, differences are much more easily explained, as they stand out from this common basis. This method works great when laterally moving through equivalent topics.

Hub Analogy Method: I recently had the opportunity to do a presentation to new pilots on how airplanes land at an airport. The aviation terms and language really make no sense to a newcomer, especially in 5 minutes, so I started with the idea that landing at an airport is a lot like going through a Tim Horton’s Drivethru: you line up, follow the signs, make a radio call with your request, and keep yourself away from other traffic. This had the benefit of allowing us to ‘hang’ new ideas off this solid mental model which everyone could be familiar with. For example, you can call the Tim Horton’s person the ‘Tower’ and introduce the concept of ‘runway clearance’ as the equivalent of, ‘please drive up to the second window’. A familiar and flexible hub analogy allows better student recall by splitting new learning into Foundation and Connected Differences, i.e. Hub and Spokes. This method works best when introducing less familiar or totally unfamiliar topics. (It is most notoriously misused in science documentaries as the classic units of measure: “Human Hair”, “elephants”, “Golf Ball” and “Football Fields”)

No technique is going to work as well without examples. Popular books from Gladwell and Kahneman are completely saturated with examples because they know that we learn by generalizing, not by making up specifics after memorizing some abstract framework. Humans evolved to think, “I don’t eat that fish because that one time I did was pretty bad, therefore, no yellow-striped fish for me of any kind.”, not, “anything that is sending a signal it is poisonous is highly visible as opposed to camouflaged, therefore I will not try that fish over there.” Examples are also a form of storytelling – which is just a way of conveying a personal experience like, “In and Out: That One Time I Ate That Yellow Striped Yuck Fish”. We love stories because humans are empathy machines, and blur our Specifics into Generalizations.

 

 

 

Displays in a vibrating environment: How NASA addressed a blurry problem

vibratingDisplay2

The effect of synchronized strobed display to reduce motion blurring readibility issues in a high vibration environment. Or more specifically, Turning a mess into less of a mess.

The stereotypes are true: the boost into space from sea level is a shaky, G-infested carnival ride with every Fourier component you care to name. NASA had a similar problem as part of its problematic Ares 1 project. Some rockets have a dominant resonance frequency in long axis that is termed ‘pogo’ (like the stick) and in human rated vehicles this means a dominant mode vibration passes to the passengers. In the case of the Ares I, this was on the order of 0.7G’s at about 12 Hertz, working out to around 5mm motion. If the computer displays the passenger is looking at do not have the same damping and resonance characteristics as their own eyes and head, motion blur in the displays will make them unreadable as simulated above.

A solution tested was to strobe the display in the same way LED-based displays are dimmed – a square wave duty cycle is applied so that the display is actually off some of time. The duty cycle is synchronized to the main vibrational component of the the pogo motion, removing the worst of the motion blur at the expense of some brightness (This simulated view assumes that brightness can be boosted somewhat to compensate).

 

Trading Brightness for clarity: effect of strobing a display in sync with a sinusoidal vibration mode

Trading Brightness for clarity: effect of strobing a display in sync with a sinusoidal vibration mode

When compensating for a single, sinusoidal mode, the loss in brightness is not that great if the duty cycle of the strobing is phase matched to a displacement peak of the motion as shown. A vibration reduction of 90% is possible with a 20% duty cycle, or 80% loss in brightness.

Read the article and see the demo video here:

http://gizmodo.com/5880850/how-nasa-solved-a-100-million-problem-for-five-bucks

UX Science – measurement gets the answer

The simplification of user interfaces has been proceeding quickly now that the last vestiges of skewmorphism are gone. Like any new technology, the first iterations of the interface must be familiar to the users. Early cars looked like carriages, early lightbulbs behaved like gaslight, early televisions looked like radios, and the first home computers worked like typerwriters (and still do!).

But any design trend ultimately overshoots the mark, in this case iconography has possibly become oversimlified, and buttons without outlines or contrast fill are being used because retina-class displays support the fine line widths.

Curt Arledge addresses one basic question in this user interface direction: does an outline or contrast button have more usability. Check out his results here

In summary, what seems to matter are two things. First, the users’ familiarity with the icon type: i.e. the common language all interfaces share to a great degree in the iconography alphabet. Second, that user testing is still required, since differences appear in counter-intuitive places, and some design decisions affect usability less than expected.

The Robotics Behind the Gravity Movie

 

One of many prototype space suits you and I can see, but not try on, at Kennedy Space Center

One of many prototype space suits you and I can see, but not try on, at Kennedy Space Center

First, go see Gravity in 3D IMAX. Why? Because Patriotism: the ISS is partly Canadian, the Canadarm and Dextre appear in the movie, and IMAX is a Canadian invention. Also, IMAX has been to the real ISS. 3D is almost  mandatory in a space movie, where there a fewer environment queues to indicate relative position, like say, the Earth 300 miles below you, or the peaks of the Himalayan mountains 1% closer.

The movie itself was made with the contribution of robotic arms to lend a complete sense of zero G choreography. Cameras have been controlled robotically for at least 40 years for special effect shots. Motion control cameras were developed for Star Wars (computer controlled) and 2001: A Space Odyssey (mechanical), since those films require repeated camera passes of the same spaceship models, allowing each passes’ film to be overlayed with the next, lining up things like lights, background, engine glow, perfectly. No human could match the repeatability. Check out this video about John Dykstra’s pioneering work on robotic camera control.

But robots have another capability other than endless perfect repetition – the motion can be planned, carefully, and in motion paths that would tax a camera person: simultaneous pan, tilt, zoom, dolly, rack, in smooth elegant arcs. In the world of CG, cameras have always had this freedom to move on gentle curves, maintaining flawless lock on a subject (in fact it takes quite a bit of talent to add back in the ‘human’ elements of a camera operator, as demonstrated so well in Battlestar Galactica.)

Putting those virtual camera moves into a real camera is something that Bot and Dolly handled in Gravity – their demo reel shows the alien smoothness and confidence of a robotic camera in action. Another demo reel by The Marmalade shows the new high-framerate possibilities of precise high speed motion control using their Spike robot camera system.

The technology is nothing without the illusion-creating setup. One robot can be fitted with the camera. Another can be fitted with, say, an actor/astronaut, or a keylight. Moving these two carefully can ‘null out’ any hint that a scene was filmed in gravity, since the gravity that we expect to be coming from one direction in a scene can now be coming from anywhere. In fact, full-motion flight simulators use the trick of “redirecting Gravity” to simulate the feel a pilot would have in a real aircraft. Like any good special effect, the trick is to fool the senses, not to simulate reality.

The big change in this technology is the real-time speed (i.e. fast!) at which the cameras can now be moved with high accuracy. The control software and motors have been refined so that the possibilities of motion are now beyond what a human operator can do. For example, here is an industrial pancake sorting robot (!), performing at about 3 pancakes per second, and another playing perfect pool.

Of course, you need a Director who is up to using these tools, and this was the case. Alfonso Cuaron has apparently wanted to be two things – an Astronaut and a Director, and being a director pays better. He is also known for his very long continuous shots with no cuts, and the opening scene, as one of the characters in the movie says, “breaks the record”. The cinematography really is the core of the movie, since the alien world of zero G, where momentum is dominant, is the Antagonist. (Check out a demo by ISS astronaut Mike Fossum on Angular Momentum before you see the movie!)

The difficulty is that the camera motion is now the star of the show, meaning the actors have to adapt to the shot, and the shot has to be pre-visualized and planned more carefully, as described here by the Director. Although the control system opens up new possibilities, organic control and intuitive direction of the tool become the next challenge. “Hey Robocam, orbit around Sandra B’s head as she squints into the sun setting behind the rim of the Earth”….no? You only understand 6-degree-of-freedom target points and motion splines? And you want a Union? Hmm.”

For a great Astronaut’s View on the movie, check out Mark Kelly’s write-up at the Washington Post.

Check out this interview with the Cinematographer, Emmanuel Lubezki, who discusses the virtual lighting challenges in the movie.

For the excellent in-depth view of the movie’s tech, check out CG Society’s article.

And if you think that the plot line is somewhat infeasible, this overview from ESA might convince you otherwise: spacejunk is a huge problem for astronauts, and satellites and manned missions do have to get out of the way periodically!

If you are interested in the unique editing and shot style of the movie, contrast it with the “traditional repetoire” of editing techniques to move from one shot to the next. This oldschool web site gives a great overview, with long duration shots, or “Plan Sequence”,  near the bottom.

Track spacejunk (and stuff that most certainly isn’t junk) using spacejunk for android, Night Sky for iOS or nyso’s site for desktop