The fuzzy side of data is probably uncertainty. Here is a quick overview of viz methods for showing what you don’t know visualisingdata.com: Visualizing Uncertainty
Ideally, an agency for which you work as a freelancer has hired you as the square peg to fit into a square hole. A bit like software, there is a distinct interface between the people you are working with – approvals, in-feeds, time lines, and expectations.
Reality is much different. And for you, that is a good thing.
The golden rule is: expect to contribute more, do more, and give more. In the end, you will get more. Here’s why:
A tactile watch with 7-segment digits (pseudo digital) that raise or lower – good for the sight impaired of course, but also a well executed design in general – one of the cleanest I have seen, especially in the watch world where ornate details can get a bit Byzantine.
The details of the clasp and band anodization are particularly striking, but what I admire most is that the designer is polling visitors to see if the market would support production of this watch. Using the internet as a test market is almost disarmingly modest and plays on some of the best aspects of social web interaction.
Finally, he could have made the interface American Brail (in fact the backside of the watch hints at this feature), but instead has taken the wider view that such a novel and elegant interface would appeal broadly. It plays on the very human desire to figure out puzzles and learn [small] new skills. We are learning creatures and the connection between novelty and this aspect of human nature should not be overlooked by designers. The flip side is that too much of a departure from the ordinary can be too much of a learning curve for some (learning brail for example or any of the binary-display type watches)
A client provides you raw traffic files for a web site. Google analytics has some great interpretive tools to get you started – some of the most interesting come close to one of the ‘idealized’ goals of web site information architecture analysis. Which one? The suite of tools that determine click paths through a site. Following a User’s tracks through the site, determining entry and exit points.
This is passive information, literally it is the past. Aside from usability testing and focus groups and my mom, you have no model for determining how a given change in the Info Architecture, layout, etc etc will alter user pathways through a site.
Ultimately it is these paths you are trying to predict, steer, and design to.
Where does pattern mining come in? It’s a term basically that separates work associated with modeling and predicting patterns from revealing them. In machine learning there are a host of tools designed to first categorize (mine), organize (recognize patterns), then predict the continuation of those patterns.
But there is more than strictly linear extrapolation. Is a User on your site clicking on a $500 watch because they have a budget of $500 and want to buy a gift, because they want a new watch, because they were referred? If they click on another, much less expensive watch – it may be the ‘need a watch’ option, in which case you may have enough of a pattern to model what else they would click on were they to spend the next 24 hours on the site checking out the whole database. instead, a pattern predictor could take this step for you and present the options immediately. This is passive pattern recognition and the intereation between the pattern predictor and the User can be recursive. Is it advisable to present several options to assist in tuning the predictor, “Are you looking for a gift? Are you looking for a new watch? What is your budget?”. Ask Clippy. Maybe in some cases, but in most, passive observation is the best Machine Learning tool.
Ultimately a good predictor is not just for shopping experiences – as design tools they can provide objective, empirical methods of guiding new Information Architectures. Different pattern predictors trained from different mined patterns in web site traffic data effectively can act as simulated Users.
Is this straightforward? No. It may involve natural language processing and other intangibles. An ongoing issue in machine learning is that the learning algorithm may not be learning what you expect. But if these issues are carefully addressed , it may be possible to use web site traffic data to train algorithm to act as real time beta testers. imagine laying out an information architecture where the click-through percentages varied in real time as you designed, changed position and sizing of wire frame elements!