I happen to have a small number of codes that can be used to download the eBook version (in full glorious technicolor) for free via the Packt website.
I’m not 100% clear just how many I have but I will find out shortly: it could be as many as 7 or as few as 3.
However, the reason why I’m publishing this now without the certainty of numbers is that the codes will expire shortly so I need to make them available in a random way to knowledge-thirsty folks like yourselves!
So, if you would like a chance to get a free copy of my (yet to be, but surely imminently) award-winning eBook then please add your name in the comments field below and I will add you to the random prize draw. Please include your email address in the appropriate box when you submit your comment so I can get in touch with you.
I will close the draw at 10am CET tomorrow (Tuesday) then randomly pull out a sufficient number of names.
To read more about the book I have a dedicated page that outlines the contents and has some sample excerpts.
** Update: Prize draw made! **
And the five random drawn winners are…
Published yesterday comes the newest episode of the PBS ‘Off Book’ series in the form of a nice video titled “The Art of Data Visualization”, a subject close to the topic of some of my recent talks. A few out there on Twitter have picked up on this already but I thought I would post it on here to reach a wider audience.
As Lisa Romagnoli – the Associate Producer – explains “Off Book is a web series from PBS that explores cutting edge art and technology and the people behind it. This episode features Edward Tufte and breaks down the ways in which we present and digest data today.”
Humans have a powerful capacity to process visual information, skills that date far back in our evolutionary lineage. And since the advent of science, we have employed intricate visual strategies to communicate data, often utilizing design principles that draw on these basic cognitive skills. In a modern world where we have far more data than we can process, the practice of data visualization has gained even more importance. From scientific visualization to pop infographics, designers are increasingly tasked with incorporating data into the media experience. Data has emerged as such a critical part of modern life that it has entered into the realm of art, where data-driven visual experiences challenge viewers to find personal meaning from a sea of information, a task that is increasingly present in every aspect of our information-infused lives.
This is a guest post by Johanna Morariu (@j_morariu) and Ann Emery (@annkemery) who work for Innovation Network, a nonprofit evaluation, research, and consulting firm in Washington, DC.
You convened stakeholders from all aspects of the evaluation: grantees, their funders, community members, even service recipients. You designed a theory of change that demonstrates how a national initiative will achieve its goals over the duration of a grant. You collected data through interviews, surveys, and other assessments. You spent dozens (hundreds?) of hours analyzing the data. You summarized key findings through a 30-page evaluation report—and even included 30 more pages of appendices with details about the analyses. After 12 months of blood, sweat, and tears, you’re finally ready to share the evaluation findings with your client.
Our strongest weapon against the Dusty Shelf Report is data visualization. These three tactics have proven most powerful:
Tactic #1: Captivate the readers with visuals. We’re also battling the Sea of Similar Reports. Our readers are not immune. Like us, they’re entrenched in a constant downpour of reports, memos, and emails. To make matters worse, the typical evaluation report is about 80% text and 20% visuals—paragraphs of text with one small chart per page. In our State of Evaluation 2012 research, we used the opposite tactic: 80% visuals and 20% text. As we suspected, the visualizations started conversations, made the information more accessible, and helped the (dry) research results stay afloat in the Sea of Similar Reports.
To grab the reader’s attention, we often deviate from the typical one-chart-per-page evaluation report format.
Tactic #2: Choose the design that’s right for the client–not the design that’s right for the data. Yep, we know. This is a major dataviz sin. We hear you. Last year, we developed a social network analysis (SNA) for a foundation. Our SNA had by-the-book design principles. But there was a major problem. It didn’t fit their dataviz literacy level. They were dataviz novices, and preferred to ‘visualize’ their data in a table. The SNA was completely unusable, thrown into the heap of Dusty Shelf Reports. Now, we’re strategically moving them forward, bit by bit, to get them to a point where they benefit from SNA. Until then, we’re using tables. Lesson learned: Have a contingency plan. What’s beautiful in theory doesn’t always apply to real-world situations.
In evaluation, we use social network maps to visualize relationships between key players in a network.
Tactic #3: Strengthen the dataviz literacy of your troops. Even though we’re visual thinkers, our clients might not be. Lead with visuals that your readers are familiar with (like bar charts). Emphasize data they’re interested in (like contextual details). With a few adjustments, you can overcome even the most lackluster bar charts. In this example, we combined a closed-ended survey question (the stacked bar chart) and a corresponding open-ended survey question (the quotes). Uniting quantitative and qualitative data provides the context they’re craving. Once they’re engaged in the conversation and comfortable with the basics, you can move on to advanced visualizations.
Throughout the evaluation field, surveys are one of the most common ways to collect data.
So, how do you combat the Dusty Shelf Report? Leave your comments below or get in touch via our twitter accounts: Johanna Morariu (@j_morariu) and Ann Emery (@annkemery). Thanks!
This is a guest post from Ben Harrow, Digital Editor at 72Point and at News by Design, a news site built around infographics – a platform that shows off infographics that tell a newsworthy story in a structured and visually dynamic way.
The term ‘infographic boom’ really grates.
As the product and its presence both continually improve, data visualisation is definitely riding a wave – and I guess that means it’s surfing its way into the mainstream media.
With the likes of The Guardian still setting the bar when it comes to quantity and quality, others news orgs like The Press Association are increasingly getting on board; delivering to the rest of the mainstream UK press and ensuring that the tabloids begin to take note.
Even PRs are seeing greater success with agency produced graphics, hoovering up name-checks and giving the business of infographics a wider appeal.
But how are they being received? Is data visualisation, in all its forms, being implemented more regularly just because it’s the next new thing? And will there be a place, long term, for graphics and visualisations in the mainstream press, both on and offline, or will the quality collapse and the interest dwindle?
Discussing data – the four kinds of ‘infographic’
I’m not even going to go into the definition of ‘infographic’ – to me, it’s a single piece of data visualisation. An information graphic. An informative graphic. The meaning has changed and will continue to change, as all meanings do.
I will however, break them down into categories because, for me, they all fall into pretty different camps:
The Beautiful - we’re talking Pitch Interactive’s Drone Strikes and visualising population with NPR – often large, complex sets of data visualised in unique, intuitive and often visually staggering or stunning ways.
The Brilliant - this is the upper echelon of what is, essentially, standard procedure – we’re talking about well researched, data-driven, reactive and consistently high quality graphics, growing out of the newsrooms of The Guardian, The Washington Post and The Economist (although they have the tendency to rely on basic graphs).
The Basic - there’s nothing wrong with basic – we’re talking about typically vertical infographics on pop culture, issues in the news or survey-based statistics, the best of which find their way into the national newspapers online (occasionally in print) and across sector-specific news websites. Well-referenced research and clean, clear design, often created by an agency or a freelance/in-house designer.
The Brave - now, there IS something wrong with this. We’re talking poorly designed, poorly referenced (or not referenced at all) or no original research. Often been done a million times before with marginally different ideas or design. The bane of my life on News by Design, and the graphic equivalent of press release spam. Basically ‘The Basic’, but done all kinds of wrong.
Now, the vast majority will fit into the middle two categories – most of ‘The Brave’ won’t go viral or even see the light of day, and ‘The Beautiful’ are few and far between and, oddly, don’t pick up major coverage (because they are often impossible to embed or are a story in themselves, not needing coverage on another platform).
So if we’re talking mainstream media, you don’t really get to see the very best and the very worst.
The UK papers, and a dash across the pond
The Guardian obviously have a monopoly, of sorts, on ‘The Brilliant’ in the UK – when you think about data and data vis, you go straight to the Datablog (but The Economist are consistent, and another good example).
This means that the vast majority of ‘Brilliant’ infographics or data visualisations are only ever produced in-house, whether by their own designers or as part of a wider project attached to the organisation – and they are only ever distributed via one platform. There’s no PR or social media effort, so to speak, simply self publication.
This means that, unlike the standard infographic, which intends to ‘live’ for a long time and remain as viral as possible for as long as possible, they have a shelf life. Which is a curious thing.
As a result, and as is the nature of the type of publication, the ‘Brilliant’ are reactive to the news agenda, feature high quality research and data journalism and are much more suited to print, where they are beginning to appear more and more often (most interestingly as a regular feature in The Metro).
It’s basically journalism vs. public relations all over again.
(When it comes to the US they are, on the whole, kicking the UK when it comes to consistently producing quality graphics – across the likes of The New York Times and The Washington Post, they have more people consistently covering important issues visually – plus, The Guardian have around a third of their readers are in the US, and the Economist is partly US too – selfish, is what that is).
Marketing, PA, PR and social media shenanigans
So marketers saw the potential of these infographics to go viral. We all know the story. Content marketing, blahblahblah.
The better of ‘The Basics’ are attached to good brands and good PR companies, who now have the potential to achieve huge coverage on some of the most popular news websites in the world – and it does make sense to try and take advantage.
Mashable is a brilliant example of an early (and continuous) adopter – develop a good news angle, a strong topline and a decent quality infographic and you’re golden. Even we’ve done it, as part of the day job. And it’s genuine, good quality coverage.
And the Press Association are making the process more mainstream, regularly sending out graphics as a picture desk would send out photos – as a resource for journalists to enhance and enlighten stories.
The quality and interest vary, but it’s an enticing prospect and something which could further wedge the door open, so to speak, to allow room for infographics to become to norm in the mainstream press.
It’s definitely a nice change from the self-producing market leaders (Guardian, Economist) having the monopoly on beautiful, visual things – but only if we’re heading in the right direction.
For me, there are two reasons why infographics in the mainstream media are becoming pollutants – because of either the clients, or the quality.
For example, take, ironically, ‘The State of Infographics’ infographic that went viral as anything earlier this year – really good looking piece and some really good data.
But look at the ‘client’ – topmarketingschools.net. I refuse to hyperlink, but take a look. Take an actual look. I dare you.
Brands, doing infographics based subject matter relevant to their public perception, are my favourite. They show understanding, wit and intelligence, and the willingness to do something new.
But when it’s marketing spam, for a website that’s useful to no-one with content that has no relevance to your brand?
That’s when you’re polluting the pool.
The same goes with poor quality infographics – if you’re going to do it, do it right. Don’t spend >£100, or use a company you don’t know, and just do it for the sake of doing it believing that if it’s an infographic, it’ll go viral.
Don’t use research you’ve seen on a different infographic (or no research at all, or unreferenced research), repeating the process just because you’ll catch some marketing value by proxy (Gangnam Style. Ooofffftt).
I can understand that not everyone can afford an intense marketing effort, but there are young freelances who will really try for you, and survey companies that will offer you research. And there are always original ideas.
We don’t always expect ‘The Beautiful’ – and as I said, they don’t often go viral -but if people strove to rise above the average, there’d be no arguments from any corner.
In short, marketing or not, it’s about the story. Graphic and data vis. are great tools, but so are photos, news copy and video. Don’t use something simply because it’s trending, or simply for the sake of using it.
Think of a brilliant story, and use whatever medium will tell it best. It will make the best journalism, the best marketing tool, and the most viral product.
And that’s coming from someone who’s worked in data journalism, is a PR, and loves talking about infographics.
One collection of projects that most intrigued me came from the work of Kate McLean and her fascinating sensory maps: mapping the smells, sounds and tastes of a city. Here is an example of a wonderfully-titled project ‘Auld Reekie‘ showing the smell patterns of Edinburgh.
Kate is a designer, photographer and a lecturer, and describes herself as a ‘sensory researcher’ producing work that “challenges the paradigm that graphic design can rely on the visual. We have 5 senses… let’s use them“. I love the quote below, welcoming visitors to Kate’s homepage, as it perfectly captures the incredibly evocative and enduring memories we can create from our non-visual senses.
There are so many dimensions to Kate’s work. On one level she creates abstract visual representations of cityscapes using shape, contours and colours to encode the smells that characterise different parts of a city.
The cities that seem to provide the most striking experiences are those older, non-homogeneous cities with a varied tapestry of cultures and communities. She has done work on Glasgow, Edinburgh, Newport (RI), Paris and New York, and right now she is working on a smell map project for Amsterdam and has just returned from an exhibition of a 3D Smell Map of Milan.
Her process involves undertaking ‘smell walks’ of a city to capture for herself the essence of a city through its aroma. She will then record these experiences forming a ‘smell’ sketch layer of notes on top of a map of the city. From this research Kate will create a visual portrayal of the city, as we see in the image at the top, but she also goes a stage further by making up individual scents using natural ingredients that best reflect the smells identified. These aren’t part of the maps themselves but are incorporated into the wider exhibition, as she describes in an article for the Daily Mail:
Each scent is stored in its own bottle which is stored in a small cabinet underneath the map. I prefer to keep the contents of the bottles hidden so that the audience cannot rely on visual cues to identify the smells.
I have learned how to distill rose petals, to create a perfume of stinky cheese, to depict the smell of penguins at the zoo without harming a single penguin. I can fabricate the smell of a building site and of boy’s toilets in primary schools*.
In the image below we see an example of another interactive approach to creating this work, here conducted on a Paris smell map project. Here, the audience were invited to smell a scent, consider their memory or recollection of that smell and then note the feeling they identified with the smell before placing their note on the map in the location they most associated with that smell.
Kate has also worked on a ‘taste map‘ project in Edinburgh using slabs of beef dripping (that eventually melted!) to illustrate the different levels of fat content in typical diets based on (I think) the available cuisine/food outlets around the city. She has also worked on a tactile representation of Edinburgh.
You can keep a track on Kate’s fascination research and design process through her blog and follow her updates on Twitter (@katemclean). For further information, you can listen to Kate chatting about ‘Smell and the City’ (…the lesser known TV series!) on radio in Rhode Island last summer and here is a presentation she gave on ‘Representing Smell‘.
* I think I received that aftershave for Christmas a couple of years ago…
Last year I had the pleasure of taking part (in a very small way!) in the Big Dive EU, an intensive 5 week training program based in Turin, Italy aimed at boosting a new generation of data scientists and visualisation developers.
After a successful first edition, TOP-IX, together with Axant, ISI Foundation and Todo have announced there will be a second edition of this event and I’m more than happy to share details and help spread the word:
The demand for data scientists is growing exponentially and we are just at the beginning of an exciting Big Data era in the IT world. This is your chance to boost your data science skills diving into the BIG DATA universe. Big Dive EU is like a street-fighting gym where high value datasets are the raw material in the hands of a bunch of ambitious smart geeks tutored and mentored by experts in three key areas: Development, Visualization and Data Science.
Here is an outline of the disciplines that are being covered:
Web application with Python
API and Data fetching
Data Storage (Relational/ key-value databases)
Hadoop and MapReduce
Performance Optimization and Profiling
Brief history of data visualization
Showcase of relevant data visualization projects
The manifesto of data visualization
Visual paradigms for dataviz
Step by step creation of the first dataviz project through creative coding
# DATA SCIENCE
Network science: represent and analyze network data
Natural language processing and semantics
The second edition will start on June 3rd, 2013 and run through to July 5th with full-time lessons Monday through to Friday of each week. There are only 20 places available for this fantastic training opportunity. Applications close on 19th May so act quickly. Pricing information can be found here but for more details in general visit the website at www.bigdive.eu, email the organisers at firstname.lastname@example.org or check out the twitter feed @bigdive_eu.
This is a guest post from Dr Paula McLeod who has one of the most interesting jobs (and challenges!) I’ve heard of for a long time. In September of 2012 Paula was appointed as statistician for St. Helena on a two-year fixed term contract. Very few of you may have heard of St. Helena. It is a small volcanic Island in the South Atlantic Ocean, a British territory with only 4500 residents. It is such a remote island that it was used to imprison Napoleon Bonaparte in 1815 and it takes five days to sail by RMS St Helena to Cape Town for access to travel hubs. The development of an airport scheduled for completion in 2015/2016 is expected to transform the island’s fortunes.
This week Paula will be hosting a local event as part of this week’s worldwide Big Data Week so I invited Paula to share with readers the unique challenges she is facing in her role such as “transforming the quality and range of available statistics, support users in accessing and interpreting data”. Paula is very keen to invite any readers to share with here any suggestions of resources or support that you think may be of help in the ongoing development of the island’s capability.
St Helena is a small island heading for big change. One of the most isolated islands in the world St Helena has been used as a stopover for passing ships on the pre-Suez canal route from Europe to Asia and South Africa and as a place of exile.
The isolation is true both physically and in terms of communications- our internet connection speeds range from 128Kbps – 2 Mbps depending on the depth of your pocket. Although improving the speed of our broadband services is dependant on securing funding to re-route a planned trans-Atlantic fibre connection the days of physical isolation are numbered. St Helena is facing up to irrevocable change with the impending arrival of air access- construction is well underway for an airport scheduled for operational completion in February 2016.
As we head into this change St Helena has a higher than ever demand for reliable data on the people, the economy and the environment. The statistics need to be bang up-to-date (where possible) and accessible to all (always!). To build trust, provide accountability and enable the community at large to engage and support the change process everyone must have access to the same information. This means many changes in the way we collect data, process information, and then report and disseminate. We need to modify approaches to provide immediate information, accessible to all and presented in a way they can understand regardless of level of education. I don’t believe that these are radical ideas but making an idea a reality is not always easy, especially when involves many technical and skilled processes.
Before coming to the island I had only ever worked in the UK in academia and the civil service. I took for granted the many experts which surrounded me. If that advice and support wasn’t found in my office or through professional contacts then it was often a Google search away. Training courses and workshops are readily available. Where needed consultants and contractors can be bought in to fill vital roles in a project for a day, a week or as and when required.
On St Helena to send a team member on a one-day training course is going to require upwards of a month away from the office to allow for travel time. On-line seminars and e-learning is a growing area which we would love to engage with… but these, unsurprisingly are not tailored to our internet capacity. It’s not unreasonable to make use of You-Tube, until you are in a place where your download allowance is 500Mb a month. It is exciting to see seminars being made available on-line. Finding them is not always straightforward. Again, unsurprisingly, browsing is not a quick process when most websites are not designed for limited bandwidth.
The thirst for knowledge and advancement found here is admirable. When presented with information well there is genuine delight. An hour-long session showing Hans Roslings documentary “The Joy of Stats” has the most amazing impact. You may be able to watch this online any time you like, or order the DVD for next day delivery. For St Helena it took 8-weeks to arrive by post.
We want to engage and inspire people but that is almost the easy part. Keeping up interest is more difficult. It is slightly grieving that having initiated excitement about data we are unable to back it up with convincing example of where this can be put to good use. Data collection and collaboration on St Helena is in its infancy. Migration towards electronic databases is in progress but currently exists as disparate, isolated solutions. The need for change towards conformity and collaboration is recognized but difficult to achieve. This reticence to engage with new techniques and technologies is repeated the world over and it is fantastic to be able to join with international initiatives, such as Big Data and Statistics 2013, in order to make progress.
It is hardly surprising that data is difficult to get hold of and poorly used when is difficult to show the benefits. Why should anyone be expected to be burdened with sharing sensitive information about themselves or their business if there is no apparent benefit? (other than the pleadings of a dedicated enumerator!).
Some prime data issues on St Helena are:
Fear of information: there is a strong suspicion that information will be used against people. Disclosure risk is difficult to manage in a small population. The island population currently stands at a little under 4,300. To be unique is the one common characteristic! That said, many people are open to the concept but need to be convinced of the benefits that come from sharing information on income, wages and so on. Businesses and those responsible for production need an even greater level of persuasion that collaboration on use of data will benefit their business as well as (not just?) that of their neighbour.
Limited use of business data: from monitoring stock level to market strategy there is a keen need for better use of data. This isn’t a support service businesses can buy in- the skills simply aren’t available on island. The solution is training and support but people need convincing that the expense and effort are worth it. We don’t have local examples so look to provide inspiration from overseas… just need to pin-point the right stories!
Lack of understanding: we just can’t “see” the benefits of using data well. The UK media is a wealth of information and the way this is presented is constantly improving. If further information is required it is a few clicks or a trip to the library away. These are straightforward activities which just aren’t as easy here. Our local library is not blessed with many core references on use and understanding of data or statistics. On an island where the median income is less than £6,500 all purchases must be very carefully considered and clearly justified.
The island needs to be shown the benefits that come from making personal and business data available for use by those equipped with the expertise to manipulate, analyze and present information. Equally important to the ability to correctly interpret the information with which they are presented- to know whether this information is correct, fundamentally flawed or being cherry-picked to support a particular business need or political stance.
This isn’t a sorry story from an isolated British territory, heavily dependent on UK aid. This is a story of a small island growing in population, economy and potential. We need to develop in the use and understanding of data to ensure we are to be equipped to deal with our entry into the international arena. This is a journey that many other have started on, perhaps just a little bit ahead of us and with a greater ability to accommodate change. Capacity within the St Helena Statistics Office is limited and are needs are many legged (see below). If you have any suggestions of resources or support that you think may be of help in this development then we will be very grateful to hear from you, you can email me via email@example.com.
You can follow Paula’s personal experience of this genuine life experience on her blog ‘Small Island Stats‘.
At the end of each month I pull together a collection of links to some of the most relevant, interesting or thought-provoking web content I’ve come across during the previous month. If you follow me on Twitter you will see many of these items shared as soon as I find them. Here’s the latest collection from March 2013.
Includes static and interactive visualisation examples, infographics and galleries/collections of relevant imagery.
Peoplemov.in | Updated interactive slope graph by Carlo Zapponi to show where the migrant populations of the world are moving from and to as of 2010.
Economist | Simple but effectively executed video graphic to explain the trend-bucking recent rise in music sales
Washington Post | Where Americans go to work: commuting in and out of counties nationwide
Wired | Microsoft whiteboard unites big data, predictive drawing and autocorrect
University of Lincoln | Gallery of projects from the Graphic Design course, trying out visual storytelling
ChronoZoom | Interactive tool to convey and contrast the staggering difference in sizes of the Cosmos, Life and Humanity
National Post Graphics | Impressive collection of graphics from the National Post Canadian (online?) newspaper.
Reuters | A multi-media, multi-chaptered mini-site exploring the different dimensions of ‘Connected China’ (work by Fathom)
Behance | Cost to Cost: ‘visualising holidays for all budgets represented by a world map where the position of the different cities is not based on the real distance but on the price of low-cost flights from Italy and the cost of 3 nights in hotel during the week of Christmas’
Huffington Post | Exploring the people who have died in gun-related incidents since Newtown…
The Why Axis | Second great post from Bryan Connor, this exploring the Washington Posts’ project ‘How long will we live – and how well?’
Datavisualization.ch | Design narrative from Interactive Things about their work on the ‘Life After Fukushima’ project
Masters of Media | Interesting piece discussing colour, framed by Moritz’s famous ‘colour is difficult’ quote
Bloomberg TV | ‘A look at how this graphic form of business analytics and intelligence software is helping companies analyze data, forcing C-suite executives to develop what some call visual literacy’
Learning & Development
These links cover tutorials, learning opportunities, case-studies, how-tos etc.
Telling Information | ‘Before you hit the ‘chart’ button…’, useful advice from Lulu about choosing the right chart to illustrate different points about data
Eager Eyes | Robert explores the idea that ‘visual representation gives numbers and concepts a reality they don’t otherwise have’
Data Viz Blog | A review of Alberto Cairo’s MOOC ‘Introduction to Infographics and Data Visualisation’
Eager Eyes | A better definition of chart junk (particularly interest comments/discussion)
Stamen | Stamen announce their new mapping project titled ‘here’
Includes announcements within the field, brand new sites, new (to me) sites, new books and generally interesting developments.
VisualizingEconomics | New book by Catherine Mulbrandon ‘An Illustrated Guide to Income in the United States’
UK Data Service | Newly discovered site with a collection of datasets providing ‘a comprehensive resource funded by the ESRC to support researchers, teachers and policymakers who depend on high-quality social and economic data’
Graphics Link | Conference: 17th International Conference Information Visualisation, 15th to 18th July 2013 in London
I’ve only had intermittent WiFi access over the past 48 hours but earlier today I caught a discussion relating to the percentage of delegates attending data visualisation training who are female.
Any ongoing discussion about gender balance and participation of women in our field is of great value and having trained over 800 delegates since November 2011 I felt I had a good sample size from which to draw some analysis relating to this issue. Instinctively, I responded to Lynn Cherny’s tweet below with an estimate of about 35% of delegates being female but I decided to do some quick work to firm this up.
Instinctively feels better than the 25% from census, ~35%? RT @arnicas: Folks who teach infovis courses: what’s your gender ratio in class?
I went through my records and marked up the gender of those attending my public training classes. I don’t have sufficiently detailed records of those who attended private training events, which make up about half of this dataset, so I discounted these from the investigation. This gave me a sample total of 430 people. A quick bit of analysis and presentation in Excel is shown below, click for a larger view.
Aside from revealing my incredible powers of astuteness (35% vs. 36.5% is pretty good, let’s face it!) the main headline from this analysis reveals that the percentage of attendees who are female is higher those participating in the recent data visualisation census. Perhaps this shows that a healthier ‘pipeline’ of females entering/learning about the field.
I’m in the midst of inviting a few folks to contribute guest posts to profile their work, ideas or knowledge. This guest post comes from Jurian Baas of Silk, who explains how you can use his tool to create and publish simple data visualisations.
It’s exciting to see how people start to recognize the importance of data visualisations. Good data journalism and visualisations help to make sense of the enormous amounts of data being made available each day. But while there is a much needed growth in visualisation tools, their learning curve is steep. Unless you are a mathematician or programmer, it’s hard to know where to begin.
We are building Silk to make it easier for everyone to get to work with collections of data. A Silk site lets you import or create datasets, and publish articles and interactive visualisations. The data and analysis’s live in one place, so your visitors can be encouraged to play around with the parameters of the visualisations. We aspire to make working with data available to everyone, just like blogging platforms like WordPress and Tumblr did for publishing.
To show you how our platform works, let’s see how a Silk site gets made. I’ll use myself as an example. I’m a big movie fan, and I created a Silk site about pirated Oscar movies. I found a dataset with information about when Oscar nominated movies got leaked on piracy sites, and imported the spreadsheet into his Silk site. For each row in the spreadsheet, a page is created that holds all the information from the columns.
I used the IMDB to add actors, directors and other information to the movie pages. This makes each movie page interesting in itself.
On the homepage, I placed interactive visualisations to show which movies are leaked early, and which genres are leaked the most. The graphs use the properties on the imported pages. Anyone visiting the site can hit the ‘explore’ button on each widget and play around with the data. In addition to graphs, you can also insert interactive maps if your site has location data (see The Guardian’s Silk site for an example).
That’s all there is to it! I hope some of the readers of Visualising Data find value in our platform. We are constantly trying to improve it, and welcome any kind of suggestion and commentary. You can read more about the tool here, or check out this collection of Silk sites. I’m on Twitter as @jurb.