How to make a data visualisation with Infogr.am

organic market report 2014Infogr.am is a free online tool that helps you make quick and beautiful interactive data visualisations like the one I prepared for my online journalism blog. Its interface is intuitive and user-friendly, and majority of tools is drag-and-drop, which makes Infogr.am so easy to operate.

The first step is to choose your data and plan on what you want to present in your interactive visualisation. I opted for a data set from the Organic Market Report 2014 compiled by the Soil Association (available on demand).

Once you sign up to Infogr.am and start your creative process, you are invited to choose one of the ready-made templates:

infogram

Choose a colour palate that you want to go for and click “Use design”. A dashboard with editable elements appears to which you can add a chart, a map, a text, a photo or a video from the menu on the right.

infogram

Double-click on each element to edit it: change text or open a chart menu. First, give a title to your visualisation. Edit the existing chart or add a new one – make sure you choose the right type of chart for the type of data you have. Double-click on the chart. An Excel-like spreadsheet appears where you can paste your data:

infogramAfter the final tweaks to your data, go to the second tab “Settings”. Depending on the chart you chose, you will find here different editing options: colours, directions, chart’s size and other.

infogram

Pay close attention to how you manipulate your chart. It is important that it present the data in a clear and easily understandable way.

After you have finished adjusting your chart, click “Done” and go on to add more elements to your visualisation.

Infogr.am is a a great tool especially for beginners in data-driven journalism, yet it has a couple of major limitations:

  1. It is impossible to copy-paste text to and from text boxes, which makes typing time-consuming and rather laborious.
  2. As you manipulate the data in the Excel-like spreadsheet, the preview of the chart is unavailable, which makes you save and re-edit the chart a couple of times before you achieve the effect you want.
  3. It would be useful to be able to caption the charts directly, as opposed to having to add chart titles and captions as separate elements to your visualisation.
Advertisement

How to scrape data without coding? A step by step tutorial on import.io

import.ioImport.io (pronounced import-eye-oh) lets you scrape data from any website into a searchable database. It is perfect for gathering, aggregating and analysing data from websites without the need for coding skills. As Sally Hadadi told Journalism.co.uk: The idea is to “democratise” data. “We want journalists to get the best information possible to encourage and enhance unique, powerful pieces of work and generally make their research much easier.” Different uses for journalists, supplemented by case studies, can be found here.

After downloading and opening import.io browser, copy the URL of the page you want to scrape into the import.io browser. I decided to scrape the search result website of orphanages in London:

001 Orphanages in London

After opening the website, press the tiny pink button in top right corner of the browser and follow up with “Let’s get cracking!” in the bottom right menu which has just appeared.

Then, choose the type of scraping you want to perform. In my case, it’s a Crawler (we’ll be getting data from multiple similar pages on the same site):crawler

And confirm the URL of the website you want to scrape by clicking “I’m there”.

As advised, choose “Detect optimal settings” and confirm the following:data

In the menu “Rows per page” select the format in which data appears on the website, whether it is “single” or “multiple”. I’m opting for the multiple as my URL is a listing of multiple search results:multiple

Now, the time has come to “train your rows” i.e. mark which part of the website you are interested in scraping. Hover over an entire “entry” or “paragraph”:hover over entry

…and he entry will be highlighted in pink or blue. Press “Train rows”.train rows

Repeat the operation with the next entry/paragraph so that the scraper gets the hang of the pattern of your selections. Two examples should suffice. Scroll down to the bottom of your website to make sure that all entries until the last one are selected (=highlighted in pink or blue alternately).

If it is, press “I’ve got all 50 rows” (the number depends on how many rows you have selected).

Now it’s time to focus on particular chunks of data you would like to extract. My entries consist of a name of the orphanage, address, phone number and a short description so I will extract all those to separate columns. Let’s start by adding a column “name”:add column

Next, highlight the name of the first orphanage in the list and press “Train”.highlighttrain

Your table should automatically fill in with names of all orphanages in the list:table name

If it didn’t, try tweaking your selection a bit. Then add another column “address” and extract address of the orphanage by highlighting the two lines of address and “training” the rows.

Repeat the operation for a “phone number” and “description”. Your table should end up looking like this:table final

*Before passing on to the next column it is worth to check if all rows have filled up. If not, highlighting and training of individual elements might be necessary.

Once you’ve grabbed all that you need, click “I’ve got what I need”. The menu will now ask you if you want to scrape more pages. In this case, the search yielded two pages of search results so I will add another page. In order to this this, go back to your website in your regular browser, choose page 2 (or any next one) of your search results and copy the URL. Paste it into the import.io browser and confirm by clicking “I’m there”:i'm there

The scraper should automatically fill in your table for page 2. Click “I’ve got all 45 rows” and “I’ve got what I needed”.

You need to add at least 5 pages, which is a bit frustrating with a smaller data set like this one. The way around it is to add page 2 a couple of times and delete the unnecessary rows in the final table.

Once the cheating is done, click “I’m done training!” and “Upload to import.io”.upload

Give the name to your Crawler, e.g. “Orphanages in London” and wait for import.io to upload your data. Then, run crawler:run crawler

Make sure that the page depth is 10 and that click “Go”. If you’re scraping a huge dataset with several pages of search results, you can copy your URLs to Excel, highlight them and drag down with a black cross (bottom right of the cell) to obtain a comprehensive list. Paste it into the “Where to start?” window and press “Go”.go

crawlingAfter the crawling is complete, you can download you data in EXCEL, HTML, JSON or CSV.dataset

As a result, we obtain a data set which can be easily turned into a map of orphanages in London.

Do you have any further tips for import.io extraction? Do you know any other good scrapers? Share your thoughts in the comments below.

Hint: If you need to structure and clean your data, here’s how to do it.

In the meantime, look out for another post in which I will explain the next step: how to visualise the data you have.