Solution: the biggest Earthquake in the UK this Century

Download the data

import requests

quakes = requests.get(
    "http://earthquake.usgs.gov/fdsnws/event/1/query.geojson",
    params={
        "starttime": "2000-01-01",
        "maxlatitude": "58.723",
        "minlatitude": "50.008",
        "maxlongitude": "1.67",
        "minlongitude": "-9.756",
        "minmagnitude": "1",
        "endtime": "2018-10-11",
        "orderby": "time-asc",
    },
)

Parse the data as JSON

import json
quakes.text[0:200]
'{"type":"FeatureCollection","metadata":{"generated":1659971964000,"url":"https://earthquake.usgs.gov/fdsnws/event/1/query.geojson?starttime=2000-01-01&maxlatitude=58.723&minlatitude=50.008&maxlongitud'
requests_json = json.loads(quakes.text)

Note that the requests library has native JSON support, so you could do this instead: requests_json = quakes.json()

Investigate the data to discover how it is structured

There is no foolproof way of doing this. A good first step is to see the type of our data!

type(requests_json)
dict

Now we can navigate through this dictionary to see how the information is stored in the nested dictionaries and lists. The keys method can indicate what kind of information each dictionary holds, and the len function tells us how many entries are contained in a list. How you explore is up to you!

requests_json.keys()
dict_keys(['type', 'metadata', 'features', 'bbox'])
type(requests_json["features"])
list
len(requests_json["features"])
120
requests_json["features"][0]
{'type': 'Feature',
 'properties': {'mag': 2.6,
  'place': '12 km NNW of Penrith, United Kingdom',
  'time': 956553055700,
  'updated': 1415322596133,
  'tz': None,
  'url': 'https://earthquake.usgs.gov/earthquakes/eventpage/usp0009rst',
  'detail': 'https://earthquake.usgs.gov/fdsnws/event/1/query?eventid=usp0009rst&format=geojson',
  'felt': None,
  'cdi': None,
  'mmi': None,
  'alert': None,
  'status': 'reviewed',
  'tsunami': 0,
  'sig': 104,
  'net': 'us',
  'code': 'p0009rst',
  'ids': ',usp0009rst,',
  'sources': ',us,',
  'types': ',impact-text,origin,phase-data,',
  'nst': None,
  'dmin': None,
  'rms': None,
  'gap': None,
  'magType': 'ml',
  'type': 'earthquake',
  'title': 'M 2.6 - 12 km NNW of Penrith, United Kingdom'},
 'geometry': {'type': 'Point', 'coordinates': [-2.81, 54.77, 14]},
 'id': 'usp0009rst'}
requests_json["features"][0].keys()
dict_keys(['type', 'properties', 'geometry', 'id'])

It looks like the coordinates are in the geometry section and the magnitude is in the properties section.

requests_json["features"][0]["geometry"]
{'type': 'Point', 'coordinates': [-2.81, 54.77, 14]}
requests_json["features"][0]["properties"].keys()
dict_keys(['mag', 'place', 'time', 'updated', 'tz', 'url', 'detail', 'felt', 'cdi', 'mmi', 'alert', 'status', 'tsunami', 'sig', 'net', 'code', 'ids', 'sources', 'types', 'nst', 'dmin', 'rms', 'gap', 'magType', 'type', 'title'])
requests_json["features"][0]["properties"]["mag"]
2.6

Find the largest quake

quakes = requests_json["features"]
largest_so_far = quakes[0]
for quake in quakes:
    if quake["properties"]["mag"] > largest_so_far["properties"]["mag"]:
        largest_so_far = quake
largest_so_far["properties"]["mag"]
4.8
lat = largest_so_far["geometry"]["coordinates"][1]
long = largest_so_far["geometry"]["coordinates"][0]
print("Latitude: {} Longitude: {}".format(lat, long))
Latitude: 52.52 Longitude: -2.15

Get a map at the point of the quake

import requests


def request_map_at(lat, long, satellite=True, zoom=10, size=(400, 400)):
    base = "https://static-maps.yandex.ru/1.x/?"

    params = dict(
        z=zoom,
        size="{},{}".format(size[0], size[1]),
        ll="{},{}".format(long, lat),
        l="sat" if satellite else "map",
        lang="en_US",
    )

    return requests.get(base, params=params)
map_png = request_map_at(lat, long, zoom=10, satellite=False)

Display the map

from IPython.display import Image

Image(map_png.content)
../_images/03_02answer_earthquake_exercise_29_0.png