When did the two highest mega floods in the Amazon occur?¶
We found two simulated mega-floods, that largely deviated from what can be expected from the observed record. Here we will perform a first analysis to figure out when they happened.
##Load pacakages
import xarray as xr
import matplotlib.pyplot as plt
import numpy as np
import cartopy
import cartopy.crs as ccrs
##This is so variables get printed within jupyter
from IPython.core.interactiveshell import InteractiveShell
InteractiveShell.ast_node_interactivity = "all"
We are loading the data from (der Wiel et al., 2019), openly available through https://zenodo.org/record/2536396#.XnnBgohKiUk.
Note the error message in regard to the time. I suspect this is because it is just a number from 1:2000 -> there are 2000 present-climate and future-climate years.
dirname = r'/home/tike/Discharge/'
Global_discharge = xr.open_dataset(dirname + 'Zenodo/presentYearMax.nc')
Global_discharge
/home/tike/miniconda3/envs/exp/lib/python3.8/site-packages/xarray/coding/times.py:426: SerializationWarning: Unable to decode time axis into full numpy.datetime64 objects, continuing using cftime.datetime objects instead, reason: dates out of range
dtype = _decode_cf_datetime_dtype(data, units, calendar, self.use_cftime)
/home/tike/miniconda3/envs/exp/lib/python3.8/site-packages/numpy/core/_asarray.py:85: SerializationWarning: Unable to decode time axis into full numpy.datetime64 objects, continuing using cftime.datetime objects instead, reason: dates out of range
return array(a, dtype, copy=False, order=order)
- lat: 360
- lon: 720
- time: 2000
- time(time)object0001-01-01 00:00:00 ... 2000-01-01 00:00:00
- standard_name :
- time
- long_name :
- Days since 1901-01-01
array([cftime.DatetimeGregorian(0001-01-01 00:00:00), cftime.DatetimeGregorian(0002-01-01 00:00:00), cftime.DatetimeGregorian(0003-01-01 00:00:00), ..., cftime.DatetimeGregorian(1998-01-01 00:00:00), cftime.DatetimeGregorian(1999-01-01 00:00:00), cftime.DatetimeGregorian(2000-01-01 00:00:00)], dtype=object)
- lat(lat)float3289.75 89.25 88.75 ... -89.25 -89.75
- long_name :
- latitude
- units :
- degrees_north
- standard_name :
- latitude
array([ 89.75, 89.25, 88.75, ..., -88.75, -89.25, -89.75], dtype=float32)
- lon(lon)float32-179.75 -179.25 ... 179.25 179.75
- standard_name :
- longitude
- long_name :
- longitude
- units :
- degrees_east
array([-179.75, -179.25, -178.75, ..., 178.75, 179.25, 179.75], dtype=float32)
- discharge(time, lat, lon)float32...
- standard_name :
- discharge
- long_name :
- discharge
- units :
- m3s-1
[518400000 values with dtype=float32]
Selecting the streamflow timeseries for the mouth of the Amazon¶
The files contain 2000 years of annual monthly maximum streamflow. We use the mouth of the river and select which year resulted in the largest flood.
##Spatial domain of the Amazon basin
lats = [5, -17]
lons = [-80, -45]
## We cut out the Amazon region
Amazon_discharge = Global_discharge['discharge'].sel(lon=slice(lons[0], lons[1]),
lat=slice(lats[0], lats[1]))
Amazon_discharge
Amazon_timeseries = Global_discharge['discharge'].sel(lon=-51.75, lat=-1.25)
- time: 2000
- lat: 44
- lon: 70
- ...
[6160000 values with dtype=float32]
- time(time)object0001-01-01 00:00:00 ... 2000-01-01 00:00:00
- standard_name :
- time
- long_name :
- Days since 1901-01-01
array([cftime.DatetimeGregorian(0001-01-01 00:00:00), cftime.DatetimeGregorian(0002-01-01 00:00:00), cftime.DatetimeGregorian(0003-01-01 00:00:00), ..., cftime.DatetimeGregorian(1998-01-01 00:00:00), cftime.DatetimeGregorian(1999-01-01 00:00:00), cftime.DatetimeGregorian(2000-01-01 00:00:00)], dtype=object)
- lat(lat)float324.75 4.25 3.75 ... -16.25 -16.75
- long_name :
- latitude
- units :
- degrees_north
- standard_name :
- latitude
array([ 4.75, 4.25, 3.75, 3.25, 2.75, 2.25, 1.75, 1.25, 0.75, 0.25, -0.25, -0.75, -1.25, -1.75, -2.25, -2.75, -3.25, -3.75, -4.25, -4.75, -5.25, -5.75, -6.25, -6.75, -7.25, -7.75, -8.25, -8.75, -9.25, -9.75, -10.25, -10.75, -11.25, -11.75, -12.25, -12.75, -13.25, -13.75, -14.25, -14.75, -15.25, -15.75, -16.25, -16.75], dtype=float32)
- lon(lon)float32-79.75 -79.25 ... -45.75 -45.25
- standard_name :
- longitude
- long_name :
- longitude
- units :
- degrees_east
array([-79.75, -79.25, -78.75, -78.25, -77.75, -77.25, -76.75, -76.25, -75.75, -75.25, -74.75, -74.25, -73.75, -73.25, -72.75, -72.25, -71.75, -71.25, -70.75, -70.25, -69.75, -69.25, -68.75, -68.25, -67.75, -67.25, -66.75, -66.25, -65.75, -65.25, -64.75, -64.25, -63.75, -63.25, -62.75, -62.25, -61.75, -61.25, -60.75, -60.25, -59.75, -59.25, -58.75, -58.25, -57.75, -57.25, -56.75, -56.25, -55.75, -55.25, -54.75, -54.25, -53.75, -53.25, -52.75, -52.25, -51.75, -51.25, -50.75, -50.25, -49.75, -49.25, -48.75, -48.25, -47.75, -47.25, -46.75, -46.25, -45.75, -45.25], dtype=float32)
- standard_name :
- discharge
- long_name :
- discharge
- units :
- m3s-1
Mega-flood indices¶
We have now selected the timeseries for the mouth of the Amazon. Next step is to extract the two highest events, that are deviating a lot from the rest (see Extreme value analysis.
The highest flood occurs on 1568 (index 1567 in python). The second highest on 1731 (index 1730 in python).
For megaflood1 this corresponds to: Year: 2037 Start: 13 Ensemble: 13
And for megaflood2 this corresponds to: Year: 2035 Start: 14 Ensemble: 21
# List from Niko:
# S01-E01-2035 S01-E01-2036 S01-E01-2037 S01-E01-2038 S01-E01-2039 S01 E02 -2035…. S01-E16-2039 S02-E01-2035…
Starts=np.arange(1,17)
Ensembles=np.arange(0,25)
years=np.arange(2035,2040)
Starts
Ensembles
years
array([ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16])
array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
17, 18, 19, 20, 21, 22, 23, 24])
array([2035, 2036, 2037, 2038, 2039])
years_list=np.tile(years,len(Starts)*len(Ensembles))
Starts_list=np.array([])
for start in Starts:
Starts_list = np.append(Starts_list, np.repeat(start,len(years)*len(Ensembles)))
Ensembles_list_signlestart=np.array([])
for ensemble in Ensembles:
Ensembles_list_signlestart = np.append(Ensembles_list_signlestart, np.repeat(ensemble,len(years)))
Ensembles_list=np.tile(Ensembles_list_signlestart,len(Starts))
years_list[0:200]
Ensembles_list[0:200]
Starts_list[0:200]
len(years_list)
len(Starts_list)
len(Ensembles_list)
array([2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035,
2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036,
2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037,
2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038,
2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039,
2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035,
2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036,
2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037,
2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038,
2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039,
2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035,
2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036,
2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037,
2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038,
2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039,
2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035,
2036, 2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036,
2037, 2038, 2039, 2035, 2036, 2037, 2038, 2039, 2035, 2036, 2037,
2038, 2039])
array([ 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 2., 2., 2.,
2., 2., 3., 3., 3., 3., 3., 4., 4., 4., 4., 4., 5.,
5., 5., 5., 5., 6., 6., 6., 6., 6., 7., 7., 7., 7.,
7., 8., 8., 8., 8., 8., 9., 9., 9., 9., 9., 10., 10.,
10., 10., 10., 11., 11., 11., 11., 11., 12., 12., 12., 12., 12.,
13., 13., 13., 13., 13., 14., 14., 14., 14., 14., 15., 15., 15.,
15., 15., 16., 16., 16., 16., 16., 17., 17., 17., 17., 17., 18.,
18., 18., 18., 18., 19., 19., 19., 19., 19., 20., 20., 20., 20.,
20., 21., 21., 21., 21., 21., 22., 22., 22., 22., 22., 23., 23.,
23., 23., 23., 24., 24., 24., 24., 24., 0., 0., 0., 0., 0.,
1., 1., 1., 1., 1., 2., 2., 2., 2., 2., 3., 3., 3.,
3., 3., 4., 4., 4., 4., 4., 5., 5., 5., 5., 5., 6.,
6., 6., 6., 6., 7., 7., 7., 7., 7., 8., 8., 8., 8.,
8., 9., 9., 9., 9., 9., 10., 10., 10., 10., 10., 11., 11.,
11., 11., 11., 12., 12., 12., 12., 12., 13., 13., 13., 13., 13.,
14., 14., 14., 14., 14.])
array([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2.,
2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2.,
2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2.,
2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2.,
2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2., 2.])
2000
2000
2000
##Select the highest flood
megaflood1 = Amazon_timeseries.isel(
time=np.argmax(Amazon_timeseries))
megaflood1
## time = 1568, so index is 1567:
Amazon_timeseries[1567]
- 1157535.8
array(1157535.8, dtype=float32)
- time()object1568-01-01 00:00:00
- standard_name :
- time
- long_name :
- Days since 1901-01-01
array(cftime.DatetimeGregorian(1568-01-01 00:00:00), dtype=object)
- lat()float32-1.25
- long_name :
- latitude
- units :
- degrees_north
- standard_name :
- latitude
array(-1.25, dtype=float32)
- lon()float32-51.75
- standard_name :
- longitude
- long_name :
- longitude
- units :
- degrees_east
array(-51.75, dtype=float32)
- standard_name :
- discharge
- long_name :
- discharge
- units :
- m3s-1
- 1157535.8
array(1157535.8, dtype=float32)
- time()object1568-01-01 00:00:00
- standard_name :
- time
- long_name :
- Days since 1901-01-01
array(cftime.DatetimeGregorian(1568-01-01 00:00:00), dtype=object)
- lat()float32-1.25
- long_name :
- latitude
- units :
- degrees_north
- standard_name :
- latitude
array(-1.25, dtype=float32)
- lon()float32-51.75
- standard_name :
- longitude
- long_name :
- longitude
- units :
- degrees_east
array(-51.75, dtype=float32)
- standard_name :
- discharge
- long_name :
- discharge
- units :
- m3s-1
arg2=np.argmax(Amazon_timeseries.where(Amazon_timeseries<megaflood1))
megaflood2 = Amazon_timeseries.isel(time=arg2)
megaflood2
- 940480.25
array(940480.25, dtype=float32)
- time()object1731-01-01 00:00:00
- standard_name :
- time
- long_name :
- Days since 1901-01-01
array(cftime.DatetimeGregorian(1731-01-01 00:00:00), dtype=object)
- lat()float32-1.25
- long_name :
- latitude
- units :
- degrees_north
- standard_name :
- latitude
array(-1.25, dtype=float32)
- lon()float32-51.75
- standard_name :
- longitude
- long_name :
- longitude
- units :
- degrees_east
array(-51.75, dtype=float32)
- standard_name :
- discharge
- long_name :
- discharge
- units :
- m3s-1
year_megaflood1=years_list[1567]
Start_megaflood1=Starts_list[1567]
Ensemble_megaflood1=Ensembles_list[1567]
year_megaflood1
Start_megaflood1
Ensemble_megaflood1
2037
13.0
13.0
year_megaflood2=years_list[1730]
Start_megaflood2=Starts_list[1730]
Ensemble_megaflood2=Ensembles_list[1730]
year_megaflood2
Start_megaflood2
Ensemble_megaflood2
2035
14.0
21.0