CoCalc Shared FilesATMS-305-Content / Week-10 / ATMS 305 HW 7 redo.ipynbOpen in CoCalc with one click!
Authors: Andrew Huang, Steve Nesbitt
Views : 7

ATMS 305 HW 7 (redo opportunity)

Due Wednesday, March 29 at 9:30 AM.

Copy and paste the answers you had before. If you want this assignment regraded, send [email protected] an email when you're ready.

Problem 1

I've included the Champaign Weather Data in this assignment (daily_wx_data_champaign.csv), as well as the climate indicies data file from last week (*.data). I've also included the code to load the daily weather data into a pandas data frame into a data frame called daily_data, while the climate indicies are loaded into a pandas data frame called climate_inds.

# Data prep again - daily weather data is in daily_data and climate index data is in newdf_all2

import numpy as np
import pandas as pd

daily_data = pd.read_csv('daily_wx_data_champaign.csv',sep=',',skiprows=7)

cols = ['TMIN','TMAX','PRCP','SNOW','SNWD']

for col in cols:
    daily_data[col][daily_data[col] == 'M'] = None
    daily_data[col][daily_data[col] == 'T'] = 0.005
    daily_data[col] = daily_data[col].astype('float')
    
daily_data['Date']=pd.to_datetime(daily_data['Date'])

data=pd.read_table('censo.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=2)
data2=pd.read_table('pdo.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=4)
data3=pd.read_table('nao.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=3)
newdf=pd.DataFrame()
newdf2=pd.DataFrame()
newdf3=pd.DataFrame()
newdf['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2016,12,1),freq="MS")
newdf2['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2017,12,1),freq="MS")
newdf3['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2017,12,1),freq="MS")
newdf['ENSO']=data.loc[:,1:].stack().values
newdf2['PDO']=data2.loc[:,1:].stack().values
newdf3['NAO']=data3.loc[:,1:].stack().values
newdf_all=pd.merge(newdf,newdf2,how='inner',on='Date')
climate_inds=pd.merge(newdf_all,newdf3,how='inner',on='Date')

(a) Plot a time series of the climate data, with the x-axis being "Time (years)", and add a legend for each inded.

In [59]:
import numpy as np import pandas as pd daily_data = pd.read_csv('daily_wx_data_champaign.csv',sep=',',skiprows=7) cols = ['TMIN','TMAX','PRCP','SNOW','SNWD'] for col in cols: daily_data[col][daily_data[col] == 'M'] = None daily_data[col][daily_data[col] == 'T'] = 0.005 daily_data[col] = daily_data[col].astype('float') daily_data['Date']=pd.to_datetime(daily_data['Date']) data=pd.read_table('censo.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=2) data2=pd.read_table('pdo.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=4) data3=pd.read_table('nao.data',delim_whitespace=True,header=None,skiprows=1,skipfooter=3) newdf=pd.DataFrame() newdf2=pd.DataFrame() newdf3=pd.DataFrame() newdf['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2016,12,1),freq="MS") newdf2['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2017,12,1),freq="MS") newdf3['Date']=pd.DatetimeIndex(start=pd.datetime(1948,1,1),end=pd.datetime(2017,12,1),freq="MS") newdf['ENSO']=data.loc[:,1:].stack().values newdf2['PDO']=data2.loc[:,1:].stack().values newdf3['NAO']=data3.loc[:,1:].stack().values newdf_all=pd.merge(newdf,newdf2,how='inner',on='Date') climate_inds=pd.merge(newdf_all,newdf3,how='inner',on='Date') climate_inds
/projects/anaconda3/lib/python3.5/site-packages/ipykernel/__main__.py:9: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy /projects/anaconda3/lib/python3.5/site-packages/ipykernel/__main__.py:10: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy /projects/anaconda3/lib/python3.5/site-packages/ipykernel/__main__.py:15: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support skipfooter; you can avoid this warning by specifying engine='python'. /projects/anaconda3/lib/python3.5/site-packages/ipykernel/__main__.py:16: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support skipfooter; you can avoid this warning by specifying engine='python'. /projects/anaconda3/lib/python3.5/site-packages/ipykernel/__main__.py:17: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support skipfooter; you can avoid this warning by specifying engine='python'.
Date ENSO PDO NAO
0 1948-01-01 0.26 -0.11 -99.90
1 1948-02-01 0.43 -0.74 -99.90
2 1948-03-01 0.72 -0.03 -99.90
3 1948-04-01 0.21 -1.33 -99.90
4 1948-05-01 0.24 -0.23 -99.90
5 1948-06-01 0.41 0.08 -99.90
6 1948-07-01 0.08 -0.92 -99.90
7 1948-08-01 0.25 -1.56 -99.90
8 1948-09-01 0.45 -1.74 -99.90
9 1948-10-01 -0.64 -1.32 -99.90
10 1948-11-01 -0.21 -0.89 -99.90
11 1948-12-01 0.58 -1.70 -99.90
12 1949-01-01 0.18 -2.01 -99.90
13 1949-02-01 0.13 -3.60 -99.90
14 1949-03-01 -0.55 -1.00 -99.90
15 1949-04-01 0.15 -0.53 -99.90
16 1949-05-01 0.43 -1.07 -99.90
17 1949-06-01 0.00 -0.70 -99.90
18 1949-07-01 -0.06 -0.56 -99.90
19 1949-08-01 0.08 -1.30 -99.90
20 1949-09-01 -0.42 -0.93 -99.90
21 1949-10-01 -0.59 -1.41 -99.90
22 1949-11-01 -0.29 -0.83 -99.90
23 1949-12-01 -0.91 -0.80 -99.90
24 1950-01-01 -0.74 -2.13 0.56
25 1950-02-01 -1.70 -2.91 0.01
26 1950-03-01 -1.43 -1.13 -0.78
27 1950-04-01 -1.29 -1.20 0.65
28 1950-05-01 -1.22 -2.23 -0.50
29 1950-06-01 -1.69 -1.77 0.25
... ... ... ... ...
798 2014-07-01 0.42 0.70 0.21
799 2014-08-01 0.70 0.67 -2.28
800 2014-09-01 0.97 1.08 1.72
801 2014-10-01 0.76 1.49 -0.87
802 2014-11-01 1.24 1.72 0.58
803 2014-12-01 0.87 2.51 1.63
804 2015-01-01 0.94 2.45 1.57
805 2015-02-01 0.14 2.30 1.05
806 2015-03-01 0.91 2.00 1.12
807 2015-04-01 0.83 1.44 0.64
808 2015-05-01 1.69 1.20 0.19
809 2015-06-01 1.53 1.54 0.24
810 2015-07-01 2.34 1.84 -3.14
811 2015-08-01 2.60 1.56 -1.10
812 2015-09-01 2.92 1.94 -0.49
813 2015-10-01 2.77 1.47 0.99
814 2015-11-01 1.99 0.86 1.70
815 2015-12-01 1.91 1.01 1.99
816 2016-01-01 2.96 1.53 -0.37
817 2016-02-01 3.05 1.75 1.35
818 2016-03-01 1.23 2.40 0.37
819 2016-04-01 1.78 2.62 0.26
820 2016-05-01 0.07 2.35 -0.67
821 2016-06-01 -0.40 2.03 -0.13
822 2016-07-01 -0.56 1.25 -1.72
823 2016-08-01 -0.93 0.52 -2.24
824 2016-09-01 -1.25 0.45 0.74
825 2016-10-01 -0.25 0.56 0.96
826 2016-11-01 -0.14 1.88 -0.31
827 2016-12-01 -0.39 1.17 0.35

828 rows × 4 columns

In [60]:
climate_inds['ENSO'].values
array([ 0.26, 0.43, 0.72, 0.21, 0.24, 0.41, 0.08, 0.25, 0.45, -0.64, -0.21, 0.58, 0.18, 0.13, -0.55, 0.15, 0.43, 0. , -0.06, 0.08, -0.42, -0.59, -0.29, -0.91, -0.74, -1.7 , -1.43, -1.29, -1.22, -1.69, -1.69, -0.96, -0.95, -1.27, -1.12, -1.61, -1.13, -0.69, -0.24, 0.33, 0.71, 0.19, 1.33, 1.01, 1.1 , 1.15, 0.91, 0.83, 0.64, 0.56, 0.07, 0.71, -0.5 , -0.74, -0.54, -0.12, -0.03, -0.06, -0.02, 0.41, 0.18, 0.51, 0.5 , 0.81, 1.78, 0.62, 0.36, 0.9 , 1.19, 0.16, 0.37, 0.41, 0. , 0.4 , 0.17, -0.31, -0.34, -0.44, -0.81, -1.16, -0.68, -0.57, -0.67, -0.96, -0.01, -1.14, -0.58, -0.4 , -1.3 , -1.49, -1.72, -1.27, -1.46, -1.64, -1.65, -1.15, -1.1 , -1.14, -0.98, -0.95, -1.04, -1.06, -1.13, -1.06, -0.56, -1.3 , -0.56, -0.7 , -0.44, 0.04, 0.24, 0.43, 1.1 , 0.68, 0.65, 1.08, 0.94, 0.5 , 1.2 , 0.86, 1.67, 1.24, 0.87, 0.35, 1.16, 0.46, 0.1 , -0.13, 0.05, 0.26, 0.36, 0.59, 0.68, 1.07, -0.09, 0.2 , 0.08, 0.13, 0.17, 0.03, -0.26, -0.08, -0.5 , -0.34, 0.07, -0.02, -0.21, -0.19, 0.07, 0.07, -0.25, -0.22, -0.18, 0.07, -0.47, -0.23, 0.15, -0.27, 0.95, -0.23, 0.05, 0.36, -0.16, -0.08, -0.3 , -0.06, -0.41, -0.7 , -0.83, 0.11, -0.13, -0.28, -0.76, -0.21, -0.05, -0.2 , -0.53, -0.63, -0.33, -0.15, -0.47, -0.29, -0.27, -0.14, 0.09, 0.63, 0.81, 0.88, 0.83, 1.19, 0.86, 1.21, 0.67, 0.4 , -0.36, -1. , -0.45, -0.91, -0.64, -1.13, -1.15, -1.17, -0.52, -0.27, -0.03, -0.22, -0.3 , 0.52, 0.31, 1.03, 1.9 , 1.55, 1.63, 1.54, 1.91, 0.79, 1.22, 0.78, 1.31, 0.89, 0.43, 0.18, 0.44, -0.13, 0.18, 0.14, -0.07, 0.2 , -0.91, -0.95, -0.55, -0.29, 0.15, -0.23, 0.02, -0.4 , -0.5 , -0.12, 0.14, 0.24, -0.47, -0.82, -0.27, -0.19, -0.81, -0.16, 0.08, 0.35, 0.3 , 0.37, 0.65, 0.39, 1.11, 1.06, 0.48, 0.71, 0.87, 0.4 , 0.59, 0.72, 0.94, 1.04, 0.5 , 0.29, 0.87, 0.83, 0.2 , 0.46, 0.15, -0.51, -0.36, -0.81, -0.98, -0.71, -1.31, -1.36, -0.83, -1.5 , -1.66, -1.69, -1. , -0.67, -0.42, -1.1 , -1.13, -1.26, -0.69, -0.47, -0.41, -0.47, -0.2 , 0.44, 1.56, 1.23, 1.7 , 1.47, 1.6 , 1.59, 1.18, 1.84, 1.06, 1.39, 0.46, 0.17, -0.45, -1.07, -1.05, -1.4 , -1.46, -1.33, -2.46, -1.89, -1.96, -1.66, -1.82, -1.23, -1.18, -0.58, -0.93, -0.47, -0.7 , -0.78, -0.26, -0.43, 0.11, -0.43, -0.84, -0.95, -0.96, -1.58, -1.83, -1.76, -1.9 , -1.66, -1.36, -1.74, -1.42, -1.24, -0.97, -0.35, -0.57, 0.14, 0.86, 0.89, 1.08, 0.45, 0.11, 0.6 , 0.65, -0.01, 0.78, 0.33, 0.72, 1.2 , 1.23, 0.85, 0.86, 1.2 , 1.33, 1.18, 0.63, 1.64, 0.4 , 0.11, -1. , -0.43, -0.45, -0.48, -0.34, 0.16, 0.07, 0.18, 0.26, -0.25, 0.32, 0.47, -0.18, -0.08, -0.66, 0.48, 0.37, 0.37, 0.5 , 0.81, 0.21, 0.21, 0.52, 0.86, 0.44, 0.64, 0.26, -0.12, 0.26, 0.11, 0.31, 0.33, -0.31, -0.03, 0.82, 0.18, -0.35, -0.47, -0.58, -0.29, -0.19, 0.32, -0.15, -0.11, -0.36, -0.06, 0. , 0.47, 1.05, 1.74, 1.57, 1.9 , 2.04, 2.22, 2.69, 2.38, 2.76, 3.08, 2.5 , 1.57, 0.66, 0.82, 0.39, 0.09, -0.67, -0.77, -0.53, -0.37, -0.37, -0.54, 0.02, -0.56, -0.39, -0.28, -0.2 , -0.31, -0.21, -0.2 , -0.66, -0.62, -0.19, -0.95, -0.72, -1.08, -0.5 , -0.11, -0.22, -0.64, -0.29, -0.01, -0.02, -0.21, -0.73, 0.26, -0.29, -0.29, 0.05, -0.31, 0.2 , 0.59, 0.76, 0.3 , 1.35, 1.31, 0.99, 1.42, 1.71, 1.84, 1.97, 2.08, 2.1 , 1.87, 1.68, 1.14, 0.74, 0.9 , 0.54, 0.54, 0.18, -0.31, -1.3 , -1.11, -1.61, -1.64, -1.71, -1.86, -2.06, -1.47, -1.55, -1.23, -1.14, -1.66, -1.26, -0.74, -0.64, -0.04, -0.4 , -0.51, 0.06, 0.27, 0.15, 1.12, 0.61, 0.32, -0.29, 0.12, 0.03, 0.47, 0.55, 0.17, 0.4 , 0.4 , 0.12, 0.23, 0.64, 0.86, 1.37, 0.94, 0.71, 0.82, 1.02, 1.28, 1. , 1.82, 2.12, 1.51, 2.22, 2.05, 1.19, 1.18, 0.69, -0.08, -0.02, 0.75, 0.36, 0.41, 0.59, 0.74, 0.81, 1.71, 1.2 , 1.39, 0.94, 0.92, 0.65, 0.96, 0.24, 0.12, 0.16, -0.07, 0.52, 1.1 , 0.89, 0.83, 1.11, 1.33, 1.18, 1.29, 1.02, 1.33, 0.84, 0.7 , 0.44, 0.96, 0.63, 0.28, -0. , -0.2 , -0.42, -0.26, -0.38, 0.01, -0.66, -0.39, -0.52, -0.53, -0.48, -0.54, -0.26, -0.35, -0.44, -0.28, -0.03, -0.48, -0.3 , -0.75, 0.42, 0.97, 2.1 , 2.11, 1.89, 2.48, 2.27, 2.15, 1.92, 1.74, 2.73, 2.61, 2.46, 1.55, 0.63, -0.95, -1.5 , -1.44, -1.23, -1.28, -1.28, -1.52, -1.7 , -1.22, -1.04, -1.22, -0.64, -0.82, -0.78, -0.95, -0.54, -1.03, -1.3 , -1.49, -1.26, -1.75, -1.2 , -1.01, -0.58, -0.27, -0.07, -0.44, -0.72, -0.9 , -1.4 , -0.78, -0.87, -1.15, -0.46, -0.09, 0.36, 0.13, 0.44, 0.51, -0.08, 0.14, -0.4 , 0.41, -0.15, -0.33, 0.44, 0.4 , 1.06, 0.99, 1.12, 1.45, 1.21, 1.15, 1.2 , 1.48, 0.85, 1.07, 0.83, 0.3 , 0.08, 0.32, 0.23, 0.17, 0.31, 0.44, 0.46, -0.28, 0.88, -0.48, 0. , 0.65, -0.12, 0.61, 0.89, 0.9 , 0.82, 0.56, 0.85, 0.95, 0.2 , 2.23, 0.33, 0.72, 1.08, 0.39, 0.35, 0.4 , -0.14, -0.47, 0.04, -0.2 , -1.22, -0.31, -1. , -0.39, 0.63, 0.48, 0.64, 1.02, 0.86, 1.26, 0.57, 0.9 , 0.88, 0.3 , 0.15, 0.26, 0.15, -0.07, 0.06, -0.48, -0.63, -1.14, -1.23, -1.6 , -1.75, -2.52, -1.24, -0.93, -0.27, -0.46, 0.02, -0.35, -0.8 , -0.82, -0.89, -1.07, -1.32, -1.95, -0.3 , -0.54, 0.68, 0.64, 0.63, 1.02, 0.35, 1.83, 1.51, 1.73, 1.62, 1.94, 1.21, -0.14, -0.76, -0.69, -2.17, -2.33, -2.76, -2.35, -1.83, -2.99, -2.46, -3.07, -2.34, -1.69, -0.53, -0.23, -0.81, -0.77, -1.2 , -1.13, -1.4 , -2.37, -1.3 , -0.84, -0.85, -0.08, 0.06, 0.56, 0.62, 0.67, 0.24, -0.05, 0.03, 0.46, -0.13, -0.04, -1.11, -0.13, -0.74, -0.78, -0.7 , -0.32, -0.18, -0.14, -0.54, -0.01, -1.26, -0.37, 0.54, -0.17, 0.09, 0.37, 0.42, 0.7 , 0.97, 0.76, 1.24, 0.87, 0.94, 0.14, 0.91, 0.83, 1.69, 1.53, 2.34, 2.6 , 2.92, 2.77, 1.99, 1.91, 2.96, 3.05, 1.23, 1.78, 0.07, -0.4 , -0.56, -0.93, -1.25, -0.25, -0.14, -0.39])

(b) Plot a time series of the average high and low, and record high and low for the period 1880-2016.

In [ ]:

(c) Plot a 3-panel scatter plot using the plot command of monthly average high temperature vs. ENSO, PDO, and NAO for the period 1950-2016. Be sure to label each plot accordingly (titles, axis labels, units if applicable).

In [ ]:

Problem 2

For this problem, you will use the the global surface temperature anomaly (departure from average) data set from NASA Goddard Institute for Space Studies (GISS). The dataset is available on OpenDAP, meaning that you can access the data remotely through xarray.

To load the data into an xarray data frame called data:

import xarray as xr

data=xr.open_dataset('https://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/gistemp/combined/250km/air.2x2.250.mon.anom.comb.nc')

(a) Create a pandas data frame containing monthly time series from 1901 to 2017 of mean global temperature, and at the grid point closest to Champaign-Urbana (latitude 40N, longitude 88W).

In [61]:
%pylab inline import xarray as xr data=xr.open_dataset('https://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/gistemp/combined/250km/air.2x2.250.mon.anom.comb.nc') global_time_series=data['air'].mean(dim=['lon','lat'])
Populating the interactive namespace from numpy and matplotlib
In [62]:
global_time_series_subset=global_time_series.sel(time=slice('1901-01-01','2017-01-01'))
In [63]:
champaign_time_series_subset=data['air'].sel(lat=40.,lon=-88.+360.,method='nearest') champaign_time_series_subset=champaign_time_series_subset.sel(time=slice('1901-01-01','2017-01-01')) champaign_time_series_subset
<xarray.DataArray 'air' (time: 1393)> array([ 2.14887476, -4.54907036, -0.11208001, ..., 3.6890409 , -0.67572999, 4.57961941]) Coordinates: lat float32 41.0 lon float32 273.0 * time (time) datetime64[ns] 1901-01-01 1901-02-01 1901-03-01 ... Attributes: long_name: Monthly Average Temperature Anomalies valid_range: [-25. 25.] units: degC precision: 2 var_desc: Air Temperature: 250km smoothing combined dataset: GISS Surface Temperature Analysis (GISTEMP) level_desc: Surface statistic: Anomaly parent_stat: Individual obs cell_methods: time: anomaly (monthly from values) standard_name: air_temperature_anomaly actual_range: [ -20.74464989 9999. ] _ChunkSize: [ 1 90 180]
In [64]:
pdframe=champaign_time_series_subset.to_dataframe() pdframe
air lat lon
time
1901-01-01 2.148875 41.0 273.0
1901-02-01 -4.549070 41.0 273.0
1901-03-01 -0.112080 41.0 273.0
1901-04-01 -1.268340 41.0 273.0
1901-05-01 -0.822690 41.0 273.0
1901-06-01 1.292585 41.0 273.0
1901-07-01 3.419330 41.0 273.0
1901-08-01 0.839621 41.0 273.0
1901-09-01 0.039970 41.0 273.0
1901-10-01 0.569880 41.0 273.0
1901-11-01 -1.913354 41.0 273.0
1901-12-01 -2.497545 41.0 273.0
1902-01-01 1.241045 41.0 273.0
1902-02-01 -4.336280 41.0 273.0
1902-03-01 2.226600 41.0 273.0
1902-04-01 -0.879625 41.0 273.0
1902-05-01 1.774370 41.0 273.0
1902-06-01 -1.651875 41.0 273.0
1902-07-01 0.334765 41.0 273.0
1902-08-01 -1.400959 41.0 273.0
1902-09-01 -1.731610 41.0 273.0
1902-10-01 0.754615 41.0 273.0
1902-11-01 3.917701 41.0 273.0
1902-12-01 -1.297650 41.0 273.0
1903-01-01 0.791595 41.0 273.0
1903-02-01 -0.532330 41.0 273.0
1903-03-01 3.727795 41.0 273.0
1903-04-01 0.004295 41.0 273.0
1903-05-01 1.624905 41.0 273.0
1903-06-01 -3.041575 41.0 273.0
... ... ... ...
2014-08-01 0.246421 41.0 273.0
2014-09-01 -0.741530 41.0 273.0
2014-10-01 -0.430990 41.0 273.0
2014-11-01 -3.243764 41.0 273.0
2014-12-01 2.330320 41.0 273.0
2015-01-01 -0.100380 41.0 273.0
2015-02-01 -5.614221 41.0 273.0
2015-03-01 -0.913440 41.0 273.0
2015-04-01 0.875785 41.0 273.0
2015-05-01 1.951535 41.0 273.0
2015-06-01 -0.080650 41.0 273.0
2015-07-01 -0.844920 41.0 273.0
2015-08-01 -0.563179 41.0 273.0
2015-09-01 2.317325 41.0 273.0
2015-10-01 0.810130 41.0 273.0
2015-11-01 2.896001 41.0 273.0
2015-12-01 6.489240 41.0 273.0
2016-01-01 1.573245 41.0 273.0
2016-02-01 2.807405 41.0 273.0
2016-03-01 4.639920 41.0 273.0
2016-04-01 0.166800 41.0 273.0
2016-05-01 -0.003800 41.0 273.0
2016-06-01 1.297495 41.0 273.0
2016-07-01 0.372935 41.0 273.0
2016-08-01 1.910696 41.0 273.0
2016-09-01 2.613180 41.0 273.0
2016-10-01 2.727770 41.0 273.0
2016-11-01 3.689041 41.0 273.0
2016-12-01 -0.675730 41.0 273.0
2017-01-01 4.579619 41.0 273.0

1393 rows × 3 columns

In [20]:
plt.pcolormesh(test['lat'],test['time'],test) plt.clim([-2,2]) plt.colorbar()
<matplotlib.colorbar.Colorbar at 0x7f5f3ee0d748>
/projects/anaconda3/lib/python3.5/site-packages/matplotlib/colors.py:494: RuntimeWarning: invalid value encountered in less cbook._putmask(xa, xa < 0.0, -1)

(b) Plot time series created in (a) in a two panel plot, stacked vertically. Use symbols and lines for each time series. Do this globally and for the grid point closest to Champaign-Urbana. Include appropriate titles (with units) and legends.

In [39]:
ax1 = plt.subplot(2,1,1) ax1.plot(global_time_series_subset['time'],global_time_series_subset) ax1.set_xlabel('Time (years)') ax1.set_ylabel('T anomaly (C)') ax1.set_title('Global') ax2 = plt.subplot(2,1,2) ax2.plot(champaign_time_series_subset['time'],champaign_time_series_subset) ax2.set_xlabel('Time (years)') ax2.set_title('Shampoo banana') plt.tight_layout()

(c) Create a scatterplot (using the scatter command) of ENSO index on the y-axis, PDO index on the x-axis, colored by Champaign-Urbana GISS monthly temperature anomaly. Be sure to include a title, axis labels, units, and a colorbar. Hint: You're going to have to create a new pandas data frame that covers the same time periods so each of the datasets match up in time.

In [12]:
#Here we are going to use random data just for illustration import numpy as np import matplotlib.pyplot as plt x=np.random.randint(0,100,size=25) #PDO index would go here y=np.random.randint(0,100,size=25) #ENSO index would go here z=np.random.rand(25) # C-U temperature anomaly plt.scatter(x, y, c=z, s=100, cmap=plt.cm.cool, edgecolors='None', alpha=0.75) #play around with different cmaps and other settings if you want plt.xlabel('Some random data [unitless]') plt.ylabel('Some other random data [unitless]') cbar=plt.colorbar() cbar.set_label('Time to Easter (days)') plt.title('Random data in really bright colors')
<matplotlib.text.Text at 0x7f52668053c8>
In [42]:
ax1=plt.subplot(1,1,1) ax1.plot(climate_inds['PDO'],climate_inds['ENSO'],'.') ax1.set_xlabel('PDO') ax1.set_ylabel('ENSO')
<matplotlib.text.Text at 0x7f5f3d35d898>
In [43]:
champaign_time_series_subset=data['air'].sel(lat=40.,lon=-88.+360.,method='nearest') champaign_time_series_subset=champaign_time_series_subset.sel(time=slice('1948-01-01','2016-12-01'))
In [46]:
climate_inds['Shampoo']=champaign_time_series_subset.values
In [47]:
climate_inds
Date ENSO PDO NAO Shampoo
0 1948-01-01 0.26 -0.11 -99.90 -1.963040
1 1948-02-01 0.43 -0.74 -99.90 0.122245
2 1948-03-01 0.72 -0.03 -99.90 0.540310
3 1948-04-01 0.21 -1.33 -99.90 2.090145
4 1948-05-01 0.24 -0.23 -99.90 -1.429335
5 1948-06-01 0.41 0.08 -99.90 -0.264860
6 1948-07-01 0.08 -0.92 -99.90 0.365600
7 1948-08-01 0.25 -1.56 -99.90 0.457111
8 1948-09-01 0.45 -1.74 -99.90 1.215015
9 1948-10-01 -0.64 -1.32 -99.90 -1.594350
10 1948-11-01 -0.21 -0.89 -99.90 2.071966
11 1948-12-01 0.58 -1.70 -99.90 1.498210
12 1949-01-01 0.18 -2.01 -99.90 3.826670
13 1949-02-01 0.13 -3.60 -99.90 1.986925
14 1949-03-01 -0.55 -1.00 -99.90 0.907950
15 1949-04-01 0.15 -0.53 -99.90 -0.785005
16 1949-05-01 0.43 -1.07 -99.90 0.880065
17 1949-06-01 0.00 -0.70 -99.90 1.709075
18 1949-07-01 -0.06 -0.56 -99.90 1.938615
19 1949-08-01 0.08 -1.30 -99.90 0.534321
20 1949-09-01 -0.42 -0.93 -99.90 -3.056800
21 1949-10-01 -0.59 -1.41 -99.90 2.088940
22 1949-11-01 -0.29 -0.83 -99.90 0.413606
23 1949-12-01 -0.91 -0.80 -99.90 2.423145
24 1950-01-01 -0.74 -2.13 0.56 5.202685
25 1950-02-01 -1.70 -2.91 0.01 0.302845
26 1950-03-01 -1.43 -1.13 -0.78 -1.808610
27 1950-04-01 -1.29 -1.20 0.65 -3.613455
28 1950-05-01 -1.22 -2.23 -0.50 0.629835
29 1950-06-01 -1.69 -1.77 0.25 -0.771305
... ... ... ... ... ...
798 2014-07-01 0.42 0.70 0.21 -2.557265
799 2014-08-01 0.70 0.67 -2.28 0.246421
800 2014-09-01 0.97 1.08 1.72 -0.741530
801 2014-10-01 0.76 1.49 -0.87 -0.430990
802 2014-11-01 1.24 1.72 0.58 -3.243764
803 2014-12-01 0.87 2.51 1.63 2.330320
804 2015-01-01 0.94 2.45 1.57 -0.100380
805 2015-02-01 0.14 2.30 1.05 -5.614221
806 2015-03-01 0.91 2.00 1.12 -0.913440
807 2015-04-01 0.83 1.44 0.64 0.875785
808 2015-05-01 1.69 1.20 0.19 1.951535
809 2015-06-01 1.53 1.54 0.24 -0.080650
810 2015-07-01 2.34 1.84 -3.14 -0.844920
811 2015-08-01 2.60 1.56 -1.10 -0.563179
812 2015-09-01 2.92 1.94 -0.49 2.317325
813 2015-10-01 2.77 1.47 0.99 0.810130
814 2015-11-01 1.99 0.86 1.70 2.896001
815 2015-12-01 1.91 1.01 1.99 6.489240
816 2016-01-01 2.96 1.53 -0.37 1.573245
817 2016-02-01 3.05 1.75 1.35 2.807405
818 2016-03-01 1.23 2.40 0.37 4.639920
819 2016-04-01 1.78 2.62 0.26 0.166800
820 2016-05-01 0.07 2.35 -0.67 -0.003800
821 2016-06-01 -0.40 2.03 -0.13 1.297495
822 2016-07-01 -0.56 1.25 -1.72 0.372935
823 2016-08-01 -0.93 0.52 -2.24 1.910696
824 2016-09-01 -1.25 0.45 0.74 2.613180
825 2016-10-01 -0.25 0.56 0.96 2.727770
826 2016-11-01 -0.14 1.88 -0.31 3.689041
827 2016-12-01 -0.39 1.17 0.35 -0.675730

828 rows × 5 columns

In [55]:
ax1=plt.subplot(1,1,1) plt.scatter(climate_inds['PDO'],climate_inds['ENSO'],c=climate_inds['Shampoo'],cmap='jet') plt.colorbar() plt.xlabel('PDO') plt.ylabel('ENSO')
<matplotlib.text.Text at 0x7f5f3cc5b4e0>
In [57]:
import xarray as xr
In [58]:
xr.open_dataset('https://www.esrl.noaa.gov/psd/thredds/dodsC/Datasets/gistemp/combined/250km/air.2x2.250.mon.anom.comb.nc')
<xarray.Dataset> Dimensions: (lat: 90, lon: 180, time: 1646) Coordinates: * lat (lat) float32 89.0 87.0 85.0 83.0 81.0 79.0 77.0 75.0 73.0 71.0 ... * lon (lon) float32 1.0 3.0 5.0 7.0 9.0 11.0 13.0 15.0 17.0 19.0 21.0 ... * time (time) datetime64[ns] 1880-01-01 1880-02-01 1880-03-01 ... Data variables: air (time, lat, lon) float64 ... Attributes: title: GISS Surface Temperature Analysis (GISTEMP): 250km smoothing combined land/ocean comments: This is the 250km smoothed combined land,ocean version of the dataset on a 2x2 grid. SST dataset used at NASA is now ERSST platform: Analysis Source: http://data.giss.nasa.gov/gistemp/ source and http://data.giss.nasa.gov/pub/gistemp/ data Documentation: http://www.esrl.noaa.gov/psd/data/gridded/data.gistemp.html references: http://www.esrl.noaa.gov/psd/data/gridded/data.gistemp.html Conventions: COARDS history: created at PSD Jun 2009 by CAS. Converted to chunked, deflated non-packed NetCDF4 2014/06 dataset_title: GISS Surface Temperature Analysis DODS_EXTRA.Unlimited_Dimension: time EXTRA_DIMENSION.nbnds: 2
In [ ]: