Hi Max, in the "SF for TS" paper you called your novel model "SeFT-ATTN". I was expecting to see the exact same name in this repo to see how you implemented it, but can't find it anywhere. Does the "deep_set_attention.py" file contain the SeFT-ATTN implementation?
Hi Max, in the "SF for TS" paper you called your novel model "SeFT-ATTN". I was expecting to see the exact same name in this repo to see how you implemented it, but can't find it anywhere. Does the "deep_set_attention.py" file contain the SeFT-ATTN implementation?