 My name is Blake Holman and I'll be talking about sustained space and cumulative complexity trade-offs for data dependent memory hard functions This is work with Jeremiah blocky Memory hard functions or hash functions whose computational costs are dominated by memory cost Our intuitive goal is to force an attacker to lock up a large amount of memory for a long amount of time Memory hard functions protect low entropy secrets like passwords from brute force attacks They're also used for egalitarian proof of work. A natural question is how can we formalize our intuitive notions of memory hardness? One way is through cumulative complexity Which is the sum of the memory used at each time step as an example s-crypt is a widely used memory hard function It has been proven to have maximal cumulative complexity of n squared where n is the running time parameter But does this maximal cumulative complexity actually align with our intuitions of memory hardness? The answer is no because you can evaluate s-crypt using constant memory for n squared time So it still has maximal cumulative complexity But it doesn't match our intuitive goal of forcing an attacker to allocate a large amount of space for a large amount of time a Second approach for defining memory hardness is sustained space complexity This measures the time spent above a memory threshold s This matches our intuitive goal for memory hard functions because an attacker with high sustained space complexity must allocate a lot of space s for a long time t and This is stricter than cumulative complexity because such an attacker has cumulative cost s times t Now a natural question is whether we can construct a memory hard function in which any evaluation strategy sustains n space for n steps and The answer is no it turns out that any memory hard function can be evaluated using an over log n space However, this attack is impractical because it may have exponential runtime Now the key question is whether we can ensure that an attacker with low sustained space complexity has very large cumulative complexity In this work our goal is to construct memory hard functions such that any strategy either sustains n space for n steps Or as cumulative complexity much greater than n squared We've already seen that s-crypt can be computed using constant space for n squared steps So we examine two memory hard functions with high trade-offs and then we give a theoretical construction with near optimal trade-offs Hybrid DR sample has the property that any strategy either sustains n over log n space for n steps Or has cumulative complexity n cubed over log n argon2id won the password hashing competition in 2015 and is available in many cryptographic libraries We show that any strategy either sustains almost in space for n steps Or it has cumulative complexity slightly more than n squared Finally, we give a construction in which any strategy either sustains n space for n steps or has cumulative complexity almost n cubed