Programming languages usually try to minimise undefined behaviour. What if you did the opposite? How much UB could you have whilst still being (theoretically) usable?