AI systems can create, propagate, support, and automate bias in decision-making processes. To mitigate biased decisions, we both need to understand the origin of define what it means for an algorithm make fair decisions. Most group fairness notions assess a model's equality outcome by computing statistical metrics on outputs. We argue that these output encounter intrinsic obstacles present comp...