Sony AI released a dataset that tests the fairness and bias of AI models. It’s called the Fair Human-Centric Image Benchmark (FHIBE, pronounced like “Phoebe”). The company describes it as the “first publicly available, globally diverse,…

Sony AI released a dataset that tests the fairness and bias of AI models. It’s called the Fair Human-Centric Image Benchmark (FHIBE, pronounced like “Phoebe”). The company describes it as the “first publicly available, globally diverse,…