Downloads of Wisdom
How AI can perpetuate racist/sexist imbalances
The same thinking for any solution architects, marketing teams and people who are looking at sets of data to make “informed” or “accurate” decisions, is to not be of the belief that the machine is right. The machine, is making predictions on inputs. For the most part, these inputs are still running on outdated belief systems.
COLORINTECH Leadership Series / Mike Bugembe founder of lens.ai
Part of the barriers and obstacles faced by minority groups within the business and tech space, is not just in the context of working in tech, but within the architecture of data that is being used to inform much of AI's decision making. More importantly though, how this is assisting in perpetuating imbalances at a global scale and mostly without any regulation.