KEY INSIGHTS
- There are two million AI-related job postings in the United States.
- Black Americans have fewer AI-related job opportunities.
- Consumers could have more authority over AI-driven personal purchase algorithms in the near future.
Brookings Institute released a report on the emerging and fast-growing artificial intelligence (AI) economy across the United States, with AI-related projects occurring in federal facilities and universities, finding that start-ups using AI solutions have grown five-fold from a decade ago.
While the job market for AI-related jobs is growing, it is still small in the U.S. job market, with AI job postings accounting for two million, or one percent, of almost 22 million U.S. job postings.
A combination of Black, Hispanics and the Indigenous population make up five percent of Silicon Valley, according to a Wired survey. A Trustradius survey found that 65 percent of minorities say that diversity in the tech industry has increased in the last decade.
Some metro areas are growing in the tech-provider industry. Cities like Atlanta have 85 percent diversity in the tech industry, Dallas has 72 percent and Chicago at 71 percent.
According to the report, only a few metro areas in the U.S. have considerable AI-related job presence, but many southern states are open to becoming centers for AI-related jobs. In Georgia, Alabama, Florida and Tennessee, only two percent of jobs are available. Across the nation, there are varying levels of AI adoption in the workforce.
There are a few states that have moderately high AI capacity to implement AI. The most AI-ready city being the San Francisco Bay area, according to Brookings. In Austin Texas, AI 229.2 companies per million workers in indexed per capita values in contrast to San Francisco-Oakland-Berkeley’s 792.2 AI companies. The New York metro area has 135.9 AI- companies.
While AI could assist in work equity and economic growth, companies have to earn consumer trust. A Harvard law review analyzed rates of AI adoption being harmful to the user. Minorities and women will face worse social inequality from algorithm bias where algorithms fuel discrimination.
Machine learning bias has stifled Black Americans in public and private sector jobs. Businesses are implementing data-science approaches to see algorithmic outcomes before sending them to the world. The outcome process includes examining AI models, specifying the function that model should reflect and inputting data and post-processing data estimates. Testing these functions can nip bias in the bud.
Amazon built an algorithm to automatically review and score job applicants’ CVs. The algorithm was found to score male applicants higher than females by taking out words like “women” in the job application. Amazon shut down the algorithm.
Airbnb created an AI-algorithm-based price tool to increase income for hosts on the platform. The smart price tool would alleviate limiting information on competing Airbnb properties, hotel rates and other demands that they used to correctly price their properties. A report from Harvard found daily revenue for hosts who used smart-pricing increased by 8.6 percent. Despite this, the racial revenue gap between Black and white hosts increased with both adopters and non-adopters.
The report also found that the tool worked perfectly in reducing price listings regardless of race. The software grew Black hosts’ revenue because the demand curve for Black hosts was more responsive to price changes than the demand curve for the same properties owned by white people. The algorithm launch widened Airbnb’s racial revenue gap because Airbnb did not account for addressing root issues Black hosts didn’t adopt the tools as their white peers.
“Rich data, dedicated models and accurate predictions do not automatically lead to good individual and societal impact,” Shunyuan Zhang, Associate Professor at Harvard School of Business, told The Plug. “Public and private sectors need to address the concerns that algorithms might produce biased outcomes against disadvantaged groups.”
The Airbnb example is used to indicate marketing conditions algorithms can create. One idea was to see whether the algorithm should be regulated to address racial and economic disparities. Black Airbnb hosts were 46 percent less likely to use the smart pricing tool, but Airbnb could encourage tool usership by offering promotions, or more importantly, figure out why Black hosts are hesitant to use the tool.
Recent federal legislation is mandating more transparency in AI algorithms. Algorithmic Justice and Online Platform Transparency Act of 2021 prohibits using personal information in the algorithmic process. The California Consumer Privacy Act (CCPA) awards consumers more control over their online information.
“The advent of consumers tailoring their own AI-driven personal purchase algorithms is on the horizon,” Zhang said. “The issues and challenges—uninterpretable, lack of trust, discriminatory outcome, etc.— in AI will apply with even greater force.”