ChatGPT caught regulators by surprise when it set off a new AI race. As companies have rushed to develop and release ever more powerful models, lawmakers and regulators around the world have sought to catch up and rein in development.
As governments spin up new AI programs, regulators around the world are urgently trying to hire AI experts. But some of the job ads are raising eyebrows and even chuckles among AI researchers and engineers for offering wages that, amid the current AI boom, look pitiful.
The European AI Office, which will be central to the implementation of the EU’s AI Act, listed vacancies early this month and wants applicants to begin work in the fall. They include openings for technology specialists in AI with a master’s degree in computer science or engineering and at least one year’s experience, at a seniority level that suggests an annual salary from €47,320 ($51,730).
Across La Manche, the UK government’s Department for Science, Innovation & Technology is also seeking AI experts. One open position is Head of the International AI Safety Report, who would help shepherd a landmark global report that stems from the UK’s global AI Safety Summit last year. The ad says “expertise in frontier AI safety and/or demonstrable experience of upskilling quickly in a complex new policy area” is essential. The salary offered is £64,660 ($82,730) a year.
Although the EU listing is net of tax, the salaries are far lower than the eye-watering sums being offered within the industry. Levels.fyi, which compiles verified tech industry compensation data, reports that the median total compensation for workers at OpenAI is $560,000, including stock grants, as is common in the tech industry. The lowest compensation it has verified at the ChatGPT maker, for a recruiter, is $190,000.
At OpenAI’s Amazon-backed rival Anthropic—creator of the Claude chatbot—the median compensation of $212,500 still far outstrips what regulators are currently offering. The lower 25th percentile for jobs in machine learning and AI is $172,500, according to Levels.fyi. Stock grants included in tech industry compensation packages can turn into huge windfalls if a company’s value increases. OpenAI is currently valued at $80 billion following a February 2024 share tender first reported by The New York Times.
“There’s a brain drain happening across every government across the world,” says Nolan Church, cofounder and CEO at FairComp, a company tracking salary data to help workers negotiate better pay. “Part of the reason why is that private companies not only have a better working environment, but also will offer significantly higher salaries.”
Church worries that competition between private companies will also widen the gap further between the private and public sector. “I personally believe the government should be attracting the best and the brightest,” he says, “but how can you convince the best and the brightest to take a massive pay cut?”
Outside the Ballpark
It’s not new for government jobs to pay significantly less than those in industry, but in the current AI boom the disconnect is potentially more significant and urgent. Tech companies and corporations in other industries rushing to embrace the technology are competing fiercely for AI-savvy talent. The rapid pace of developments in AI means regulators need to move fast.
Jack Clark, a cofounder of Anthropic, posted on X comparing the EU AI Office’s salary offer unfavorably to tech industry internships. “I appreciate governments are working within their own constraints, but if you want to carry out some ambitious regulation of the AI sector then you need to pay a decent wage,” he wrote. “You don't need to be competitive with industry, but you definitely need to be in the ballpark.”
Most PopularPS5 vs PS5 Slim: What’s the Difference, and Which One Should You Get?By Eric Ravenscraft Gear13 Great Couches You Can Order OnlineBy Louryn Strampe GearThe Best Portable Power StationsBy Simon Hill GearThe Best Wireless Earbuds for Working OutBy Adrienne So
GearThe European AI Office did not respond to a request for comment by the time of publication. A statement from Ian Hogarth, chair of the UK’s AI Safety Institute, provided by the Department for Science, Innovation & Technology says that his organization has “rapidly” recruited 31 engineers and researchers from companies including OpenAI and DeepMind. “Demand to take part in our work evaluating frontier AI models is not slowing down,” he said. “While we do benchmark our salaries against those on offer in industry, the technical experts that are joining us from the top of their fields do so seeking more than a high salary. They are joining to contribute to a critical mission to make sure these models are safe.”
Others hoping regulators and public bodies like the AI Safety Institute can provide a counterweight to tech industry power are less upbeat. The AI Safety Institute, for instance, wants to probe how models work to ensure they’re operating safely, “Empowering government assessors to directly test models is a promising approach to surfacing safety issues, but the success of any program is dependent on the resources it is provided with,” says Harry Law, a researcher in the history and philosophy of AI at the University of Cambridge. “Governments clearly cannot always offer competitive compensation for those with experience working with AI in industry, but they can relax existing rules to meet people in the middle.”
Law says the AI Safety Institute appears to be attempting to offer compensation that falls in between low government salaries and the more attractive packages found in industry, “but only partially closes the gap.” Targeting recent graduates, as the European office specifically does, could also help augment more experienced candidates willing to compromise pay for principles.
Government and public sector AI organizations can offer candidates unique non-monetary benefits. “It's obvious the public sector can't compete with OpenAI but, combined with goals to ‘not be evil,’ it could be less laughable,” says Lilian Edwards, professor of law at Newcastle University. She thinks that “principle premium” makes the EU and UK agencies’ offers more attractive.
Not everyone is convinced that method will work. “I think what's going to end up happening is you're going to have these archaic governments with poor talent,” says Church. “This talent is at a premium today. And, you know, the brain drain will continue.”