Is the inbox getting a promotion- and the inbox reader being marginalized? In a broad-based interview on the Microsoft AI plans, a Microsoft AI CEO Mustafa Suleyman added a dark timeline to a long-standing fear that has long been circulating: that the contemporary workplace is an abnormally automatable environment. His assertion was clear-cut, in 12-18 months, AI systems “human-level performance on most, if not all, professional tasks,” professional jobs, and much of white-collar work would be entirely automated. His named targets were the desk-based pillars of the service economy, namely law, accounting, marketing and project management.

The 18 month horizon that Suleyman is placed in is in a workplace already being restructured around the software capable of creating, summarizing, reconciling, and route the information at a rate that is faster than any other human staff. It is not just a matter of text generation but rather agentic systems capable of accepting a goal, dividing it into steps and performing it across tools previously managed by a specialist. That appearance, less “chatbot,” more junior operator who is allowed to behave in a certain way, makes sense in explaining why these predictions keep reappearing. It also explains why the impact, should it come, will be experienced initially in jobs that are based on repeatable digital processes: document preparation, assembling slide decks, ticket sorting, and messy requests translation into structured outputs. Such tasks do not constitute the entire job, but rather in most cases take up most of the hours of a job.
The example Suleyman is referring to is software engineering since it offers quantifiable outputs: code written, tests run, pull requests merged. He outlined teams that already applied AI in form of generating most of the code, and moving human resource to debugging, inspection, design, and implementation. Microsoft CEO Satya Nadella has reported that more than a quarter of its code is coded in AI, an indicator of how fast the tooling has been integrated into daily activity.
But the adoption curve differs to the displacement curve. Evidence in the context of professional services has had more of an incremental than an apocalyptic look. A Thomson Reuters report in 2025 discovered that lawyers, accountants and auditors applied AI to limited tasks such as document review and routine analysis, and not wholesale replacement. A study of software by a nonprofit, the Model Evaluation and Threat Research, found that AI assistance increased the working time of developers by 20 per cent, a realistic tax: checking outputs, fixing insidious bugs and interpreting unclear instructions into dependable advice. Existence in most companies has been more of AI and oversight, than “AI instead of staff.”
Economic indications have also been unbalanced. The chief economist of Apollo Global Management Torsten Slok noted that the profit margins of the Big Tech increased by over 20 percent in late 2025, whereas in the rest of the markets, it went up only slightly, indicating that all the profits have been consolidated where AI is not only widely used but also significantly linked to digital products. Meanwhile, the labor market has demonstrated early, noisy indications of change as opposed to one, clean, break: Even today, Challenger, Gray and Christmas estimated that around 55,000 of the jobs lost to AI might be explained by such reasons, although numerous companies have been downsizing their workforces without necessarily relating their choices to automation.
The unique feature of the prediction of Suleyman is the deadline not only, but also the suggested mechanism: AI models specific to each institution and role, created within a short timeframe and at low costs. He opined that it will be the same as developing a new model because he said it will be like developing a podcast or a blog because organisations can develop systems that fit their own procedures and constraints. He described in the same interview the core mission of Microsoft AI as creating its own frontier foundation models, part capability race, part self sufficiency so that the company can drive the application of these systems in how the software stack is used by offices already.
The engineering problem at hand is not the disappearance of all jobs, but the disappearance of most office work, and this is made readable to machines: input normalised, decisions recorded, exceptions managed by raising the alarm. When that occurs, the center of gravity in white-collar work moves to oversight, accountability and the hard to template moments negotiation, judgment in lack of knowledge and responsibility in results when the automation is wrong.

