Calculating Cloud Computing Expenses
Coined in 2006 by then Google CEO Eric Schmidt, “cloud computing” is a general term that encompasses any shared computer utility that a business and its employees have access to through any computing device at any time and from anywhere. In recent years, cloud computing has become an essential company resource vital to a company’s growth. For taxpayers, correctly calculating the tax treatment of cloud computing expenses for research and development activities has become increasingly important.
Originally, companies relied on cloud computing to manage, manipulate and optimize their data while developing their products in-house. With increased modernization, companies have opted to move all their computing resources to the cloud. Now, cloud computing is most commonly referenced in tandem with a cloud service provider that offers a “public cloud” model. An example of one of the most used public cloud providers is Amazon Web Services, which single-handedly brought in an estimated $25 billion in revenue in 2018. This means tens of millions of dollars are spent daily on cloud computing costs associated with a variety of daily business activities, including internal resource management, outside business development, economic projection, software as a service computing, and even research and development.
When determining the eligibility of a company’s cloud computing costs as qualified research expenses, U.S. Code § 41 offers little guidance. U.S. Code § 41(b)(2)(iii) states they are “any amount paid or incurred to another person for the right to use computers in the conduct of qualified research.” In 1985, the Treasury secretary released 26 CFR § 1.41-2 to define which computing costs were qualified research expenses. The regulations state the cost of rented or leased computers may be a qualified research expense as long as the computers are owned and operated by someone other than the taxpayer. Additionally, the computer processing must be located off the taxpayer’s premises and the taxpayer must not be the primary user of the computer.
In the 1980s, computers used for research and development were primarily owned by universities or scientific institutions. These computers were massive, room-sized machines made up of towers of processors and fans, yet they only produced a fraction of the computing power of a modern-day cellphone. Most companies at the leading edge of new discoveries were hugely constrained by limited accessibility to computers. Thus, 26 CFR § 1.41-2 was a valuable addition to both the government and the companies claiming the credit.
The original goal of the R&D tax credit was to reduce the financial risk of technological innovation. By giving companies the opportunity to claim computing costs as a qualified expense, companies were more likely to invest in renting or leasing university computers to run their tests, simulations and calculations. By the 1990s, as computers started to become more accessible and more powerful, companies started realizing that computers were a necessity to stay viable with the changing times. Major companies went about building their own room-sized computers. And with even more modernization, the processing power of these massive computers was shrunk to the size of laptops, allowing most businesses to run tests, simulations and calculations without ever leaving their desks.
With each advancement in computers and computing technology, the need for renting or leasing computers became less and less necessary. Due to this, the conversation around qualified research expenses tied to rented or leased computing utilities became rare. However, with the sudden emergence and importance of cloud computing, the question of which computing costs are qualified expenses and which aren’t have returned to the forefront of the R&D tax credit.
While cloud computing has become an important staple to millions of businesses throughout the country, the U.S. tax code has yet to adapt its laws to explicitly capture cloud computing expenses tied to a company’s research and development. While it seems cloud computing expenses should be qualified expenses considering they pass the criteria put forward by 26 CFR § 1.41-2, a company’s expenses today face much stricter scrutiny. In 1985, when businesses rented or leased computers, the cost was qualified as long as it passed the criteria of U.S. Code § 41(b). Now, when assessing cloud computing expenses for qualified research expenses, companies need to dissect each invoice to understand which components of their cloud computing expenses are qualified. Most cloud computing service providers now offer more than just computing power, such as file storage, data migration and application hosting, which are all nonqualified expenses.
Understanding the history of the R&D tax credit around computer leasing allows us to avoid common mistakes when claiming cloud computing as a qualified research expense. Cloud computing as a service inherently fulfills the criteria of computing leasing as a qualified expense put forward by 26 CFR § 1.41-2. However, cloud computing is often incorrectly claimed as a contract research expense under U.S. Code § 41(b)(1)(B). Moreover, cloud service providers often offer services such as data security services, optical character recognition services and financial computing management services. These expenses, similar to the nonqualified expenses mentioned earlier, are often included in a cloud service provider’s monthly bill. This often leads companies to incorrectly claim every dollar paid to a cloud service provider as a qualified expense.
As businesses accrue ever-increasing amounts of cloud computing expenses for research and development activities, it is vital for tax specialists to counsel their clients on the opportunity to claim cloud computing as a computer leasing expense while also having the knowledge and experience to identify qualified and nonqualified expenses.