The tech industry loves its garage start-up stories. From Hewlett-Packard to Google, stories of bootstrapped companies that grew into giants have inspired generations of entrepreneurs.
But start-ups trying to keep up with today’s cutting-edge technology, such as the artificial intelligence used in chatbots, require massive amounts of funding and computing power. chatgpt And google bardPerhaps making those inspirational stories a thing of the past.
In 2019, Eden Gomez and Nick Frost left Google to form an AI start-up in Toronto called Foghere that could compete with their former employer. After several months, they went back to Google and asked if it would sell them the enormous computing power they needed to build their own AI technology. After the arrangement was personally approved by Google’s chief executive, Sundar Pichai, the tech giant gave them what they wanted.
“It’s ‘Game of Thrones’.” That’s it,” said David Katz, partner at Radical Ventures, the first investor in Fog. He said that big companies like Google, Microsoft and Amazon are controlling the chips. “They’re controlling computing power,” he said. “They’re choosing who gets it.”
It’s hard to build a groundbreaking AI start-up without getting the backing of the “hyperscalers” who control it large data center Capable of running AI systems. And it has put industry veterans in the driver’s seat again — leading to what many expect to be the most significant change for the tech industry in decades.
OpenAI, the start-up behind ChatGPT, Recently raised $10 billion from Microsoft, It will put most of that money back into Microsoft as it pays for time on huge clusters of computer servers operated by the big company. These machines, spread over thousands of specialized computer chips, are essential for improving and expanding the skills of ChatGPT and similar technologies.
Competitors may not be able to keep pace with OpenAI until they have access to a similar amount of computing power. get together recently raised $270 million, bringing its total funding to over $440 million. It will use most of that money to buy computing power from companies like Google.
Other start-ups have made similar arrangements, most notably a Silicon Valley company called anthropic, which was founded in 2021 by a group of former OpenAI researchers; Character.AI, founded by two leading Google researchers; and Inflection AI, which was founded by a former Google executive. Inflection raised $1.3 billion in funding last week, bringing its total to $1.5 billion.
At Google, Mr. Gomez was part of a small research team that designed TransformerBasic technology used to build chatbots like ChatGPT and Google Bard.
transformer is a powerful example of what scientists call neural network A mathematical system that can learn skills by analyzing data. Neural networks have existed for years, helping to operate everything talking digital assistant like Siri To Instant translation services like Google Translate,
Transformers took this idea into new territory. Running on hundreds or even thousands of computer chips, it can analyze far more data, far more quickly.
Using this technology, companies such as Google and OpenAI began building systems that learned from massive amounts of digital text, including Wikipedia articles, digital books, and chat logs. As these systems analyzed more and more data, they learned to generate text themselves, including term papers, blog posts, poetry, and computer code.
These systems – called large language models – now underpin chatbots such as Google Bard and ChatGPT.
Long before the advent of ChatGPT, Mr. Gomez left Google to start his own company with Mr. Frost and another Toronto entrepreneur, Evan Zhang. The aim was to create a larger language model to rival Google’s.
At Google, he and his fellow researchers had access to an almost unlimited amount of computing power. After leaving the company, he needed something similar. So he and his co-founders bought it from Google, which sells access to similar chips through cloud computing services.
Over the next three years, Fog built a large language model almost no one else rivals, Now, it’s selling this technology to other businesses. The idea is to provide the technology needed for any company to build and run their own AI applications, from chatbots to search engines to personal tutors.
“The strategy is to create a platform that other people can build on and use,” Mr. Gomez said.
OpenAI provides a service along the same lines called GPT-4, which is already being used by many businesses to build chatbots and other applications. This new technology can analyze, create and edit text. But it will soon handle images and sounds as well. OpenAI is building a version of GPT-4 that can examine a picture, Describe it right away and even answer questions about it.
Microsoft’s chief executive, Satya Nadella, said the company’s arrangement with OpenAI is one of the types of mutually beneficial relationships it has developed with smaller competitors over a long period of time. “I grew up in a company that always did these kinds of deals with other companies,” he told The New York Times earlier this year.
As the industry races to match GPT-4, entrepreneurs, investors and pundits are debating who will be the ultimate winner. Most agree that OpenAI is a leader in this area. But Fog and a small group of other companies are building similar technology.
The tech giants are in a strong position because they have the vast resources needed to advance these systems more than anyone else. google too holds patent on transformerThe fundamental technology behind the AI systems that Foge and many other companies are building.
But there is one wild card: open source software.
Meta, another giant with the computing power needed to create the next wave of AI, Recently its latest big language model is open, which means anyone can reuse it and build on top of it. Many in the field believe that this type of freely available software will allow anyone to compete.
“The collective mind of every researcher on Earth will beat any company,” said Amr Awdallah, chief executive of AI start-up Vectara and former Google executive. But they still have to pay for access to a much larger competitor’s data centers.