'Feedback Loops' Is Valuable, And GPU Hoarding Was To 'Catch Up To TikTok'

Mark Zuckerberg
CEO and founder of Meta Platforms, Inc.

A CEO of a company is the head of the pack, and is in the position to embody the essence of the company's vision and values.

Mark Zuckerberg is the founder and CEO of Meta Platforms, the company that owns and operates Facebook, Instagram, Threads, and WhatsApp, among other products and services.

While Zuckerberg's role involves selling the company's vision to investors, stakeholders, and the public, and that he's constantly pitching ideas, forging partnerships, and negotiating deals to propel his company forward, he is also a chief that needs to be there to showcase the company's strengths, capabilities, and innovations.

And as a leader, he needs to foresee things before others, and predict what's going to happen, before it happens.

And in terms of generative AI, he may not predicted it well enough, but he came prepared. And that seeing the demand is high, he doesn't want to be bound to just obtaining training data.

Mark Zuckerberg, CEO and founder of Meta.
Mark Zuckerberg, CEO and founder of Meta.
"The thing that I think is going to be more valuable is the feedback loops rather than any kind of upfront corpus."

Feedback loops are used to train, retrain and improve AI models over time based on their previous outputs.

They are essentially outputs that came out of the AI's previous outputs, and are basically algorithms that inform and educate AI models when they are making mistakes.

These algorithms let AI models know when they make an error, for example, and provide them with data to adjust their future performance.

The race for data for AI models has increased the demand of data, that third-parties are creating synthetic data just to meet the increasing demands.

This is something Meta and other players would be looking at.

" I think there's going to be a lot in synthetic data, where you are having the models trying to churn on different problems and see which paths end up working, and then use that to reinforce."

Zuckerberg sees feedback loops as the key to building powerful AI models, but also knows that there are risks in relying on them.

They could reinforce some of their mistakes, limitations, and biases if they're not trained on "good data" to begin with.

Sourcing new data for their insatiable AI models to consume —which theoretically will make them smarter — is now an obsession for companies racing to dominate AI.

Mark Zuckerberg, CEO and founder of Meta.

It's worth noting that Zuckerberg was just like pretty much everyone else, in which he failed to see the generative AI wave coming. But he did prepared for it.

He predicted that AI would raise in prominence, and because of that, he led Meta to build AI using the massive stockpile of Nvidia GPUs he secured.

Before OpenAI launched ChatGPT, Meta had already acquired a horde of GPUs, not to build generative AI tools, but to change its algorithm, and fo related training infrastructure so Reels could "catch up to what TikTok was doing."

"I basically looked at [needing to catch up with TikTok] and I was like, 'Hey, we have to make sure that we're never in this situation again,'" Zuckerberg said.

But fortunately for Zuckerberg and his company, he decided to twice as many GPUs as Meta needed just in case.

"Our normal principle is there's going to be something on the horizon that we can't see yet."

Before the generative AI hype took place, Zuckerberg was still focused on the metaverse

"It came from being behind…it wasn't like, oh, I was so far ahead," Zuckerberg added. "Actually, most of the time, I think where we kind of make some decisions that end up seeming good is because we messed something up before and just do not want to repeat the mistake."