ALISON BEARD: Welcome to the HBR IdeaCast from Harvard Business Review. I’m Alison Beard.
Baidu launched in 2000 as a search engine platform. Fast-forward two decades, and it’s now one of the few companies in the world that offers a full AI stack. Its core businesses span mobile, cloud, intelligent driving, and other growth initiatives. And its products and services have attracted hundreds of millions of users and hundreds of thousands of enterprise customers. Today’s guest is leading all of that. For the third episode in our special series on the future of business, we’ll hear from Robin Li, the co-founder, chief executive, and chairman of Baidu. He explains how his company has built generative AI into its business, the technology trends he’s keeping an eye on, and how he anticipates these tools will transform our lives. Robin spoke to HBR editor-in-chief, Adi Ignatius, and took questions from the audience during our recent virtual Future of Business conference. Here’s their conversation.
ADI IGNATIUS: So Robin Li, I know it is very late your time in Beijing, so we appreciate your joining us live for this, welcome.
ROBIN LI: Hi, Adi, thank you for having me. It’s great to be here.
ADI IGNATIUS: Well, it’s great to have you. Before we start, let me just remind everyone in the audience to put any questions you have for Robin in the ask the speaker chat, and I will try to get to as many as I can later. But Robin, let’s get to it. So Baidu, your company introduced a ChatGPT-like product, ERNIE Bot, last year, that last I saw has more than 300 million users. I assume it’s been a learning experience for you. Can you talk a little bit about what you’ve learned since the first version came out and how it has evolved? Just tell us a little bit about all that.
ROBIN LI: Yeah, sure. We launched ERNIE Bot, I think, March 16th of last year. I think that was the first ChatGPT-like chatbot for all the public companies around the world. Because we’ve been investing in AI, especially natural language-related AI for quite a few years, we were able to quickly launch a chatbot based on our large language models. Over the past year and a half, a lot has happened. The technology has evolved very quickly and dramatically. Things we learned? There are a lot of things that I should mention. The first is that a lot of people, users, developers, the customers — they not only care about the efficacy of the model, they also care about the response speed. They also care about the cost, inference cost. So after March of last year, we have rolled out a series of language models or foundation models to satisfy all kinds of different needs in different scenarios. Meaning that the model size could vary greatly and the inference cost could be very different too. And in certain cases, users don’t mind to wait 10 seconds to get the best answer, and in other scenarios, you will have to do it very, very quickly, sub-second response time. And also in terms of cost, we’ve been able to reduce the cost by about 99 percent, meaning the current inference cost is about one percent of the original cost when we first launched that. Having said all of that, I would say that probably the most significant change we’re seeing over the past 18 to 20 months is the accuracy of those answers from the large language models. And I think over the past 18 months, that problem has pretty much been solved. Meaning that when you talk to a chatbot, a frontier model-based chatbot, you can basically trust the answer. That’s a huge difference.
ADI IGNATIUS: Now, from my perspective, maybe this is a US perspective, there was a huge wave of excitement about AI — particularly with the release of generative AI products. You’ve talked about search, there don’t seem to be a lot of or as many interesting use cases as maybe some of us had expected by now. So I’m interested in your view, are we in an AI bubble at this point? What’s the trajectory of the technology?
ROBIN LI: I think like many other technology waves, bubble is kind of inevitable. When you pass the stage of initial excitement, people would be disappointed that the technology doesn’t meet the high expectation generated through the initial excitement. We’ve seen this many times when the internet took off in the mid to late ’90s, and there was a huge bubble. For mobile internet, similar things happened. And this time for generative AI, I think we will also go through that kind of period too. But I think it also helps, it will wash out a lot of those fake innovation or products that doesn’t have market fit. After that, probably one percent of the companies will stand out and become huge and will create a lot of value, will create tremendous value for the people, for the society. And I think we are just going through this kind of process. This year the sector is probably cooler than last year, but I think it’s also healthier than last year.
ADI IGNATIUS: And what’s the right business model? I mean, there’s some large models, Meta’s Llama, for example, are open source. Others are closed source like OpenAI’s GPT. Baidu I think is advocated for a closed source approach. What’s the thinking behind that and how does that set Baidu up to capitalize on AI technology?
ROBIN LI: Yeah, you mentioned closed source, but I would prefer call it a commercial-grade model, foundation model. I think when you look at the most advanced language model or foundation model, most of them are closed. And when people talk about open source, it’s kind of misleading to me. It’s different from the open source of Linux or Python because for an open source model, what you basically get is a bunch of parameters. You don’t know how those parameters was derived and you have no way of changing those parameters. So it doesn’t have the effect of many, many people from different part of the world, and contribute back to the main branch and make it better and better. Language models are very different, and you can use a so-called open source model to do things. But it’s very hard for you to contribute back. And another perception people have on the open source stuff is that it’s free or it’s actually cheaper, at least cheaper than the commercial ones. But in the foundation model area, that’s also not true. For our business model, I think we try to support all kinds of applications, all kinds of customers — both external, our cloud computing customers and also our internal customers by Baidu search, Baidu map, and [indecipherable]. There are lots of applications that leverage the power of ERNIE Bot, and basically we charge the inference cost for those API cost.
ADI IGNATIUS: So I want to switch now to robotaxis. A few days ago, Tesla announced its robotaxi plan, Waymo has expanded its service. What do you think? Are we at a point where large-scale development of robotaxis including your own product, has it arrived? Is this the moment?
ROBIN LI: Yeah we’ve been investing in the self-driving technology for over a decade. It took us a very long time to get to this point, meaning that we now have more than 400 cars in the city of Wuhan that covers about 9 million people, and a lot of people in that city are already used to taking a robotaxi. Our brand is called Apollo Go. They pay a fare that’s typically cheaper than a regular taxi. And I think the technology is ready in certain restricted areas. It’s not ready for anywhere, anytime. In our professional terms, it’s called level five is that you can drive anytime, anywhere. But for level four, I think we are at this stage. We are at level four, which means that when you know which area you are in then you can get rid of the human driver and provide a ride hailing service. We cannot do that in the most crowded, most complicated traffic area yet, but we can do that in most of the areas, in most cities. I think US is probably at a similar stage. Right now, the bottleneck is more about regulation. I think in most cities around the world, taxi service without a driver is not allowed yet. So we basically try to go to those select few cities that regulations allow us to operate such a service. And it’s a gradual process. It’s probably going to take probably another decade for the robotaxi service to be really everywhere, to become mainstream. But it will gradually become a service that people prefer.
ADI IGNATIUS: So you’ve talked about the inevitability of robotaxis putting human drivers out of work. I’m curious your perspective, you can’t talk about AI without at least discussing the issue of job displacement more broadly. What do you think? Do you think AI, generative AI, will replace humans on a large scale? And if so, how do we prepare for that?
ROBIN LI: Yeah, a lot of people compare the generative AI revolution to the industrial revolution. If you look back at the industrial revolution, similar things happened. A lot of old jobs were get rid of, but more new jobs were created. Every time when innovation, when technology revolution happens, the jobs that got lost are those hardest, toughest jobs. Those jobs that’s not so pleasant to human beings. And the jobs, the new jobs that got created were the jobs that’s more comfortable, more decent, and less stressful. I’m optimistic that this wave of innovation or generative AI will do the same thing. Another point I would like to make is that this kind of process, it’s not something that’s going to happen overnight. It will take ten years, maybe 20 years, maybe 30 years to happen. So I think human beings have time to prepare for that, and we need to be proactive. And companies, organizations, governments, and ordinary people all need to prepare for that kind of paradigm shift.
ADI IGNATIUS: So I’d love to talk about China more specifically and China’s approach to AI. Do you see a difference between China’s path to generative AI development and the approach taken by the rest of the world?
ROBIN LI: Yeah, I do see some differences. The most obvious difference is that China is more application-driven. We hear more about what kind of applications can benefit from this kind of frontier models. And a lot of startups are trying to find ways to leverage the power of foundation models. Companies like Baidu, our strategy is to reconstruct and rebuild almost every one of our existing products based on the ERNIE Bot foundation model. We have already seen very big changes in our existing products. Search being the first and foremost thing. Right now, over 18 percent of the Baidu search results are generated by ERNIE Bot. We also have this kind of phenomenon in China, live-streaming shopping. I know it’s not that popular in the US, but it’s a big business in China. But live-streaming requires a real human to devote to that kind of service full time. But we now can create digital humans to do live-streaming shopping. The scripts can be generated by ERNIE Bot. It looks like a real person. It’s not an avatar per se. Can look very real. Sometimes the consumers, the shoppers, cannot just tell whether it’s a digital human or it’s a real human. And during the live stream, the digital human can also interact with the audience and answer questions and react to their shopping activities, things like that. I think we just got started. There are lots of, lots of use cases we are seeing that by leveraging the power for generative AI, businesses can get much better ROI, they can get more revenue, and they can save more cost.
ADI IGNATIUS: So just to follow up with that, and I don’t want to get into politics at all, but there’s a concern certainly in the US about in dealing with Chinese tech companies — TikTok is facing a situation now — that the data could be misused, that our data could be…just given the difference between the Chinese system, the American system, the Western system — that data is not safe, that the government could have access to it, et cetera, et cetera. How do you respond to that in terms of your company, your products, and about those concerns?
ROBIN LI: Well, first of all, Baidu is a NASDAQ-listed public company. And we would follow, we would comply with any applicable law. So for whatever practice we have, we would fully disclose that. We do respect user privacy. We have a data compliance committee within our company with very high-ranking executives being the chair. And we take this very seriously. And Chinese consumers are not different from any consumers for the rest of the world. They care about their data. They don’t want other people to look at their data. And in order to gain the trust of our users, we need to do the right thing in order to stay as a public-listed company in the US. So we also need to comply with all the applicable laws in the US too.
ADI IGNATIUS: Okay. I want to get to a couple of audience questions. And there are a couple — one from CLJ, one from Vicky, not sure where they’re from. But they’re both asking about sustainability and environmental costs. So here’s one, “how do you think about the environmental costs of AI when you consider a return on investment? When is the use case too taxing on the environment, too much for the environment to be valuable as a product or to remain valuable to society?”
ROBIN LI: Yeah, that’s a very good question. But I would argue that the efficiency gain, the value created through this kind of innovation will more than offset the power consumption. We will find ways to generate green energy faster than without generative AI. We will be able to complete other kinds of tasks better and faster and at a lower cost. So net-net, I think generative AI will have a positive impact on the environment.
ADI IGNATIUS: All right, so we probably have time for just one more question. This is an audience question, it is from Brian, and do what you want with this one. Brian asks, “what do you think the world will look like in 10 to 15 years? What will our interactions with technology be like then?”
ROBIN LI: Yeah, it’s very exciting to think about that. I think generative AI is really disruptive. I think it will give a lot of the ordinary people the power to be a programmer. What does that mean? Nowadays, the engineers, the programmers get paid, let’s say, $200,000 per year. And that’s because they are quite powerful. They can program things, they can create software that’s very valuable. And I think probably doesn’t take 10 years. We should see that around five to ten years. And a person who can speak natural language — be it English or Chinese — can have the power of a software engineer. You can imagine how much productivity we can have just by having that kind of capability. And when I was in college, we learned program language, but we learned assembly language. Nobody use assembly language to do programming anymore. Nowadays, people use Python or C++. Five, ten years from now, nobody will use Python or C++ anymore. They are just use English or Chinese to do whatever they want, and everyone can do that. So think about that. The world will be completely different ten years from now.
ALISON BEARD: That was Robin Li, the co-founder, chief executive, and chairman of Baidu speaking to HBR editor-in-chief, Adi Ignatius, at our recent virtual Future of Business conference. I hope you listen to all of our Future of Business series and all of the episodes we have on the HBR IdeaCast about leadership strategy and the future of work. Find us at hbr.org/podcasts or search HBR in Apple Podcasts, Spotify, or wherever you listen. And if you don’t already subscribe to HBR, please do. It’s the best way to support our show. Go to hbr.org/subscribe to learn more. Thanks to our team, Senior Producers, Anne Saini and Mary Dooe, Associate Producer, Hannah Bates, Audio Product Manager, Ian Fox, and Senior Production Specialist, Rob Eckhardt. And thanks to you for listening to the HBR IdeaCast. I’m Alison Beard.
Discover more from reviewer4you.com
Subscribe to get the latest posts to your email.