Assorted Recent Talks on AI

Bill Gates on AI and the rapidly evolving future of computing

Godfather of artificial intelligence” talks impact and potential of AI

GTC 2023 Keynote with NVIDIA CEO Jensen Huang

For Discussion: What is your take on the following comments by Bill Gates on the future of programming?

Kevin Scott: One of the things for me that has been really fascinating, and I think I’m going to say this just as a reminder to folks who are thinking about pursuing careers in Computer Science and becoming programmers. I spent most of my training as a computer scientist in the early part of my career as a systems person. I wrote compilers, tons of assembly language, and designed programming languages, which I know you did as well. I feel a lot of the things that I studied, such as parallel optimization and high-performance computer architectures in grad school, have become relevant again. I left grad school, went to Google, and thought I would never use any of this stuff ever again. But then, all of a sudden now, we’re building supercomputers to train models, and these things are relevant again. I think it’s interesting. I wonder what Bill Gates, the young programmer, would be working on if you were in the mix right now, writing code for these things because there are so many interesting things to work on. But what do you think you, as a 20-something-year-old, young programmer, would get really excited about in this stack of technology?

Bill Gates: Well, there is an element of this that’s fairly mathematical. I feel lucky that I did a lot of math. That was a gateway to programming for me, including all crazy stuff with numerical matrices and their properties. There are people who came to programming without that math background, who don’t need to go and get a little bit of the math. I’m not saying it’s super hard, but they should go and do it so that when you see those funny equations, you’re like, I’m comfortable with that because a lot of the computation will be that thing instead of classic programming.

The paradox when I started out writing tiny programs was super necessary. The original Macintosh was a 128K machine, 128K bytes, 22K of which is the bitmap screen. Almost nobody could write programs to fit in there. Microsoft, our approach, our tools, let us write code for that machine, and really only we and Apple succeeded. Then when it became 512K, a few people succeeded, but even that, people found very difficult. I remember thinking as memory got to be 4 gigabytes, all these programmers, they don’t understand discipline, and optimization, and they’re just allowed to waste resources. But now that these things that you’re operating with billions of parameters, the idea of okay, can I skip some of those parameters? Can I simplify some of those parameters? Can I precompute various things? If I have many models, can I keep deltas between models instead of having them? All the optimizations that made sense on these very resource-limited machines. Well, some of them come back in this world where when you’re going to do billions of operations or literally hundreds of billions of operations, we are pushing the absolute limit of the cost and performance of these computers.

That’s one thing that is very impressive is the speed-ups even in the last six months on some of these things has been better than expected. That’s fantastic because you get the hardware speedup, the software speedup multiplied together. That means, how resource bottlenecked will we be over the next couple of years? That’s less clear to me now that these improvements are taking place, although I still worry about that and how we make sure that companies broadly, and Microsoft in particular, allocate that in a smart way. Understanding algorithms, understanding why certain things are fast and slow, that is fun. The systems work that in my early career was just one computer and later a network of computers, now that systems where you have data centers with millions of CPUs, it’s incredible the optimization that can be done there. Just how the power supplies work, or how the network connections work. Anyway, in almost every area of computer science, including database type techniques, programming techniques, this forces us to think about in a really new way.

--

--

Ismail Ali Manik

Uni. of Adelaide & Columbia Uni NY alum; World Bank, PFM, Global Development, Public Policy, Education, Economics, book-reviews, MindMaps, @iamaniku