The Aarthi and Sriram Show

EP 84: The Mike Maples Interview: How a legendary VC finds the best startups and founders

Episode Notes

Sriram and Aarthi are joined by Martin Casado, a general partner at Andreessen Horowitz, to discuss the future of the AI Safety Bill SB 1047. A bill that sets out to regulate AI at the model level, is now slated for a California Assembly vote in August. If passed, one signature from Governor Gavin Newsom could cement it into California law.

Show notes:

(00:00) Trailer and intro on SB1047

(00:43) Welcome Martin Casado of a16z

(2:30) What is SB1047?

(4:10) What is the origin of SB1047 bill?

(6:14) Should AI be regulated?

(10:20) Is AI a paradigm shift?

(13:36) Who is funding this bill? ‘Baptists vs bootleggers’ and Nick Bostrom’s Superintelligence book being the origin point.

(16:35) Scott Wiener’s support of the bill

(19:34) Open source benefitting software world and risks to open source due to this bill

(22:10) Are large models more dangerous?

(24:10) Is there a correlation between size of models and risk associated?

(26:45) How would Martin frame any regulations on AI? What’s a better way

(28:46) Nancy Pelosi opposes the bill. Some famous researchers are for the bill. Who comprises the two opposing camps and what is the motivation?

(33:10) Why does Pelosi oppose the bill?

(35:00) Leopold Aschenbrenner and “Situational Awareness” paper

(37:20) Non-system people viewing systems - computer systems are not parametric but sigmoidal.

(41:30) AI is the ultimate ‘Deus ex machina’ (God in the machine)

(44:15) Sriram compares AI regulations to Game of Thrones episode

(46:00) Anthropic’s investment in AI safety

(48:15) If you’re an AI founder, what can you do about this bill today?

(50:00) Why is this bill a personal issue for Martin?

Other episodes you might enjoy:

EP 74 - How To Fix Google's WOKE AI Disaster

EP 63 - Lessons From Networking In Silicon Valley

EP 59 - Why We Moved to London, The Elon Musk Book, Should You Get An MBA