Business, Innovation & Managing Life Q&A:
Business, Innovation and Managing Life (April 5, 2023) »
1 hour 30 minutes
✕ close
Should I become a programmer? At what age do you think kids should start learning computer-related skills? Should programming be a core class for students, like math and English?
What do you think are good ways to introduce computational thinking to kids?
But can you really get to a point to ask if there is something that you want to do that can be solved computationally without at least going about a trial-and-error-type process?
"Human-AI coauthorship" is what I call it now.
What would be some examples of the differences between programming, mathematical thinking and computational thinking? Or is there a difference? Is this just a colloquial thing?
Would you consider hiring someone without a technical background?
What is the minimum body of knowledge one should gather before being able to produce meaningful ideas in one research area?
What was the hardest part in starting Wolfram Research?
What are your thoughts on learning things outside of your domain of expertise? How should one balance their time between diving deep into their primary domain and exploring things outside of that?
What valuable new products will Wolfram Research build using AI in the next decade? What ideas do you have that you hope others build?
What do you think is going to happen in the next five years with AIs? What's the next big "surprise" thing like ChatGPT you think will come?
What's the worst thing that could happen with AI?
Are you concerned that we are building our murderer? Or that we have to simulate worlds empty of influence to determine the genuine intentions/alignments of an AI?
Which is better: ChatGPT calling a plugin, or a plugin/standalone calling ChatGPT? Depends on the application, probably.
I'd love for an AI to be able to, for instance, teach me chess in the most optimal way by figuring out my weaknesses and how to reinforce my learning.
One thing to consider: If the galaxy is incredibly vast, why wouldn't an AI just leave Earth so that it can gather resources elsewhere? Or it could even explore the universe. Staying on Earth seems like it'd be very limiting to an AI or superintelligence. How can one NOT get left behind socially and economically in the wake of AI innovation?
One thing I was thinking earlier is that what we're going to be seeing now is "automation of AI," where we have lots of websites and APIs that do one machine learning task well, and then we're handing off data from one model to the next.
I like the idea of LLMs acting as the core interface module for a "soup" of APIs in a cognitive/hybrid AI architecture.
View Less »